Univerza na Primorskem Fakulteta za matematiko, naravoslovje in informacijske tehnologije

sobota, 14. marec 2020 Online seminar: Using EEG for Emotion Experience Design in TV Commercials

V ponedeljek, 16. marca 2020, bo ob 16.00 uri prek spletnih orodij na daljavo izvedeno 
Oddelkov za Informacijske znanosti in tehnologije UP FAMNIT in UP IAM.

ČAS/PROSTOR: 16. marec 2020 ob 16.00 na daljavo 

Predavanje bo potekalo v angleškem jeziku prek spletnega orodja Zoom.
Do predavalja dostopate tako, da se povežete prek sledeče povezave: https://zoom.us/j/297328207


Jordan Aiko Deja is a UX Practitioner with an HCI and AI background. He has led several teams and projects that have been used for various digital, innovative, and even AI-empowered products and initiatives used both locally and internationally. He has done work fusing HCI and AI that are currently being used in marketing, digital production, therapy, and many others. He is currently a computer science PhD student at UP FAMNIT.

NASLOV: Using EEG for Emotion Experience Design in TV Commercials

TV Commercials are audio video segments played by paying advertisers in between the consumers' favorite shows. They are usually aired for marketing purposes especially towards selling specific products or services. In the process of creating TV Commercials, marketers, digital agencies undergo a rigorous process to validate various elements in the said audio video production. This process is usually tedious and time consuming. This report aims to present results of preliminary study that uses Electroencephalogram (EEG) as input in helping production team verify their marketing content. Target consumers for a specific commercial product underwent one-on-one semi-structured interviews and from viewing sessions where their EEG data were collected using the 5-channel EMOTIV Insight headset. To triangulate the emotion modeling phase, external observers referred to as coders annotated their emotions with the aid of emotion recognition via facial recognition. Initial Analysis of the interviews along with the early version of the model revealed several interesting points on visual and audio elements that triggered viewer valence. The collection of EEG data was also able to uncover a few insights that aided in the design of a better viewing experience for the target consumers. We intend to expand the findings in the future work to build a model from a bigger data and user base along with a wider array of emotions to choose from. Also, we intend to employ more advanced experiment designs to determine if we can directly correlate viewer experiences into actual product and commercial conversion.