The new tools developed in neuromarketing are sometimes mistakenly presented as destined to replace the existing, verbal-based marketing research methods (survey, interviews, focus groups), maligned as being biased, incomplete, and plain outdated. This was not at all the general opinion I felt at the Neuromarketing World Forum this year. Nobody doubted the efficiency of the verbal methods. The purpose of the new tools is to augment and complement them, which is a crucial difference.
The work I had the pleasure to present was exactly on this topic: how data from verbal and neuroscientific methods (in my case, survey and EEG) can be effectively combined. Both methods have strengths and weaknesses, and sound application of analytics can extract the best of both worlds, thus giving an unparalleled level of insight on how a marketing stimulus is really felt by the consumers.
The slides of my talk were as follows:
Here is a short outline of the different parts of the talk:
Introduction (slide 1-13)
When conducting the pre-test of a TV commercial, survey and EEG (Electro EncephaloGraphy) both offer important and complementary insight on a viewer's response. EEG, for instance, shows the viewer's state of mind with a high temporal precision, pinpointing which parts of the commercial invoke engagement. However, the type of emotion felt can only be measured in a very loose way, following an arousal-valence scale. On the contrary, in a post-screening survey, viewers can give a precise account of how they felt about the ad, but can't recall their experience on a second by second level. Sound integration of these information can lead to a new level of insight: the actual emotion felt at each second.
Intent (slide 14-19)
In a typical neuromarketing-based advertising study, information is collected through numerous methods, from classical (survey, interview), biometrics (eye tracking, facial recognition), and neuroscientific proper (EEG, fMRI). However, integration of these data into an overall result is usually done in a qualitative way, and is thus costly and prone to error depending on the researcher's subjectivity. We explore in this talk how integration of diverse data can be done in a quantitative and reliable way.
Simple joint analysis (slide 20-28)
The simplest way to jointly analyze survey and EEG data is to separate the subjects of a study into different groups based on their survey answers, and then compare the EEG signals of each group. Significance of population difference can easily be assessed from basic statistics, such as t-test. However, this simple approach supposes an a priori hypothesis on the data, as statistic tests lose their meaning when applied in an exploratory manner.
Full data integration (slide 29-47)
We developed a framework for automated integration of survey and EEG data, grounded in Bayes statistics. We use a classification of emotions along arousal/valence axis as previous knowledge. We suppose a probability distribution of emotions induced by each second of an ad, and estimate this distribution from data using Bayesian inference. The model could give a convincing account of the emotions induced by each stimulus for all the videos tested.