Face Reading Technology: Improving Preference Prediction from Self-Reports Using Micro Expressions

with Franziska Krause (EBS University)

and Janina Krick (EBS University)

September 21st, 2023

TIE 2023 Annual Metting

Goethe University, Frankfurt

Pantelis Karapanagiotis
(EBS University and SAFE)

@pi_kappa_

What Makes a Good Sales PersonMachine?

  • In the classic HBR article, Mayer & Greenberg (1964) identify two essential qualities for being successful as a salesperson.
    1. Ego Drive - Drive for sales is relatively easy to program.
    2. Empathy - Can this quality be machine-automated?

Why is this Interesting?

  • Preference Prediction
  • Managers and researchers mainly rely on traditional market research tools

Positives

  • Cost efficient
  • Practical
  • Easy to distribute
  • Fast
  • Scalable
  • Enable access to enormous cohorts of subjects

Negatives

  • Mainly rely on self-reports (can therefore be biased or inaccurate)
  • Can be difficult to use in real-time applications.
  • Difficult to automate and human intervention is needed.

Why is this Interesting?

  • Preference Prediction
  • Facial expressions are one of the primary channels for humans to transmit emotional signals.

  • Often uncontrolled and spontaneous.

  • Collected unobtrusively via facial recognition software.

  • Much cheaper approach and easier to use compared to other neuromarketing tools.

Previous Work

  • Hakim et al., 2021 . Machines learn neuromarketing: Improving preference prediction from self-reports using multiple EEG measures and machine learning.
  • Höfling & Alpers, 2023 . Automatic facial coding predicts self-report of emotion, advertisement and brand effects elicited by video commercials.
  • Lu, Xiao, & Ding, 2016 . A video-based automated recommender (VAR) system for garments.
  • Zhou, Chen, Ferreira, & Smith, 2021 . Consumer behavior in the online classroom: using video analytics and machine learning to understand the consumption of video courseware.

Experimental Design

Experimental Design

Experimental Design

Experimental Design

Experimental Design

Experimental Design

Examples of Questions:

  • Product Knowledge: Do you know the product featured in the ad? [7=Very much, 1=Not at all]
  • Product Consumption: To what extent do you consume the advertised product? [7=Very much, 1=Not at all]

Experimental Design

Experimental Design

  • Binary Comparisons:
    • All possible pairs appear to each participant (15 combinations).
    • Products appear in random order.
    • Products appear for 3 seconds and time-outs are counted as mistrials.
  • Take-out Choice: You can now choose one of the products. You will get this product afterward.

Data and Descriptive Statistics

Unprocessed Data

The unprocessed data have three dimensions:

  • Subject dimension (\(N=156\) participants).
  • Product dimension (\(J=6\) products).
    • Binary comparisons (\(P=15\) for each product).
  • Advertisement time.
    • Measurement every 200-milliseconds for the duration of the advertisement (e.g. 100 points for a 20-seconds ad).
    • Collected 8 Core (happy, sad, angry, surprised, scared, disgusted, contempt, and neutral) and 2 synthetic (arousal, valence) emotion measurements.

Pre-processing

  1. Transitivity as a preference attention filter.
    • Remove participants with non-transitive preferences (excludes 21 participants).
  2. Choose participants with preferences revealed to be incentive compatible.
    • Calculated the most preferred item from binary comparisons.
    • Remove participants with disagreements between most preferred and take-out choices (excludes 30 participants).
  3. Advertisement attention filter.
    • Remove participants who reported being unsure about seeing any of the included advertisements (excludes 35 participants)
  4. Facial expression measurement filters
    • Remove the first second of measurements while the face recognition software is still calibrating (no participants excluded).

Pre-processing

  • After pre-processing we are left with 71 participants,
  • 426 observations of participant-products pairs, and
  • 1065 observations of participant-preferences pairs.

Emotion consolidation

  • We consolidate emotion measurements in 5 ways:
    1. Average emotions: Mean over Ad measurements.
    2. Weighted average emotions: Weighted mean over each Ad. Weights are the observed frequencies.
    3. Extreme emotions: Max measurement over Ad.
    4. Worst case emotions: Min measurement over Ad.
    5. Fluctuations of emotions: Standard deviation of Ad measurements.

Results

Logit model

Boosted Trees

Notes: Decision tree boosting using 100 trees with a depth equal to 5. The trees are estimated with cost parameters \(\lambda={10}^k\) for 100 different \(k\) equidistantly selected from \(\left[-4,\ -1\right]\). The relative influence plot is drawn for the cost parameters \(\lambda\) that minimize the prediction errors.

Boosted Trees

Notes: Decision tree boosting using 100 trees with a depth equal to 5. The trees are estimated with cost parameters \(\lambda={10}^k\) for 100 different \(k\) equidistantly selected from \(\left[-4,\ -1\right]\). The relative influence plot is drawn for the cost parameters \(\lambda\) that minimize the prediction errors.

Deep Learning Model

Notes: DL models have 3 dense layers with ReLU activations followed by dropout layers. The final activation uses SoftMax. Training obs. = 900. Test obs. = 165. Validation obs. = 225. Trained for 500 epochs.

Face Reading Technology: Improving Preference Prediction from Self-Reports Using Micro Expressions

  • Collected preference, marketing survey, and facial expression experimental data.
  • Used a comprehensive toolkit of prediction-oriented ML methods.
  • Introduced a DL model using differences of emotion and survey data.
  • Choice predictions improve from 1%-6% when emotion variables are added to survey data.
  • Sketched how our DL model can be streamlined with facial recognition models in practical applications.
  • Argued that this technology can affect the balance of bargaining power in market applications and regulation amendments might be appropriate.

References

Hakim, A., Klorfeld, S., Sela, T., Friedman, D., Shabat-Simon, M., & Levy, D. J. (2021). Machines learn neuromarketing: Improving preference prediction from self-reports using multiple EEG measures and machine learning. International journal of research in marketing, 38(3), 770–791.
Höfling, T. T. A., & Alpers, G. W. (2023). Automatic facial coding predicts self-report of emotion, advertisement and brand effects elicited by video commercials. Frontiers in neuroscience, 17, 1125983.
Lu, S., Xiao, L., & Ding, M. (2016). A Video-Based Automated Recommender (VAR) System for Garments. Marketing science, 35(3), 484–510.
Mayer, D., & Greenberg, H. M. (1964). What Makes a Good Salesman.
Zhou, M., Chen, G. H., Ferreira, P., & Smith, M. D. (2021). Consumer Behavior in the Online Classroom: Using Video Analytics and Machine Learning to Understand the Consumption of Video Courseware. Journal of marketing research, 58(6), 1079–1100.