Face Emotions and Short Surveys during Automotive Tasks

Face Emotions and
Short Surveys during
Automotive Tasks
LEE QUINTANAR, PETE TRUJILLO, AND
JEREMY WATSON
March 2016
J.D. Power
A Global Marketing Information Company
jdpower.com
Face Emotions and Short Surveys in Automotive Tasks
CASRO Digital Conference, March 2016
Introduction
Facial expressions are a daily occurrence in the communication of human emotions. In 1872,
Charles Darwin described facial expressions as signals of specific emotions (Darwin,
1872/1998), which was later tested by Paul Ekman and Wallace Friesen (Ekman & Friesen,
1987). Their team conducted a cross-cultural study, demonstrating that interpretations of
facial expressions appear to be universal across cultures. While minor cultural variations
appeared in the ratings of emotional intensity, agreement in emotional evidence was found
to be high across cultures. Moreover, a method called the Facial Action Coding System
(FACS) was designed to help classify human facial movements by their appearance on the
face, based on a system originally developed by a Swedish anatomist (Hjortsjö, 1969).
Ekman, Friesen, and Hager later published a significant update to FACS (Ekman et al. 2002),
with variants of the system emerging in modern technologies for computer-based detection
of facial expressions.
The purpose of our research is to better understand the relationship between human facial
expressions—face emotions—and consumer attitudes toward products and services.
J.D. Power’s Voice of the Customer research measures customer satisfaction with products
and services based on consumer surveys. Survey research asks respondents what they think
or how they feel about products/services and then extrapolates that data to actual attitudes
that impact consumer behavior. Biometrics is an alternative to infer attitudes from
observations of bodily behaviors corresponding to human emotions. Specifically, the
research evaluated how human facial expressions would compare with survey responses in
order to measure attitudes and behaviors. Early research by Cacioppo, Quintanar, Petty, and
Snyder (1981, 1979) evaluated the relationship among facial expressions, emotions, and
attitudes, and an expansion of this assessment using modern computer technologies seemed
promising.
Biometrics is an
alternative to infer
attitudes from
observations of
bodily behaviors
that correspond to
human emotions.
Biometrics takes physical information from a human body and makes it quantifiable.
Biometrics technologies are becoming increasingly refined. A variety of bodily
measurements can be evaluated, including facial expressions, heart rate, eye tracking, pupil
dilation, galvanic skin response (arousal), and voice modulations. How effective is biometrics
in assessing emotions and attitudes? The focus of the research in this study is on facial
expressions as an assessment of respondent’s attitudes.
The automation of methods to recognize the emotional content of facial expressions has
been evolving in parallel with psychological research. Research conducted by Bartlett et al.
(2003, 2006) began with prototypes for automatic recognition of spontaneous facial actions.
Littlewort et al. (2006) explored the dynamics of facial expression extracted automatically
from video. Pantic and Bartlett (2007) went further with a machine analysis of facial
expressions. Wu, Bartlett, and Movellan (2010) applied the use of Gabor motion energy
filters in the recognition of facial expressions. Computerized methods improved when
© 2016 J.D. Power and Associates, McGraw Hill Financial. All Rights Reserved.
1
Face Emotions and Short Surveys in Automotive Tasks
CASRO Digital Conference, March 2016
Littlewort, Whitehill, Wu, Fasel, Frank, Movellan, and Bartlett (2011) developed the
Computer Expression Recognition Toolbox (CERT) which served as an end-to-end system for
fully automated facial expression recognition that operates in real time. To access the
affective nature of facial expressions, the points on the face are scanned by computer
analysis to recognize emotions.
The results of this J.D. Power study are expected to help researchers better understand how
facial expressions/face emotions can accurately assess consumer attitudes and, in turn,
predict their behaviors.
Research Design
Methodology: The research paradigm consisted of a website evaluation scenario in which
participants evaluated a site’s usability. Three 2015 automotive websites (Honda, Kia, and
Hyundai) were presented to participants in a randomized order. Asian automakers were
chosen to reduce preference bias that might emerge with European or U.S.-based
automakers. Participants were asked to use each website’s Build a Car price tool and
afterward complete a short survey to rate Appearance, Navigation, Speed, and their Overall
satisfaction. A webcam captured videos of participants’ faces as they used the Build a Car
tool. Although eye-tracking information was also collected, it was not used in this analysis.
The research
paradigm was a
website evaluation
scenario in which
webcam face
videos were
collected while
using the website’s
Build a Car tool.
Figure 1: Build a Car Websites Were Utilized in Research Procedure
Note: Participants completed short surveys after using Build a Car websites.
Facial encoding: The technology platform we used for the emotions recognition of facial
expressions was the iMotions Biometric Research Platform (Release 5, 2015). Detailed
specifications can be found on the iMotions product website (2016-a), which includes
additional resources, e.g., a facial emotions publications list (2016-b) and a guide for facial
emotions analysis (2016-c). The underlying technology involved three steps: (1) face
detection; (2) feature detection; and (3) classification. The position, orientation, and
information encompassing key facial features were input into classification algorithms that
translated the features into emotional states and affective metrics. These technologies rested
© 2016 J.D. Power and Associates, McGraw Hill Financial. All Rights Reserved.
2
Face Emotions and Short Surveys in Automotive Tasks
CASRO Digital Conference, March 2016
on methods of image processing, edge detection, Gabor filters, and statistical comparisons
with normative databases provided by facial expression software engines. This can be
imagined as a kind of invisible virtual mesh covering the face of a respondent: whenever the
face moves or changes expressions, the face model adapts, follows, and classifies emotions.
The face video collected during each Build a Car website session was analyzed using the
iMotions biometrics platform, which was also used to set up the PC-based experimental
procedure, sequence and timing of stimulus events, baseline “gray screen,” website
presentation, and online survey questions and ratings. Different baselines screens were also
tested but yielded no differences from a “gray screen” for establishing an emotions baseline.
The iMotions platform digitizes facial expressions to measure emotions by using encoding
algorithms based on the FACET scoring methods. Facial encoding scans the various points on
a face and then interprets the patterns into measurable emotion events. Face emotion index
scores were collected simultaneously for the following nine indices: Joy, Anger, Surprise,
Fear, Sadness, Disgust, Contempt, Confusion, and Frustration. Overall sentiment scores were
also collected for generalized positive and negative facial expressions.
Sample: The sample consisted of 30 college-educated participants, 50% male and 50%
female. Ages ranged from 21 to 67 (37 mean), with a racial mix of 79% white and 7% each of
black, Asian, and Hispanic. The average length of each session was 35 minutes.
Figure 2: Facial Encoding for Emotions
Webcam
Facial encoding
scans the various
points on a face
and then
interprets the
patterns into
measurable
emotion events.
Eye Tracker
Note: Webcam face videos were collected and computer analyzed for emotions.
© 2016 J.D. Power and Associates, McGraw Hill Financial. All Rights Reserved.
3
Face Emotions and Short Surveys in Automotive Tasks
CASRO Digital Conference, March 2016
Analysis
Face emotions from video recordings: On average, face videos were recorded for 35
minutes for each of the 30 participants at a frame rate of 30 frames per second. Each frame
was computer analyzed for the 11 emotions listed above, which resulted in a data set
consisting of nearly 2 million observations (records): 35 min x (30 frames/sec) x 30
participants = 1.9 million. Consequently, proper data aggregation and transformation
methods were required for effective data analysis. There is a need to reduce big data into a
measurable Emotion Index.
Figure 3: Data Example
Biometric measures
require big data
solutions for analysis.
An Emotion Index
was calculated by
deriving a percentage
of emotion change.
Note: Biometric measures require big data solutions for analysis.
J.D. Power Emotion Index: Emotion Index scores were calculated by deriving the
percentage of emotion change between the highest and lowest ranges for each participant
across all conditions. For each face emotion, a minimum and maximum were determined for
each person across all conditions (within-subjects). An Emotion Index score was derived by
(a) converting to positive integers; (b) calculating the percentage between minimum and
maximum; and (c) converting to a 1,000-point scale.
This indexing approach created a “percent Emotion Index” (from zero to 1,000) based on the
range of distance between within-subject minimum and maximum end points. This was
found to be the best and most emotion-sensitive data transform method among the
alternative methods, such as difference scores from baseline; threshold binary scores;
square roots; and logarithm scores. Raw data scores were low and variable across
participants, while other transforms changed the scale and/or data distribution.
© 2016 J.D. Power and Associates, McGraw Hill Financial. All Rights Reserved.
4
Face Emotions and Short Surveys in Automotive Tasks
CASRO Digital Conference, March 2016
Results
Core Emotional Dimensions
A factor analysis of the 11 face emotions was performed using a principal components
analysis with orthogonal varimax rotation (see Kim & Mueller, 1978a; 1978b), followed by
an oblique Procrustes rotation (SAS ROTATE=PROMAX), with the varimax output as the
target matrix. An oblique rotation method was used because the simultaneous emotion
measurements were expected to be interrelated with each other.
The number of factors retained was determined based on the solution that best satisfied the
following criteria: the percentage of variance explained by each factor; the outcome of a
scree test; the size of the eigenvalue differences between factors; the number of high
loadings on each factor; the perseverance of factors over each of the possible rotations; and
the meaningfulness of the factor structures over different rotations. As shown in Figure 4, a
rotation of three factors, accounting for 93.1% of the variance, was selected as representing
the best estimate of the primary emotional judgmental dimensions utilized during the
automotive task evaluation.
After examining the pattern of factor loadings, these factors were labeled Enjoyment, Dislike,
and Perplexed. Factor scores were also calculated for use in later analyses. This method of
collapsing the data matrix by stringing out across conditions to assess dimensionality and
derive factor scores for group comparisons has been found useful in prior precedent
(Quintanar, 1982; Osgood, May, and Miron, 1975).
Figure 4: Factor Analysis of Core Emotional Dimensions
Three factors were
found as primary
emotional dimensions
utilized during this
automotive task
evaluation:
Enjoyment, Dislike,
and Perplexed.
Note: Factor loadings are multiplied by 100, rounded to integers, and those > 40 are
flagged by asterisks. Scree plots and eigenvalues indicate three primary factors.
© 2016 J.D. Power and Associates, McGraw Hill Financial. All Rights Reserved.
5
Face Emotions and Short Surveys in Automotive Tasks
CASRO Digital Conference, March 2016
Face Emotions
Analysis of Variance (ANOVA repeated measures) was used to compare automaker websites
on face emotions. The strongest emotions appeared during the first 2 minutes of the Build a
Car website sessions. Kia’s car build had higher levels of Confusion and Disgust and lower
levels of Joy. An analysis of factor scores also showed that Kia had higher Perplexed and
lower Enjoyment scores.
Figure 5: Face Emotions during Initial Impressions
Honda
Hyundai
Kia
745
727
609
722
565
Confusion
267
569
274
237
Disgust
Joy
Note: Kia’s car build had higher levels of Confusion and Disgust and lower levels of Joy.
The level of Confusion in Kia’s Build a Car evaluation persisted throughout the remainder of
the Web session after the initial impressions. Surprise also emerged at higher levels during
Kia’s car build session.
Figure 6: Face Emotions during Latter Session
Honda
Hyundai
Kia
739
730
714
Confusion
444
426
424
Surprise
Note: Confusion in Kia’s car build persisted throughout the remainder of the Web
session.
© 2016 J.D. Power and Associates, McGraw Hill Financial. All Rights Reserved.
6
Face Emotions and Short Surveys in Automotive Tasks
CASRO Digital Conference, March 2016
An analysis of emotion indices is also available on a second-by-second basis, which can be
useful for comparing emotions when key events occur during a session. You can also
observe fluctuations and overall slope. An example with Confusion is shown in Figure 7.
Figure 7: Confusion across 10 Minutes of Using Build a Car Tool
Note: Confusion in car builds is shown in second-by-second plots.
Survey Ratings
An analysis of short survey ratings after each Build a Car session found that the Honda car
build scored highest in Navigation, Speed, and Overall Satisfaction. Hyundai was a close
second. Kia’s car build scored lowest in Appearance, Navigation, and Speed. As shown in
Figure 8, these ratings are consistent with the face emotion results.
Figure 8: Ranking Car Builds by Survey Ratings
Note: Highest ratings are marked by green and lowest by orange.
© 2016 J.D. Power and Associates, McGraw Hill Financial. All Rights Reserved.
7
Face Emotions and Short Surveys in Automotive Tasks
CASRO Digital Conference, March 2016
When website satisfaction ratings were divided into low/high categories, it was found as
expected that Appearance, Navigation, and Speed were highest when Overall satisfaction was
high (see Figure 9).
Figure 9: Attributes during Low/High Satisfaction
Note: Appearance, Navigation & Speed receive highest ratings when satisfaction is high.
Face Emotions and Survey Ratings
Findings show that
face emotions
were aligned with
survey ratings.
Negative face
emotions were
higher when
satisfaction ratings
were lower.
How do face emotions relate to participant evaluations and satisfaction levels? Face
emotions were found to be aligned, that is, there is a directional relationship with survey
ratings. Negative face emotions were at higher levels when satisfaction ratings were lower.
Figure 10: Face Emotions during Low/High Satisfaction Ratings
Note: Face emotions are aligned with ratings, with negative emotions higher when
satisfaction ratings are lower.
© 2016 J.D. Power and Associates, McGraw Hill Financial. All Rights Reserved.
8
Face Emotions and Short Surveys in Automotive Tasks
CASRO Digital Conference, March 2016
A correlational analysis also showed that negative emotions increased as survey ratings
decreased. Why were negative emotions more prominent than positive? It may be that the
nature of this task seemed more like “work” rather than “fun” for this website evaluation.
Conclusions
Overall, face emotions were an accurate measure of participant reactions during the Build a
Car website sessions as measured by the J.D. Power Emotion Index percent (zero-1,000).
Initial impressions (first 2 minutes) showed higher levels of Confusion (a mean index score
of 745) and Disgust (609), and lower levels of Joy (237) in Kia’s car build. During the latter
part of the website session, higher Confusion (739) persisted throughout Kia’s car build,
with Surprise emerging (444). Honda and Hyundai didn’t have these issues.
Factor analysis of the 11 emotions revealed three core underlying emotional dimensions
used by participants during this automotive task: (1) Enjoyment; (2) Dislike; and (3)
Perplexed. Further analysis showed that Kia scored highest on Perplexed and lowest on
Enjoyment (more confusion) during initial impressions.
Correlations were found between attribute ratings and emotions. Negative face emotions
were high when overall website satisfaction was low. These results are corroborated by
Kia’s recent 2015 decision to dismiss their Web design firm in order to pursue a better
redesign.
These findings are also supported by the J. D. Power manufacturer website evaluation
studies. Overall, Honda and Hyundai car builds were straightforward and allowed you to
easily build and explore car options. The Kia car build was pretty (nice photos and car
views), but more complicated, harder to search and navigate.
Overall, face
emotions were an
accurate measure
of participant
reactions during
website sessions
as measured by
the Emotion Index.
Moreover, participants’ initial impressions seemed impacted when a large pop-up panel
window appeared first thing in Kia’s car build and required a ZIP code to continue. Although
the Hyundai car build also asked for a ZIP code, it was done via a small pop-up panel
described as needed for the latest rebates and prices. Honda didn’t ask for a ZIP code until
the end of their car build and only as an optional item.
Future Research and Applications
There are many applications of this research in providing nonobtrusive evaluations of
human emotions to predict consumer behavior and attitudes toward products and services.
Customer video feeds can be used to provide evaluations of consumer reactions in
automotive, retail, travel, hospitality, or similar environments. Face emotions can be
gathered from video sources such as webcams for digital-comfortable consumers (e.g.,
Millennials) to leave video-based service feedback or product reviews. There might also be
optional security-based applications to assess strongly polarized emotions. Moreover, it’s
possible to do a census of branches/facilities to assess and improve customer service.
© 2016 J.D. Power and Associates, McGraw Hill Financial. All Rights Reserved.
9
Face Emotions and Short Surveys in Automotive Tasks
CASRO Digital Conference, March 2016
There is an abundance of video opportunities for recognizing face emotions so much so that
questions of privacy and legal permissions to record and process such video for evaluation of
personal emotions may be necessary. One strategy might be to obtain approval for
recordings similar to what is currently done when contacting a call center and a request for
approval to record for “the purpose of improving customer service” is asked up front.
Future research is expected to investigate more thoroughly the core judgmental dimensions
used in product evaluation to assess how they persist across various industries. Empirical
assessments of emotion indexing methods and data aggregation strategies are also
important for hardened research paradigms and tools. Other opportunities can include
utilizing scenarios that elicit stronger emotions about products and services.
Future research is also expected to delve deeper into the comparison of face emotions to
survey-measured attitudes with special attention given to the persistence of these feelings
and their predictive nature on consumer behaviors. For example, are face emotions more
transitory and reflective of the moment rather than more enduring attitudes in predicting
consumer behavior? Perhaps face emotions are additive and with consistent reactions
contribute to the formation of enduring persistent attitudes. Perhaps facial expressions are
inherent in the processing of emotions and always involved with attitudes at all levels.
There are many research opportunities available for evaluating more effective ways to blend
biometrics, consumer attitudes, and the prediction of consumer behavior.
We see many
applications for
nonobtrusive
evaluations of
emotions to
predict consumer
behavior and
attitudes toward
products and
services.
Authors
Lee Quintanar, Ph.D., Director, Marketing Science, J.D. Power
Pete Trujillo, Senior Manager, J.D. Power
Jeremy Watson, Ph.D., Senior Statistician, J.D. Power
References
J. D. Power 2015 Biometrics Research Study SM
Bartlett, M.S., Littlewort, G., Braathen, B., Sejnowski, T.J., & Movellan, J.R. (2003). A prototype
for automatic recognition of spontaneous facial actions. In S. Becker & S. Thrun & K.
Obermayer, (Eds.) Advances in Neural Information Processing Systems, Vol 15, p.
1271-1278, MIT Press.
Bartlett, M.S., Littlewort, G.C., Frank, M.G., Lainscsek,C., Fasel, I., Movellan, J.R. (2006).
Automatic Recognition of Facial Actions in Spontaneous Expressions. Journal of
Multimedia 1(6) p. 22-35.
Cacioppo, J. T., Quintanar, L. R., Petty, R. E., & Snyder, C. W. (1981). Electroencephalographic,
facial EMG, and cardiac changes during equivocal and less equivocal attitudinal
processing [Abstract]. Psychophysiology, 18, 160.
© 2016 J.D. Power and Associates, McGraw Hill Financial. All Rights Reserved.
10
Face Emotions and Short Surveys in Automotive Tasks
CASRO Digital Conference, March 2016
Cacioppo, J. T., Quintanar, L. R., Petty, R. E., & Snyder, C. W. (1979). Changes in cardiac and
facial EMG activity during the forewarning, anticipation, and presentation of
proattitudinal, counterattitudinal, and neutral communications [Abstract].
Psychophysiology, 16, 194.
Darwin, Charles. (1998). The Expression of The Emotions In Man And Animals. New York:
Philosophical Library. (original work published in 1872).
Ekman, P., et al. (1987). Universals and Cultural Differences in the Judgments of Facial
Expressions of Emotion. Journal of Personality & Social Psychology, 53(4), 712-717.
Ekman, P., Friesen, W. V., & Hager, J. C. (Eds.). (2002). Facial Action Coding System [e-book].
Salt Lake City, UT: Research Nexus.Ekman, P., Friesen, W. V., & O’Sullivan, M. (1988).
Smiles when lying. Journal of Personality and Social Psychology, 54, 414–420.
Hjorztsjö, C. H. (1969). Man's face and mimic language. Studentlitteratur, Lund, Sweden.
iMotions (2016a). Biometric Research Platform. Product website: https://imotions.com/.
iMotions (2016b). Publications resources for Facial Expressions Analysis. Retrieved from
https://imotions.com/resources/publications/.
iMotions (2016c). Facial Expression Analysis: Everything you need to know to elevate your
research with emotion analytics - The Definitive Guide. Retrieved from
https://imotions.com/guides/.
Kim, J., & Mueller, C. W. Introduction to factor analysis. (Sage University Paper Series on
Quantitative Applications in the Social Sciences, 07-013). Beverly Hills and London:
Sage Publications, 1978a.
Kim, J., S Mueller, C. W. Factor analysis: statistical methods and practical issues. (Sage
University Paper Series on Quantitative Applications in the Social Sciences, 07-014).
Beverly Hills and London: Sage Publications, 1978b.
Littlewort, G., Bartlett, M., Fasel, I., Susskind, J., and Movellan, J. (2006). Dynamics of facial
expression extracted automatically from video. Image and Vision Computing 24(6),
p. 615-625.
Littlewort G, Whitehill J, Wu T, Fasel I, Frank M, Movellan J, and Bartlett M (2011). The
Computer Expression Recognition Toolbox (CERT). Proc. IEEE International
Conference on Automatic Face and Gesture Recognition.
Osgood, C. E., May, W. H., & Miron, M. S. Cross-cultural universals of affective meaning.
Chicago, Illinois: University of Illinois Press, 1975.
Pantic M. & Bartlett, M.S. (2007). Machine Analysis of Facial Expressions, in K. Delac & M.
Grgic, Eds., Face Recognition, Vienna, Austria: I-Tech Education and Publishing,
2007, pp. 377-416.
Quintanar, L. R. (1982). The interactive computer as a social stimulus in computer-managed
instruction: a theoretical and empirical analysis of the social psychological
processes evoked during human-computer interaction (Doctoral dissertation). The
University of Notre Dame, Notre Dame, Indiana.
© 2016 J.D. Power and Associates, McGraw Hill Financial. All Rights Reserved.
11
Face Emotions and Short Surveys in Automotive Tasks
CASRO Digital Conference, March 2016
Wu, T., Bartlett, M.S., and Movellan, J. (2010). Facial expression recognition using Gabor
motion energy filters. IEEE CVPR workshop on Computer Vision and Pattern
Recognition for Human Communicative Behavior Analysis.
© 2016 J.D. Power and Associates, McGraw Hill Financial. All Rights Reserved.
12