Neural correlates of facial embodiment during social interaction

SOCIAL NEUROSCIENCE, 0000, 00 (00), 000 000
What’s in a smile? Neural correlates of facial
embodiment during social interaction
Leonhard Schilbach
University of Cologne, Cologne, and Research Center Juelich, Juelich, Germany
Simon B. Eickhoff
Research Center Juelich, Juelich, Germany
Andreas Mojzisch
Georg-August-University of Goettingen, Goettingen, Germany
Kai Vogeley
University of Cologne, Cologne, and Research Center Juelich, Juelich, Germany
Previous investigations have shown that the perception of socially relevant facial expressions, indicating
someone else’s intention to communicate (e.g., smiling), correlate with increased activity in zygomaticus
major muscle regardless of whether the facial expressions seen are directed towards the human observer
or toward someone else (Mojzisch et al., 2006). These spontaneous, involuntary reactions have been
described as facial mimicry and seem to be of considerable importance for successful interpersonal
communication. We investigated whether specific neural substrates underlie these responses by
performing a finite impulse response (FIR) analysis of an experiment using functional magnetic
resonance imaging (fMRI) to investigate the perception of socially relevant facial expressions (Schilbach
et al., 2006). This analysis demonstrates that differential neural activity can be detected relative to the
FIR time window in which facial mimicry occurs. The neural network found includes but extends beyond
classical motor regions (face motor area) recruiting brain regions known to be involved in social
cognition. This network is proposed to subserve the integration of emotional and action-related processes
as part of a pre-reflective, embodied reaction to the perception of socially relevant facial expressions as
well as a reflective representation of self and other.
INTRODUCTION
The human body is the best picture of the
human soul.
(Wittgenstein, 1974)
Successful interpersonal communication largely depends upon the exchange of nonverbal
information (Mehrabian, 1971). The face is
known to be of particular importance in this
context as facial expressions convey manifold
information about the emotional state of others
as well as their appraisal of a given situation
(Darwin, 1874; Erickson & Schulkin, 2003; Frijda
& Tcherkassof, 1997; Kaiser & Wehrle, 2001).
Additionally, the perception of facial expressions has also been demonstrated to directly affect
a human observer: research has consistently shown
that perceivers across the human lifespan spontaneously imitate the facial gestures of perceived
Correspondence should be addressed to: Leonhard Schilbach, Department of Psychiatry, University of Cologne, Kerpener Str.
62, D-50924 Cologne, Germany. E-mail: [email protected]
# 2007 Psychology Press, an imprint of the Taylor & Francis Group, an Informa business
www.psypress.com/socialneuroscience
DOI:10.1080/17470910701563228
2
SCHILBACH ET AL.
others (Dimberg, 1997a; Doherty, 1998; Esteves,
Parra, Dimberg, & Ohman, 1994; Lang, Greenwald, Bradley, & Hamm, 1993; Lanzetta & Orr,
1986; Lundqvist, 1995; Lundqvist & Dimberg,
1995; Meltzoff & Moore, 1977, 1989; Meltzoff &
Prinz, 2002; O’Toole & Dubin, 1968). This phenomenon of involuntary activity of facial muscles
occurring automatically in a human observer’s face
in response to seeing facial expressions of a
perceived other has been described as facial
mimicry. It has also been suggested that these
automatic, involuntary facial reactions are consistent with how subjects perceive the stimuli and
their own emotions constituting a form of ‘‘physiological linkage’’ or socio-emotional contagion
(Dimberg, 1982, 1988; Dimberg, Thunberg, &
Elmehed, 2000; Wallbott, 1991), which can influence the selection of consequent response patterns
(Niedenthal, Barsalou, Winkielman, Krauth-Gruber, & Ric, 2005).
In a previous study (Mojzisch et al., 2006) we
have investigated the perception of socially relevant facial expressions as they would occur in
everyday life approach situations to initiate social
interaction (Grammer, Schiefenhövel, Schleidt,
Lorenz, & Eibl-Eibesfeldt, 1988; Kendon &
Ferber, 1973). For this purpose, we created an
experimental situation in which subjects were
socially engaged by anthropomorphic virtual
characters in a controllable mediated scene.
Subjects were either gazed at by virtual characters
or observed the virtual characters looking at
someone else. In dynamic animations, virtual
characters then showed different facial expressions. We were able to show that the perception of
socially relevant facial expressions (e.g., smiling)
elicits differential muscular activity in zygomaticus major muscle in the human observer *as
measured by electromyography (EMG) *consistent with the idea of facial mimicry. Interestingly,
this effect occurred regardless of whether test
subjects were personally addressed by the virtual
character or not. Put differently, subjects involuntarily ‘‘smiled back’’ even if they observed the
virtual characters smiling at someone else. Due to
their spontaneous nature such automatic facial
reactions could be described as an embodied
response, i.e., involuntary responses to a socially
relevant stimulus by changes in facial musculature
and appearance. Such embodied responses have
been suggested to be of considerable importance
for interpersonal communication. Conversely,
alterations of involuntary facial reactions (e.g.,
due to conditions resulting in facial paralysis)
seem to have a dramatic impact on the quality of
interpersonal communication (Cole, 2001).
In spite of a great wealth of neuroimaging
literature pertaining to face perception (Blair,
2003; Haxby, Hoffmann, & Gobbini, 2002) as well
as imitation and the ‘‘mirror neuron system’’
(MNS; Iacoboni et al., 1999; Rizzolatti, Fogassi,
& Gallese, 2002b) the neural correlates of automatically occurring, involuntary facial reactions
in response to certain stimuli are not equally well
researched and remain incompletely understood.
Previous neuroimaging studies suggest the involvement of ‘‘mirror neurons’’ for emotional
facial actions: a largely similar neural network is
activated when subjects either passively view or
deliberately imitate static pictures of facial expressions of basic emotions (Carr, Iacoboni, Dubeau,
Mazzioatta, & Lenzi, 2003) and dynamic depictions of smiling and frowning expressions (Leslie,
Johnson-Frey, & Grafton, 2004). Lee, Josephs,
Dolan, and Critchley (2006) also specifically
looked at the perception and imitation of facial
expressions that index emotions (as compared to
‘‘ingestive’’ facial movements) and were able to
show that imitating leads to enhanced activity
within right inferior prefrontal cortex, a pattern
not found for passive viewing (Lee et al., 2006).
In spite of their interesting findings it must be
noted that these studies focused on the intentional, voluntary imitation of mimic behavior and
not on its spontaneous, involuntary occurrence in
response to seeing mimic behavior described as
facial mimicry. Most relevant to our investigation,
Wild, Erb, Eyb, Bartels, and Grodd (2003a) have
shown that involvement of the medial basotemporal lobe might facilitate the occurrence of
automatic, involuntary facial movements in response to the perception of facial expressions.
Contributions of different brain areas can, hence,
be assumed to contribute to the neural network
subserving facial mimicry, i.e., the automatic form
of facially mediated social contagion which allows
us to ‘‘live in someone else’s facial expressions’’
(Merleau-Ponty, 1964).
In the present study, we specifically aimed at
describing this network by reanalyzing data from
a previous neuroimaging study (Schilbach et al.,
2006) in which the same paradigm had been
employed as in the study by Mojzisch et al.
(2006) described above. Specifically, by using
the temporal information from the study by
Mojzisch et al. we performed a windowed finite
impulse response (FIR) analysis to investigate
whether differential neural activations can be
NEURAL CORRELATES OF FACIAL EMBODIMENT
detected relative to the FIR time window in
which facial mimicry occurs.
This reanalysis was performed to investigate
whether automatically occurring facial mimicry
relies upon differential neuronal activity in the
motor system alone or whether also brain regions
known to be involved in social cognition contribute to the neural network subserving this
phenomenon.
METHODS
Stimulus material, tasks and study
design
The stimuli used both in the fMRI and EMG
study (Schilbach et al., 2006; Mojzisch et al., 2006)
consisted of dynamic video animation sequences
designed using the software package Poser 4.0
(Curious Lab† ). Condition-specific dynamic
changes in facial appearance of the anthropomorphic virtual characters were modeled in concordance with the Facial Action Coding System
(FACS; Ekman & Friesen, 1978). Animation of
facial motion was realized by interpolating images
between the neutral and condition-specific facial
expressions as well as body positions of the virtual
character.
The video sequences depict anthropomorphic
virtual characters that appear on screen and exhibit
dynamic facial expressions as they would appear in
real-life approach situations when initiating social
interaction (SOC). In contrast arbitrary facial
movements are shown (ARB). The temporal order
of each video clip adhered to a standardized
pattern of 7.5 seconds. Each sequence began with
the entrance of a virtual character (walk in),
followed by positioning (turn) either looking
towards the observer (ME) or towards someone
else who is out of view (OTHER; see Figure 1). The
resulting two factors: (1) ‘‘social interaction’’ [SOC
vs. ARB]; and (2) ‘‘self involvement’’ [ME vs.
OTHER], thus constitute a two-by-two factorial
3
design. The presentation of mimic behavior *
either socially relevant or arbitrary *by the virtual
character always started at 3000 ms and reached its
apex, i.e., the moment of maximal change in facial
appearance, at 3333 ms. This temporal sequence
was maintained across all four conditions. After
showing mimic behavior the virtual character
turned away and walked out of the screen frame
(turn and walk off).
All subjects received standardized instructions
before the experiment to consider themselves
part of a virtual scene with virtual characters,
one of which would appear face-to-face throughout the experiment (as shown in Figures 2 and 3)
and express mimic behavior. The other virtual
characters could not be seen on screen at any time
from the test subject’s point of view, but participants were instructed to assume that these
characters stood close at an angular distance of
approximately 30 degrees (right or left; Figure 3).
Participants were instructed that the facial expressions or facial movements of the virtual
character seen vis-à-vis could, henceforth, be
directed towards themselves (08, ME) or towards
‘‘someone else’’, i.e., another virtual other (308,
OTHER). No additional explicit instructions
concerning the other agents were given.
The examples in Figure 2 show a male virtual
character directed at the observer, demonstrating
a socially relevant facial expression (‘‘eyebrow
flash’’ and smile), and a female virtual character
directed at someone else to the left of the
observer, demonstrating a facial movement perceived as arbitrary (lip contraction). Sample
video sequences are demonstrated at http://www.
uk-koeln.de/kliniken/psychiatrie/Bildgebung/ls_vi
deos/schilbach-videos.htm.
Study participants of two previous
studies and rationale for reanalysis
Neuroimaging data were acquired in an eventrelated fMRI study using the described stimulus
Figure 1. Screen shots of one exemplary video (SOC_ME) depicting the temporal event structure (first row: stimulus onset,
stimulus apex), the interval of increased EMG activity in zygomaticus major muscle (second row: EMG) as well as the FIR time bins
(third row: bins 1 10) whose associated BOLD responses were analyzed.
4
SCHILBACH ET AL.
Figure 2. Screen shots of exemplary videos showing (a) a
socially relevant facial expression (eyebrow flash) directed
towards the human observer (SOC_ME) and (b) an arbitrary
facial movement (lip contraction) directed towards someone
else (ARB_OTHER).
material. Seventeen healthy, right-handed male
volunteers (mean age 25.9 years9standard
deviation of 4.2 years) participated in this
experiment (Schilbach et al., 2006). Functional
magnetic resonance (fMRI) was carried out
using echo planar imaging (EPI) with whole
brain coverage and a 1.5 Tesla MRI system
(SIEMENS Sonata, Erlangen, FRG) with the
standard head coil. An echo planar imaging
sequence with the following parameters was
employed: repetition time (TR)3020 ms,
echo time (TE)66 ms, field of view
Figure 3.
Scene as shown in instructions.
200 mm200 mm, a908, matrix size64
64, voxel size3 mm3 mm4 mm. Using a
midsagittal scout image, 30 axial slices (0.4 mm
inter-slice gap) were positioned to cover the
whole brain. Scanning was performed continuously over one run and restarted for the
subsequent three runs. In addition, anatomical
wholebrain images were obtained by using a
T1-weighted, 3D gradient-echo pulse sequence
(MP-RAGE, magnetization-prepared, rapid acquisition gradient echo) with the following
parameters: TR2200 ms, TE3.93 ms, 158
flip angle, FOV256 mm256 mm, matrix
size200256, 128 sagittal slices with 1 mm
thickness.
To investigate the EMG correlates of the
perception of self- and other-related facial expressions and in particular to characterize the
temporal dynamics of facial mimicry, a subsequent study including 23 healthy, right-handed
male volunteers (mean age 23.4 years9standard
deviation of 2.4 years) was performed using
exactly the same stimuli (Mojzisch et al., 2006).
This investigation has shown that *in concordance with the available literature (cf. Dimberg,
1997b) *responses in activity of zygomaticus
major muscle occur after a delay of approximately 200 ms following the perception of a
relevant stimulus (here: stimulus apex). Our
analysis revealed a significant main effect of
SOC, F(1, 16)7.66, p.01, h2 .32, when
comparing mean EMG activity across all experimental conditions for a time window ranging
from 3500 to 3700 ms (Figures 1 and 4). This
indicated that zygomaticus major muscle activity
was significantly larger if the virtual character
showed socially relevant facial expressions than if
the facial expression was arbitrary. Outside this
time window, no significant differences in EMG
responses were observed.
To investigate whether specific neural correlates underlie the occurrence of facial mimicry in
response to seeing socially relevant facial expressions the abovementioned fMRI data set was now
reanalyzed by using the timing information of
the occurrence of involuntary facial movements
during passive viewing of these stimuli as identified in the EMG study. This highlights the
potential value of acquiring and using psychophysiological correlates together with neuroimaging
data as it allows us to refine data analyses by
either providing explanatory confounds or by
providing additional temporal information, as
was the case in this investigation.
NEURAL CORRELATES OF FACIAL EMBODIMENT
0,3
0,25
mean EMG activity
0,2
0,15
0,1
SOC_ME
SOC_OTHER
ARB_ME
ARB_OTHER
0,05
0
-0,05
-0,1
-0,15
-0,2
experimental conditions
Figure 4. Condition-specific mean EMG activity in zygomaticus major muscle for the time interval of 3500 3700 ms (z transformed,9standard error of the mean). Data from Mojzisch et al., 2006.
Statistical parametric mapping
The acquired fMRI data was preprocessed according to Schilbach et al. (2006) and analyzed
employing the finite impulse response model
(FIR; Henson, Andersson, & Friston, 2000; Henson, Rugg, & Friston, 2001) implemented in
SPM5 (Wellcome Department of Imaging Neuroscience). This approach is similar to selective
averaging in that it can be thought of as selective
averaging without counterbalancing of trial orders and the need for time-locking stimulus
presentation and data acquisition (Dale & Buckner, 1997). The model fits the measured BOLD
response relative to different temporal segments
of the events by providing a set of basis functions
within the framework of a General Linear Model
(GLM) capturing responses relative to a number
of successive poststimulus time bins as separate
parameters (‘‘mini-boxcars’’; Ollinger, Shulman,
& Corbetta, 2001). In this sense, the parameter
estimate of a specific time bin within the FIR
response space can be regarded as the hemodynamic response to the corresponding time window of the stimulus presentation. That is, the FIR
model allows us to separate the responses to
particular partitions of the stimulus (assuming a
linear temporal relationship between subsequent
stimulus and response components, respectively).
The obtained parameter estimates, capturing the
partition of the response space pertaining to a
particular window of the stimulus presentation
can then *appropriate corrections for nonsphericity provided *be entered into second-level uni-
5
variate or multivariate analyses for group
inference.
For the present analysis each event, i.e., each
video (Schilbach et al., 2006) was divided into 10
time bins of 750 ms stimulus time (see Figure 1),
whose corresponding BOLD response models
covered a total of 24 s post-stimulus (BOLD)
response. This setting allows us to estimate
BOLD response relative to different time windows, including one (hereafter referred to as the
‘‘window of interest’’, equivalent to time bin 5 in
Figure 1) that corresponds to both the occurrence
of the stimulus apex (3000 3333 ms) as well the
response obtained during EMG measurements
(3500 3750 ms) (Mojzisch et al., 2006). Hereby
we could test for BOLD signal change, which is
specifically related to the time window of the
stimulus presentation in which facial mimicry
occurred.
Time-bin width was lower than the TR used
during data acquisition, because we attempted to
specifically target a stimulus time interval including the stimulus apex and its psychophysiological
response. As detailed above, however, parameter
estimation was not performed on the stimulus, but
on the corresponding response time bins covering
the expected BOLD response. Furthermore, it
has also been shown that it is possible to sample
the impulse response at post-stimulus intervals
shorter than TR by jittering event onsets with
respect to scan onsets (Josephs, Turner, & Friston,
1997). In our study a random interstimulus jitter
of varying duration (38 s) was used.
Image analyses were carried out after highpass filtering (128 s) to remove subject-specific,
low-frequency signal drifts and global intensity
scaling. Following the estimation of the subjectspecific general linear models, parameter estimates for all 40 experimental regressors (4 conditions10 time bins) were entered in a secondlevel analysis of variance (ANOVA) allowing
inference to the general population. Violations
of data sphericity were explicitly accounted for by
modeling non-independence across parameter
estimates from the same subject and allowing
unequal variances both between conditions/time
bins and subjects using the standard implementation for variance component estimation in SPM5.
Specific effects for each voxel were tested by
applying appropriate linear contrasts to the parameter estimates of this ANOVA, resulting in a tstatistic for every particular voxel and consequently a statistical parametric map of the tstatistic (SPMt). A random effects model with a
6
SCHILBACH ET AL.
height threshold of p.001 (uncorrected) and an
extent threshold of 10 voxels was used for
inferring significant activations.
As we were primarily interested in defining
the neural correlates of facial mimicry, inference
focused on the main effect of SOC in time bin
5, corresponding to the estimated BOLD response evoked by the neuronal activity during
this our ‘‘window of interest’’. We, henceforth,
tested for the stimulustime bin interaction,
focusing on the stimulus-contrast of the main
effect of SOC as our previous psychophysiological study had demonstrated significant EMG
results for exactly this main effect as described
above (Mojzisch et al., 2006). Using the timing
information of the EMG response in zygomaticus major muscle we consequently tested a
specific hypothesis about neuronal activity in
response to a specific time bin (#5, 3000 3750 ms) for this particular main effect. Based
on our findings contrast estimates were taken
from the data of all subjects at principally
activated voxels (namely left precentral gyrus
and right posterior cingulate cortex) to also
assess condition-specific signal change for all
FIR time bins.
To corroborate whether the observed activations for the main effect of SOC are in fact
specific to this contrast, we additionally performed a time bincondition interaction analysis
in a single linear contrast comparing the contrast
pertaining to time bin 5, i.e., the time where facial
mimicry occurred, with those pertaining to time
windows that are unrelated to facial mimicry
(time bins 1 4 and time bins 610).
Stereotactic Montreal Neurological Institute
(MNI) coordinates of the local maxima of significant activation were anatomically localized using
the SPM Anatomy Toolbox whenever possible
(Eickhoff et al., 2005; available with all published
cyto-architectonic maps from www.fz-juelich.de/
ime/spm_anatomy_toolbox). If no cyto-architectonic maps were available the macro-anatomical
labels of the MNI single-subject template brain
were used after additional comparison with the
mean structural image of the analyzed subjects
after normalization.
RESULTS
FIR analysis of the previously acquired neuroimaging data show that a distinct pattern of
neuronal activity was related to the difference
between perceiving socially relevant and arbitrary
facial gestures during our ‘‘window of interest’’,
i.e., time bin 5. This FIR window is tantamount
with the temporal segment of the stimulus sequence during which all changes in the virtual
character’s facial appearance occur including the
stimulus apex, i.e., the moment of maximal
change in facial appearance. Our previous study
(Mojzisch et al., 2006) has shown that facial
mimicry *indicated by an increase of facial
EMG activity in zygomaticus major muscle *
occurs with a latency of 200 ms after stimulus
apex and is, therefore, also covered by this FIR
time window (see Figures 1 and 4).
The neural network active during this period
that includes the presentation of socially relevant
facial expressions or arbitrary facial movements
followed by increased EMG activity, i.e., during
the occurrence of facial mimicry, is formed by
significant activations in the precentral motor
area, as well as in several other, non-motor
regions of the brain (Figure 5).
More precisely, our analysis demonstrated
activations pertaining to the window of interest
within the posterior portions of cingulate gyrus on
both hemispheres. Furthermore, differential
neural activity is seen in the right hippocampus.
Differential neural activity was also observed in
the dorsal midbrain, which was anatomically
localized by comparison with the mean structural
image of the analyzed subjects after normalization and thereby interpreted as activity in the
midbrain tectum. Additionally, our results demonstrate activation of the inferior aspects of the
left precentral gyrus, i.e., the face motor area.
Lastly, differential brain activity is found in the
left precuneus more posterior than the activations
observed in cingulate cortices.
In keeping with our idea of employing a timesensitive approach guided by findings from a
previous psychophysiological study to target a
specific time interval, results of the analysis of the
contrast estimates for precentral gyrus and posterior cingulate cortex across all FIR time bins
illustrate significant, condition-specific differences for our ‘‘window of interest’’ (time bin 5)
related to the main effect of SOC, (SOC_ME
SOC_OTHER) relative to (ARB_MEARB
_OTHER) (Figures 6 and 7). More specifically,
the contrast estimates for precentral gyrus demonstrate a significant difference between activity related to SOC_ME, i.e., the perception of
self-directed socially relevant facial expressions,
as compared to SOC_OTHER, i.e., the percep-
NEURAL CORRELATES OF FACIAL EMBODIMENT
7
Figure 5. Neural correlates of the main effect of SOC related to facial mimicry: (a) overview of activations*; (b) left precentral
cortex activation (42,4, 38); (c) posterior cingulate cortex activation (12,48, 32); (d) midbrain activation (0,36,14); (e)
right hippocampus activation (16, 32, 0). *Activations in (a) are shown as overlay rendered onto SPM template; Activations in (b)
to (e) are shown as section overlay onto mean structural image of the analyzed subjects after normalization.
(Table 1), whereas in posterior cingulate cortex
this difference is already apparent earlier (Figure
7).
To test for the temporal specificity of the
activation results for the main effect of SOC,
i.e., targeting a condition-by-time bin interaction,
the activations associated with the ‘‘window of
interest’’ (time bin 5) were contrasted with those
FIR windows not related to stimulus apex and the
tion of other-directed socially relevant facial
expressions (Figure 6). Interestingly, this finding
also parallels the findings from our EMG study,
results of which also show the significant main
effect for SOC to be influenced by a strong effect
for SOC_ME (Mojzisch et al., 2006; see Figure 4).
It is also noteworthy that the activations corresponding to the main effect of SOC occur later in
precentral gyrus and are restricted to time bin 5
1,5
1
contrast estimates
0,5
0
SOC_ME
1
2
3
4
5
6
-0,5
7
8
9
10
SOC_OTHER
ARB_ME
ARB_OTHER
-1
-1,5
-2
-2,5
FIR time bins
Figure 6. Condition-specific mean activations of all subjects (9standard error of the mean) for all 10 FIR time bins for left
precentral gyrus (42,4, 38).
8
SCHILBACH ET AL.
2
1,5
contrast estimates
1
0,5
SOC_ME
SOC_OTHER
ARB_ME
ARB_OTHER
0
1
2
3
4
5
6
7
8
9
10
-0,5
-1
-1,5
-2
-2,5
FIR time bins
Figure 7. Condition-specific mean activations of all subjects (9standard error of the mean) for all 10 FIR time bins for right
posterior cingulate cortex (12,48, 32).
occurrence of facial mimicry. Hereby, we could
confirm that the neural activations found were,
indeed, specific to this ‘‘window of interest’’ as
the comparison with time bins unrelated to facial
mimicry yielded exactly the same pattern of
activations as described above (not illustrated).
DISCUSSION
Drawing upon functional imaging data from a
previous study (Schilbach et al., 2006) and the
temporal information from a follow-up EMG
investigation using exactly the same experimental
paradigm (Mojzisch et al., 2006), we are able to
show that specific neural activations do pertain to
the time window of our stimulus sequence in
which facial mimicry occurs. The findings show
that specific brain activations are related to the
occurrence of involuntary facial movements in
human observers in response to the perception of
socially relevant facial expressions shown by
perceived others. These activations comprise but
extend beyond classical motor regions (face
motor area) and include other brain centers
such as the cingulate cortex, the precuneus,
hippocampus and the dorsal midbrain.
Different forms of motor ‘‘resonance’’
Differential neural activity related to facial mimicry in response to the perception of socially
relevant facial expressions regardless of whether
these are self- or other-directed was observed in
the motor system, namely the left precentral
gyrus. More precisely this activation is located
in the inferior part, i.e., the face representation, of
the primary motor cortex. This area has not only
been shown to be differentially activated during
fMRI studies targeting the voluntary production
of facial movement and expressions (Dresel et al.,
2005; Hanakawa, Parikh, Bruno, & Hallett, 2005),
but has also been proposed to be automatically
activated during the perception of emotional
vocalizations (e.g., laughter) potentially preparing
oro-facial gestures in response to the stimuli
(Warren et al., 2006). It is also interesting to
TABLE 1
Neural correlates of the main effect of SOC for time bin 5 related to facial mimicry
MNI coordinates
Region
Right middle cingulate cortex*
Dorsal midbrain
Right hippocampus*
Left precentral gyrus*
Left middle cingulate cortex*
Left precuneus*
x
y
z
T
12
0
16
42
6
14
48
36
32
4
40
52
32
14
0
38
36
36
4.21
3.88
3.58
3.43
3.35
3.33
Note :*Assigned by using the Anatomy Toolbox.
NEURAL CORRELATES OF FACIAL EMBODIMENT
note that subjects in the study by Warren and
colleagues are described as reporting the urge to
produce a facial expression when exposed to
emotive auditory stimuli, which bears a strong
resemblance to the phenomenon of facial mimicry. This is in concordance with the idea of
automatic, embodied reactions to emotional and
social stimuli and corroborates our present findings in that they are suggestive of potential
perceptual motor interaction underlying these
phenomena.
There might be several benefits to instantiating
an involuntary, embodied component as part of a
reaction to a stimulus: first of all, it might help to
add an affective, bodily component when appraising a given situation, which might increase
relevance of a stimulus to oneself. Perceptual motor interaction might also facilitate motor
responses altering the organism’s state of action
readiness. Facially embodied reactions, in particular, might have also evolved to serve a communicative function by making a feeling state
accessible to others, when it translates into a
visible facial expression on one’s own face. It has
been suggested that it might be, in fact, this very
evolutionary heritage that intersubjectively conjoins human beings (Cole, 2001). While in adults
these automatic, instinctive facial reactions can be
suppressed and do not necessarily translate into a
visible change in facial appearance, evidence from
developmental psychology demonstrates that this
mechanism and its automatic manifestation is
important from birth onwards and fosters infant carer attachment (Meltzoff & Moore, 1977).
Given the deictic dimension of facial expressions and oro-facial movements, it is also relevant
to note that differential activity in the mouth
motor area has also been shown to be important
for different aspects of speech: Wilson, Saygin,
Sereno, and Iacoboni (2004) were able to show
that production of speech sounds activates Brodmann’s areas 4a, 6 and 4p. Similarly, simply
listening to speech sounds not only activated
areas of premotor cortex, but also more posterior
speech production motor areas. We suggest,
therefore, that both premotor and motor areas
can be automatically engaged by the perception
of actions.
It has been proposed that activation of premotor areas can be understood as belonging to
the ‘‘mirror neuron system’’ (MNS), which is
known to play an important role both in the
production and perception of intentional motor
behavior (Chao & Martin, 2000; Grèzes, Armony,
9
Rowe, & Passingham, 2003; Rizzolatti, Fadiga,
Fogassi, & Gallese, 2002a). Such systems of
‘‘shared circuits’’ have been suggested to be
involved in automatic processes of social contagion or ‘‘intentional attunement’’ by producing
internal representations of the body states of
actions and emotions and, thus, modeling someone else’s behavior as intentional experiences
(Gallese, 2006). Our data are in line with the
idea that these automatic processes contribute to
our understanding of others, their relatedness to
the world and allow for interpersonal sharing of
experiences.
Intriguingly, activations of primary motor areas
also seem to automatically ‘‘resonate’’ when we
passively perceive facial actions, the production
of which would rely on these activations. There
seems to be a close link between the visual
representation of face-based cues and its corresponding motor representation that lends support
to the idea that the process of perceiving faces
always includes an ‘‘enactive’’ element through
which we engage with and respond to stimuli
instead of a mere ‘‘passive’’ perception of facebased cues. Facial expressions might be particularly prone to engaging such mechanisms as they
can serve not only evaluative, but also communicative functions by providing visible cues about
one’s own or someone else’s internal states.
Medial temporal lobe, the dorsal
midbrain and automatic facial
movements
Our reanalysis demonstrates that activation of the
right hippocampus pertains to facial mimicry. This
may be related to mnestic and emotional contributions to the perception of socially relevant
facial expressions using memories to better understand the meaning of a mimic display (Britton,
Taylor, Sudheimer, & Liberzon, 2006). Differential activity in zygomaticus major muscle has been
suggested to indicate positive affect (Cacioppo,
Petty, Losch, & Kim, 1986), which makes sense
given that the socially relevant facial expressions
used to elicit facial mimicry are known to convey
affiliative motives.
Wild and colleagues (2003a) have provided
evidence for the involvement of medial temporal
lobe activations during non-volitional facial
movements. The authors hypothesized that this
region facilitates congruent facial movements
when an emotionally expressive face is perceived.
10
SCHILBACH ET AL.
Unfortunately, they were not able to record facial
EMG during their scanning procedures and might
have missed changes of activity in facial muscles
not accessible to visual inspection. Our data show
that medial temporal lobe structures are involved
in automatic facial reactions. Such involvement of
medial temporal lobe structures as part of a social
contagion has also been demonstrated by Schuermann et al. (2005) who found ‘‘contagious’’
yawning to involve the periamygdalar region.
Given the potentially strong anatomical connections between medial temporal lobe and cortical
face representations, involvement of the medial
temporal lobe might facilitate activations in the
motor system to respond to and mimic perceived
actions (Morecraft, Avramov, Schroeder, StilwelMorecraft, & Van Hoesen, 1998; Morecraft &
Van Hoesen, 1998).
Furthermore, our reanalysis also revealed activation of the dorsal midbrain, which might be
related to an orienting or alerting reaction possibly
associated with autonomic arousal as perception of
direct gaze is known to have this effect (Donovan &
Leavitt, 1980). Unfortunately, we were unable to
record psychophysiological parameters (such as
heart rate and galvanic skin response) online for the
reported study, but future investigations will include such measurements to specifically test for this
hypothesis. It has also been argued that the tectal
midbrain, namely the superior colliculi, is part of a
subcortical pathway, which mediates activation of
the amygdala (Morris, Oehman, & Dolan, 1999).
Consistent with this suggestion midbrain activations have been shown in response to emotionally
valenced pictures (Simpson et al., 2000), and it has
been demonstrated that these activations can be
augmented by the expectancy to see such images
(Bermpohl et al., 2006). Kim and colleagues have
also found differential activity of the midbrain
when asking subjects to imagine emotional facial
expressions (Kim et al., 2007). In a review article
concerning the neural correlates of humor and
laughter Wild, Rodden, Grodd, and Ruch (2003b)
summarize a number of studies and postulate the
involvement of an ‘‘involuntary’’ or ‘‘emotionally
driven’’ system including the amygdala, thalamic/
hypo- and subthalamic areas and the dorsal brainstem. Evidence from the macaque monkey suggests
that projections connect the oro-facial region of
primary motor cortex to the superior colliculi,
which implies that the tectal region might be
involved in the control of oro-facial movements
(Tokuno, Takada, Nambu, & Inase, 1995). Taken
together these studies suggest the involvement of
the dorsal midbrain in subcortical pathways influencing brain regions that are involved in processing
emotional aspects of stimuli as well as those parts of
the motor system that can be used to nonverbally
convey inner states via oro-facial expressions.
Another line of evidence to explain midbrain
contributions to the perception of socially relevant facial expressions and concomitant facial
mimicry would be to consider that dopaminergic
influences on the amygdala operate through
midbrain projections. Degeneration of such
projections *the neuropathological hallmark of
Parkinson’s disease*is known to affect emotional processing (Benke, Bosch, & Andree,
1998; Blonder, Gur, & Gur, 1989; Borod et al.,
1990; Breitenstein, Van Lancker, Daum, &
Waters, 2001; Pell, 1996; Tessitore et al., 2002).
Patients suffering from Parkinson’s disease have
both deficits in the production of emotional
responses and the perception of facial expressions
(Breitenstein, Daum, & Ackermann, 1998; Jacob,
Shuren, Bowers, & Heilman, 1995).
‘‘Default mode’’ activations and social
cognition
Both activations of posterior aspects of the
cingulate cortex and the precuneus as found in
our present reanalysis have repeatedly been
implicated by social cognition research and the
interesting question has been raised about how
such activations might be related to the so-called
‘‘default mode of brain function’’ (Raichle et al.,
2001; Schilbach et al., 2006).
The posterior cingulate cortex has been
connected to processes in which emotional
experiences trigger memory retrieval (Maddock,
Garrett, & Buonocuore, 2001, 2003), which is in
concordance with the idea that the recognition
of affective content of gestures (like facial
expressions) is shaped by sharing and forming
memories of experiences in interpersonal interaction. Involvement of posterior cingulate areas
has previously been demonstrated for the perception of facial expressions (Moriguchi et al.,
2005). The region has also been shown to be
active in studies of empathy and forgiveness as
well as self-reflection and self-referential processing (Carr et al., 2003; Craik et al., 1999;
Farrow et al., 2001; Fink et al., 1996; Iacoboni
et al., 2004; Johnson et al., 2002; Kelley et al.,
2002).
NEURAL CORRELATES OF FACIAL EMBODIMENT
Brain areas described as belonging to the
‘‘default network’’ are known to be metabolically
active at rest. This activity, however, decreases
when test subjects are confronted with tasks that
apply high cognitive demands whereas during
social cognitive tasks these deactivations are not
as pronounced. We suggest that the defaultmode-like activations involving the cortical midlines structures might represent the brain’s disposition to process social cues and might
contribute to self other differentiation (Northoff
& Bermpohl, 2004). This is in line with results of a
previous study which demonstrated that processing of self-relevant stimuli resulted in recruitment of medial prefrontal cortex whereas the
perception of other-directed stimuli involved
activity in medial parietal cortex (Schilbach et
al., 2006).
Importantly, the activation patterns found to
be related to facial mimicry include activity both
in the motor system and default mode structures.
The two systems, hence, seem to be involved
during the perception of intentional motor behavior, but may potentially contribute to different
aspects of the phenomenon. While activity in
motor cortex might help to generate a representation of the action which, in fact, can translate to
mimicking that behavior oneself, involvement of
default-mode-like activations might contribute to
social cognition by processing the differentiation
between self and other. In dyadic interaction both
mechanisms are crucially important as a facial
expression might highlight someone else’s internal state, but could also refer to some object or
might be expressive of the assessment of the vis-àvis, behavior or the process of interacting itself.
Indeed, it has been suggested that activity of
cortical midline structures might interact with socalled ‘‘shared circuit’’ activations that help to
represent states of perceived others (Keysers &
Gazzola, 2007; Uddin, Iacoboni, Lange, &
Keenan, 2007). Empirical investigation of such
interactions, we suggest, will rely upon implementing ‘‘truly interactive mind paradigms’’
(Singer, 2006), which would allow us to study
the neural correlates of ‘‘online interaction’’
(Legrand & Iacoboni, in press).
CONCLUSIONS
On the basis of the presented results we suggest
that specific neural activations subserve the
phenomenon of facial mimicry in response to
11
the perception of socially relevant facial expressions. Our findings show that associated brain
activity is not restricted to motor cortex, but
includes brains regions known to be involved in
social cognition.
We propose that differential activity in the face
area of the motor cortex relates to the occurrence
of involuntary activity of facial muscles in a
human observer in response to seeing socially
relevant facial expressions of a perceived other *
facial mimicry *and can be understood as an
embodied response. This complex, pre-reflective
response, hence, seems to involve perceptual motor interaction leading to the production of
facial movements, which might serve evaluative
and communicative functions. Activation of medial temporal areas might provide emotional and
mnemonic contributions to the perception of
socially relevant facial expressions and has been
implicated in facilitating non-volitional facial
movements as part of the automatic response to
seeing emotional facial expressions. Differential
activity in the tectal midbrain can be understood
as part of subcortical pathways influencing emotion processing and the production of oro-facial
movements prompted by the perception of emotionally salient stimuli.
Additionally, activation of posterior cingulate
cortex and the precuneus found in our analysis
can be described as pertaining to the ‘‘default
mode network’’ of the brain and might be
involved in the differentiation of self and other.
Involuntary activity in zygomaticus major
muscle in response to the perception of socially
relevant facial expressions, i.e., facial mimicry,
and its neural correlates can be understood as a
manifestation of facial embodiment, i.e., imitative, pre-reflective bodily responses to seeing
someone else’s facial expressions, which might
not only serve evaluative functions, but might also
help to reciprocally engage with someone else. By
these mechanisms of perceptual motor interaction the perception of facial expressions or other
social cues might be enriched to give us ‘‘direct’’
access to other people’s minds (Lipps, 1907),
while at the same time producing behavioral
responses visible to the interaction partner, which
help to sustain the process of interaction.
On a more speculative note, we would like to
end by saying that these mechanisms might be
constitutive for entering a ‘‘second-person perspective’’ in which two subjects not only generate
representations of each other, but are actively
engaged with one another’s actions (Legrand &
12
SCHILBACH ET AL.
Iacoboni, in press; Reddy, 2003). In this sense the
dynamic process of social interaction could be
seen as grounded in embodied practice, which
governs and sculpts interpersonal relatedness
(Gallagher, 2001).
Manuscript received 16 April 2007
Manuscript accepted 5 July 2007
First published online day/month/year
REFERENCES
Benke, T., Bosch, S., & Andree, B. (1998). A study of
emotional processing in Parkinson’s disease. Brain
and Cognition, 38, 36 52.
Bermpohl, F., Pascual-Leone, A., Amedi, A., Merabet,
L. B., Fregni, F., Gaab, N., et al. (2006). Attentional
modulation of emotional stimulus processing: an
fMRI study using emotional expectancy. Human
Brain Mapping, 27(8), 662 677.
Blair, R. J. R. (2003). Facial expressions, their communicatory functions and neuro-cognitive substrates.
Philosophical Transactions of the Royal Society of
London, 358, 561 572.
Blonder, L. X., Gur, R. E., & Gur, R. C. (1989). The
effects of right and left hemiparkinsonism on
prosody. Brain and Language, 36, 193 207.
Borod, J. C., Welkowitz, J., Alpert, M., Brozgold, A. Z.,
Martin, C., Peselow, E., et al. (1990). Parameters of
emotional processing in neuropsychiatric disorders:
conceptual issues and a battery of tests. Journal of
Communication Disorders, 23, 247 271.
Breitenstein, C., Daum, I., & Ackermann, H. (1998).
Emotional processing following cortical and subcortical brain damage: contribution of the frontostriatal circuitry. Behavioral Neurology, 11, 29 42.
Breitenstein, C., Van Lancker, D., Daum, I., & Waters,
C. H. (2001). Impaired perception of vocal emotions
in Parkinson’s disease: influence of speech time
processing and executive functioning. Brain and
Cognition, 45, 277 314.
Britton, J. C., Taylor, S. F., Sudheimer, K. D., &
Liberzon, I. (2006). Facial expressions and complex
IAPS pictures: common and differential networks.
NeuroImage, 31(2), 906 919.
Cacioppo, J. T., Petty, R. E., Losch, M. E., & Kim, H. S.
(1986). Electromyographic activity over facial muscle regions can differentiate the valence and the
intensity of affective reactions. Journal of Personality and Social Psychology, 50, 260 268.
Carr, L., Iacoboni, M., Dubeau, M.-C., Mazzioatta, J.
C., & Lenzi, G. L. (2003). Neural mechanisms of
empathy in humans: A relay from neural systems for
imitation to limbic areas. Proceedings of the National Academy of Sciences USA, 100(9), 5497 502.
Chao, L. L., & Martin, A. (2000). Representation of
manipulable man-made objects in the dorsal stream.
NeuroImage, 12, 478 484.
Cole, J. (2001). Empathy needs a face. Journal of
Consciousness Studies, 8(5 7), 51 68.
Craik, F. I. M., Moroz, T., Moscovitch, M., Stuss, D. T.,
Winocur, G., Tulving, E., et al. (1999). In search of
the self: A positron emission tomography study.
Psychological Science, 10, 26 34.
Dale, A., & Buckner, R. (1997). Selective averaging of
rapidly presented individual trials using fMRI. Human Brain Mapping, 5, 329 340.
Darwin, C. (1874). The expression of the emotions in
man and animals. London: Murray.
Dimberg, U. (1982). Facial reactions to facial expressions. Psychophysiology, 19, 643 647.
Dimberg, U. (1988). Facial expressions and emotional
reactions: A psychobiological analysis of human
social behaviour. In H. L. Wagner (Ed.), Social
psychophysiology and emotion: Theory and clinical
applications (pp. 131 150). Chichester, UK: Wiley.
Dimberg, U. (1997a). Psychophysiological reactions to
facial expressions. In C. U. Segerstrale & P. Molnar
(Eds.), Nonverbal communication: Where nature
meets culture (pp. 47 60). Mahwah, NJ: Lawrence
Erlbaum Associates, Inc.
Dimberg, U. (1997b). Facial reactions: Rapidly evoked
emotional responses. Journal of Psychophysiology,
11(2), 115 123.
Dimberg, U., Thunberg, M., & Elmehed, K. (2000).
Unconscious facial reactions to emotional facial
expressions. Psychological Science, 11, 86 89.
Doherty, R. W. (1998). Emotional contagion and social
judgment. Motivation and Emotion, 22(3), 187 209.
Donovan, W. L., & Leavitt, L. A. (1980). Physiologic
correlates of direct and averted gaze. Biological
Psychology, 10(3), 189 199.
Dresel, C., Castrop, F., Haslinger, B., Wohlschlaeger, A.
M., Hennenlotter, A., & Ceballos-Baumann, A. O.
(2005). The functional neuroanatomy of coordinated
orofacial movements: sparse sampling fMRI of
whistling. NeuroImage, 28(3), 588 597.
Eickhoff, S. B., Stephan, K. E., Mohlberg, H., Grefkes,
C., Fink, G. R., Amunts, K., et al. (2005). A new
SPM toolbox for combining probabilistic cytoarchitectonic maps and functional imaging data. NeuroImage, 25(4), 1325 1335.
Ekman, P., & Friesen, W. (1978). Facial action coding
system: A technique for the measurement of facial
movement. Palo Alto, CA: Consulting Psychologists
Press.
Erickson, K., & Schulkin, J. (2003). Facial expressions
of emotions: A cognitive neuroscience perspective.
Brain and Cognition, 52, 52 60.
Esteves, F., Parra, C., Dimberg, U., & Ohman, A.
(1994). Non-conscious associative learning: Pavlovian conditioning of skin conductance responses to
masked fear-relevant stimuli. Psychophysiology,
31(4), 375 385.
Farrow, T. F. D., Zheng, Y., Wilkinson, I. D., Spence, S.
A., Deakin, J. F. W., Tarrier, N., et al. (2001, August
8). Investigating the functional anatomy of empathy
and forgiveness. Neuroreport, 12(11), 2433 2438.
Fink, G. R., Markowitsch, H. J., Reinkemeier, M.,
Bruckbauer, T., Kessler, J., & Heiss, W. D. (1996).
Cerebral representation of one’s own past: Neural
networks involved in autobiographical memory.
Journal of Neuroscience, 16, 4275 4282.
Frijda, N. H., & Tcherkassof, A. (1997). Facial expressions as modes of action readiness. In J. A. Russel &
J. M. Fernandez-Dols (Eds.), The psychology of
NEURAL CORRELATES OF FACIAL EMBODIMENT
facial expression (pp. 78 102). New York: Cambridge University Press.
Gallagher, S. (2001). The practice of mind: Theory,
simulation, or interaction? Journal of Consciousness
Studies, 8(5 7), 83 107.
Gallese, V. (2006). Intentional attunement: a neurophysiological perspective on social cognition and its
disruption in autism. Brain Research, 1079(1), 15 24.
Grammer, K., Schiefenhövel, W., Schleidt, M., Lorenz,
B., & Eibl-Eibesfeldt, I. (1988). Patterns on the face:
The eye-brow flash in crosscultural comparison.
Ethology, 77, 279 288.
Grèzes, J., Armony, J. L., Rowe, J., & Passingham, R. E.
(2003). Activations related to ‘‘mirror’’ and ‘‘canonical’’ neurones in the human brain: an fMRI study.
NeuroImage, 18(4), 928 937.
Hanakawa, T., Parikh, S., Bruno, M. K., & Hallett, M.
(2005). Finger and face representations in the
ipsilateral precentral motor areas in humans. Journal
of Neurophysiology, 93(5), 2950 2958.
Haxby, J. V., Hoffman, E. A., & Gobbini, M. I. (2002).
Human neural systems for face recognition and
social communication. Biological Psychiatry, 51,
59 67.
Henson, R. N. A., Andersson, J., & Friston, K. J. (2000).
Multivariate SPM: Application to basis function
characterisations of event-related fMRI responses.
NeuroImage, 11, 468.
Henson, R. N. A., Rugg, M. D., & Friston, K. J. (2001).
The choice of basic functions in event-related fMRI.
HBM01 abstract. NeuroImage, 13, 149.
Iacoboni, M., Lieberman, M. D., Knowlton, B. J.,
Molnar-Szakacs, I., Moritz, M., Throop, C. J., et al.
(2004). Watching social interactions produces dorsomedial prefrontal and medial parietal BOLD
fMRI signal increases compared to a resting baseline. NeuroImage, 21(3), 1167 1173.
Iacoboni, M., Woods, R. P., Brass, M., Bekkering, H.,
Mazziotta, J. C., & Rizzolatti, G. (1999). Cortical
mechanisms of human imitation. Science, 286, 2526 2528.
Jacob, D. H., Shuren, J., Bowers, D., & Heilman, K. M.
(1995). Emotional facial imagery, perception, and
expression in Parkinson’s disease. Neurology, 45,
1696 1702.
Johnson, S. C., Baxter, L. C., Wilder, L. S., Pipe, J. G.,
Heiserman, J. E., & Prigatano, G. P. (2002). Neural
correlates of self-reflection. Brain, 125(8), 1808 1814.
Josephs, O., Turner, R., & Friston, K. (1997). Event
related fMRI. Human Brain Mapping, 5, 243 248.
Kaiser, S., & Wehrle, T. (2001). Facial expressions as
indicators of appraisal processes. In K. R. Scherer,
A. Schorr, & T. Johnstone (Eds.), Appraisal processes in emotions: Theory, methods, research (pp.
285 300). New York: Oxford University Press.
Kelley, W. M., Macrae, C. N., Wyland, C. L., Caglar, S.,
Inati, S., & Heatherton, T. F. (2002). Finding the
self? An event-related fMRI study. Journal of
Cognitive Neuroscience, 14(5), 785 794.
Kendon, A., & Ferber, A. (1973). A description of
some human greetings. In R. P. Michael & J. H.
13
Crook (Eds.), Comparative ecology and behaviour
of primates (pp. 591 668). London: Academic Press.
Keysers, C., & Gazzola, V. (2007). Integrating simulation and theory of mind: from self to social cognition.
Trends in Cognitive Sciences. (Epub ahead of print)
Kim, S. E., Kim, J. W., Kim, J. J., Jeong, B. S., Choi, E.
A., Jeong, Y. G., et al. (2007). The neural mechanism
of imagining facial affective expression. Brain Research, 11, 128 137.
Lang, P. J., Greenwald, M. K., Bradley, M. M., &
Hamm, A. O. (1993). Looking at pictures: Affective,
facial, visceral, and behavioral reactions. Psychophysiology, 30, 261 273.
Lanzetta, J. T., & Orr, S. P. (1986). Excitatory strength
of expressive faces: effects of happy and fear
expressions and context on the extinction of a
conditioned fear response. Journal of Personality
and Social Psychology, 50(1), 190 194.
Lee, T. W., Josephs, O., Dolan, R. J., & Critchley, H. D.
(2006). Imitating expressions: emotion-specific
neural substrates in facial mimicry. Social Cognitive
and Affective Neuroscience, 1(2), 122 135.
Legrand, D., & Iacoboni, M. (in press). Intersubjective
intentional actions. In F. Grammont, D. Legrand, &
P. Livet (Eds.), Naturalizing intention in action. An
interdisciplinary approach. Cambridge, MA: MIT
Press.
Leslie, K. R., Johnson-Frey, S. H., & Grafton, S. T.
(2004). Functional imaging of face and hand imitation: towards a motor theory of empathy. NeuroImage, 21, 601 607.
Lipps, T. (1907). Das Wissen von fremden Ichen. In
Psychologische Untersuchungen (Vol. 1 pp. 694 722). Leipzig, Germany: W. Engelmann.
Lundqvist, L.-O. (1995). Facial EMG reactions to facial
expressions: A case of facial emotional contagion?
Scandinavian Journal of Psychology, 36, 130 141.
Lundqvist, L.-O., & Dimberg, U. (1995). Facial expressions are contagious. Journal of Psychophysiology, 9,
203 211.
Maddock, R. J., Garrett, A. S., & Buonocore, M. H.
(2001). Remembering familiar people: the posterior
cingulate cortex and autobiographical memory retrieval. Neuroscience, 104(3), 667 676.
Maddock, R. J., Garrett, A. S, & Buonocore, M. H.
(2003). Posterior cingulate cortex activation by
emotional words: fMRI evidence from a valence
decision task. Human Brain Mapping, 18, 30 41.
Mehrabian, A. (1971). Silent messages. Belmont, CA:
Wadsworth.
Meltzoff, A. N., & Moore, M. K. (1977). Imitation of
facial and manual gestures by human neonates.
Science, 198, 75 78.
Meltzoff, A. N., & Moore, M. K. (1989). Imitation in
newborn infants: Exploring the range of gestures
imitated and the underlying mechanisms. Developmental Psychology, 25, 954 962.
Meltzoff, A. N., & Prinz, W. (2002). The imitative mind:
Development, evolution, and brain bases. In Cambridge studies in cognitive perceptual development.
New York: Cambridge University Press.
Merleau-Ponty, M. (1964). The primacy of perception.
Evanston, IL: Northwestern University Press.
14
SCHILBACH ET AL.
Mojzisch, A., Schilbach, L., Pannasch, S., Helmert, J.
R., Velichkovsky, B. M., & Vogeley, K. (2006). The
effects of self-involvement on attention, arousal, and
facial expression during social interaction with
virtual others: A psychophysiological study. Social
Neuroscience, 1, 184 195.
Morecraft, R. J., Avramov, K., Schroeder, C. M.,
Stilwell-Morecraft, K. S., & Van Hoesen, G. W.
(1998). Amygdala connections with the cingulate
motor cortex: Preferential innervation of the face
and arm representations of M3 (area 24c) in the
rhesus monkey. Society for Neurosciences Abstracts,
24, 653.
Morecraft, R. J., & Van Hoesen, G. W. (1998). Convergence of limbic input to the cingulate motor
cortex in the rhesus monkey. Brain Research Bulletin, 45, 209 232.
Moriguchi, Y., Ohnishi, T., Kawachi, T., Mori, T.,
Hirakata, M., Yamada, M., et al. (2005). Specific
brain activation in Japanese and Caucasian people
to fearful faces. Neuroreport, 16(2), 133 136.
Morris, J. S., Oehman, A., & Dolan, R. J. (1999). A
subcortical pathway to the right amygdala mediating
‘‘unseen’’ fear. Proceedings of the National Academy of Sciences USA, 96(4), 1680 1685.
Niedenthal, P. M., Barsalou, L. W., Winkielman, P.,
Krauth-Gruber, S., & Ric, F. (2005). Embodiment in
attitudes, social perception, and emotion. Personality and Social Psychology Review, 9(3), 184 211.
Northoff, G., & Bermpohl, F. (2004). Cortical midlines
structures and the self. Trends in Cognitive Sciences,
8(3), 102 107.
Ollinger, J. M., Shulman, G. L., & Corbetta, M. (2001).
Separating processes within a trial in event-related
functional MRI. NeuroImage, 13, 210 217.
O’Toole, R., & Dubin, R. (1968). Baby feeding and
body sway: An experiment in George Herbert
Mead’s ‘‘Taking the role of the other’’. Journal of
Personality and Social Psychology, 10, 59 65.
Pell, M. D. (1996). On the receptive prosodic loss in
Parkinson’s disease. Cortex, 36, 693 704.
Raichle, M. E., MacLeod, A. M., Snyder, A. Z., Powers,
W. J., Gusnard, D. A., & Shulman, G. L. (2001). A
default mode of brain function. Proceedings of the
National Academy of Sciences USA, 98(2), 676 682.
Reddy, V. (2003). On being the object of attention:
implications for self other consciousness. Trends in
Cognitive Science, 7(9), 397 402.
Rizzolatti, G., Fadiga, L., Fogassi, L., & Gallese, V.
(2002a). From mirror neurons to imitation. In A. N.
Meltzoff & W. Prinz (Eds.), The imitative mind (pp.
247 266). New York: Cambridge University Press.
Rizzolatti, G., Fogassi, L., & Gallese, V. (2002b). Motor
and cognitive functions of the ventral premotor
cortex. Current Opinion in Neurobiology, 12(2),
149 154.
Schilbach, L., Wohlschläger, A. M., Newen, A., Krämer, N., Shah, N. J., Fink, G. R., et al. (2006). ‘‘Being
with others’’: Neural correlates of social interaction.
Neuropsychologia, 44, 718 730.
Schuermann, M., Hesse, M. D., Stephan, K. E., Saarela,
M., Zilles, K., Hari, R., et al. (2005). Yearning to
yawn: the neural basis of contagious yawning.
NeuroImage, 24(4), 1260 1264.
Simpson, J. R., .Öngür, D., Akbudak, E., Conturo, T. E.,
Ollinger, J. M., Snyder, A. Z., et al. (2000). The
emotional modulation of cognitive processing: An
fMRI study. Journal of Cognitive Neuroscience, 12,
157 170.
Singer, T. (2006). The neural basis and ontogeny of
empathy and mind reading: Review of literature and
implication for future research. In V. J. Brown & E.
Kelley (Eds.), Methodological and conceptual advances in the study of brain-behaviour dynamics: A
multivariate lifespan perspective. Neuroscience and
Biobehavioral Reviews, 30, 855 863.
Tessitore, A., Hariri, A. R., Fera, F., Smith, W. G.,
Chase, T. N., Hyde, T. M., et al. (2002). Dopamine
modulates the response of the human amygdala: a
study in Parkinson’s disease. Journal of Neuroscience, 22, 9099 9103.
Tokuno, H., Takada, M., Nambu, A., & Inase, M.
(1995). Direct projections from the oro-facial region
of the primary motor cortex to the superior colliculus in the macaque monkey. Brain Research, 703(1 2), 217 222.
Uddin, L. Q., Iacoboni, M., Lange, C., & Keenan, J. P.
(2007). The self and social cognition: the role of
cortical midline structures and mirror neurons.
Trends in Cognitive Sciences, 11(4), 153 157.
Wallbott, H. G. (1991). Recognition of emotion from
facial expression via imitation? Some indirect evidence for an old theory. British Journal of Social
Psychology, 30, 207 219.
Warren, J. E., Sauter, D. A., Eisner, F., Wiland, J.,
Dresner, M. A., Wise, R. J., et al. (2006). Positive
emotions preferentially engage an auditory-motor
‘‘mirror’’ system. Journal of Neuroscience, 26(50),
13067 13075.
Wild, B., Erb, M., Eyb, M., Bartels, M., & Grodd, W.
(2003a). Why are smiles contagious? An fMRI study
of the interaction between perception of facial affect
and facial movement. Psychiatry Research: Neuroimaging, 123, 17 36.
Wild, B., Rodden, F. A., Grodd, W., & Ruch, W.
(2003b). Neural correlates of laughter and humour.
Brain, 126(10), 2121 2138.
Wilson, S. M., Saygin, A. P., Sereno, M. I., & Iacoboni,
M. (2004). Listening to speech activates motor areas
involved in speech production. Nature Neuroscience,
7(7), 701 702.
Wittgenstein, L. (1974). Philosophical investigations.
Oxford, UK: Blackwell.