Brain Networks Involved in Viewing Angry Hands or Faces

Cerebral Cortex August 2006;16:1087--1096
doi:10.1093/cercor/bhj050
Advance Access publication October 12, 2005
Brain Networks Involved in Viewing
Angry Hands or Faces
Most neuropsychological research on the perception of emotion
concerns the perception of faces. Yet in everyday life, hand actions
are also modulated by our affective state, revealing it, in turn, to the
observer. We used functional magnetic resonance imaging (fMRI)
to identify brain regions engaged during the observation of hand
actions performed either in a neutral or an angry way. We also
asked whether these are the same regions as those involved in
perceiving expressive faces. During the passive observation of
emotionally neutral hand movements, the fMRI signal increased
significantly in dorsal and ventral premotor cortices, with the
exact location of the ‘peaks’ distinct from those induced by face
observation. Various areas in the extrastriate visual cortex were
also engaged, overlapping with the face-related activity. When the
observed hand action was performed with emotion, additional
regions were recruited including the right dorsal premotor, the right
medial prefrontal cortex, the left anterior insula and a region in the
rostral part of the supramarginal gyrus bilaterally. These regions,
except for the supramarginal gyrus, were also activated during the
perception of angry faces. These results complement the wealth of
studies on the perception of affect from faces and provide further
insights into the processes involved in the perception of others
underlying, perhaps, social constructs such as empathy.
Keywords: action observation, emotion, empathy, fMRI, fronto-parietal
circuits, occipito-temporal cortex, social cognition
Introduction
By observing someone performing a simple act, such as picking
up a telephone, one can often tell if this person is happy, angry
or sad. Our brain is thus able to extract not only the meaning
and goal of a visually observed action, but also information about
the agent, such as his/her affective state. The expression of
emotions by body movements and posture has been studied
by biologists in the past (Darwin, 1872; Hess and Bruger, 1943),
yet most neuropsychological research on the perception of
emotion has concentrated on the decoding of facial expression
(Adolphs, 2002). The goal of this study is to investigate which
brain regions are engaged when we observe hand actions
performed with an emotion and how this compares with the
perception of emotional face movements.
A wealth of neuropsychological and neuroimaging studies
have demonstrated that when observers are asked to watch faces
depicting an emotion, a brain network that includes the lateral
fusiform gyrus, the superior temporal sulcus, the amygdala, the
orbitofrontal cortex and the insula is engaged consistently
(Adolphs, 2002; LaBar et al., 2003). But it is not clear whether
those brain regions respond specifically to faces, or to emotion in
faces, or whether they might be more generally involved in the
perception of emotion expressed by other people.
The Author 2005. Published by Oxford University Press. All rights reserved.
For permissions, please e-mail: [email protected]
Marie-Hélène Grosbras1 and Tomáš Paus1,2
1
Cognitive Neuroscience Unit, Montreal Neurological
Institute, McGill University, Montreal, Canada and 2Brain &
Body Centre, University of Nottingham, Nottingham, UK
It is now well established that there are two main neural
systems, first described in non-human primates, dedicated to
the perception of motion of other living beings. First, some
neurons within the superior temporal sulcus (STS) respond
selectively to the presentation of dynamic bodies, body parts or
faces (Perrett et al., 1982, 1985; Allison et al., 2000). Second,
‘mirror neurons’ found within the inferior premotor cortex
and/or inferior frontal gyrus, as well as in the anterior inferior
parietal lobe, are active both when the subject performs
a specific action himself and when he observes another individual performing the same action (reviewed in Rizzolatti and
Craighero, 2004). However, only a few previous studies of this
‘action-observation’ system have considered the emotional
aspect of the action. A few neuroimaging studies in humans
have shown that passively viewing caricatured silhouettes,
point-of-light displays or whole-body postures symbolizing an
emotion engages regions in the superior temporal sulcus, the
fusiform gyrus and the amygdala (Bonda et al., 1996; Hadjikhani
and de Gelder, 2003; de Gelder et al., 2004). Interestingly, those
regions are frequently described as being involved in the
processing of emotion from static or dynamic images of faces.
To date, however, the neural correlates of the perception of
emotion from natural everyday movements have not been
investigated.
We designed the present study to bring together the emotion
and the action-observation research fields. We addressed the
following questions: to what extent does the affect of the agent
modulate the brain activity engaged by observing someone
else’s action? Are specific regions recruited when the observed
agent performs the action with emotion? How does this
compare to the brain network recruited during the observation
of emotional face movements? In addition, we sought to
contribute to the debate on the specificity of face perception
by investigating whether the observation of face movements
and the observation of hand movements share common neural
basis and emotional modulation.
Materials and Methods
Subjects
Twenty healthy adults (10 females, age range 19--46 years, mean = 28.6
years) participated after providing written informed consent. All were
right handed and had normal or corrected-to-normal-vision. The study
conformed to the Helsinki declaration and was approved by the
Research Ethics Board of the Montreal Neurological Institute and
Hospital.
Stimuli
Since many psychophysics studies showed that anger was the most
reliably decoded emotion from dance or gesture (Dittrich et al., 1996;
Boone and Cunningham, 1998; Pollick et al., 2001, 2002), we chose to
Figure 1. Stimuli. Snapshots were taken at the beginning of representative clips of each condition. The video clips were displayed at 30 frames/s. Two consecutive images on the
figure are separated by five frames.
compare neutral and angry movements. The experiment involved
passive viewing of video clips. We used a two by two factorial design
with body parts (hand and face) and emotional states (neutral and angry)
as independent variables. We also included control non-biological motion
stimuli to assess the effect of each body-part condition separately.
The stimuli consisted of short (2--5 s) black-and-white video clips
depicting either a hand action or a face in movement. They were
digitized and edited using Adobe Premiere. Luminance and contrast
were equalized and gamma correction was applied. Examples of stimuli
can be seen in Figure 1.
Eight actors (four females) were filmed for the face movements. They
were instructed to express happiness, anger, or sadness starting from
a neutral point. We also extracted short video clips from the periods
when the actors were not expressing the emotions but were nonetheless moving their face (e.g. twitching their nose, opening their mouth,
blinking their eyes). Twenty video clips were selected for the angry and
neutral face movements respectively. Four volunteers judged the
intensity of each of three categories of emotion (happiness, sadness,
and anger) from those clips. The average rating for the angry face movements, on a scale of 1 (not angry at all) to 9 (very angry) was 7.94 ± 0.77
(mean ± SD). The average rating for the neutral faces was 2.18 ± 0.84
for anger, 2.97 ± 1.07 for sadness and 3.49 ± 1.03 for happiness. When
combined across the happiness, sadness and anger scales, the rating
was 2.92 ± 1.18.
Three actors performed the hand actions with their right or left hand.
They were instructed to reach, grasp and manipulate eight different
objects (phone, pencil, spoon, computer mouse, glass, hammer, screwdriver, and cup) in a neutral, sad, happy or angry way. The field of view
was such that only the hand and arm were visible; neither the shoulder
nor any other body parts appeared. Each video clip started with the
object alone, placed at about one-third from the left end of the screen;
the hand, whether a left or a right hand, arrived always from the right
side of the screen in order to reach the object. After grasping and
manipulating the object, the hand went back from where it appeared.
Prior to the experiment, four observers rated ~200 video clips,
indicating for each emotion (sadness, happiness, anger) the emotional
intensity on a scale from 1 to 9. Based on this rating, we selected the 15
clips that provided the highest score for anger (7.23 ± 0.50). For the
neutral stimuli we selected the 15 clips that provided the lowest score
across the three emotions (1.09 ± 0.1 when pooling the three emotions;
1.03 ± 0.09 for anger, 1.03 ± 0.09 for sadness and 1.22 ± 0.30 for
happiness).
For the four observers used to chose the stimuli, the recognition of
anger from the videoclips was high (between 75 and 100%; average
across stimuli and observers: 84% for hands and 96% for faces).
The selected video clips were arranged into 18 s blocks. Each block
included 4--7 video clips. The duration of the video clips was matched
across the faces and hands blocks. In each of the 10 blocks of hand
actions, 20 and 80% of the total video clip time contained, respectively,
the arm movement alone and the movements involving the interaction
between the hand and the object. There was no difference between
1088 Perception of Angry Hands and Angry Faces
d
Grosbras and Paus
the neutral and angry clips [mean hand-object duration for neutral movements across five blocks = 13.49 ± 0.83 s, and for angry movements =
13.63 ± 0.74 s, t(4) = 0.95, P < 0.42; mean arm-movement duration
for neutral movements = 3.71 ± 0.22 s and for angry movements = 3.45 ±
0.27, t(4) = 0.95, P < 0.41]. The control stimuli consisted of black-andwhite concentric circles of various contrasts, expanding and contracting at various speeds, roughly matching the contrast and motion
characteristics of the faces and hands clips. These control stimuli
were adapted from a study of Beauchamp et al. (2003). A pilot fMRI
study conducted in two subjects showed that, in the hand condition, the
expanding circles led to highly similar results as when we used as
control stimuli moving bars matching the movements of the hand.
Five blocks of each biological motion condition (neutral hands, angry
hands, neutral faces, angry faces), and 10 blocks of the control condition
were intermixed and presented to the subjects using the software
Presentation (www.neurobehav.com). To ensure that the stimuli were
synchronized to the MR images acquisition, a signal sent from the
scanner at the beginning of each image acquisition was converted into
a TTL pulse transmitted to the stimulation computer through USB port.
The stimuli delivery software (Presentation) read every sixth TTL as
a signal to start a new block. The experiment lasted 9 min. Two different
orders were counterbalanced across subjects.
In the scanner, the stimuli were projected on a screen placed at the
feet of the subject. They subtended 10 3 7 of visual angle and were
viewed through a mirror. Subjects were asked to watch the movies
carefully and were told that they would be asked questions about what
they saw after the scan. After the scanning session, we verified that they
could recognize a subset of 10 face and hand stimuli within a set of
14 clips (four foils).
Functional Magnetic Resonance Imaging
Scanning was performed on a 1.5 T Siemens Sonata imager. First, we
acquired a high resolution T1-weighted 3D structural image (matrix
256 3 256 3 170; 1 mm3 voxels) for anatomical localization and coregistration with the functional time-series. A series of blood oxygenlevel dependent (BOLD) T2*-weighted gradient-echo, echo-planar
images was then acquired (matrix size 64 3 64; TE = 50 ms; TR = 3 s;
180 frames collected after the gradients had reached steady-state, voxel
size 4 3 4 3 4 mm3). Each image consisted of 32 slices, oriented parallel
to a line connecting the base of the cerebellum to the base of the
orbitofrontal cortex and covering the whole brain.
The images were assessed for head motion and realigned to the first
frame using AFNI (Cox, 1996). Then they were spatially smoothed using
a 6 mm full-width half-maximum Gaussian filter. We checked that
motion did not exceed one millimeter or one degree in any direction.
The statistical analysis was performed using fmristat (Worsley et al.,
2002), a dedicated Matlab (Mathworks Inc.) toolbox. For each image,
the signal value at each voxel was expressed as a percentage of the signal
averaged across the whole volume in order to minimize unspecific time
effects. The time course of the response was specified by modeling
each condition convolved with a hemodynamic response function
(Glover, 1999). Based on this model, regression coefficients were
calculated at each voxel using the general linear model. For each
individual, we computed the t-statistic maps for the contrasts of each
action-observation condition versus the control condition, as well as the
contrasts between the emotional versus neutral conditions.
To conduct an analysis of the individual results maps, we overlaid
them on the corresponding individual high resolution T1-weighted
images. This was used to compare the hand observation and face
observation conditions.
To obtain the average group t-maps, all individual maps were first
transformed into a common standard space (MNI 305; Collins et al.,
1994), then combined using a random-effects model using the multistat
function from the fmristat toolbox. The resulting t-statistic images were
thresholded at P < 0.05 using Gaussian random-field theory to correct
for multiple comparisons. For the (more specific) contrast ‘angry hand
movements versus neutral hand movements’, the images were masked
by the results of the (less specific) contrast ‘angry hand movements
versus control motion stimuli’ thresholded at P < 0.001 uncorrected for
multiple comparisons. The same was done for the faces.
The group t-maps were superimposed on a group anatomical image
obtained by averaging the T1-weighted images acquired in all 20
subjects and transformed into the MNI-305 standard space. The
anatomical location of the statistical maxima (peaks) of signal change
was determined with the help of Talairach and Tournoux (1988) and
Duvernoy (1992) atlases. In addition, to characterize further the activity
within the inferior frontal gyrus, the t-statistic maps were also superimposed over the probabilistic cytoarchitectonic maps of Brodmann
areas 44 and 45 (Amunts et al., 1999).
Results
Observation of Angry Movements of the Face and Hand
First we investigated the main effect of emotion for each body
part (hand and face) by comparing the angry and neutral
conditions, separately, to the baseline control (see Table 1).
The obtained results maps were used as masks to test for higher
BOLD signal in the angry condition than in the neutral
condition, for the hand and face movements separately.
Angry Hand Movements (Figs 2 and 3)
As shown in Table 1, viewing angry hand and neutral hand
actions, compared with control motion stimuli, engages common regions. Those include locations that have been consistently reported in studies involving executing, imagining or
observing reaching, pointing or grasping movements (e.g
Binkofski et al., 1999; reviewed in Culham and Kanwisher,
2001), part of the middle occipital gyrus and part of the fusiform
gyrus in both hemispheres. These regions included a peak close
to the occipital transverse sulcus that might correspond to the
‘extrastriate body area’ (Downing et al., 2001) and a part of the
region engaged during the observation of faces (see below).
Within this ‘action observation’ network, a significantly
higher BOLD signal could be observed in the right superior
precentral sulcus, in a region of the middle temporal gyrus
bilaterally and in a small focus in the cerebellum. When the
threshold of the mask was lowered further to P < 0.05
uncorrected for multiple comparisons, a region in the right
fusiform cortex was identified, which showed significantly
higher signal for angry than for neutral hands (x = 32 mm,
y = –56 mm, z = –10 mm; t = 3.52).
Some regions were engaged for angry movements, as compared with control stimuli, but not for neutral hand movements.
Those were located in the anterior and lateral part of the
inferior parietal lobule (parietal operculum and supramarginal
gyrus), in the pre-SMA, the medial prefrontal cortex, in a region
of the ventral pars opercularis of the inferior frontal gyrus, in the
anterior part of the superior temporal sulcus and the amygdala.
Most of them also showed significant t statistics when we
contrasted directly ‘angry hand movements versus neutral hand
movements’ (Fig. 3, second row; Table 2). However, in the
amygdala and in the ventral pars opercularis, the difference in
BOLD signal was not statistically significant (t = 2.66, P < 0.0078
uncorrected, and t = 1.88, P < 0.037 uncorrected, respectively).
Angry Face Movements
For the face movements, we did not observe any region showing
significant increase in BOLD signal in the contrast ‘angry faces
versus control’ without seeing the same regions in the contrast
‘neutral faces versus control’. Also, the direct comparison ‘angry
faces versus neutral faces’, masked by the contrast ‘angry faces
versus control’, did not show any significant increases in BOLD
signal. Regions showing a difference close to significance include the left amygdala (x = –20 mm, y = –4 mm, z = –12 mm;
t = 2.81, P < 0.011 uncorrected), the left thalamus, the left insula
(x = –40 mm, y = 16 mm, z = 8 mm; t = 2.67, P < 0.016 uncorrected) and bilaterally a region of the lateral inferior occipital
cortex at the junction with the fusiform gyrus (x = –40 mm,
y = –58 mm, z = –16 mm; t = 2.56, P < 0.02 uncorrected, x = 40
mm, y = –60 mm, z = –16 mm; t = 2.09, P < 0.05 uncorrected).
The other regions, engaged similarly by angry and neutral face
movements compared with control stimuli, included peaks in
the dorsal and ventral premotor cortices in both hemispheres.
The increase in BOLD signal in the inferior precentral sulcus
was located close to the junction with the inferior frontal
sulcus. This peak of signal change extended into the pars
opercularis (area 44 according to the probabilistic map) of the
inferior frontal gyrus. In the right hemisphere, we also observed
an increase in BOLD signal in the dorsomedial part of the
probabilistic area 45 with no overlap with area 44. No significant
signal change was observed in the parietal lobes. However,
when the threshold was lowered at P < 0.001 uncorrected,
a small cluster (seven voxels) could be identified in the middle
part of the intraparietal sulcus (peak x = 32 mm, y = –56 mm,
z = 40; mm t = 4.91).
Several peaks could be distinguished, in both hemispheres, in
the fusiform gyrus, along the superior temporal sulcus and in
the middle temporal gyrus, including the area described as the
MT complex (Orban et al., 1999; Watson et al., 1993) and the
extrastriate body area (Downing et al., 2001). The latter area
was found in the same location as the one observed during hand
movements.
Angry Hands and Angry Faces: Common Substrate
A conjunction analysis revealed that only one region, namely the
left anterior insula (x = –40 mm, y = 12 mm, z = 8mm), showed
significantly higher BOLD signal in both ‘angry hand movements
versus neutral hand movements’ and ‘angry face movements
versus neutral face movements’.
Observation of Anger in Hand Movements: Interaction
We next tested the interaction between factors emotion and
body part. Significantly higher BOLD signal in the contrast
‘Angry Hand versus Neutral Hand’ than in the contrast ‘Angry
Face versus Neutral Face’ was found in the left supramarginal
gyrus (t = 5.68, P < 0.05 for multiple comparisons) and, at a
lower threshold (t = 4.30, P < 10–4 uncorrected), in the right
supramarginal gyrus. Another site of significant interaction was
Cerebral Cortex August 2006, V 16 N 8 1089
Table 1
Stereotaxic coordinates of the peaks of BOLD signal change while observing hand or face movements relative to the control condition
Region
Hand neutral
x
Fronto-parietal
Dorsal precentral sulcus
Ventral precentral sulcus
IFS/MFG
IFG/pars triangularis
IFG/pars opercularis
x
60
7.06
48
44
52
8
4
20
28
36
24
7.56
6.12
6.76b
R
L
R
48
24
20
9.46
L
Posterior SPL
R
L
STS
t
8
MIP
Occipito-temporal
MTG
z
24
R
R
L
Supramarginal gyrus
y
L
R
L
R
R
Medial prefrontal
Pre-SMA
AIP/postcentral gyrus
Precuneus
Hand angry
R
L
R
L
L
R
L
R
52
36
36
28
32
12
28
52
44
52
52
76
40
64
52
44
56
56
9.09
7.38
7.68
6.21
6.35
6.06
8
68
64
6.35
52
48
60
64
64
56
0
0
4
7.49
12.54
8.4
44
56
8
8.31
44
40
44
44
44
80
84
52
48
60
12
4
20
20
16
10.47
7.47
7.16
6.82
7.18
12
32
80
68
44
56
7
5.71
8
16
76
24
32
8
7.02
6.46
Face neutral
y
z
t
x
24
40
56
44
56
8
4
8
4
20
64
64
24
36
28
6.05
6.44
6.42
5.82
7.96c
52
48
16
32
8
0
6.44
5.77
0
52
24
28
52
40
5.47
7.76
36
40
32
28
20
28
20
36
44
60
80
80
60
60
48
56
48
28
44
52
68
9
8.01
5.44
9.44
9.08
7.37
5.44
12
64
68
20
64
32
5.84
5.66
48
52
56
64
52
68
4
4
12
8.19
12.01
7.63
56
40
48
44
44
4
80
80
52
48
12
12
4
20
20
5.29
10.37
9.35
5.83
7.43
16
12
80
80
48
44
6.65
5.94
16
16
48
24
48
12
6.14
5.74
16
32
4
4.87
Anterior STG
MOG
Fusiform gyrus
Occipito-parietal s.
Cerebellum
Thalamus
Striatum
Amygdala
Hippocampus
Anterior insula
R
L
R
L
R
L
L
R
L
R
L
L
R
R
L
24
8
12
4.02
Face angry
y
z
t
x
6.56
6.79
6.06a
5.87
6.09d
40
44
52
52
56
56
4
4
20
12
20
24
44
48
28
24
28
20
6.05
8.01
7.05a
6.32
8.04c
7.24
0
8
56
6.66
52
48
56
52
52
56
60
56
36
44
52
40
8
8
4
4
4
0
6.07
10.8
5.57
5.60
7.01
5.48
56
48
48
44
40
44
4
80
84
56
68
56
20
12
4
20
24
20
5.18
8.22
6.01
8.08
6.14
8.31
8
32
80
68
36
52
6.40
5.45
20
0
12
5.31
20
20
20
24
44
0
8
12
20
28
0
12
12
12
8
5.83
7.04
9.06
7.42
5.20
44
48
48
36
52
4
4
20
12
24
56
48
24
32
28
56
8
12
6.95
4
4
44
12
44
60
8.62
5.92
52
48
52
52
56
56
48
48
44
44
44
68
48
56
36
8
4
80
84
48
60
56
12
4
8
4
12
12
12
4
16
20
20
7.28
7.91
7.91
7.29
6.95
7.12
5.2
9.89
7.08
6.6
8.58
8
32
4
8
28
16
84
68
76
84
4
20
44
52
24
40
16
4
7.45
6.93
8.2
6.38
5.85
6.8
20
20
8
12
12
12
5.4
9.24
44
28
4
5.41
y
z
t
AIP: anterior intraparietal sulcus; IFG: inferior frontal gyrus; IFS: inferior frontal sulcus; MFG: middle frontal sulcus; MIP: middle intraparietal sulcus; MOG: middle occipital gyrus; MTG: middle
temporal gyrus; SPL: superior parietal lobule; STG: superior temporal gyrus; STS: superior temporal sulcus.
Junction with inferior frontal sulcus; probability value on cytoarchitectonic maps: area 44, 40%; area 45, 50%.
b
Probability value on cytoarchitectonic maps: area 44, 30%; area 45, 50%.
c
Probability value on cytoarchitectonic maps: area 44, 30%; area 45, 80%.
d
Probability value on cytoarchitectonic maps: area 44, 0%; area 45, 80%.
a
also observed in the left cerebellum. In this case, the interaction
was due to a higher signal for angry as compared with neutral
hands and a lower signal for angry as compared with neutral
faces. Finally, we found no regions in which the BOLD signal
was higher in the contrast ‘Angry Face versus Neutral Face’ as
compared with the contrast ‘Angry Hand versus Neutral Hand’.
Comparison of Emotionally Neutral Hands and
Faces (Table 3 and Fig. 4)
Thirdly, we tested the main effect of body part by first
contrasting each neutral movement condition to the baseline
1090 Perception of Angry Hands and Angry Faces
d
Grosbras and Paus
and then comparing directly the two neutral conditions (i.e.
hand versus face neutral movements).
Comparison of ‘Neutral Hand Movements versus Control
Motion Stimuli’ and ‘Neutral Face Movements versus Control
Motion Stimuli’
In the group maps, the ‘hand’ peak in the left superior precentral
sulcus was more medial, dorsal and posterior than the ‘face’ peak
(see Fig. 4 and Table 2). In the inferior precentral sulcus, the overlap was greater and the hand/face segregation was less apparent.
In the individual data, the same topological relationship could
be observed in the superior precentral cortex of 15/19 left
Figure 2. Tri-dimensional rendering of the t-statistic maps for the contrasts between each movement type and the control condition. Note that regions located deeper in the sulci
do not necessarily appear clearly on this type of figure. See also to Table 1 and Figure 3
Figure 3. Brain regions showing an increase in BOLD signal when watching angry hand movements as compared with control motion stimuli (first row), angry hand movements as
compared with neutral hand movements (second row), and angry face movements as compared with control motion stimuli (third row). The histograms represent the signal change
[mean and SD, after converting to percentage (see Materials and Methods)] compared with control stimuli for, from left to right: neutral hands, angry hands, neutral faces, angry
faces. The scale ranges from –0.2 to 0.4, except for the MTG and the fusiform gyrus for which it ranges from –0.2 to 0.6. The color-scale indicates the value of Student’s t-statistic.
Only those voxels showing a t-value >5.4 (first and third rows; P < 0.05 corrected for multiple comparisons) or >2.9 (second row; P < 0.001 uncorrected for multiple comparisons)
are represented in the figure. Amg: amygdala; AIP: anterior intraparietal sulcus; Fus: fusiform gyrus; Ins: insula; Mpf: medial prefrontal cortex; MTG: middle temporal gyrus; Pf:
inferior prefrontal cortex; Pmd: dorsal premotor cortex; Pmv: ventral premotor cortex; Pre-SMA: pre-supplementary motor area.
hemispheres and 10/16 right hemispheres; we could not
conduct this comparison in the remaining hemispheres because
of the lack of signal change. In the inferior precentral sulcus,
somatotopic relationship was more difficult to assess due to the
fact that, in many individuals, the observation of hand and face
actions induced several peaks of signal change in the sulcus and
adjacent inferior frontal gyrus (Broca area and its homologue in
the right hemisphere).
In the group maps, the activity induced by observing
emotionally neutral hand and face movement overlapped
greatly in the right posterior superior temporal sulcus, in the
right middle occipital gyrus and in the fusiform gyrus of both
Cerebral Cortex August 2006, V 16 N 8 1091
Table 2
Topography of face and hand observation in the frontal and occipitotemporal cortices
Angry hands/neutral hands
Precentral gyrus R
Medial prefrontal R
Inferior frontal gyrus L
Posterior insula L
Anterior.insula L
Postcentral L
SMG/perisylvian parietal region L
R
STGant L
STS L
MTG L
R
Parieto-occipital sulcus L
Cerebellum R
x
y
z
t
48
8
48
40
32
20
56
52
60
52
48
56
56
48
12
24
8
42
16
16
24
48
32
28
32
40
12
68
64
64
80
56
56
40
12
0
4
68
16
12
20
20
8
16
8
8
44
29
4.50
2.93
2.77
4.67
3.11
4.46
5.61
5.54
4.98
3.82
4.05
3.71
4.78
3.30
3.53
2.97
MTG: middle temporal gyrus; SMG: supramarginal gyrus; STS/STG: superior
temporal sulcus/gyrus.
Table 3
Stereotaxic coordinates of the peaks of BOLD signal changes when watching angry hand
movements relative to watching neutral movements
Region
Left Pmd
Right Pmd
Left Pmv
Right Pmv
Left Fus
Right Fus
Difference (hand--face)
Averaged
Group
Averaged
Group
Averaged
Group
Averaged
Group
Averaged
Group
Averaged
Group
x
y
z
12.7
20
8.81
12
8.5
0
0.44
8
5.18
0
2.64
0
5.78
4
6.25
12
9.3
12
3.22
8
4.29
4
0.09
4
9.222
4
5.813
8
7
4
7
4
6.647
4
0.818
4
Vector distance (mm)
18.97
20.78
16.91
18.76
18.63
12.65
11.79
12
13.88
5.657
12.52
5.657
SD
8.479
10.2
n
19
16
8.371
11
7.675
9
6.836
17
3.608
10
The values express the difference between the coordinates of the peak for watching neutral
hand movements relative to control motion stimuli and the coordinates of the peak for watching
neutral face movements relative to control motion stimuli (i.e x-hand x-face, y-hand y-face
and z-hand z-face). A positive difference in x for the left hemisphere region signifies that the
hand region lies medially relative to the face region, whereas a positive difference in x for the
right hemisphere signifies that the hand region lies laterally to the face region. For the purpose of
this comparison, the threshold was set to P \ 0.001 (uncorrected). The last column indicates
the number of subjects showing significant activity for both hand and face stimuli at this
threshold. Note that even with this relatively low threshold, premotor and occipito-temporal
regions could not be identified in every subject. For each region, the first row provides the
average of the coordinate differences calculated in each individual. The second row provides the
difference between the coordinates identified in the group analysis. The sixth column indicates
the vectorial distance between the hand and the face peak, i.e O((x-hand x-face)2 þ
(y-hand y-face)2 þ (z-hand z-face)2).
Fus: fusiform gyrus; Pmd: dorsal premotor cortex; Pmv: ventral premotor cortex.
hemispheres. In the posterior STS as well as in the adjacent
occipital cortex, the topological organization is difficult to
characterize given that several peaks were present in most
individuals. Overall, however, the signal change induced by
watching face movements extended more towards the middle
part of the STS and the signal change induced by watching hand
movements extended more towards the posterior branch of the
STS, middle occipital and inferior temporal gyri. This pattern is
highly similar to the one described by Pelphrey et al. (2005).
In the individual data, the ‘fusiform’ peaks for the face and
hand were separated by more than two voxels in 9/15 left
hemispheres and 4/10 right hemispheres. In the left hemi1092 Perception of Angry Hands and Angry Faces
d
Grosbras and Paus
Figure 4. Localization of increases in BOLD signal when watching neutral hand or face
movements in the premotor (top panel), right superior temporal sulcus and right
fusiform regions (bottom panel). These images were constructed by coding in yellow
the voxels showing a t-value above threshold (P < 0.001 uncorrected for multiple
comparisons) only in the hand actions group-averaged map, in blue those above
threshold only in the face movements group-averaged map and in pink the voxels
above threshold in both maps.
sphere, the peak for the observation of hand movements tended
to be more dorsal and posterior than the one for observation of
face movements.
Direct Comparison of the Observation of Neutral Hand
Movements and Observation of Neutral Face Movements
The BOLD signal was significantly higher during the observation
of hand (versus face) movements in the left dorsal and ventral
premotor cortex and in several locations in the parietal cortex
and the mid-temporal regions; the latter included the region
corresponding to the extrastriate body area as well as the medial
part of the left fusiform gyrus and the posterior part of left
superior temporal sulcus. Observing movements of the face led
to higher change in BOLD signal, compared with hands, in the
right dorsal premotor cortex and the amygdala bilaterally. No
significant difference could be observed in the region corresponding to the fusiform face area, even by lowering the
threshold to P < 0.001 uncorrected.
Discussion
Three categories of regions were engaged during the observation of hand movements performed with anger: (i) regions
engaged uniquely during the perception of angry hand movements and not at all during observation of face movements; (ii)
regions involved in the perception of angry hand movements
but also neutral and angry face movements; and (iii) regions
engaged when watching both angry and neutral hand movements. The discussion will focus successively on these three
categories of regions, underlining, when relevant, the similarities and differences between the perception of hand and face
movements. Given the passive nature of the task, however, we
cannot ascertain whether the observed differences between the
processing of hand and face are due to differences in the degree
of attention or to hypothesis generation when watching these
video clips; the average recognition of emotion is indeed slightly
better for the faces than for the hands.
A Region Specific to the Perception of Emotional
Hand Movements
We observed only one brain region that was engaged during the
perception of hand movements performed with anger and not
by any other category of movements, as demonstrated by the
significant interaction between emotion (angry/neutral) and
body part (hand/face). This region is located in the supramarginal gyrus close to the adjacent perisylvian cortex in both
hemispheres. Lesions encroaching on this region appear to
impair recognition of emotion from point-of-light displays of
whole-body movements (Heberlein et al., 2004). The supramarginal gyrus, especially in the left hemisphere, has been
implicated in motor attention, i.e. directing attention towards
one’s limb (Rushworth et al., 2001). It is possible that observing
an action performed with an emotion induces an automatic
shift of attention towards one’s own motor repertoire without
any overt movement. Such shifts of attention could perhaps
facilitate interpretation of emotions embedded into the movement. The supramarginal gyrus has also been consistently
implicated in the perceptual analysis of complex hand gestures
independently of overt execution (Hermsdorfer et al., 2001;
Tanaka et al., 2001; Nakamura et al., 2004). In particular, it is
engaged when people proficient in sign language, but not nonsigners, view signs with detailed spatial configuration of fingers
(Emmorey et al., 2002; MacSweeney et al., 2004). Overall, the
above findings suggest that this part of the inferior parietal lobe
is important for the perceptual analysis of action-relevant
sensory input, especially those of communicative nature. Note
that in our study, as in the Heberlein et al. study of point-of-light
displays, this part of the supramarginal cortex is involved during the observation of gestures that are not by themselves
communicative; it is the conveyed emotion that makes the
movements relevant for social interactions. Thus this part of the
supramarginal gyrus would be complementary to the mirror
neuron system by providing an access to the meaning of the
observed action that goes beyond the mere description of the
action goal.
Fronto-limbic System Engaged during the Observation of
both Angry Hand Actions and Expressive Faces
The only region showing a significant signal change in the
conjunction analysis between the two types of emotional
movements, compared with their neutral homologues, was
the left anterior insula. The engagement of insula is consistently
reported in studies of the perception or experience of emotion
(Kawashima et al., 1999; Wicker et al., 2003; Phillips et al.,
2004). Based on such findings, and the connectivity pattern of
the insula in non-human primates, some authors have proposed
that the insula serves as a relay between fronto-parietal areas
representing action and limbic areas processing emotion (Carr
et al., 2003). Indeed, the anterior insula possesses connections
with primary and secondary somatosensory cortex as well as
with anterior inferior parietal lobule and STS (Augustine, 1996),
which is consistent with the pattern of brain regions we
identified during the observation of anger stimuli. It has also
been shown that the anterior insula engagement during the
observation of painful stimulation applied to another person
correlates with empathy scores of the observer (Singer et al.,
2004). This finding, together with the link between the insula
and the autonomic system (Augustine, 1996), suggests that it
might play a role in inducing a resonance in the viscero-motor
centers of the observer while watching emotion in other people
(Wicker et al., 2003), and thereby plays a key role in empathy.
The insula, however, was also significantly activated when
the observers watched the neutral faces as compared with the
control condition. Similarly, other regions engaged during the
perception of angry hand movements, but not neutral hand
movements, were also recruited by both neutral and angry faces.
There is evidence that even a facial motion that is not included
in the typical emotional expressions can be recognized as
conveying an emotional message (Wallbott, 1991). Thus, even if
our neutral face stimuli were rated as neutral, they may have
recruited regions involved in the perception or decoding of
emotion. Our data then suggest that a network of brain regions
is commonly activated during the observation of angry hand
actions and dynamic expressive faces. This network included,
besides the left insula, the left anterior superior temporal
gyrus, the right medial prefrontal, the right anterior insula
and, although engaged to a lesser extent, the right amygdala.
Increases in BOLD signal in the anterior superior temporal gyrus
and the medial prefrontal cortex are consistently reported in
studies that involve some kind of social judgments such as
attributing mental states, thinking about other’s intentions,
deciding about the social meaning of moving shapes (Castelli
et al., 2002; Frith and Frith, 2003; Schultz et al., 2003; Gallagher
and Frith, 2004). Thus, when watching an angry action, in
addition to attributing a goal to the agent (e.g. picking up the
phone), an observer might also be attributing the agent’s mental
state (i.e. anger) relevant for social interaction.
The amygdala is identified in brain imaging studies involving
the perception of stylized dance movements or silhouettes
postures (Bonda et al., 1996; Hadjikhani and de Gelder, 2003; de
Gelder et al., 2004). However, a recent study showed that
patients with amygdala lesions misinterpreted the anger emotion depicted in complex scenes with faces visible but not when
faces were hidden, suggesting that the amygdala is crucial for
the correct attribution of emotion from faces, but not from
other visual cues (Adolphs and Tranel, 2003). In non-human
primates, besides face-selective neurons (Rolls, 1984), the
amygdala contains also neurons that respond to complex social
stimuli including expressive body movements and interactions
(Brothers et al., 1990). The present study, which is the first to
report amygdala engagement during the perception of emotion
embedded into actions that are not by themselves communicative, indicates that the role of the human amygdala goes beyond
the processing of faces, albeit the engagement appears to be
more important for face stimuli.
The ‘co-activation’ of the amygdala and the fusiform gyrus is
interesting and similar to the Hadjikhani and de Gelder (2003)
finding. Lesion studies have demonstrated a direct impact of the
amygdala on emotional modulation of the fusiform cortex
(Vuilleumier et al., 2004). It might be that the activation of
the circuit represents a broader mechanism engaged by the
perception of visual stimuli relevant for social interactions
regardless the exact nature of the stimuli. This view is
supported by the overlap in changes in the BOLD signal in the
Cerebral Cortex August 2006, V 16 N 8 1093
fusiform gyrus when either watching faces or making judgments about a movement of geometric shapes depicting social
interactions (Schultz et al., 2003).
Action Observation System and the Perception of
Emotion
In addition to this common network related to the perception
of emotion, watching hand movements and watching face
movements induced similar pattern of brain activity in regions
known to be involved in the observation of other people
movements (see Introduction). The emotional modulation
was subtle and found only a few frontal and temporal regions.
These quantitative differences might be simply explained by
a greater stimulus saliency when the agent is perceived as angry,
rather than by specificity of anger perception. They might also
reflect the fact that an action performed with an emotion
induces a greater engagement of the action-observation system
of an observer.
The higher fMRI signal for angry, compared with neutral,
hands in the posterior STS is also interesting. The posterior STS
has been consistently implicated in perception of biological
motion (e.g. Allison et al., 2000; Beauchamp et al., 2003; Grezes
et al., 2003). It is also recruited during observation of dynamic
abstract shapes mimicking social interactions (Schultz et al.,
2003). One possibility is that the posterior STS contains neurons
important for judging biological movements relevant for interactions. The greater involvement of this region when observing
angry movements is consistent with this notion.
Fronto-parietal and Occipito-temporal Circuits Related
to Watching Hands or Faces: Overlap and Dissociation
Despite the lack of extensive emotional modulation, we would
like to focus the last part of the discussion on the actionobservation system and the similarities and differences between
the observation of faces and hands.
The first interesting result is the consistent topographic
organization of signal changes induced by observing face and
hand movements respectively. Such topographic relationship
revealed in the context of passive action observation has already
been reported, but only for the ventral precentral cortex (in
group data: Buccino et al., 2001; in individual data: Wheaton
et al., 2004). The spatial pattern we observed corresponds well
to the somatotopical organization of movement representation
in the premotor cortex of non-human primates (Preuss et al.,
1996; Wise et al., 1997; Rizzolatti et al., 1998; Wu et al., 2000).
As such, the present data are in line with a fine-tuned matching
mechanism that maps the observed body movements to the
complex motor repertoires existing in the brain of the observer
(Calvo-Merino et al., 2004; Rizzolatti and Craighero, 2004).
The activity in the dorsal premotor cortex is not very often
reported in studies of passive observation of actions (Grafton
et al., 1996; Grezes et al., 2003). Such dorsal activation during
passive observation might correspond to the frontal eye field,
which would be involved in visual scanning and/or attentional
requirement of the task. However, this is unlikely given that we
found different peaks for observation of faces and hands. It is also
unlikely that it corresponds to the most dorsal part of the ventral
premotor cortex, as suggested in other studies (Buccino et al.,
2004). We observed two sets of peaks for the hand and face
condition: one clearly within the superior precentral sulcus and
the other within the inferior precentral sulcus and the adjacent
1094 Perception of Angry Hands and Angry Faces
d
Grosbras and Paus
inferior frontal cortex. It is therefore clear that, in the present
study, both ventral and dorsal premotor cortices are engaged
during the passive observation of hand and face movements.
Even if the dorsal premotor cortex is not included as a part of the
‘mirror-neuron’ system stricto sensus, our results suggest that it
might have some ‘mirror’ properties. Those might simply reflect
the possibility that the motor preparation system, which
encodes the intrinsic properties of movements, is automatically
engaged when we observe someone else’s movements. A recent
electrophysiological study in the macaque monkey strengthens
this hypothesis: the neuronal activity in many dorsal premotor
neurons was similar when individuals performed a learned
conditional reaching task and when they passively observed
the visual output of the task (Cisek and Kalaska, 2004).
In the parietal cortex, only the movements of the hand, but not
the face, led to significant increases in the BOLD signal (but note
the presence of a small ‘peak’ in the right hemisphere observed
during the face movements at a lower threshold). This much
lesser involvement of the parietal cortex might be due to
differences in low level features of the visual motion depicted
in hand and face stimuli. It might also be linked to the fact that the
face movements were not object-related (Buccino et al., 2001;
Rizzolatti and Craighero, 2004). Our findings are consistent with
results obtained by Thompson et al. (2004), who compared
watching meaningful finger movements and language-related lip
movements and showed involvement of the parietal cortex for
the hand condition only. As in the precentral cortex, we could
distinguish two regions within the parietal cortex. The more
rostral region corresponds to an area engaged when we grasp
objects (Binkofski et al., 1999) and represents a likely homologue of the monkey area AIP (Culham and Kanwisher, 2001),
which is strongly connected with the ventral premotor cortex
(containing mirror neurons). The more caudal intraparietal site
corresponds to an area recruited during reaching movements
and might be a homologue of the monkey area MIP (Colby and
Duhamel, 1996), which is strongly connected with the dorsal
premotor cortex (Rizzolatti et al., 1998). Thus, observing hand
and face movements appears to generate a somatotopic resonance of the motor system including not only circuits involved in
action representation and retrieval (ventral premotor and
anterior intraparietal cortices), but also circuits involved in
motor preparation and coding movements to specific locations
in space (dorsal premotor and middle intraparietal cortices).
Contrary to the fronto-parietal regions, the signal changes
induced by the observation of face and hand movements,
respectively, overlap in the occipito-temporal cortex. This has
been partly discussed above in the context of the emotional
modulation. The presence of such an overlap questions the
specificity of the ‘fusiform face area’ (Kanwisher et al., 1997) for
the processing of faces. In non-human primates, neurons in
a corresponding face-responsive region of the inferotemporal
cortex respond to the presentation of bodies or body parts
(Perrett et al., 1985). The STS contains face-responsive neurons
and body responsive neurons as well. This fact, together with
the involvement of the fusiform gyrus in processing nonbiological social stimuli (Schultz et al., 2003), suggests common
substrates for the perception of people-related visual stimuli.
Conclusion
Our data demonstrate further that areas involved in generating
action or emotion are also involved in the perception of these
actions and emotions displayed by others. They show that the
observation of everyday life hand actions performed with an
emotion recruits regions involved in the perception of emotion
and/or in communication. We speculate that, in addition to
inducing resonance in the motor program necessary to execute
an action, watching an action performed with emotion induces
a resonance in the emotional system responsible for the
affective modulation of the motor program. Such a mechanism
could be a key to understand how the other person feels.
Further work is needed to distinguish the processes involved in
evaluating the emotion of an actor and the processes involved in
subjectively experiencing the feeling of the actor by emotional
contagion.
Notes
This work was supported by the SantaFe Institute Consortium, the
Canadian Institutes of Health Research and the Fyssen Fondation
(France). The authors are grateful to Jon Rueckerman, Catherine
Poulsen, Valeria Della Maggiore and Kate Watkins for their assistance
in designing the stimuli and running the experiments, as well as for
useful comments on the manuscript. We also thank the reviewers for
providing essential suggestions for reshaping the manuscript.
Address correspondence to Tomáš Paus, Brain & Body Centre,
University of Nottingham, University Park, Nottingham NG7 2RD, UK.
Email: [email protected].
References
Adolphs R (2002) Neural systems for recognizing emotion. Curr Opin
Neurobiol 12:169--177.
Adolphs R, Tranel D (2003) Amygdala damage impairs emotion
recognition from scenes only when they contain facial expressions.
Neuropsychologia 41:1281--1289.
Allison T, Puce A, McCarthy G (2000) Social perception from visual cues:
role of the STS region. Trends Cogn Sci 4:267--278.
Amunts K, Schleicher A, Burgel U, Mohlberg H, Uylings HB, Zilles K
(1999) Broca’s region revisited: cytoarchitecture and intersubject
variability. J Comp Neurol 412:319--341.
Augustine JR (1996) Circuitry and functional aspects of the insular lobe
in primates including humans. Brain Res Brain Res Rev 22:229--244.
Beauchamp MS, Lee KE, Haxby JV, Martin A (2003) FMRI responses to
video and point-light displays of moving humans and manipulable
objects. J Cogn Neurosci 15:991--1001.
Binkofski F, Buccino G, Posse S, Seitz RJ, Rizzolatti G, Freund H (1999)
A fronto-parietal circuit for object manipulation in man: evidence
from an fMRI-study. Eur J Neurosci 11:3276--3286.
Bonda E, Petrides M, Ostry D, Evans AC (1996) Specific involvement of
human parietal systems and the amygdala in the perception of
biological motion. J Neurosci 16:3737--3744.
Boone RT, Cunningham JG (1998) Children’s decoding of emotion in
expressive body movement: the development of cue attunement.
Dev Psychol 34:1007--1016.
Brothers L, Ring B, Kling A (1990) Response of neurons in the macaque
amygdala to complex social stimuli. Behav Brain Res 41:199--213.
Buccino G, Binkofski F, Fink GR, Fadiga L, Fogassi L, Gallese V, Seitz RJ,
Zilles K, Rizzolatti G, Freund HJ (2001) Action observation activates
premotor and parietal areas in a somatotopic manner: an fMRI study.
Eur J Neurosci 13:400--404.
Buccino G, Vogt S, Ritzl A, Fink GR, Zilles K, Freund HJ, Rizzolatti G
(2004) Neural circuits underlying imitation learning of hand actions:
an event-related fMRI study. Neuron 42:323--334.
Calvo-Merino B, Glaser DE, Grezes J, Passingham RE, Haggard P (2004)
Action Observation and Acquired Motor Skills: An fMRI Study with
Expert Dancers. Cereb Cortex.doi: 10.1093/cercor/bhi007.
Carr L, Iacoboni M, Dubeau MC, Mazziotta JC, Lenzi GL (2003) Neural
mechanisms of empathy in humans: a relay from neural systems for
imitation to limbic areas. Proc Natl Acad Sci USA 100:5497--5502.
Castelli F, Frith C, Happe F, Frith U (2002) Autism, Asperger syndrome
and brain mechanisms for the attribution of mental states to
animated shapes. Brain 125:1839--1849.
Cisek P, Kalaska JF (2004) Neural correlates of mental rehearsal in the
dorsal premotor cortex. Nature 431:993--996.
Colby CL, Duhamel J-R (1996) Spatial representations for action in
parietal cortex. Cogn Brain Res 5:105--115.
Collins DL, Neelin P, Peters TM, Evans AC (1994) Automatic 3D
intersubject registration of MR volumetric data in standardized
Talairach space. J Comput Assist Tomogr 18:192--205.
Cox RW (1996) AFNI: software for analysis and visualization of
functional magnetic resonance neuroimages. Comput Biomed Res
29:162--173.
Culham JC, Kanwisher NG (2001) Neuroimaging of cognitive functions
in human parietal cortex. Curr Opin Neurobiol 11:157--163.
Darwin C (1872) The expression of the emotions in man and in animals.
London: John Murray.
de Gelder B, Snyder J, Greve D, Gerard G, Hadjikhani N (2004) Fear
fosters flight: a mechanism for fear contagion when perceiving
emotion expressed by a whole body. Proc Natl Acad Sci USA
101:16701--16706.
Dittrich WH, Troscianko T, Lea SE, Morgan D (1996) Perception of
emotion from dynamic point-light displays represented in dance.
Perception 25:727--738.
Downing PE, Jiang Y, Shuman M, Kanwisher N (2001) A cortical area
selective for visual processing of the human body. Science
293:2470--2473.
Duvernoy H.M. (1992) Le cerveau humain. Surface, coupes sériées
tridimensionnelles et IRM. Paris: Springler Verlag.
Emmorey K, Damasio H, McCullough S, Grabowski T, Ponto LL, Hichwa
RD, Bellugi U (2002) Neural systems underlying spatial language in
American Sign Language. Neuroimage 17:812--824.
Frith U, Frith CD (2003) Development and neurophysiology of mentalizing. Philos Trans R Soc Lond B Biol Sci 358:459--473.
Gallagher HL, Frith CD (2004) Dissociable neural pathways for the
perception and recognition of expressive and instrumental gestures.
Neuropsychologia 42:1725--1736.
Glover GH (1999) Deconvolution of impulse response in event-related
BOLD fMRI. Neuroimage 9:416--429.
Grafton ST, Arbib MA, Fadiga L, Rizzolatti G (1996) Localization of
grasp representations in humans by positron emission tomography.
2. Observation compared with imagination. Exp Brain Res
112:103--111.
Grezes J, Armony JL, Rowe J, Passingham RE (2003) Activations related
to ‘mirror’ and ‘canonical’ neurones in the human brain: an fMRI
study. Neuroimage 18:928--937.
Hadjikhani N, de Gelder B (2003) Seeing fearful body expressions
activates the fusiform cortex and amygdala. Curr Biol 13:2201--2205.
Heberlein AS, Adolphs R, Tranel D, Damasio H (2004) Cortical regions
for judgments of emotions and personality traits from point-light
walkers. J Cogn Neurosci 16:1143--1158.
Hermsdorfer J, Goldenberg G, Wachsmuth C, Conrad B, CeballosBaumann AO, Bartenstein P, Schwaiger M, Boecker H (2001) Cortical
correlates of gesture processing: clues to the cerebral mechanisms
underlying apraxia during the imitation of meaningless gestures.
Neuroimage 14:149--161.
Hess WR, Bruger M (1943) Das subkorticale Zentrum der affektiven
Abwehrreaktion. Acta Helv Physiol Pharmacol 1:33--52.
Kanwisher N, McDermott J, Chun MM (1997) The fusiform face area:
a module in human extrastriate cortex specialized for face perception. J Neurosci 17:4302--4311.
Kawashima R, Sugiura M, Kato T, Nakamura A, Hatano K, Ito K, Fukuda H,
Kojima S, Nakamura K (1999) The human amygdala plays an
important role in gaze monitoring. A PET study. Brain 122:779--783.
LaBar KS, Crupain MJ, Voyvodic JT, McCarthy G (2003) Dynamic
perception of facial affect and identity in the human brain. Cereb
Cortex 13:1023--1033.
MacSweeney M, Campbell R, Woll B, Giampietro V, David AS, McGuire
PK, Calvert GA, Brammer MJ (2004) Dissociating linguistic and
nonlinguistic gestural communication in the brain. Neuroimage
22:1605--1618.
Cerebral Cortex August 2006, V 16 N 8 1095
Nakamura A, Maess B, Knosche TR, Gunter TC, Bach P, Friederici AD
(2004) Cooperation of different neuronal systems during hand sign
recognition. Neuroimage 23:25--34.
Orban GA, Sunaert S, Todd JT, Van Hecke P, Marchal G (1999) Human
cortical regions involved in extrating depth from motion. Neuron
24:929--940.
Pelphrey KA, Morris JP, Michelich CR, Allison T, McCarthy G (March 2,
2005) Functional anatomy of biological motion perception in
posterior temporal cortex: an fMRI study of eye, mouth and hand
movements. Cereb Cortex 10.1093/cercor/bhi064.
Perrett DI, Rolls ET, Caan W (1982) Visual neurones responsive to faces
in the monkey temporal cortex. Exp Brain Res 47:329--342.
Perrett DI, Smith PA, Mistlin AJ, Chitty AJ, Head AS, Potter DD,
Broennimann R, Milner AD, Jeeves MA (1985) Visual analysis of
body movements by neurones in the temporal cortex of the
macaque monkey: a preliminary report. Behav Brain Res 16:
153--170.
Phillips ML, Williams LM, Heining M, Herba CM, Russell T, Andrew C,
Bullmore ET, Brammer MJ, Williams SC, Morgan M, Young AW, Gray
JA (2004) Differential neural responses to overt and covert presentations of facial expressions of fear and disgust. Neuroimage
21:1484--1496.
Pollick FE, Paterson HM, Bruderlin A, Sanford AJ (2001) Perceiving affect
from arm movement. Cognition 82:B51--B61.
Pollick FE, Lestou V, Ryu J, Cho SB (2002) Estimating the efficiency of
recognizing gender and affect from biological motion. Vision Res
42:2345--2355.
Preuss TM, Stepniewska I, Kaas JH (1996) Movement representation in
the dorsal and ventral premotor areas of owl monkeys: a microstimulation study. J Comp Neurol 371:649--676.
Rizzolatti G, Craighero L (2004) The mirror-neuron system. Annu Rev
Neurosci 27:169--192.
Rizzolatti G, Luppino G, Matelli M (1998) The organization of the
cortical motor system: new concepts. Electroencephalogr Clin
Neurophysiol 106:283--296.
Rolls ET (1984) Neurons in the cortex of the temporal lobe and in the
amygdala of the monkey with responses selective for faces. Hum
Neurobiol 3:209--222.
Rushworth MF, Ellison A, Walsh V (2001) Complementary localization
and lateralization of orienting and motor attention. Nat Neurosci
4:656--661.
1096 Perception of Angry Hands and Angry Faces
d
Grosbras and Paus
Schultz RT, Grelotti DJ, Klin A, Kleinman J, Van der GC, Marois R,
Skudlarski P (2003) The role of the fusiform face area in social
cognition: implications for the pathobiology of autism. Philos Trans R
Soc Lond B Biol Sci 358:415--427.
Singer T, Seymour B, O’Doherty J, Kaube H, Dolan RJ, Frith CD (2004)
Empathy for pain involves the affective but not sensory components
of pain. Science 303:1157--1162.
Talairach J, Tournoux P (1988) Co-planar steretaxic atlas of the human
brain. Stuttgart: Thieme.
Tanaka S, Inui T, Iwaki S, Konishi J, Nakai T (2001) Neural substrates
involved in imitating finger configurations: an fMRI study. Neuroreport 12:1171--1174.
Thompson JC, Abbott DF, Wheaton KJ, Syngeniotis A, Puce A (2004)
Digit representation is more than just hand waving. Brain ResCogn
Brain Res 21:412--417.
Vuilleumier P, Richardson MP, Armony JL, Driver J, Dolan RJ (2004)
Distant influences of amygdala lesion on visual cortical activation
during emotional face processing. Nat Neurosci 7:1271--1278.
Wallbott HG (1991) Recognition of emotion from facial expression via
imitation? Some indirect evidence for an old theory. Br J Soc Psychol
30:207--219.
Watson JD, Myers R, Frackowiak RS, Hajnal JV, Woods RP, Mazziota J.C,
Shipp S, Zeki S (1993) Area V5 of the human brain: evidence from
a combined study using positron emission tomography and magnetic
resonance imaging. Cereb Cortex 3:79--94.
Wheaton KJ, Thompson JC, Syngeniotis A, Abbott DF, Puce A (2004)
Viewing the motion of human body parts activates different regions
of premotor, temporal, and parietal cortex. Neuroimage 22:277--288.
Wicker B, Keysers C, Plailly J, Royet JP, Gallese V, Rizzolatti G (2003)
Both of us disgusted in My insula: the common neural basis of seeing
and feeling disgust. Neuron 40:655--664.
Wise SP, Boussaoud D, Johnson P.B, Caminiti R (1997) Premotor and
parietal cortex: corticocortical connectivity and combinatorial
computations. Annu Rev Neurosci 20:25--42.
Worsley KJ, Liao CH, Aston J, Petre V, Duncan GH, Morales F, Evans AC
(2002) A general statistical analysis for fMRI data. Neuroimage
15:1--15.
Wu CW, Bichot NP, Kaas JH (2000) Converging evidence from microstimulation, architecture, and connections for multiple motor areas
in the frontal and cingulate cortex of prosimian primates. J Comp
Neurol 423:140--177.