Facial mimicry, valence evaluation or emotional reaction? Mechanisms underlying the modulation of congruent and incongruent facial reactions to emotional facial expressions Inaugural-Dissertation zur Erlangung der Doktorwürde der Philosophischen Fakultät II der Julius-Maximilians-Universität Würzburg Vorgelegt von DIpl. Psych. Katja U. Likowski aus Würzburg Würzburg 2011 Erstbetreuer: Professor Dr. Paul Pauli Zweitbetreuer: Professor Dr. Andreas Mühlberger Tag des Kolloquiums: 04.08.2011 INHALT Zusammenfassung ................................................................................................................................... 3 Abstract ................................................................................................................................................... 8 1. General Background............................................................................................................... 12 1.1. Moderators of facial reactions to emotional facial expressions ........................................... 12 1.2. Motor determinants of facial reactions to emotional facial expressions .............................. 14 1.2.1. (Facial) mimicry ............................................................................................................ 14 1.2.2. Perception‐Behavior Link and Perception‐Action model ............................................ 15 1.2.3. Mirror Neuron System ................................................................................................. 16 1.3. Affective determinants of facial reactions to emotional facial expressions .......................... 18 1.3.1. Valence evaluations ..................................................................................................... 18 1.3.2. Emotional reactions ..................................................................................................... 19 1.4. Mechanisms underlying the modulation of facial reactions to emotional facial reactions .. 23 1.4.1. Deficits of the present body of evidence ..................................................................... 23 1.4.2. A working model on mechanisms underlying the modulation of facial reactions to emotional facial expressions ....................................................................................... 25 1.5. 2. Aim of the experiments ......................................................................................................... 27 Experiments ........................................................................................................................... 28 2.1. Experiments 1 & 2 – Experimental manipulation of mechanisms ......................................... 28 2.1.1. Experiment 1 ................................................................................................................ 28 2.1.2. Experiment 2 ................................................................................................................ 40 2.1.3. Revision of the working model .................................................................................... 55 2.2. Experiment 3 – Psychological mediators ............................................................................... 56 2.2.1. Theoretical Background ............................................................................................... 56 2.2.2. Methods ....................................................................................................................... 58 2.2.3. Results .......................................................................................................................... 62 1 2.2.4. Discussion .................................................................................................................... 71 2.2.5. Revision of the working model II ................................................................................. 74 2.3. Experiments 4 & 5 – Neuronal correlates .............................................................................. 76 2.3.1. Experiment 4 ................................................................................................................ 77 2.3.2. Experiment 5 ................................................................................................................ 88 3. General Discussion ............................................................................................................... 102 3.1. Motor mechanisms: Integration of findings ........................................................................ 102 3.2. Affective mechanisms: Integration of findings .................................................................... 105 3.3. Revised model on mechanisms underlying the modulation of facial reactions to emotional facial reactions ..................................................................................................................... 106 3.4. Limitations of the approach ................................................................................................. 108 3.5. Conclusion/Outlook ............................................................................................................. 110 4. References ........................................................................................................................... 111 5. Footnotes ............................................................................................................................. 128 6. Appendix .............................................................................................................................. 129 6.1. Materials: Experiment 1 ....................................................................................................... 129 6.2. Materials: Experiment 2 ....................................................................................................... 142 6.3. Materials: Experiment 3 ....................................................................................................... 149 6.4. Materials: Experiment 4 ....................................................................................................... 160 6.5. Materials: Experiment 5 ....................................................................................................... 165 6.6. Lebenslauf ............................................................................................................................ 177 2 ZUSAMMENFASSUNG Menschen haben die automatische Tendenz, auf emotionale Gesichtsausdrücke anderer kongruente muskuläre Reaktionen zu zeigen. Kongruente faziale Reaktionen auf Gesichtsausdrücke sind allerdings kein allgegenwärtiges Phänomen, sondern werden durch eine Vielzahl von situativen Faktoren moduliert. Die dieser Modulation zugrunde liegenden Prozesse sind bisher jedoch kaum erforscht. Die Modulation kongruenter und inkongruenter fazialer Reaktionen wird in der vorhandenen Literatur zu fazialer Mimikry bisher nahezu ausschließlich mit rein motorischen Resonanzprozessen erklärt. Faziale Reaktionen haben jedoch noch weitere Determinanten. So belegen Studien, dass v.a. muskuläre Reaktionen im M. Zygomaticus major und M. Corrugator supercilii Valenzindikatoren bei der Beurteilung positiver und negativer Stimuli sind. Weiterhin zeigt der Gesichtsausdruck den momentanen emotionalen Zustand an z.B. in Reaktion auf spezifische, emotionsauslösende Stimuli. Dies legt nahe, dass die in der Literatur zu fazialer Mimikry beobachteten Modulationen fazialer Reaktionen auf emotionale Gesichtsausdrücke eventuell nicht nur auf rein motorischen Prozessen beruht haben. Vielmehr könnten affektive Prozesse diese Reaktionen beeinflusst bzw. unterstützt haben. Tatsächlich gibt es bisher keine Studie, die das Zusammenspiel motorischer und affektiver Mechanismen bei der Modulation fazialer Reaktionen auf Gesichtsausdrücke durch soziale Situationen näher analysieren und aufklären konnte. Dies sollte in dieser Arbeit geleistet werden. Genauer gesagt wurde folgende zentrale Fragestellung untersucht: Welche spezifischen psychologischen Mediatoren und neuronalen Korrelate liegen der sozialen Modulation kongruenter und inkongruenter fazialer Reaktionen auf emotionale Gesichtsausdrücke zugrunde? Zur Beantwortung dieser Frage wurde auf der Grundlage einer Literaturübersicht ein Arbeitsmodell aufgestellt und mit einer systematischen Reihe an Experimenten näher spezifiziert und auf Gültigkeit getestet. In Experiment 1 und 2 wurden mittels spezifischer sozialer Kontextinformationen motorische und affektive Mechanismen experimentell manipuliert, um Aussagen über deren Beteiligung an der Modulation kongruenter und inkongruenter fazialer Reaktionen auf emotionale Gesichtsausdrücke machen zu können. Aus einer Vielzahl denkbarer Szenarien, bei denen eine Beteiligung affektiver Prozesse zu erwarten ist, wurden zwei sehr unterschiedliche Situationsmanipulationen ausgewählt, um eine breite Gültigkeit der Ergebnisse zu ermöglichen. In Experiment 1 wurde die Einstellung zu 3 Sorten computergenerierter Charakter (positiv, neutral, negativ) manipuliert. Motorische 3 Prozessmodelle und affektive Prozessmodelle der Valenzevaluation sagen in diesem Fall unterschiedliche faziale Reaktionen vorher. Motorische Prozess‐Modelle würden vorhersagen, dass kongruente faziale Reaktionen unabhängig vom dargestellten Ausdruck durch die in der jeweiligen Situation aktivierten Affiliationsmotive moduliert (verstärkt oder vermindert) werden. Spezifisch würde angenommen, dass sowohl auf positive als auch auf negative Ausdrücke positiver Charaktere mehr faziale Mimikry als auf entsprechende Ausdrücke negativer Charaktere gezeigt wird. Neutrale Charaktere sollten dazwischen liegen. Affektive Prozessmodelle der Valenzevaluation würden vorhersagen, dass eine positive Charaktervalenz zu einer Anspannung des M. zygomaticus major, eine negative Charaktervalenz zu einer Anspannung des M. corrugator supercilii führt. Entsprechend verhält es sich mit der Ausdrucksvalenz. In den Experimenten 2a und 2b wurde der Gesundheitszustand dreier Sorten computergenerierter Charakter (gesund, HIV selbstverschuldet, HIV fremdverschuldet) manipuliert. Motorische Prozessmodelle und affektive Prozessmodelle der Valenzevaluation sagten auch hier wieder unterschiedliche faziale Reaktionen vorher. Zusätzlich ließen sich für diesen Kontext auch Vorhersagen über affektive Prozesse emotionaler Reaktionen treffen. Affektive Prozessmodelle emotionaler Reaktionen würden vorhersagen, dass sich die durch die Interaktion von Situation und Ausdruck bewirkte emotionale Reaktion in der fazialen Reaktion niederschlägt. Die Ergebnisse von Experiment 1 lieferten keinen Beleg für die Beteiligung affektiver Mechanismen der Valenzevaluation an der Modulation kongruenter und inkongruenter fazialer Reaktionen. Negative Charaktere erzeugten im Gegensatz zu positiven schwächere kongruente Reaktionen auf freudige und sogar inkongruente Reaktionen auf traurige Gesichtsausdrücke. In den Experimenten 2a und 2b zeigte sich ein Fehlen von Zygomaticus Reaktionen auf jegliche Art HIV‐ positiver Charaktere. Weiterhin zeigte sich eine höhere Anspannung des M. corrugator supercilii in Reaktion auf neutrale und traurige Gesichter fremdverschuldet Infizierter verglichen mit Ausdrücken gesunder (Experiment 2a) bzw. selbstverschuldet infizierter Charaktere (Experiment 2b). Die Ergebnisse des Corrugators können nicht durch motorische Prozessmodelle, die die stärkste Corrugatorreaktion auf traurige gesunde Darsteller vorhergesagt hätten, erklärt werden. Die Ergebnisse können ebenfalls nicht durch Valenzevaluation erklärt werden. Hier hätte man die stärkste Corrugatorreaktion auf die negativsten Charaktere, d.h. die fremdverschuldet Infizierten, erwartet. Das Ergebnismuster lässt sich einzig mit emotionalen Reaktionen in Form von Mitleid und Mitgefühl erklären. 4 In Experiment 3 wurden motorische und affektive Mechanismen mittels psychischer Variablen gemessen und deren Beteiligung an der Modulation kongruenter und inkongruenter fazialer Reaktionen durch Mediatorenanalysen festgestellt. Die Stärke motorischer Prozesse wurde mittels zweier Maße der kognitiven und emotionalen Empathie indiziert. Affektive Prozesse wurden mittels subjektiver Angaben über die eigene emotionale Reaktion auf jeden einzelnen Gesichtsausdruck erfasst. Die Erfassung dieser Mediatoren erfolgte mittels eines Zwischensubjektdesigns. Dazu wurden 3 unabhängige soziale Situationen (Kooperation, neutral, Konkurrenz) geschaffen. Die Ergebnisse von Experiment 3 zeigen reduzierte kongruente Reaktionen auf freudige und inkongruente Reaktionen auf traurige und ärgerliche Ausdrücke in der Konkurrenzsituation. Mediationsanalysen ergaben, dass der Mangel an kongruenten fazialen Reaktionen auf freudige Ausdrücke in der Konkurrenzsituation durch eine Abnahme kognitiver Empathie erklärt werden konnte. Dies ist der erste Hinweis auf eine Beteiligung motorischer Prozesse an der Modulation kongruenter Reaktionen. Solch eine Mediation wurde jedoch nur für die Modulation kongruenter Reaktionen gefunden. Die Modulationen inkongruenter Reaktionen auf traurige und ärgerliche Ausdrücke in der Konkurrenzbedingung konnten hingegen teilweise durch affektive Prozesse, genauer gesagt durch die emotionale Reaktion Freude, erklärt werden. Es zeigte sich demzufolge, dass affektive Prozesse in der Modulation und dem Zustandekommen inkongruenter Reaktionen involviert sind. Hingegen zeigte sich keine Beteiligung affektiver Prozesse an der Modulation kongruenter Reaktionen. Zusätzlich wurde die Beteiligung einer dritten Klasse von Prozessen gefunden. Genauer gesagt zeigte sich in Experiment 3 eine unerwartete inkongruente Reaktion des M. zygomaticus major auf traurige Ausdrücke in der Kooperationsbedingung. Mediationsanalysen zeigten, dass diese Reaktion durch strategische Prozesse, d.h. dem Ziel, eine harmonische Interaktionen aufzubauen und zu erhalten, erklärt werden konnten. In Experiment 4 wurden motorische Prozesse mittels EEG direkt auf neuronaler Ebene erfasst und anschließend in Bezug zur Modulation kongruenter und inkongruenter fazialer Reaktionen auf emotionale Gesichtsausdrücke gebracht. Aus motorischen Prozessmodellen ließ sich die Annahme ableiten, dass die Aufmerksamkeitsstärke, die auf einen Ausdruck bzw. eine Emotion gerichtet ist, mit der Stärke der Aktivierung der zugehörigen Repräsentationen einhergeht und folglich eine verringerte oder erhöhte Mimikry zur Folge haben sollte. Entsprechend wurde untersucht, ob die Modulation kongruenter und inkongruenter fazialer Reaktionen durch Veränderungen der frühen Gesichterwahrnehmung erklärt werden kann. Dazu wurde wieder zum Ausgangsparadigma „Einstellung“ zurückgekehrt. Die Ergebnisse zeigten erstmals, dass die N170, die die frühe 5 Gesichterverarbeitung abbildet, durch experimentell manipulierte Einstellungen moduliert werden kann. Weiterhin zeigte sich, dass die Modulation der N170 mit der Modulation kongruenter Reaktionen des M. zygomaticus major auf freudige Ausdrücke korreliert ist. Folgt man motorischen Prozessmodellen unterstützt dies die Annahme, dass motorische Mechanismen an der Modulation kongruenter fazialer Reaktionen beteiligt sind. Solch eine Beteiligung konnte jedoch wiederum nicht für die Modulation inkongruenter Reaktionen des M. corrugator supercilii gezeigt werden. In Experiment 5 erfolgte mittels fMRT die Untersuchung der neuronalen Korrelate der Modulation fazialer Reaktionen. Hierzu zur Aufklärung der Beteiligung motorischer Mechanismen der Zusammenhang des Spiegelneuronensystem mit der Modulation kongruenter und inkongruenter fazialer Reaktionen betrachtet werden. Zudem sollte untersucht werden, inwieweit die Modulation kongruenter und inkongruenter fazialer Reaktionen mit Zentren der emotionalen Verarbeitung wie Amygdala und Striatum zusammenhängt. In Experiment 5 konnte erstmals eine Modulation des Spiegelneuronensystems durch experimentell manipulierte Einstellungen gezeigt werden. Genauer gesagt zeigte sich eine geringere Aktivität entsprechender Regionen bei Betrachtung negativer verglichen mit positiven Charakteren. Weiterhin zeigte sich ein signifikanter Effekt der Einstellungsmanipulation auf Teile des Striatums (Caudate, Putamen) in Reaktion auf freudige Gesichtsausdrücke. Es konnte zudem gezeigt werden, dass Modulationen des Spiegelneuronensytems mit Modulationen kongruenter fazialer Reaktionen auf freudige Ausdrücke, aber nicht mit inkongruenten Reaktionen auf traurige Ausdrücke in Zusammenhang stehen. Dies spricht dafür, dass motorische Prozesse bei der Modulation kongruenter Reaktionen involviert sind. Die Modulation inkongruenter Reaktionen auf traurige Ausdrpcke konnte durch Aktivierungen im nucleus accumbens erklärt werden. Dies unterstützt die Annahme der Beteiligung affektiver Prozesse an der Modulation inkongruenter Reaktionen. Zusammengefasst führen die Ergebnisse zu einer revidierten Form des Arbeitsmodells der Mechanismen der Modulation kongruenter und inkongruenter fazialer Reaktionen auf emotionale Gesichtsausdrücke. Die Experimente 1‐5 unterstützen die Annahme, dass bei der Modulation kongruenter fazialer Reaktionen vorwiegend motorische Prozesse involviert sind. Damit kann bestätigt werden, dass faziale Mimikry per se existiert. Kein Hinweis wurde wiederum für eine Beteiligung motorische Prozesse an der Modulation inkongruenter Reaktion gefunden. Dies stimmt mit theoretischen Überlegungen und Literaturinterpretationen überein. Weiterhin zeigte sich kein Beleg für eine Beteiligung von Prozessen der Valenzevaluation an der Modulation fazialer Reaktionen. Stattdessen wurden zahlreiche Belege dafür gefunden, dass reflektive emotionale 6 Reaktionen für das Zustandekommen und die Modulation inkongruenter Reaktionen verantwortlich sind. Schließlich wurde auch ein Beleg für die Beteiligung strategischer Mechanismen an der Modulation inkongruenter Reaktionen gefunden. Zukünftige Studien sollten das Zusammenspiel motorischer und affektiver Prozess bei der Modulation fazialer Reaktionen noch genauer untersuchen und die gezeigten Befunde anhand einer breiteren Stichprobe und vielfältigeren sozialen Kontexten generalisieren. Das revidierte Arbeitsmodell der Mechanismen der Modulation fazialer Reaktionen stellt dazu einen konkreten Rahmen zur Ableitung spezifischer Hypothesen. 7 ABSTRACT Humans have the tendency to react with congruent facial expressions when looking at an emotional face. Interestingly, recent studies revealed that congruent facial reactions are not a ubiquitous phenomenon. Instead, several situational moderators (e.g. group membership, goals, attitudes) can modulate strength and direction of congruent facial reactions. In current literature, congruent facial reactions to emotional facial expressions are usually described in terms of “facial mimicry” and interpreted as imitative behavior. Thereby, facial mimicry is understood as a process of pure motor resonance resulting from overlapping representations for the perception and the execution of a certain behavior. Motor mimicry, however, is not the only mechanism by which congruent facial reactions can occur. Accordingly, numerous studies have shown that facial muscles also indicate valence evaluations of positive and negative visual and auditory stimuli. Furthermore, facial muscular reactions are also determined by our current emotional state, e.g. in response to specific emotion evoking stimuli. These thoughts suggest that the modulation of congruent facial reactions to emotional expressions can be based on both motor and affective processes. However, a separation of motor and affective processes in facial reactions to facial expressions is hard to make. None of the published studies that tried or at least discussed that could show a clear involvement of one or the other process so far. Therefore, the aim of the present line of experiments is to shed light on the involvement of motor (mimicry) and affective processes in the modulation of congruent and incongruent facial reactions. Specifically, the experiments are designed to test the assumptions of a working model on mechanisms underlying the modulation of facial reactions and to examine the neuronal correlates involved in such modulations with a broad range of methods (EMG, EEG, fMRT). Experiments 1 and 2 experimentally manipulate motor and affective mechanisms by using specific contexts. Thereby, clear evidence about the involvement of motor and affective mechanisms in facial reactions shall be obtained. Two different contexts have been chosen for which motor and affective mechanisms make competing and testable assumptions concerning the resulting facial reactions. In Experiment 1, the influence of attitudes on facial reactions is assessed. For such a manipulation motor process models and affective models of valence evaluations make competing predictions about resulting facial reactions. Motor theories lead to the assumption of an increase in congruent facial reactions towards expressions of positive and a decrease in congruent facial reactions towards expressions of negative characters independent of the specific emotional 8 expression. Affective models of valence evaluations assumed an independent and additive influence of character and expression valence on the facial reaction pattern. A relaxation of the corrugator muscle should indicate a positive valence, an increase in activity a negative valence. On the other hand, an increase in activity of the zygomaticus muscle should indicate a positive valence. In Experiment 2, facial reactions to people with accidentally or negligently infected HIV‐people and healthy people are examined. Again, motor and affective process models make competing predictions. In addition to Experiment 1 Experiment 2 also allows predictions about the impact of emotional reactions. The results of Experiment 1 did not support the involvement of valence evaluations in the modulation of congruent and incongruent facial reactions to facial expressions. Compared to positive characters negative characters elicited less congruent reactions to happy and even incongruent reactions to sad facial expressions. Both experiments 2a and 2b revealed a lack of M. zygomaticus major reactions to the stigmatized group of HIV positive people in general. Furthermore, increased M. corrugator supercilii activity in response to neutral and sad faces of accidentally HIV‐infected people compared to healthy (Experiment 2a) as well as negligently infected characters (Experiment 2b) was found. The corrugator results cannot be explained by motor processes which predicted the strongest corrugator reactions to sad faces of healthy characters. It can also not be explained by valence evaluations which would have predicted that the most negative characters, i.e. negligently HIV‐infected people, elicit the largest corrugator activity. However, the observed pattern of results suggests that emotional reactions in terms of pity and sympathy are the predominant determinant of facial reactions to stigmatized groups. Experiment 3 aimed at identifying the psychological mediators that indicate motor and affective mechanisms. Meditational analyses were be computed to test whether these variables can explain the contextual modulation of congruent and incongruent facial reactions. Motor mechanisms are assessed via the psychological mediator empathy. Additionally, as a psychological mediator for clarifying the role of affective mechanisms in the contextual modulation of congruent and incongruent facial reactions to facial expressions subjective measures of the participants’ current emotional state in response to the presented emotional facial expressions were taken. To assess the relevant mediators effectively, independent groups, i.e. a between subjects design, was needed. Therefore, three social contexts were created: a cooperation condition, a competition condition and a neutral control condition. The results of Experiment 3 show less congruent reactions to happy and even incongruent reactions to sad expressions in the competition compared to a neutral condition. Mediational analyses show that the lack of congruent facial reactions to happy expressions in the 9 competition condition can be explained by a decrease of state cognitive empathy. This is the first evidence in this series of experiments suggesting that motor processes mediate the effects of the context on congruent facial reactions to emotional faces. However, such a mechanism could only be observed for effects on congruent reactions to happy faces. The modulations of reactions to sad and angry competitors were not mediated by empathy and accordingly motor mechanisms. Instead, affective processes, i.e. the emotional reaction of joy, mediated the competition effects on incongruent reactions, partially for reactions to sad and fully for reactions to angry expressions. Thus, it can be concluded that in incongruent facial reactions affective processes in terms of emotional reactions are involved. However, no support was found for the involvement of affective processes in the modulation of congruent facial reactions. Additionally, the involvement of a third class of processes was observed. Specifically, in Experiment 3 an unexpected incongruent reaction on M. zygomaticus major in response to sad expressions in the cooperation condition was found. Mediational analyses revealed that this reaction can be fully explained by strategic processes aiming at creating and maintaining a smooth and harmonious interaction. Experiment 4 aimed at investigating whether a change in the strength of perception can explain the contextual modulation of facial reactions to facial expressions. According to motor process models the strength of perception is directly related to the strength of the spread of activation from perception to the execution of an action and thereby to the strength of the resulting mimicry behavior. The present experiment therefore aimed at measuring the event‐related P100, N170 and P200 potentials as well as facial reaction towards happy, neutral and sad faces for which the attitude the participants held was experimentally manipulated according to Experiment 1. Experiment 4 revealed the first experimental evidence of a modulation of the N170 by attitudes. It further revealed that the difference in the N170, i.e. indicating the early face encoding, between positive and negative characters is correlated with the difference in congruent reactions on M. zygomaticus major between happy expressions of positive and negative characters. Given the assumption that changes in early face encoding mean changes in the strength of perception this suggests that motor mechanisms were involved in the modulation of congruent facial reactions by attitudes. Such an involvement of motor mechanisms could, however, not be observed for the modulation of incongruent reactions on M. corrugator supercilii. In Experiment 5 the investigation of neuronal correlates shall be extended to the observation of involved brain areas via fMRI. Thereby, a focus will be put on areas underlying motor as well as affective mechanisms and their involvement in the modulation of congruent and incongruent facial 10 reactions. The proposed brain areas depicting motor areas were prominent parts of the mirror neuron system. The regions of interest depicting areas involved in the affective processing were amygdala, insula, striatum (nucleus accumbens, caudate, putamen). Experiment 5 gave the first evidence showing that activity of parts of the mirror neuron system is modulated by attitudes. Specifically, the MNS is less active in response to emotional facial expressions of negative compared to positive characters. Furthermore, a significant difference in activity of parts of the striatum (the caudate and the putamen) was observed in response to happy expressions between positive and negative characters. Furthermore, it could be shown that changes in the activity of parts of the MNS are related to the modulation of congruent facial reactions of happy expressions. For the modulation of facial reactions to sad expressions no involvement of the MNS could be observed. Instead, results revealed the involvement of the nucleus accumbens in the modulation of incongruent facial reactions meaning a higher activation of the nucleus accumbens in trials with strong incongruent reactions. This probably reveals the involvement of the emotional reaction of schadenfreude in response to a sad negative character. In sum, these results lead to a revised working model on the mechanisms underlying the modulation of facial reactions to emotional facial expressions. The results of the five experiments provide strong support for the involvement of motor mechanisms in congruent facial reactions. Therefore, it can be concluded that facial mimicry per se exists. No evidence was found for the involvement of motor mechanisms in the occurrence or modulation of incongruent facial reactions. This is consistent with theoretical considerations. Furthermore, no evidence was found for the involvement of valence evaluations in the modulation of facial reactions. Instead, reflective emotional reactions were found to be involved in the modulation of mainly incongruent facial reactions. Finally, some evidence suggests that strategic mechanisms reflecting intentions to give a certain impression are probably involved in the occurrence and modulation of incongruent facial reactions. Future studies should examine the interplay of motor and affective mechanisms in the generation of facial reactions. Additionally, the findings of the reported five experiments should be generalized to a broader population and a wider range of social contexts. The revised model on mechanisms of the modulation of facial reactions may provide a concise approach to derive specific hypotheses. 11 1. GENERAL BACKGROUND Humans have the tendency to react with congruent facial expressions when looking at an emotional face (e.g. Dimberg, 1982; Dimberg & Thunberg, 1998). They react, for example, with enhanced activity of the M. zygomaticus major (the muscle responsible for smiling) when seeing a happy expression of a vis‐á‐vis’ person or with an increase in M. corrugator supercilii (the muscle involved in frowning) activity in response to a sad face. Such congruent facial reactions occur spontaneously and rapidly already after 300‐400 ms (Dimberg & Thunberg, 1998) and even in minimal social contexts (Dimberg, 1982). They appear to be automatic and unconscious, because they occur without awareness or conscious control and cannot be completely suppressed (Dimberg & Lundqvist, 1990; Dimberg, Thunberg, & Grunedal, 2002). They even occur in response to subliminally presented emotional expressions (Dimberg, Thunberg, & Elmehed, 2000). 1.1. MODERATORS OF FACIAL REACTIONS TO EMOTIONAL FACIAL EXPRESSIONS Interestingly, recent studies revealed that congruent facial reactions are not a ubiquitous phenomenon. Instead, several contexts can modulate strength and direction of congruent facial reactions. Some of these studies on modulations of facial reactions to emotional facial expressions are described here in detail. Herrera, Bourgeois and Hess (1998) found that negative racial attitudes result in reduced congruent facial reactions to pictures of members of ethnic outgroups: French Canadians did not show congruent expressions to the emotional facial expressions displayed by Japanese actors, although they had no deficits in decoding them. Moreover, the more negative their racial attitudes towards the outgroup members were, the more they showed incongruent facial reactions, i.e. an activation pattern contrary to the actors’ expressions. They smiled at expressions of sadness and frowned at expressions of happiness. In a similar design, Bourgeois and Hess (2008) found reduced congruent facial expressions only for sad but not for happy expressions of outgroup members. McHugo, Lanzetta and Bush (1991) had participants watch video sequences showing emotional displays of two prominent politicians. They found that negative attitudes towards one of the politicians resulted in less congruent facial reactions to this speaker’s happy expressions. 12 McIntosh (2006) manipulated relationship quality by confronting participants in an interview context with either a friendly acting confederate with similar habits and tastes or with an annoying confederate with dissimilar habits and tastes. He found that after this manipulation, participants showed more congruent facial reactions to the friendly confederate’s happy facial expressions than to those of the annoying one. No difference could be observed for other emotional expressions. In a study by Lanzetta and Englis (1989) participants were told that their partner in a game – a confederate of the experimenter – would probably play in a very competitive or cooperative way, respectively. In the competitive condition they found incongruent facial reactions to happy and distressed expressions of the interaction partner. Participants showed muscle activation typical for frowning in response to expressions of pleasure, and activations typical for smiling in response to expressions of distress. Weyers, Mühlberger, Kund, Hess, and Pauli (2009) found similar results after a subliminal competition priming procedure. Participants had to passively view pictures of happy, neutral and sad expressions while facial electromyogram (EMG) was recorded. Results revealed a reduced pattern of congruent facial reactions to happy and sad faces in the competition compared to a neutral control group. To sum up, congruent facial reactions are reduced when facing an outgroup member (Bourgeois & Hess, 2008; Herrera et al., 1998), when partners have opposing goals (McHugo, et al., 1991), are dissimilar (McIntosh, 2006) or in a competition context (Lanzetta & Englis, 1989; Weyers et al., 2009). In some of these contexts people do not only show reduced congruent facial reactions but even incongruent facial reactions (Blairy, Herrera, & Hess, 1999; Bourgeois & Hess, 2008; Herrera et al., 1998; Hess, 2001; Hess & Blairy, 2001; Vrana & Gross, 2004; Weyers et al., 2009). Such incongruent reactions are for instance a M. corrugator supercilii relaxation in response to sad expressions of outgroup members (Herrera et al., 1998) or a smiling expression in response to a distressed expression of a competitor (Lanzetta & Englis, 1989). This short review shows that several studies describe facial reactions to emotional facial expressions in different social contexts. However, only a few ask for the processes underlying such contextual modulations (Dimberg et al., 2002; Hess, Philippot, & Blairy, 1998; McHugo et al., 1991). And no one could answer the question in a satisfactory manner until now. The present dissertation is supposed to close this gap by examining the mechanisms and neuronal correlates underlying the modulation of facial reactions to emotional facial expressions in a structured, systematic manner and with a broad range of methods. 13 1.2. MOTOR DETERMINANTS OF FACIAL REACTIONS TO EMOTIONAL FACIAL EXPRESSIONS In this chapter, motor determinants of facial reactions to emotional facial expressions will be discussed. Motor determinants are processes relying on the imitative resonance of the observer’s motor system with an executor’s behaviors (see for an overview Uithol, van Rooij, Bekkering, & Haselager, in press). In psychological literature, such imitative resonance or matching reactions are usually termed mimicry. In the following, empirical evidence on this phenomenon of mimicry is desribed in detail along with prominent social psychological and neuropsychological models on the underlying mechanisms. 1.2.1. (FACIAL) MIMICRY In current literature, congruent facial reactions to emotional facial expressions are usually described in terms of “facial mimicry” (for an overview see Chartrand & van Baaren, 2009). Mimicry is thereby defined as the tendency to unwittingly imitate the behaviors of another person (Hatfield, Cacioppo, & Rapson, 1993). Such imitated behaviors can be e.g. movements, postures, voice characteristics or facial expressions. In the latter case of mimicry of facial expressions we talk about facial mimicry. There are two main approaches in literature that try to explain the purpose or function of facial mimicry. According to a traditional model offered by Lipps (1907) facial mimicry serves as a tool to understand a vis‐à‐vis’ feelings and intentions via facial feedback mechanisms and subsequent emotional contagion. This idea has further been termed reverse simulation model (Goldman & Sripada, 2005). However, evidence for this model is weak and not convincing (see Blairy et al., 1999; Hess & Blairy, 2001; Niedenthal, Brauer, Halberstadt, & Innes‐Ker, 2001; Niedenthal, Halberstadt, Margolin, & Innes‐Ker, 2000). Furthermore, in light of the above‐mentioned findings it does not seem very plausible to assume emotion recognition to be the main function behind facial mimicry. Or why should somebody not be interested in decoding what an outgroup member or a competitor is feeling and thinking? From an evolutionary point of view such behavior could have life threatening consequences and so should not have persisted. Chartrand and colleagues offered a second approach explaining the purpose of mimicry. In their opinion, facial mimicry and mimicry in general serve to create and reinforce smooth and 14 harmonious interactions and to strengthen social bonds and relationships (Chartrand & Bargh, 1999; Lakin & Chartrand, 2003). The sharing of postures and gestures of an interaction partner further creates rapport and liking by communicating attention and understanding (Lakin, Jefferis, Cheng, & Chartrand, 2003) and can be used to facilitate negotiations (Maddux, Mullen, & Galinsky, 2008; Swaab, Maddux, & Sinaceur, 2011). The assumption that congruent facial reactions and their modulations share mechanisms with mimicry and its modulations is a widely held belief in literature (e.g. Chartrand & van Baaren, 2009). And when looking at comparable behavioral mimicry and facial mimicry studies, results appear quite similar. For example, Yabar, Johnston, Miles and Peace (2006) could show more behavioral mimicry (touching the face) towards ingroup as compared to outgroup members. In a similar study on facial mimicry, Herrera et al. (1998) found that negative racial attitudes towards members of an ethnic outgroup resulted in reduced congruent facial reactions of expressions of outgroup members. Another example used a very different kind of moderator, namely actual mood. Van Baaren, Fockenberg, Holland, Janssen, and van Knippenberg (2006) found that people showed more behavioral mimicry (pen‐playing) of an interaction partner when being in a positive compared to a negative (sad) mood. Similar results were found for facial reactions. In a study by Likowski, Weyers, Seibt, Stöhr, Pauli, and Mühlberger (2011) participants went through a similar positive or negative mood induction procedure. Here, the authors found a significant decrease in facial muscular reactions in the sad compared to the positive mood condition. These parallels in the results of studies on behavioral and facial support the assumption of shared underlying mechanisms. 1.2.2. PERCEPTION‐BEHAVIOR LINK AND PERCEPTION‐ACTION MODEL According to Chartrand and Bargh (1999) the mechanism behind (facial) mimicry is the so called perception‐behavior link: It is assumed that representations for perceiving and executing an action show a certain overlap. Accordingly, the mere perception of a certain behavior should automatically enhance the probability of executing this very same behavior by oneself. Thus, following to this model (facial) mimicry is seen as an automatic imitative behavior and so as a pure motor process. The perception‐behavior link is thereby not seen as a fixed and inflexible mechanism. It is rather assumed to be susceptible to the social context. For example, Lakin and Chartrand (2003) explain their result of an increase of mimicry after activating a goal to affiliate with a strengthened perception‐behavior link. Specifically, they assume that “the desire to affiliate may cause people to pay more attention to what occurs in their social environments (i.e., they perceive more)” (p. 338). 15 This, in turn, would further increase the spread of activation to the representations responsible for showing the observed behavior. The Perception‐Action Model (PAM) by Stephanie Preston and Frans de Waal proposes a very similar mechanism of mimicry. Preston and de Waal (2002) state in their model that “attended perception of the object’s state automatically activates the subject’s representations of the state, situation, and object, and that activation of these representations automatically primes or generates the associated autonomic and somatic responses, unless inhibited“ (p. 4). Furthermore, the PAM states that the strength of the perception‐action link depends on the relationship between interaction partners with a stronger link for positively interrelated, e.g. cooperating, or similar subjects. The authors propose that such a modulation of the “link” works through a stronger attention of the perceiver and a subsequent stronger activation of the respective representation. Thus, it thereby parallels the assumptions of the perception‐behavior link by Chartrand and Bargh (1999). But Preston and de Waal go even further and state that this perception‐action mechanism is not only responsible for mimicry but also for several other categories of response behaviors like response preparation, helping behavior or cognitive empathy. 1.2.3. MIRROR NEURON SYSTEM According to numerous authors the neurobiological correlate of the above mentioned perception‐behavior link and thereby the neuronal base of mimicry is the mirror neuron system (e.g. Blakemore & Frith, 2005; Iacoboni & Dapretto, 2006; Niedenthal, 2007). The discovery of mirror neuron dates from studies in the macaque where Giacomo Rizzolatti and colleagues came across a system of cortical neurons in area F5 (premotor cortex in the macaque) that responded not only when the monkey performed an action, but also when the monkey watched the experimenter performing the same action (Di Pellegrino, Fadiga, Fogassi, Gallese, & Rizzolatti, 1992). They named their system of neurons the ‘‘mirror neuron system’’ (MNS) because it appeared that the observed action was reflected or internally simulated within the monkey’s own motor system. The discovery of the mirror neuron system stimulated a wide body of research and our knowledge about it was extended immensely in the past years. Most importantly, there is now specific evidence that a system equivalent to the mirror neuron system exists in humans. Specifically, the human mirror neuron system (see Figure 1) comprises the ventral premotor cortex (Broadmann’s Area 44, i.e. the human homolog of the monkey F5 region), the inferior frontal gyrus and the inferior 16 parietal cortex (Iacoboni & Dapretto, 2006). These regions fit nicely to the macaque’s mirror neuron system. Further mirror neuron activity has been detected in the superior temporal sulcus (Iacoboni & Dapretto, 2006) which is seen as the main visual input to the human mirror neuron system. Figure 1. Schematic overview of the human mirror neuron system. IFG = inferior frontal gyrus, IPL = inferior parietal lobule, PMC = premotor cortex, STS = superior temporal sulcus. Other studies looked specifically on the conditions under which the mirror neuron system is more or less active. Gallese, Fadiga, Fogassi, and Rizzolatti (1996) could e.g. show that it is selective for animate actions. Other studies could characterize it as somatotopically distributed based on the effectors used to perform the action (Buccino et al., 2001; Goldenberg & Karnath, 2006). Further studies show that activity of the MNS depends on several characteristics of the respective action, e.g. whether it is performed by a human or a robotic hand (e.g. Fogassi et al., 2005; Tai, Scherfler, Brooks, Sawamoto, & Castiello, 2004). Such modulating factors led to the assumption that mirror neurons do not only code the “what” of an action but also the “why”, i.e. the intention of the actor (e.g. Gallese & Goldman, 1998). Thus, it is widely proposed that the function of the mirror neuron system is to decode and to understand other people’s actions (e.g. Iacoboni & Dapretto, 2006; Rizzolatti & Craighero, 2004). However, in the past few years a growing amount of researchers doubt this assumption and show evidence for an associative, sensorimotor learning hypothesis (Hickok & Hauser, 2010). For example, Catmur, Walsh, and Heyes (2007) could show that the specific properties of the mirror neuron system are not innate and can be flexibly developed through sensorimotor learning. Hickok (2009) could further show that action understanding and motor 17 system function dissociate. Therefore, a prominent current alternative explanation is that the mirror neuron system is a byproduct of associative learning. The description of mirror neurons as responding to both performing an action and watching somebody else performing the same action (Di Pellegrino et al., 1992) reminds strongly of the assumptions of the perception‐behavior link (Chartrand & Bargh, 1999; see above), i.e. overlapping representations for perceiving and executing an action. By being the neuronal correlate of the perception‐behavior link it is suggested that the MNS is strongly involved in mimicry. Accordingly, there is evidence for activation in Brodmann area 44 when participants imitate other people’s facial expressions (Carr, Iacoboni, Dubeau, Mazziotta, & Lenzi, 2003). Further studies could show similar relationships between the conscious imitation of facial expressions and activity of all other parts of the MNS (e.g. Dapretto et al., 2006; Leslie, Johnson‐Frey, & Grafton, 2004; van der Gaag, Minderaa, & Keysers, 2007). Schilbach, Eickhoff, Mojzisch, and Vogeley (2008) even found enhanced activity of the MNS during non‐conscious facial mimicry. 1.3. AFFECTIVE DETERMINANTS OF FACIAL REACTIONS TO EMOTIONAL FACIAL EXPRESSIONS Motor mimicry, however, may not be the only mechanism by which congruent and incongruent facial reactions to emotional facial expressions can occur. Facial expressions usually bear a certain meaning and so do the reactions towards them. Accordingly, one could argue that affective mechanisms might additionally or even instead be involved. Such potential affective determinants of facial reactions to emotional facial expressions are discussed in the following. 1.3.1. VALENCE EVALUATIONS Numerous studies have shown that facial muscles also indicate valence evaluations of objects and persons. M. corrugator supercilii, the muscle responsible for frowning, and M. zygomaticus major, the muscle responsible for smiling, have proven to be reliable markers of valence. A relaxation of the corrugator muscle indicates a positive valence, an increase in activity a negative valence. On the other hand, an increase in activity of the zygomaticus muscle indicates a positive valence. This has been shown for various kinds of positive and negative stimuli like social and non‐social scenes, sounds or positive and negative attitude objects, but also for emotional facial expressions (e.g., Cacioppo, Martzke, Petty, & Tassinary, 1988; Cacioppo & Petty, 1979; Larsen, Norris, & Cacioppo, 18 2003; Neumann, Hess, Schulz, & Alpers, 2005; Tassinary & Cacioppo, 1992). Corresponding reactions are also observed in response to positive and negative attitude objects, contents or persons (e.g. Cacioppo & Petty, 1979; Vanman, Paul, Ito, & Miller, 1997). 1.3.2. EMOTIONAL REACTIONS Facial muscular reactions to facial expressions are also determined by our current emotional state (Ekman, 1973), e.g. in response to specific emotion evoking stimuli (Cacioppo et al., 1988, Dimberg, 1997). Magnee, Stekelenburg, Kemner and de Gelder (2007) could recently show that facial reactions in response to emotional facial expressions are comparable to reactions resulting from perceiving emotional voice or body expressions. Due to this similarity between facial reactions to emotional facial expressions and those to emotional scenes or sounds the authors conclude that congruent facial reactions to faces are more likely determined by emotional reactions than by facial mimicry. In the following outline different models and ideas concerning how facial reactions can be shaped by emotional reactions to facial expressions will be outlined. Thereby, impulsive and reflective emotional reactions will be differentiated. The assumption that two separate mechanisms can drive our emotional reactions has become a common view in the emotion literature of the last decade (Barrett, Niedenthal, & Winkielman, 2005; van Kleef, 2009). Just like other dual process models (e.g. the reflective‐impulsive model, RIM, Strack & Deutsch, 2004) dual process models of emotional reactions distinguish between an impulsive or associative system and a reflective or rule based system. Smith and Neumann (2005) follow emotion theories by Keltner and Haidt (2001), LeDoux (1996) or Leventhal and Scherer (1987) and propose in their dual‐process approach the distinction between associative and rule based processes in the elicitation of emotion. Associative or impulsive emotional reactions are in their terms reactions that are well‐learned, fast and automatic. Examples are overlearned reactions to distinct emotional stimuli like fear in response to a snake or emotional contagion in response to a facial expression (Smith & Neumann, 2005). Rule based, reflective emotional reactions on the other hand are elicited by thoughtful processing. Examples are the anticipation of threat in the future or considering knowledge about a person who e.g. violated a norm (Smith & Neumann, 2005). 19 1.3.2.1. IMPULSIVE EMOTIONAL REACTIONS Some researchers propose that congruent facial reactions to emotional facial expressions reflect the process of emotional contagion (Hatfield, Cacioppo, & Rapson, 1994; Laird et al., 1994). This idea ‐ that the observation of another person’s emotional expression directly activates the neural substrate that is implicated in the experience of the observed emotion ‐ has been referred to as the unmediated resonance model (Goldman & Sripada, 2005) and has been suggested for example by Gallese (2001; 2003) and Wicker et al. (2003). The core assumption of this model is that the observation of an emotional facial expression first triggers the corresponding emotion in the observer (Blairy et al., 1999; Hess & Blairy, 2001; Lundqvist & Dimberg, 1995). This then allows the observer to attribute his/her own emotional state to the observed person. The facial reaction is not more than a byproduct of the emotional contagion in this case. Accordingly, Hess and Blairy (2001) found specific emotional contagion effects after presenting happy, sad, angry and disgusted facial expressions and could also demonstrate some few correlations between facial mimicry and self‐ reported emotional state. However, clear evidence for the unmediated resonance model is missing so far. Furthermore, according to this account a modulation of facial reactions should be routed in a modulation of emotional contagion and the capability to recognize a vis‐á‐vis’ emotion. Thereby, the unmediated resonance model has to deal with the same critiques as the reverse simulation model (see chapter 1.2.1.). Embodied theories of emotion assume that congruent facial reactions are part of the reenactment of the experience of another person’s state (e.g. Niedenthal, 2007). Thereby, they parallel in part the assumptions of the unmediated resonance model (Goldman & Sripada, 2005). However, they propose a more comprehensive mechanism underlying the formation of facial mimicry. Specifically, they assume that during an initial emotional experience all the sensory, affective and motor neural systems are activated together. This experience leads to interconnections between the involved groups of neurons. Later on, when one is just thinking about the event or perceiving a related emotional stimulus, the activated neurons in one system spread their activity through the interconnections that were active during the original experience to all the other systems. Thereby the whole original state or at least the most salient parts of the network can be reactivated (Niedenthal, 2007; Niedenthal, Winkielman, Mondillon, & Vermeulen, 2009; Oberman, Winkielman, & Ramachandran, 2007). 20 Embodiment theories state that looking at an emotional facial expression means reliving past experience associated with that kind of face. Thus, perceiving an angry face can lead to tension in the muscles used to strike, a rise in blood pressure or the enervation of facial muscles involved in frowning (Niedenthal, 2007). Accordingly, congruent facial reactions reflect an internal simulation of the perceived emotional expression. The suggested purpose of such a simulation is to understand the actor’s emotion (Atkinson & Adolphs, 2005; Niedenthal et al., 2001; Wallbott, 1991). Thereby it virtually parallels the classic assumptions about the function of the mirror neuron system (see chapter 1.2.3.). Functional brain imaging studies support the idea of an internal simulation of a person’s state in response to an emotional expression. Specifically, it has been shown that seeing an emotional facial expression of another person and experiencing that emotion by oneself involves overlapping neural circuits. For example, Wicker et al. (2003) asked participants to either inhale disgusting odors themselves or to watch another person showing the facial expression of disgust. The authors found that both contexts produced similar activations in the anterior insula and, to some extent, the anterior cingulate cortex. Another imaging study revealed similar overlapping brain activities when participants went through a painful stimulation compared to a context in which they watched the painful stimulation of their partner (Singer et al., 2004). In both conditions the authors found activation of the anterior insula, the anterior cingulate cortex (ACC), the brainstem, and the cerebellum. Similar results could be found in a study by Hutchison, Davis, Lozano, Tasker, and Dostrovsky (1999) for receiving painful stimulations by oneself and watching the experimenter receiving painful stimulations. These results do not only verify embodied theories of emotion, they even allow some further conclusions about the probable role of embodiment in different forms of learning. Thus, embodied theories are not only able to explain learning by experience or conditioning; they might also be applicable for observational learning. The studies by Wicker et al. (2003) and Singer et al. (2004) show nice examples for learning consequences of a behavior by watching another person experience these consequences and subsequently simulating these emotional experiences. The study by Singer et al. (2004) can also help understanding another purpose of embodying emotions. As mentioned above, embodying emotions is assumed to help understanding the vis‐á‐vis’ emotional state. Singer et al. (2004) could show that the activations in insula and ACC are correlated with individual empathy trait scores. And in a later study, Singer et al. (2006) found a moderation of 21 the empathy‐related activation in pain‐related brain areas by the perceived fairness of the other person. This leads to the conclusion that embodiment is linked to empathic understanding. And it further strongly suggests that the amount of internal simulation of another person’s emotional experience can be modulated depending on the social contextual moderators like e.g. perceived fairness. 1.3.2.2. REFLECTIVE EMOTIONAL REACTIONS Only little research has so far examined facial reactions to emotional facial expressions caused by reflective emotional reactions, i.e. emotional reactions based on thoughtful processing. Moody, McIntosh, Mann and Weisser (2007) conducted a study in which half of the participants were put into a fearful mood. When seeing angry expressions, they responded with enhanced activity of the M. Frontalis medialis (the muscle responsible for lifting the brows) whereas a control group did not show any facial reactions. The latter result was interpreted as a fear response after inferring or anticipating threat from the opponent. Thus, facial reactions to emotional facial expressions might also reveal discrete emotional responses. It is further possible to assume that facial reactions to facial expressions are also influenced by reflective emotional reactions in other social contexts. For example, seeing a happy collaborator can lead to the reasoning that something positive happened for “our” common goal and therefore make me happy. Vice versa, seeing a happy competitor can imply that something happened that is disadvantageous for me and thus make me sad or angry. Both reflective emotional reactions can be reflected in facial reactions. The latter example corresponds nicely to the findings of incongruent facial reactions by Lanzetta and Englis (1989). They interpreted the happy reaction they found in response to a suffering competitor as schadenfreude. Also a more recent study by Herrera et al. (1998) interpreted a relaxation in corrugator muscle in response to sad expressions of outgroup members as caused by schadenfreude. Because schadenfreude is an emotional reaction that is grounded in reasoning processes (see e.g. Shamay‐Tsoory, Tibi‐Elhanany, & Aharon‐Peretz, 2007) it can be clearly seen as a reflective emotional reaction. 22 1.4. MECHANISMS UNDERLYING THE MODULATION OF FACIAL REACTIONS TO EMOTIONAL FACIAL REACTIONS 1.4.1. DEFICITS OF THE PRESENT BODY OF EVIDENCE In current literature, the modulation of congruent and incongruent facial reactions is nearly exclusively explained by motor mechanisms. Most authors refer to the perception‐behavior link or the mirror neuron system with its affiliative function when explaining the underlying mechanism of their moderator variables (e.g. Bourgeois & Hess, 2008; Herrera et al., 1998; Hess et al., 1998; Hess, 2001; Hess & Blairy, 2001, Lanzetta & Englis, 1989; McHugo et al., 1991; McIntosh, 2006; Niedenthal, Barsalou, Winkielman, Krauth‐Gruber, & Ric, 2005; Weyers et al., 2009). Their findings of reduced congruent reactions to emotional expressions of outgroup members, competitors or dissimilar persons are described as reduced “facial mimicry” caused by the fact that smooth interactions and further contact are not necessary and desirable in these contexts. But, as outlined above, smiling less when seeing a smile or frowning less in response to a sad face can result from more than reduced imitational tendencies. In the reviewed studies, the modulation of congruent facial reactions could have been influenced at least in part by affective and not only by motor processes. For example, if an emotional expression belongs to an outgroup member, a competitor or a dissimilar person then the positive or negative valence of the expression is reduced, which in turn might lead to reduced congruent facial reactions. Less activity of M. zygomaticus major in response to happy expressions of outgroup members (Herrera, et al., 1998) could also have occurred due to reduced emotional contagion; and less congruent reactions to happy expressions in a competition context (Lanzetta & Englis, 1989, Weyers et al., 2009) could also have occurred due to reduced emotional contagion or due to a lack of the emotional reaction of happiness because positive outcomes for a competitor might mean bad news for me. The case is very similar for incongruent facial reactions. They are usually referred to as “counter‐mimicry” in literature (Blairy et al., 1999; Bourgeois & Hess, 2008; Herrera et al., 1998; Hess, 2001; Hess & Blairy, 2001; Vrana & Gross, 2004). By using the term “counter‐mimicry” it is suggested that these reactions are pure motor processes, too. Unfortunately, the authors do not make clear statements about how exactly such opposite motor reactions could arise. Their argumentations suggest some kind of inhibitions and counter‐activations of muscles but no logical explanation is given concerning the mechanisms of origin. In my opinion, a modulation of the 23 perception‐behavior or perception‐action link in terms of an inhibition could only lead to a reduction of congruent but in no way to incongruent reactions. Thus, again affective processes are an alternative explanation for the incongruent reactions that have been found in the studies cited above, too. Smiling in response to a suffering competitor, the negative expression pattern in response to a happy competitor (Lanzetta & Englis, 1989) and frowning at happy outgroup members (Herrera et al., 1998) can also be interpreted as anger and schadenfreude instead of just opposite reactions (i.e. an inhibition or counter‐activation) of the involved muscles. One could not only doubt the exclusiveness of the involvement of motor processes in facial reactions but also the existence of facial mimicry at all. In literature, there is an often observed lack of congruent reactions in response to angry expressions. This has been described above in the study by Moody et al. (2007), but there are even more studies without such congruent reactions. For example, Weyers, Mühlberger, Hefele and Pauli (2006) also failed to find activity in M. corrugator supercilii in response to angry faces, and Bourgeois and Hess (2008) observed “mimicry” of anger only if the emotional expression was directed toward a third object as indicated by gaze direction. Albeit these findings make perfect sense – why should someone look back angrily without being aware of any social context or cause and so risk an argument or fight – they question the applicability of the perception‐behavior link for facial emotional expressions at all. However, a separation of motor and affective processes in facial reactions to facial expressions is hard to make. Some of the published studies tried to show or at least discussed such a separation but none could show a clear involvement of one or the other process so far (Dimberg et al., 2000; Dimberg et al., 2002; Hess et al., 1998; McHugo et al., 1991). The most striking reason for this is that both motor and affective mechanisms suggest the same facial reactions for the most common emotional expressions, i.e. happy and sad, in most contexts. There is up to now only one study that ever tried to disentangle different explanations. Hess et al. (1998) examined whether an alternative explanation for congruent facial reactions to emotional faces would be mood induction. They hypothesized that if mood induction would be the underlying mechanism then a block wise presentation of emotional facial expressions should lead to a cumulative addition of congruent emotional reactions. This should in turn result in stronger congruent reactions in the block wise compared to a randomized presentation. However, they found no significant differences between the two presentation conditions. Thereby, they could reject the hypothesis that congruent facial reactions occur exclusively due to emotional contagion. However, 24 Hess et al. (1998) could not rule out that congruent facial reactions result e.g. from a combination of motor and affective processes. Another crucial deficit concerns the lack of experimental studies on the neuronal correlates of facial reactions to facial expressions as well as their modulations. Up to now, no study could convincingly show the involvement of motor and/or affective cortical structures in the production of automatic congruent or incongruent facial reactions. In fact, only one study so far examined the neuronal correlates of unconscious facial reactions to facial expressions (Schilbach, Eickhoff, Mojzisch, & Vogeley, 2008). And this study has several methodological problems. The most severe one is thereby the fact that the assessment of muscular activity and blood oxygen level dependent response did not take place simultaneously but one after the other. Thus, there is up to now no empirical evidence answering the question about the neuronal structures involved in the modulation of facial reactions to emotional facial expressions. 1.4.2. A WORKING MODEL ON MECHANISMS UNDERLYING THE MODULATION OF FACIAL REACTIONS TO EMOTIONAL FACIAL EXPRESSIONS The aim of the present dissertation is to eliminate the above mentioned deficits in current literature and to shed light on the involvement of motor (mimicry) and affective processes in the modulation of congruent and incongruent facial reactions. Further, the question about the psychological mediators and neuronal correlates of contextual modulations of facial reactions to emotional facial expressions shall be answered. The review of literature on facial reactions to emotional facial expressions revealed three main determinants of automatic facial reactions to facial expressions: facial mimicry (in terms of a motor process), valence evaluations and emotional reactions. Integrating these three determinants with the reported empirical evidence a working model on mechanisms of facial reactions to emotional facial expressions is proposed (see Figure 2). This model shows by which mechanisms (motor vs. affective) context and emotional facial expression affect facial reactions. In line with the prominent opinion in current literature on mimicry and empirical evidence showing a relationship between congruent facial expressions and activity of mirror neurons it is expected that congruent facial reactions are determined by motor mechanisms, i.e. the perception‐ behavior link. It is further expected that contextual moderators influence congruent reactions by modulating the strength of the perception‐behavior link. This means, that all parts of the perception‐ 25 behavior link (the strength of perception as well as the spreading of activation between perception and behavior, i.e. “the link”) can be modulated by a certain context and thereby affect congruent facial reactions. Contexts that require smooth interactions, affiliation and liking are assumed to increase the activity of the perception‐behavior link whereas contexts in which affiliation and liking are not required or desired inhibit it. The contextual modulation of the perception‐behavior link is expected to work independently from the specific emotional expression. Importantly, it is assumed that motor mechanisms are not involved in the occurrence or modulation of incongruent facial reactions. The proposed mechanisms underlying (facial) mimicry, i.e. the perception‐behavior link, the perception‐action model or the mirror neuron system, do only endorse the conclusion of an increased activation or an inhibition of the link between perceived action and executed behavior. But they do not talk about any way by which an inhibition of a congruent muscle below zero or the activation of a different muscle, e.g. an antagonist, could be predicted. Affective mechanisms (valence evaluation and emotional reactions) are assumed to determine congruent facial reactions as well. This can be concluded from the evidence on similarities between facial reactions to emotional facial expressions and to positive and negative emotional stimuli. It is further assumed, that contextual moderators influence congruent facial reactions by modulating valence evaluations and emotional reactions. This modulation of valence evaluations and emotional reactions is expected to arise from an interaction of context and emotional expression, i.e. inferring the meaning of the emotional expression in the respective context. This means that different emotional expressions can have completely different meanings and therefore trigger completely different affective reactions in different contexts. For example, a happy face of an interaction can mean positive outcomes in a cooperative setting but a disadvantage, which is a negative outcome, in a competitive context. Consequently, the same expression can trigger positive affect in one and negative affect in the other context. In a similar vein it is expected that affective mechanisms also determine incongruent facial reactions. In contrast to the perception‐behavior link emotion theories allow very easily the prediction of incongruent emotional reactions and thereby incongruent reactions. In general, it is suggested that behavior is a function of schemata that can be controlled by motor and affective mechanisms (Norman & Shallice, 1986). Following the above introduced distinction between congruent and incongruent facial reactions congruent and incongruent 26 behavioral schemata shall be differentiated. It is assumed that, once activated, both kinds of schemata additively contribute to the muscular facial reaction according to their respective strength. Figure 2: A working model on mechanisms underlying facial reactions to emotional facial expressions (see text for explanations). In sum, the working model on mechanisms of facial reactions to emotional facial expressions provides a concise framework for deriving specific hypotheses. It allows for a selective experimental manipulation of the two kinds of proposed mechanisms in order to test how they determine congruent and incongruent facial reactions. 1.5. AIM OF THE EXPERIMENTS Five experiments are designed to test the assumptions of the working model and to examine the neuronal correlates underlying the modulation of facial reactions to facial expressions in a systematic manner and with a broad range of methods. Experiments 1 and 2 experimentally manipulate motor and affective mechanisms by using specific contexts. Thereby, clear evidence about the involvement of motor and affective mechanisms in facial reactions shall be obtained. Experiment 3 aims at identifying the psychological mediators that indicate motor and affective mechanisms. In this experiment, mediational analyses shall be computed to test whether the assessed psychological mediators can explain the contextual modulation of congruent and 27 incongruent facial reactions. Finally, Experiments 4 and 5 are designed to map the neuronal correlates of the motor and affective mechanisms and to determine how they are related to the modulation of facial reactions. 2. EXPERIMENTS 2.1. EXPERIMENTS 1 & 2 – EXPERIMENTAL MANIPULATION OF MECHANISMS Two different contexts have been chosen for which motor and affective mechanisms make competing and testable assumptions concerning the resulting facial reactions. In Experiment 1, the influence of attitudes on facial reactions is assessed. In Experiment 2, facial reactions to people with accidentally or negligently infected HIV‐people and healthy people are examined. For this part, it was chosen to examine two moderators and thereby to conduct two experiments in order to achieve a satisfying validity and generalizability of the results. 2.1.1. EXPERIMENT 1 2.1.1.1. THEORETICAL BACKGROUND AND HYPOTHESES The primary goal of this experiment was to test whether motor or affective mechanisms are the main determinant of facial reactions. This was done by experimentally manipulating attitudes to computer generated characters (so called avatars). Because attitudes are closely related to valence (see e.g. Cacioppo & Petty, 1979) this allowed generating competing hypotheses assuming that mimicry vs. valence evaluations determine the resulting pattern of facial reactions. For a better understanding of the competing hypotheses Figures 3 and 4 illustrate the respective predicted patterns of results. Only corrugator reactions will be displayed in these figures because predictions of the mimicry and the valence hypothesis do not differ for the M. zygomaticus major. Accordingly, a comparison of the different hypothesized facial muscular reaction pattern with the actually observed pattern of results then allows conclusions about the involved mechanisms. This is the first study at all examining the influence of attitudes on facial reactions to emotional expressions. Therefore, the rationale behind the generation of the hypotheses shall be explained in detail. 28 The above mentioned findings on moderators of facial reactions to emotional facial expressions (see 1.1.) already point to the possibility that attitudes modulate facial reactions. Herrera et al. (1998) found that negative racial attitudes result in reduced congruent and even incongruent facial reactions to pictures of outgroup members. Finally, McHugo et al. (1991) found that negative attitudes towards a politician resulted in less congruent facial reactions to this speaker’s expressions. These previous studies, however, did not manipulate attitudes. The results in Herrera et al. (1998) can also be due to shared and non‐shared group identity. Shared group identity leads to behavioral assimilation, non‐shared identity to contrast (Schubert & Häfner, 2003; Spears, Gordijn, Dijksterhuis, & Stapel, 2004). The results in McHugo et al. (1991) can also have occured due to shared and non‐ shared goals. Thus, there is no clear evidence showing that attitudes affect facial reactions to emotional expressions. Mimicry hypothesis. Evidence supporting the assumption of an influence of attitudes on the perception behavior link comes from research on motivational orientations. Several studies could show that evaluative processes in response to attitude objects automatically activate approach and avoidance tendencies and behaviors. For example Solarz (1960) had participants move cards with positive and negative words either towards or away from themselves. He found that participants were faster at moving cards with positive words towards themselves (approach behavior) and cards with negative words away (avoidance behavior) than vice versa. In a similar study, Chen and Bargh (1999) demonstrated that these tendencies can also be activated automatically, i.e. without conscious evaluation of the respective words. Further research affirmed this link between positive attitudes and approach behaviors as well as between negative attitudes and avoidance tendencies for both social and non‐social attitude objects (Neumann & Strack, 2000; Neumann, Hülsenbeck, & Seibt, 2004; Seibt, Neumann, Nussinson, & Strack, in press). To conclude, the available evidence suggests that positive attitudes automatically instigate approach behavior towards objects and people, and that negative attitudes automatically instigate avoidance behavior. As mentioned before (see chapter 1.2.1.), Lakin and Chartrand (2003) found that an implicit as well as an explicit affiliation goal led to an increase in mimicry. This shall be seen as a model for mimicry behavior of persons with approach motivation. Accordingly, positive attitudes towards a person should lead to an approach orientation and hence enhanced mimicry. On the other hand, negative attitudes should lead to an avoidance orientation in terms of social distancing and reduced affiliative signals and hence reduced mimicry. This leads to the following predictions of motor theories: 29 Independent of the specific emotional expression, an increase in congruent facial reactions towards expressions of positive and a decrease in congruent facial reactions towards expressions of negative characters are expected. Neutral characters are assumed to range in between. Specifically, an increase in M. zygomaticus major (the muscle involved in smiling) activity in response to happy facial expressions with increasing character valence is expected. No differential reactions to neutral or sad faces are expected. For the M. corrugator supercilii (the muscle responsible for frowning), an increase in activity in response to sad faces with increasing character valence is expected. No differential reactions to happy or neutral faces are expected. Furthermore, no incongruent reactions are expected. These assumptions are called mimicry hypothesis and can be seen in Figure 3. Figure 3. Patterns of results on M. corrugator supercilii in response to happy, neutral and sad faces for positive, neutral and negative characters as predicted by the mimicry hypothesis (see text for explanations). Valence hypothesis. Predictions of affective theories of valence evaluations are quite obvious. We know that a relaxation of the corrugator muscle indicates a positive valence, an increase in activity a negative valence. On the other hand, an increase in activity of the zygomaticus muscle indicates a positive valence (Larsen et al., 2003). It is assumed that such facial muscular patterns based on valence evaluations are the same for any kind of emotional stimuli and thereby also for facial expressions and attitude objects (e.g. Cacioppo & Petty, 1979; Vanman et al., 1997). Specifically, an independent and additive influence of character and expression valence on the facial reaction pattern is expected. This assumption is called valence hypothesis (Figure 4). 30 Figure 4. Patterns of results on M. corrugator supercilii in response to happy, neutral and sad faces for positive, neutral and negative characters as predicted by the valence hypothesis (see text for explanations). 2.1.1.2. METHODS Design and Participants The design was a 2 (muscle: M. corrugator supercilii vs. M. zygomaticus major) x 3 (emotion: happy vs. neutral vs. sad) x 3 (type of character: positive vs. neutral vs. negative) within‐subjects design. A total of 28 female university students participated in the experiment. They were recruited on campus and received 10 € for compensation. Recruitment was limited to female subjects because of earlier findings (Dimberg & Lundqvist, 1990) indicating that females show more pronounced, but not qualitatively different mimicry effects than male subjects. Data from 3 participants had to be excluded from analyses due to non‐compliance with the instructions. Analyses are based on the remaining sample of 25 females, aged between 19 and 27 years (M = 21.46, SD = 2.24). Stimuli and Apparatus Avatar emotional facial expressions. Instead of pictures of humans avatar facial emotional expressions are used. Avatars (i.e. virtual persons or graphic substitutes for real persons) provide a useful tool for research in emotion and social interactions (Blascovich, Loomis, Beall, Swinth, Hoyt, & Bailenson, 2002), because they allow full control over the facial expression and its dynamics, e.g. its intensity and temporal course (cf. Krumhuber & Kappas, 2005). Furthermore, due to the possibility to use the same prototypical faces for all types of characters there is no need to control for differences 31 in liking and attractiveness between the conditions and a reduced amount of error variance can be assumed. How successfully an avatar can be used as a research tool for studying interactions has recently been demonstrated by Bailenson and Yee (2005). Subjects rated a digital chameleon, i.e. an avatar which mimics behavior, more favorably even though they were not aware of the mimicry. Thus, an avatar’s mimicry created liking comparable to real individuals (Chartrand & Bargh, 1999). Stimuli were created with Poser software (Curious Labs, Santa Cruz, CA) and the software extension offered by Spencer‐Smith, Wild, Innes‐Ker, Townsend, Duffy, Edwards, Ervin, Merritt and Paik (2001) to manipulate action units separately according to the facial action coding system (Ekman & Friesen, 1978). Notably, Spencer‐Smith et al. (2001) could show that ratings of quality and intensity of the avatar emotional expressions were comparable to those of human expressions from the Pictures of Facial Affect (Ekman & Friesen, 1976). The facial stimuli were presented on a computer screen one meter in front of the participants with a picture size of about 19 x 25 cm. Three emotional facial expressions were created from a prototypic female and a prototypic male face: a neutral, a happy, and a sad expression (for details see Spencer‐Smith et al., 2001). Each male and female emotional expression was then combined with three types of hairstyles (blond, brown, and black hair), resulting in eighteen stimuli (for examples see Figure 5). The three types of characters were created by assigning the three character descriptions (positive, neutral or negative – see below) to the three different types of avatars (blond, brown or black). Assignments of hairstyle to the specific character as well as order of introduction of the three character types were both counter‐balanced. Figure 5. Examples of avatars with different emotional facial expressions. 32 Attitude manipulation. The attitude manipulation at the beginning of the session was covered as an introduction to different types of avatar characters designed to occur in computer games. In order to ensure a motivated and systematic processing of the given information, participants had to memorize the avatar characters with their specific traits for a later recall‐task. The positive avatars were introduced as kind (FREUNDLICH), nice (NETT), likeable (SYMPATHISCH) and self‐confident (SELBSTBEWUSST). The traits of the neutral avatar characters were reserved (DISTANZIERT), serious (ERNST), calm (RUHIG) and neat (ORDENTLICH) and the negative characters were described as malicious (BOSHAFT), aggressive (AGGRESSIV), egoistic (EGOISTISCH) and deceitful (HINTERLISTIG). All traits and combinations of traits had been pre‐tested to affirm their distinct positive, neutral, and negative nature. Facial EMG. Activity of the M. zygomaticus major (the muscle involved in smiling) and the M. corrugator supercilii (the muscle responsible for frowning) was recorded on the left side of the face using bipolar placements of 13/7 mm Ag/AgCl surface‐electrodes according to the guidelines established by Fridlund & Cacioppo (1986). In order to cover the recording of muscular activity participants were told that skin conductance would be recorded (see e.g. Dimberg et al., 2000). The EMG raw signal was measured with a BrainAmp amplifier (Brain Products Inc.), digitalized by a 16‐bit analogue‐to‐digital converter, and stored on a personal computer with a sampling frequency of 1000 Hz. Raw data were rectified offline and filtered with a 30 Hz low cutoff filter, a 500 Hz high cutoff filter, a 50 Hz notch filter, and a 125 ms moving average filter. The EMG scores are expressed as change in activity from the pre‐stimulus level, defined as the mean activity during the last second before stimulus onset. Trials with an EMG activity above 8 μV during the baseline period and above 30 µV during the stimuli presentation were excluded (less than 5 %). Before statistical analysis, EMG data were collapsed over the 6 trials with the same emotional expression of a specific character, and reactions were averaged over the 6 seconds of stimulus exposure. Individual difference measures. Since there is evidence that high empathic or high extraverted subjects tend to show stronger facial reactions (Sonnby‐Borgström, Jönsson, & Svensson, 2003) Empathy was assessed with the SPF (Paulus, 2009) and Extraversion with the EPQ‐RK (Ruch, 1999). Furthermore, the BFNE (Brief Fear of Negative Evaluation Questionnaire, Leary, 1983; German adaptation: Vormbrock & Neuser, 1983) was applied because Dimberg (1997) found differences in facial reactions of subjects high and low in social fear. 33 Manipulation Check. At the end of the experiment, participants answered three questions about each avatar’s happy, neutral and sad facial expression concerning valence, arousal and liking on 9‐point Likert scales. The specific questions were “How negative/positive do you find the picture?”, “How arousing do you find the picture?”, and “How much do you like the displayed person?”. Negative characters were expected to be rated as less positive and liked less than positive characters, with the neutral characters falling in between. No differences were expected for arousal ratings. Procedure Participants were tested individually in a laboratory room. They were told that they would complete several computer tasks designed to study the suitability and memorability of certain avatars for an upcoming computer game. After signing the consent form the EMG electrodes were placed. Participants were then asked to start the computer task. At the beginning of this task they were introduced to three different types of characters. For each type of character, the neutral expressions of the two avatars (one male, one female) of the same hairstyle were presented on the screen followed by their respective positive, neutral or negative traits. The fact that the hairstyle was the only feature in which the three characters differed was not explicitly mentioned to the participants but should have been obvious. After the introduction of the three types of characters participants completed a recall task. The neutral expression of one of the six avatars was presented on the screen, and participants were asked to press one out of three buttons to indicate its character (positive, neutral or negative). To ensure that participants remember the traits of the three character types throughout the study they were further informed that they will have to complete this recall task again right after the next part of the study. This next task for the participants was to look through the emotional facial expressions. For each participant three blocks of stimuli were presented with each block containing all the 18 facial expressions in random order, resulting in 54 picture presentation. Each picture was presented 6 seconds, preceded by a warning pitch tone and a fixation cross appearing in the center of the screen three seconds before picture onset. Inter‐trial‐intervals varied from 19 to 23 seconds. To ensure that participants paid attention to the stimuli they were told that they will be asked about the pictures later. During picture presentation, M. zygomaticus major and M. corrugator supercilii activity were recorded electromyografically. 34 Afterwards, the recall task was administered again, and the manipulation check and the questionnaires were completed. Finally, participants were asked about their ideas regarding the true purpose of the experiment. None was aware of the hypotheses, and none suspected that facial muscular reactions were measured. 2.1.1.3. RESULTS Manipulation Check For liking, valence and arousal ratings three separate repeated measures analyses of variance with the within‐subject factors Character (positive vs. neutral vs. negative) and avatar’s Emotion (happy vs. neutral vs. sad) were computed. The interaction of character and emotion did not gain significance for any rating, all p > .37. However, analyses revealed a significant main effect of character for liking ratings, F(2, 23) = 6.0, p = .010, ηp2 = .199. Following t‐tests revealed that positive characters (M = 5.69, SD = 1.03) were liked more than negative characters (M = 4.73, SD = 1.50), t(24) = 2.58, p = .016. Neutral characters (M = 5.54, SD = 1.43) were also liked more than negative characters, t(24) = 3.44, p = .002, but not less than positive ones, t(24) = .54, p = .592. These results confirm that indeed different explicit attitudes had been formed towards the negative versus the positive and neutral characters. No main effects of character for valence (positive characters: M = 5.36, SD = 1.10; neutral characters: M = 5.40, SD = 1.03; negative characters: M = 5.08, SD = 0.98) or arousal (positive characters: M = 4.72, SD = 1.22; neutral characters: M = 4.34, SD = 1.57; negative characters: M = 4.12, SD = 1.34) ratings could be observed, both ps > .11. Recall Task Analyses of the first recall task following the attitude manipulation revealed that no participant made more than two errors in the six classification trials (3 participants showed two errors, 2 participants showed one error). At the second recall task right after the presentation of the emotional stimuli none of the participants showed any error. Thus, all data were used for further analyses. Facial Reactions EMG data were analyzed with a repeated measures analysis of covariance with the within‐ subject factors Muscle (M. corrugator supercilii vs. M. zygomaticus major), avatar’s Emotion (happy vs. neutral vs. sad) and avatar’s Character (positive vs. neutral vs. negative). Empathy, extraversion 35 and social fear were entered as covariates to control for potential side effects due to individual differences. Analyses returned a significant effect of Empathy on the Muscle x Emotion x Character interaction, F(4, 40) = 5.2, p = .003, ηp2 = .213, but no effects of extraversion or social fear on any of the effects, all ps > .20. Therefore, further analyses were computed with empathy as covariate. A significant Muscle x Emotion x Character interaction was observed, F(4, 40) = 4.5, p = .006, ηp2 = .193, indicating different emotion specific facial reactions to positive, neutral and negative characters. No other main effect or interaction gained significance, all ps > .46. To further specify this interaction, separate follow up ANCOVAs for the M. zygomaticus major and the M. corrugator supercilii were calculated. M. zygomaticus major. Activity in M. zygomaticus major to happy faces increased with character valence, whereas no difference could be observed for neutral or sad faces (see Figure 6). This was verified by a significant Emotion x Character interaction, F(4, 44) = 2.8, p = .044, ηp2 = .119. Simple comparisons revealed a significant difference between M. zygomaticus major reactions to happy faces of positive characters (M = 0.61) as compared to neutral (M = 0.29) and negative characters (M = 0.18), p = .017 and p = .014, respectively, but not between neutral and negative, p = .222. No effects were observed for neutral or sad facial expressions, all ps > .18. Figure 6. Mean EMG change in μV for M. zygomaticus major in response to happy, neutral and sad faces for positive, neutral and negative characters. Error bars indicate standard errors of the means. 36 Additionally, one‐sample t‐tests against zero revealed only for positive and neutral characters a significant increase in M. zygomaticus major activity in response to happy facial expressions, t(24) = 3.0, p = .006 and t(24) = 2.6, p = .016, respectively. For negative characters the activation was only marginally different from zero, t(24) = 1.9, p = .072. Thus, happy faces of positive and neutral, but not of negative characters evoked a smiling response. M. corrugator supercilii. Congruent muscle reactions to sad facial expressions of positive characters were found, whereas in response to negative characters incongruent facial muscular reactions could be observed (see Figure 7). This was verified by a significant Emotion x Character interaction, F(4, 40) = 3.0, p = .044, ηp2 = .132. Simple comparisons revealed significantly stronger reactions to sad faces of positive characters (M = 0.19) as compared to sad faces of neutral (M = 0.03) and negative characters (M = ‐0.11), p = .020 and p = .005. Reactions to sad faces of neutral and negative characters differed, too, p = .051. Figure 7. Mean EMG change in μV for M. corrugator supercilii in response to happy, neutral and sad faces for positive, neutral and negative characters. Error bars indicate standard errors of the means. Notably, the specific reaction to sad faces of negative characters was a significant inhibition of M. corrugator supercilii activity, as a one‐sample t‐test against zero could confirm, t(24) = 2.1, p = .044. Further t‐tests against zero revealed a significant increase in M. corrugator supercilii activity in response to sad faces of positive characters, t(24) = 2.1, p = .049, but no significant change as compared to baseline for neutral characters, t(24) = 0.1, p = .947. 37 2.1.1.4. DISCUSSION Results show that the manipulation of participants’ attitudes towards avatars influences facial reactions to the avatars’ emotional facial expressions. Specifically, the activity of M. zygomaticus major in response to happy faces was reduced if the face belonged to a negative character as compared to a positive character. For M. corrugator supercilii a different pattern was observed: sad faces of positive characters evoked the strongest contraction whereas sad faces of negative characters evoked a relaxation of that muscle, the latter indicating incongruent facial reactions. It can be concluded that compared to positive characters negative characters elicit less congruent reactions to happy and even incongruent reactions to sad facial expressions. These results are the first to demonstrate in a controlled, experimental fashion that facial reactions to facial emotional expressions can be influenced by actual attitudes, i.e. when a specific affective position to the sender is held. They are further in line with the results reported by Herrera et al. (1998) and McHugo et al. (1991) who both found with a quasi‐experimental design less congruent or incongruent reactions in response to disliked politicians and ethnic outgroups. The results of Experiment 1 do not support the involvement of valence evaluations in the modulation of congruent and incongruent facial reactions to facial expressions. The valence hypothesis would have predicted a main effect of character valence on M. corrugator supercilii, i.e. a stronger corrugator reaction to all emotional expressions of negative characters. Instead, corrugator reactions did not differ in response to happy and neutral expressions of negative compared to neutral and positive characters. And they were even reduced in response to sad expressions of negative characters. Therefore, affective mechanisms in terms of valence evaluations are probably not involved in the modulation of congruent as well as incongruent facial reactions to emotional facial expressions. This leads to the conclusion that the mimicry hypothesis could be confirmed. However, the occurrence of incongruent facial reactions argues against the mimicry hypothesis and motor determinants as the only underlying mechanism. According to the assumptions in the working model (see 1.4.2.), a modulation of motor mechanisms cannot lead to muscular patterns like the observed relaxation in M. corrugator supercilii. Therefore, affective mechanisms should have been involved anyhow. Due to the rejection of the valence hypothesis emotional reactions are the only available explanation for the incongruent reactions. In line with that, the relaxation of the M. corrugator supercilii, which is normally a response to happy facial expressions (see the results for positive 38 characters), could be a sign of schadenfreude – a positive emotional response triggered by a negative event happening to a disliked, negative character. It should be noted that the manipulation check revealed an unexpected lack of differences in valence ratings. However, the expected difference in liking ratings between positive and negative characters gained significance. The neutral characters did not differ significantly from the positive ones but only from the negative ones indicating a slight positivity of the neutral characters. One might raise the question whether the present manipulation affected only explicit or also implicit attitudes. This can be of importance if we consider the above‐mentioned study by Yabar et al. (2006). Here, the authors observed behavioral mimicry in response to in‐ and outgroup members and found a positive correlation between mimicry and implicit liking but a negative correlation between mimicry and explicit liking. In the current literature, two different systems underlying the formation (and change) of explicit and implicit attitudes are postulated (e.g. Rydell, McConnell, Mackie, & Strain, 2006; Rydell & McConnel, 2006; Strack & Deutsch, 2004; Smith & DeCoster, 2000). In the approach introduced by Rydell et al. (2006), explicit attitudes, which refer to conscious knowledge and beliefs (Strack & Deutsch, 2004), are supposed to be formed quickly and flexibly in a non‐associative but rather abstract and conscious manner via a so called fast‐learning system. In contrast, implicit attitudes, which are based on associative structures to which people do not have conscious access (Strack & Deutsch, 2004), are assumed to be formed via a slow‐learning system, strengthening associations in memory step by step. However, in a recent study Gregg, Seibt and Banaji (2006) found that implicit attitudes could even be formed through abstract supposition. This suggests that in the present study, where attitudes have been formed by reports of others, implicit attitudes might have been manipulated as well. Furthermore, given the abstract nature of the avatar stimuli and the lack of possibilities to gain experiences with the characters, it seems very unlikely that participants had any contradicting implicit and explicit attitudes. In sum, the results show that facial reactions can be modified by experimentally manipulated attitudes. The valence hypothesis could not be supported meaning that valence evaluations were not strongly involved in the modulation of congruent and incongruent facial reactions in this study. Furthermore, motor mechanisms seem to play a major role instead. Though, they cannot account for the observed incongruent reactions in response to negative characters. Thus, a further determinant must be involved. This might have been the emotional reaction of schadenfreude. However, this is pure speculation. The present design does not allow conclusions about the involvement of emotional 39 reactions in the modulation of congruent and incongruent reaction. This should be achieved in the next experiments. 2.1.2. EXPERIMENT 2 2.1.2.1. THEORETICAL BACKGROUND AND HYPOTHESES The main purpose of Experiments 2a and 2b was to explore facial reactions to stigmatized people and to determine the involved mechanisms. In addition to Experiment 1 specific predictions are now also possible for emotional reactions. Therefore, three competitive hypotheses regarding facial reactions to the stigmatized group of HIV infected people were derived according to predictions of motor and affective mechanisms. Again, a comparison of the different hypothesized facial muscular reaction pattern with the actually observed pattern then allows conclusions about whether these reactions are mainly driven by reduced facial mimicry, by valence evaluations or by emotional reactions. Additionally, this is the first study at all examining the influence of stigmatization on facial reactions to emotional expressions. For Experiments 2a and 2b, HIV positive people were chosen as the stigmatized group. Additionally, the attribution of responsibility for the infection was varied. This allowed to make specific conclusions about two different types of stigmata, i.e. accidentally vs. negligently acquired ones. Whether a person is infected with HIV accidentally without personal responsibility (e.g. by means of a blood transfusion) or negligently as a direct consequence of her own behavior (e.g. engaging in unprotected sexual intercourse or injecting drugs) is important for the evaluation of and the reaction to this person. If HIV‐positive people are not responsible for their infection, people tend to show less avoidance reactions, tend to rate them less negatively than when the person is perceived as responsible and they report stronger feeling of pity and sympathy (Dijker & Koomen, 2003; Weiner, Perry, & Magnusson, 1988). The aim of the two following experiments was to explore facial reactions towards these two stigmatized groups and to determine whether those reactions are mainly driven by reduced facial mimicry, by valence evaluations or by emotional reactions to the negatively evaluated stigma. Interestingly, assumptions based on the three determinants of facial reactions lead to opposing hypotheses about the facial reactions to emotional expressions of stigmatized people which were tested against each other. For a better understanding of the competing hypotheses Figures 8‐10 40 illustrate the respective predicted patterns of results. As in Experiment 1, only corrugator reactions will be displayed in these Figures because predictions of the competing hypotheses do not differ for the M. zygomaticus major. Mimicry hypothesis. People show a wish for social distance and avoidance tendencies vis‐à‐vis the stigmatized group of AIDS patients even if they are totally aware of the fact that normal contact with an AIDS patient poses no danger to themselves (Corrigan, Edwards, Green, Diwan, & Penn, 2001; Crandall, 1991; Herek, Capitanio, & Widaman, 2002; Neumann et al., 2004; Phelan & Basow, 2007; Wilson, Lindsey, & Schooler, 2000). Those avoidance tendencies mean little desire for affiliation. Thus, given the affiliative function of mimicry (e.g. Lakin & Chartrand, 2003) this would lead to the prediction of a reduced amount of facial mimicry towards stigmatized people. Until now, no published study examined the relationship between stigmatization and facial mimicry up to now, and only one study exists on stigmata and behavioral, i.e. non‐facial, mimicry. In this study, Lucy Johnston (2002) investigated the effects of a stigma’s relevance for the to‐be‐ mimicked task on mimicry. Specifically, she measured participants’ ice‐cream consumption after they had observed a confederate eating a high or low amount of ice‐cream. This confederate was either neutral, obese or carried a facial birthmark. Non‐stigmatized confederates as well as those with facial birthmarks, but not obese ones were mimicked. Thus, stigmata that were associated with negative consequences of the task decreased mimicking behavior. However, when it comes to the imitation of emotional facial expressions it is unlikely that mimicry is seen as a possible way to obtain the stigma of HIV‐infection by oneself. Therefore, it was assumed that nearly all types of stigmata decrease the amount of facial mimicry. The mimicry hypothesis predicts reduced facial mimicry towards HIV‐positive people (see Figure 8). Specifically, less M. corrugator supercilii reactions to sad and less M. zygomaticus major reactions to happy faces of accidentally HIV‐infected as compared to healthy characters were expected. Because negligently infected HIV‐people are expected to evoke the strongest avoidance tendencies the smallest amount of congruent facial reactions was expected in response to them. 41 Figure 8. Patterns of results on M. corrugator supercilii in response to happy, neutral and sad faces for healthy, accidentally and negligently HIV‐infected characters as predicted by the mimicry hypothesis (see text for explanations). Valence hypothesis. Valence evaluations indicating negative attitudes towards people with AIDS as shown in a study by Neumann et al. (2004) on the other hand would lead to the assumption of an unspecific increase in corrugator muscle activity in reaction to HIV‐positive people. Accordingly, Vanman et al. (1997) found reduced positive and enhanced negative affective facial reactions to neutral expressions of persons carrying the social stigma of being Black. So, if (negative) valence evaluation is the predominant determinant of facial reactions to stigmatized groups then M. corrugator supercilii activity should increase in response to all kinds of facial expressions of HIV‐ positive people, i.e. also in response to neutral facial expressions. This main effect should add on the reactions elicited by the valence of the emotional expression per se. The valence hypothesis predicts a main effect of health status on M. corrugator supercilii and M. zygomaticus major responses (see Figure 9). Because negligently infected HIV‐people are evaluated as more negative than accidentally infected HIV‐people (Dijker & Koomen, 2003) they were expected to evoke the strongest M. corrugator supercilii activity across all expressions and the lowest M. zygomaticus major responses. Accidentally infected characters were expected to elicit moderate M. corrugator supercilii and M. zygomaticus major responses. Healthy characters were assumed to evoke the lowest M. corrugator supercilii and the highest M. zygomaticus major responses across all emotions. 42 Figure 9. Patterns of results on M. corrugator supercilii in response to happy, neutral and sad faces for healthy, accidentally and negligently HIV‐infected characters as predicted by the valence hypothesis (see text for explanations). Emotion hypothesis. The prominent model to conclude emotional reactions to stigmatized people with varying onset controllability is Weiner’s (1995) attributional model of helping behavior. According to this model and empirical evidence by Seacat, Hirschman and Mickelson (2007) and Dooley (1995) the predominant emotion towards AIDS patients per se is sympathy or pity. Specifically, sympathy is increased towards AIDS patients with low onset controllability (e.g. infection due to contaminated blood transfusions) and decreased towards patients with high onset controllability (e.g. infection due to sexual intercourse without protection). So, if emotional reactions are the predominant determinant of facial reactions to stigmatized groups then M. corrugator supercilii activity should increase most in response to accidentally infected HIV‐people because they elicit the strongest feelings of sympathy. This main effect should add on the reactions elicited by the emotional expressions via contagious processes per se. These predictions are made only for responses to neutral and sad but not happy expressions because no information are available on how pity reactions to positive emotional expressions might look like. Accordingly, the emotion hypothesis predicts a main effect of health status on M. corrugator supercilii responses to neutral and sad expressions (see Figure 10). Accidentally infected characters were expected to evoke the strongest sympathy and thereby the strongest M. corrugator supercilii activity in response to neutral and sad expressions. Because healthy characters were expected to 43 elicit no sympathy when displaying a neutral expressions and moderate sympathy when displaying a sad expression no M. corrugator supercilii responses to neutral and moderate M. corrugator supercilii responses to sad faces were predicted. Because negligently infected HIV‐people elicit only very few pity reactions they were assumed to evoke the lowest amounts of pity and thereby also the lowest M. corrugator supercilii responses to neutral and sad expressions. Figure 10. Patterns of results on M. corrugator supercilii in response to happy, neutral and sad faces for healthy, accidentally and negligently HIV‐infected characters as predicted by the emotion hypothesis (see text for explanations). 2.1.2.2. EXPERIMENT 2A I decided to use only two out of the three kinds of characters in a within subjects design. This was done in order to guarantee that participants would differentiate between all types of characters and to avoid any kind of “merging” of the two types of HIV positive characters in contrast to the healthy characters. The goal of Experiment 2a was a comparison of healthy and accidentally HIV‐infected people. Computer generated characters with different emotional expressions were introduced as healthy or HIV positive people and participants’ facial reactions to those expressions were recorded via EMG. 44 2.1.2.2.1. METHODS Design and participants The experiment followed a 2 (health status: healthy vs. accidental HIV infection) x 3 (emotion: happy vs. neutral vs. sad) x 2 (muscle: M. zygomaticus major vs. M. corrugator supercilii) factorial within‐subjects design. Participants were twenty‐seven psychology students in their first semester, who received course credits for participation. Data from one participant had to be excluded from further analysis due to equipment failure, resulting in a final sample of 26 females, aged between 20 and 29 years (M = 21.44, SD = 2.62). Materials and apparatus Stimuli. The same stimuli as in Experiment 1 were used with the following exceptions: Each male and female emotional expression was then combined with three types of hairstyles (blond, brown, and black hair), resulting in 18 stimuli. Finally, each avatar was combined with two shirt colors (blue vs. white), resulting in 36 stimuli. The two types of characters were created by assigning the health status to a shirt color. Healthy characters wore blue shirts, HIV positive characters white shirts1. Examples of the stimuli can be seen in Figure 11. Figure 11. Examples of avatar expressions used to elicit facial reactions Stigma manipulation. To cover the real purpose of the study, participants were told about the evaluation of a computerized training for medical students which should enable them to practice future interactions with real HIV patients. It was told that there were two groups of avatar characters in the training: one group of healthy individuals and one group of HIV patients. It was explicitly pointed out that the groups could be distinguished by the avatar’s shirt color. Afterwards, one representative of each group was presented along with a short description informing participants about their health status. HIV infection was explicitly attributed to circumstances beyond the person’s control (i.e. transfusion of contaminated blood after being involved in an accident). This 45 classification was accompanied by a general description of symptoms after the onset of the virus. To ensure that participants paid adequate attention to the given information they were informed about a subsequent memory task requiring them to assign the characters to the respective health status group. Facial EMG. The assessment and processing of facial reactions via EMG followed the procedure of Experiment 1. Explicit measures. Participants were asked to rate all neutral avatar faces in terms of valence and arousal. The exact questions were as follows: “How negative/positive do you find the picture?” and “How arousing do you find the picture?”. Procedure Participants were tested individually in a laboratory room. After signing a consent form, EMG electrodes were placed. All further instructions appeared on the computer screen. To cover the EMG recordings participants were told that skin conductance level was recorded. Following the introduction of the cover story and the healthy and HIV positive characters, participants were asked to complete a memory task, in which they had to assign the characters to their respective group by verbal classification. They were also informed that this task would be repeated at the end of the experiment to ensure that they would keep the characters in mind throughout the study. Then participants passively viewed pictures of the characters showing the emotional expressions, while activity of M. zygomaticus major and M. corrugator supercilii was recorded. The 36 avatar pictures (19 x 25 cm) were presented in randomized order for 6 seconds each, with inter‐trial intervals (ITIs) ranging from 19 to 23 seconds. Each expression was preceded by a fixation cross and a short warning pitch tone. Afterwards, the valence and arousal ratings were applied and the memory task was repeated. None of the participants made more than two errors in the first and second memory task. Thus, no exclusion of participants because of difficulties in remembering the categories was necessary. Finally, participants were asked about their ideas regarding the true purpose of the study, with none of them being aware of the hypotheses. 46 2.1.2.2.2. RESULTS Facial reactions M. zygomaticus major. A repeated measures analysis of variance with health status (healthy vs. accidental HIV infection) and emotion (happy vs. neutral vs. sad) as within‐subjects factors returned a main effect of emotion, F(2,24) = 5.24, p = .019, ηp2 = .173. This main effect was qualified by a significant Health Status x Emotion interaction (see Figure 12), F(2,24) = 4.41, p = .017, ηp2 = .150. Following t‐tests revealed that responses to happy expressions of healthy persons (M = 0.43) were larger than responses to happy expressions of HIV‐infected persons (M = 0.02), t(25) = 2.46, p = .021. No other differences between the two health status groups emerged, all ps > .22. t‐tests against zero revealed facial mimicry to happy expressions of healthy characters, t(25) = 2.39, p = .025, but a complete absence of M. zygomaticus major activity while viewing happy faces of accidentally HIV‐ infected characters, t(25) = 0.23, p = .817. No other effects reached significance, all ps > .16. Figure 12. Mean EMG change in µV for M. zygomaticus major in response to happy, neutral and sad faces for healthy vs. accidentally HIV‐infected characters. Error bars indicate standard errors of the means. M. corrugator supercilii. Main effects of health status and emotion were observed in a repeated measures analysis of variance, F(1,25) = 9.44, p = .005, ηp2 = .274, and F(2,24) = 8.79, p = .001, ηp2 = .260, respectively. Accidentally HIV‐infected characters elicited stronger M. corrugator supercilii responses (M = 0.14) than healthy characters (M = ‐0.08). Furthermore, corrugator 47 reactions to sad faces (M = 0.27) were stronger than reactions to neutral (M = ‐0.008) and happy (M = ‐0.18) faces, t(25) = 2.67, p = .013 and t(25) = 3.76, p = .001 (see Figure 13). Reactions to happy and neutral faces did not differ significantly, t(25) = 1.71, p = .100. The interaction Health Status x Emotion did not reach statistical significance, F < 1. Figure 13. Mean EMG change in µV for M. corrugator supercilii in response to happy, neutral and sad faces for healthy vs. accidentally HIV‐infected characters. Error bars indicate standard errors of the means. Ratings t‐tests were performed to test for differences between the health status groups on the valence and arousal measures. Healthy characters were perceived as more positive (M = 4.891) than those belonging to the accidentally infected HIV group (M = 4.56), t(25) = 2.63, p = .014. Furthermore, accidentally infected characters were judged to be significantly more arousing (M = 5.97) than healthy characters (M = 4.97), t(25) = 4.47, p < .001. 2.1.2.2.3. DISCUSSION OF EXPERIMENT 2A There was a complete lack of M. zygomaticus major reactions to accidentally HIV‐infected people. Furthermore, observed a main effect of health status on M. corrugator supercilii reactions was observed indicating stronger activity in response to accidentally infected HIV‐people compared to healthy characters. This effect rules out the mimicry hypothesis which predicted less corrugator 48 reactions to happy and sad expressions of stigmatized compared to healthy people. Instead, the findings are supportive for both the valence and the emotion hypothesis. However, it is not clear from the observed pattern of results, whether the (corrugator) reactions to HIV‐infected people are caused by valence evaluations or by emotional reactions. Experiment 2b is needed to shed more light on the nature of facial reactions to HIV positive people by contrasting accidentally and negligently infected HIV people. In comparison to the accidentally infected people negligently infected characters have a more negative valence and elicit less feelings of sympathy. 2.1.2.3. EXPERIMENT 2B The goal of Experiment 2b was to contrast the two types of HIV positive characters differing in cause of infection. Therefore, the healthy characters were replaced with HIV positive characters that caused their infection negligently by incautious behavior. Following the results of Experiment 2a no activity of M. zygomaticus major in response to facial expressions of either of the two characters was expected. In contrast, it was expected that M. corrugator supercilii reactions would be able to tell a lot more about the central question. The valence hypothesis predicts a stronger corrugator reactions to the negligently infected people due to their more negative valence. On the other hand, the emotion hypothesis predicts less corrugator reactions due to the weak sympathy feelings. 2.1.2.3.1. METHODS Design and participants The design was a 2 (path of infection: accidental vs. negligent) x 3 (emotion: happy vs. neutral vs. sad) x 3 (muscle: M. zygomaticus major vs. M. corrugator supercilii vs. M Levator labii superioris) design with all three factors manipulated within participants. A total of 36 female university students participated in the experiment, with data from 31 subjects going into statistical analysis. Two participants discovered the true purpose of the study and were excluded from further analyses; another three had to be removed due to technical difficulties. Participants were recruited via internet and received a compensation of 10 €. Age ranged from 19 to 29 years (M = 22.39, SD = 2.82). Materials, apparatus and procedure Materials, apparatus and procedure were identical to Experiment 1 with the following exceptions. The healthy group of characters was replaced by a group who caused the HIV infection 49 negligently by their own incautious behavior (i.e. sexual intercourse without protection). Again, different shirt colors (blue vs. white) indicated the respective path of infection of the characters. The assignment of shirt color to the specific path of infection was counter‐balanced2. Additionally, the activity of M. Levator labii superioris as an indicator of disgust towards the stigmatized group was recorded. Furthermore, explicit likeability ratings (“How much do you like the displayed person?”) were included in addition to valence and arousal ratings. Again, a memory task was applied before and after the EMG recordings. Analysis of the first memory task revealed that only four participants showed errors in categorization, with none of them making more than two errors. In the second memory task three participants showed one error each. Thus, no exclusion of participants because of difficulties in remembering the categories seemed necessary. 2.1.2.3.2. RESULTS Facial reactions M. zygomaticus major. The main effects emotion and path of infection as well as the Path of Infection x Emotion interaction did not reach statistical significance; all ps > .12 (see Figure 14). Figure 14. Mean EMG change in µV for M. zygomaticus major in response to happy, neutral and sad faces for accidentally vs. negligently HIV‐infected characters. Error bars indicate standard errors of the means. 50 M. corrugator supercilii. A repeated measures analysis of variance with path of infection (accidental vs. negligent) and emotion (happy vs. neutral vs. sad) as within‐subjects factors revealed a significant main effect of path of infection and a marginal main effect of emotion, F(1,30) = 5.03, p = .032, ηp2 = .144 and F(2,29) = 3.06, p = .070, ηp2 = .093, respectively. These main effects were qualified by a significant Path of Infection x Emotion interaction, F(2,29) = 4.82, p = .016, ηp2 = .138 (see Figure 15). Following t‐tests revealed effects of path of infection only for neutral and sad expressions, t(30) = 3.04, p = .005 and t(30) = 1.93, p = .063, respectively, but not for happy expressions, t(30) = 1.29, p = .205. M. corrugator supercilii reactions to sad and neutral faces were stronger when the expression was displayed by a character belonging to the accidental infection group (M = 0.29 for neutral and M = 0.35 for sad faces) as compared to a character who acquired the infection negligently (M = 0.08 for neutral and M = 0.21 for sad faces). One‐sample t‐tests against zero confirmed a significant increase in M. corrugator supercilii activity in response to neutral expressions of accidentally infected characters and to sad faces of members of both infection groups, all ps < .01. No other effects reached significance, all ps > .21. Figure 15. Mean EMG change in µV for M. corrugator supercilii in response to happy, neutral and sad faces for accidentally vs. negligently HIV‐infected characters. Error bars indicate standard errors of the means. M. Levator labii superioris. No main effects and no interactions did reach significance, all ps > .13 51 Ratings t‐tests revealed that characters from the accidental infection group were rated as more positive (M = 5.26) and more likeable (M = 5.63) than those from the negligent infection group (M = 4.67 for valence and M = 4.77 for liking), t(30) = 3.53, p = .001 and t(30) = 3.12, p = .004, respectively. No differences were found for arousal ratings, t(30) = 1.25, p = .222. 2.1.2.3.3. DISCUSSION OF EXPERIMENT 2B Again, a complete lack of M. zygomaticus major reactions to HIV‐infected people was observed. Furthermore, a significant interaction of Path of Infection and Emotion was observed. M. corrugator supercilii activity was stronger in response to neutral and sad expressions of accidentally compared to negligently infected HIV‐people. The difference in reactions to neutral expressions of accidentally and negligently HIV‐infected people cannot be explained with differences in valence evaluations since the more negative subtype of HIV positive people evoked less corrugator activity. So, the influence of valence evaluations on facial reactions to HIV positive people seems to be negligible and the valence hypothesis is not supported by the data. Instead, the observed pattern of facial reactions is clearly in line with predictions of the emotion hypothesis. The neutral as well as the sad expressions of the accidentally infected HIV‐ characters apparently provoked feelings of pity and sympathy that are reflected in the facial reactions. The pattern for happy expressions is not so clear. No effects of Path of Infection were observed here. Maybe, the expression of happiness is not congruent with the disease of AIDS and hence participants do not know how to react to such an unclear nonverbal statement. 2.1.2.4. DISCUSSION Both experiments 2a and 2b revealed a lack of M. zygomaticus major reactions to the stigmatized group of HIV positive people in general. This finding is in line with the findings of Vanman et al. (1997) on reduced positive affective facial reactions when facing persons carrying a stigma. Furthermore, increased M. corrugator supercilii activity in response to neutral and sad faces of accidentally HIV‐infected people compared to healthy (Experiment 2a) as well as negligently infected characters (Experiment 2b) was found. As t‐tests against zero revealed these changes in corrugator activity in response to the accidentally infected characters were indeed increases in activity. 52 The corrugator results cannot be explained by the mimicry hypothesis which predicted the strongest corrugator reactions to sad faces of healthy characters. It can also not be explained by the valence hypothesis which would have predicted that the most negative characters, i.e. negligently HIV‐infected people, elicit the largest corrugator activity. Accordingly, the data do not support the mimicry and the valence hypothesis. Facial mimicry and valence evaluations are not the main determinants of facial reactions to stigmatized groups. However, the observed pattern of results can be explained by the emotion hypothesis stating that emotional reactions are the predominant determinant of facial reactions to stigmatized groups. The relevant emotion in the context of a stigmatized group with low onset controllability is pity or sympathy (see Dooley, 1995; Seacat et al., 2007; Weiner et al., 1988; Weiner, 1995) and this is reflected in the strong corrugator reactions to neutral and sad faces of accidentally infected HIV‐ people. Although it would have also been in line with the emotion hypothesis, no effects of HIV infection on activity of M. Levator labii superioris were observed. This muscle’s activity was recorded as an indicator of disgust towards the stigmatized group because few studies reported such feelings towards HIV positive people (e.g. Varas‐Diaz & Marzan‐Rodriguez, 2007) or immoral persons and actions (Jones & Fitness, 2008; Schnall, Haidt, Clore, & Jordan, 2008). However, HIV positive characters in the present study did not evoke facial disgust reactions. It has to be mentioned that there was no pronounced corrugator response to neutral expressions of accidentally HIV‐infected people in Experiment 2a. In this experiment, the accidentally HIV‐infected people were not compared with a group of negligently infected characters but with healthy people. This difference in the type of the reference group might have led to different perceptions and evaluations of the accidentally HIV‐infected people. In contrast with the more positive group of healthy people they might have been perceived as more negative and less pitiful than in contrast with the more negative group of negligently infected characters. This assumption is also partly supported by the different valence ratings of accidentally infected characters in Experiment 2a and 2b. Thus, facial reactions might also depend on the actual scope of reference. The explicit ratings differed with respect to arousal. Participants reported to feel more aroused in response to HIV‐infected people than in response to healthy characters (Experiment 2a). Thereby, no differences between accidental and negligent infection were observed (Experiment 2b). This indicates a dissociation between implicit facial reactions and arousal ratings. Emotional arousal is 53 increased in response to the stigma of HIV in general but does not differ between the two ways of infection. On the other hand, facial reactions could very well differentiate between the two groups of HIV infected characters. This suggests that the differing emotional reactions reflected by the facial activity in response to the two groups of HIV‐people are not influenced by the perceived arousal but by different, probably more cognitively mediated processes. Although it is not the central scope of this work, some conclusion shall be drawn on the meaning of the observed results for the social life of stigmatized people. In Experiment 2b, characters who were negligently infected with HIV almost completely failed to evoke any facial reactions. At first sight, this lack of facial reactions to the negligent infection group only seems to reflect a strong lack of emotions. But given the affiliative signal character of facial reactions in general such a lack of automatic nonverbal behavior can also be seen as a strong distancing. Importantly, by signaling a low wish to affiliate this lack of facial reactions to negligently infected HIV‐people prohibits further interactions. It can further lead to social exclusion of stigmatized people as they are getting the feedback of not being welcome. On the other hand, the avoidance and discouragement of contact with HIV positive persons also prevents people from improving their attitudes about HIV and AIDS (see Intergroup Contact Theory; Pettigrew, 1998). In contrast, the enhanced facial reactions to accidentally HIV‐infected people promote further contact and thereby allow an improvement of attitudes. And according to the attributional helping model by Weiner (1995) and an empirical study by Dooley (1995) these enhanced pity reactions can also promote helping intentions and prosocial behavior towards the stigmatized group. However, an increase in facial reactions towards accidentally HIV‐infected people could only be found in response to negative emotional expressions. This means that stigmatized people lack reinforcement in the positive emotional domain. They may have problems to find somebody who shares happy moments with them by showing encouraging or relieved smiles or by just laughing out loud together. So, what do the observed results mean for our knowledge about the influence of stigmatization on facial reactions to emotional expressions? Facial reactions in response to people who became member of a stigmatized group without a fault of their own are probably mainly shaped by feelings of pity and empathic concerns. This was concluded from the stronger corrugator reactions to neutral and sad expressions of accidentally HIV‐infected people compared to both healthy and negligently infected characters. In my opinion, one can apply these results of facial reactions to HIV positive people to many other kinds of stigmata. The specific pattern will thereby depend on people’s attributions about the cause of the stigma and their beliefs about the stigmatized persons’ 54 responsibility for obtaining the stigma like e.g. being obese or mentally ill. Finally, it is not clear whether the results can be applied to all kinds of stigmata. The generalizability of the data could be limited to acquired stigmata like illnesses, homelessness or deliberately chosen group memberships. However, it seems plausible that facial reaction patterns to groups with inborn stigmata like race or disabilities look very similar to those to people with accidentally acquired HIV infection. Feelings of pity and sympathy might arise from carrying any kind of stigma that was not acquired negligently or deliberately. Therefore, further research is needed to explore the generalizability of these results to other kinds of stigmata. 2.1.3. REVISION OF THE WORKING MODEL Experiment 1 and Experiment 2 both could not give support for the involvement of valence evaluations in the modulation of congruent and incongruent facial reactions to facial expressions. In Experiment 1 the valence hypothesis would have predicted a main effect of character valence on M. corrugator supercilii for all expressions, i.e. stronger corrugator reactions for expressions of negative characters. Instead, corrugator reactions were the same (for happy and neutral expressions) or less (for sad expressions) in response to expressions of negative compared to neutral and positive characters. In Experiment 2, the valence hypothesis would have predicted that the most negative characters, i.e. negligently HIV‐infected people, elicit the largest corrugator activity. Instead they were found to be lower than for accidentally HIV‐infected people. Therefore, valence evaluations as the mechanism underlying the modulation of congruent as well as incongruent facial reactions to emotional facial expressions are supposed to play a negligible role and are therefore removed from the working model. Whereas Experiment 1 could not clearly differentiate between the mimicry and the emotion hypothesis, Experiments 2a and 2b do not support the mimicry hypothesis. Instead, they support the assumption that the modulation of congruent facial reactions is mainly driven by emotional reactions. No conclusion can be drawn from Experiment 2 concerning the modulation of incongruent reactions since no incongruent reactions were observed. However, incongruent reactions were observed in Experiment 1 and due to the exclusion of valence evaluations from the model emotional reactions are the only available explanation for that. Thus, the modulation of congruent as well as incongruent facial reactions by emotional reactions can be seen as supported and will remain in the working model. Due to the inconclusive results of Experiment 1 the influence of motor determinants 55 is not clearly ruled out so far. So, the path from the perception‐behavior link to congruent facial reactions will remain in the model, but with a weaker influence. Figure 16: The working model on mechanisms underlying facial reactions to emotional facial expressions after a first revision following Experiments 1 and 2 (see text for explanations). 2.2. EXPERIMENT 3 – PSYCHOLOGICAL MEDIATORS 2.2.1. THEORETICAL BACKGROUND Experiment 3 aims at identifying the psychological mediators that indicate motor and affective mechanisms. Meditational analyses shall be computed to test whether these variables can explain the contextual modulation of congruent and incongruent facial reactions. Motor mechanisms are assessed via the psychological mediator empathy. The choice of this mediator is based on predictions of the perception‐action model by Preston & DeWaal (2002). As mentioned before (see 1.2.2.) Preston and de Waal posit that the same perception‐action mechanism that is responsible for the occurrence of mimicry is also responsible for cognitive empathy. Therefore, a modulation of motor processes, i.e. the perception‐action or perception‐ behavior link, should manifest itself also in a modulation of the strength of empathy. Further evidence supporting a link between empathy and the perception‐behavior link comes from a study by Chartrand and Bargh (1999): They could show that people high in trait cognitive empathy, i.e. the 56 understanding of another person’s perspective (Decety & Jackson, 2004), engaged in significantly more mimicry than people low in trait cognitive empathy. Given this body of evidence it is assumed that empathy indicates the involvement of motor mechanisms in the contextual modulation of facial reactions to emotional facial expressions. Thereby empathy is neither seen as a cause nor as an outcome of facial reactions but as an outcome of the motor mechanisms underlying both phenomena. State measures of cognitive and emotional empathy are thus assessed as potential mediators in the following study. Additionally, a psychological mediator for clarifying the role of affective mechanisms in the contextual modulation of congruent and incongruent facial reactions to facial expressions is required. Therefore, subjective measures of the participants’ current emotional state in response to the presented emotional facial expressions are taken. Of course, responses depicting the current emotional state cannot be seen as depicting the mediating affective processes themselves. They are rather supposed to work as indicators of the emotional reactions that might underlie the facial reactions. To assess the relevant mediators effectively, independent groups, i.e. a between subjects design, are needed. Specifically, two contexts were chosen for which congruent as well as incongruent facial reactions and different kinds of emotional reactions to a vis‐à‐vis’ facial expressions can be expected: a cooperation condition for which congruent facial reactions have been shown, and a competition condition for which less congruent and even incongruent facial expressions can be expected (Lanzetta & Englis, 1989; Weyers et al., 2009; see 1.1.). Additionally, a neutral control condition is added, resulting In three independent conditions. The present study investigated facial muscular reactions to happy, neutral, sad and angry avatar expressions after creating explicit cooperation and competition contexts as well as a neutral control condition. Presenting both an affiliative (sad) and a non‐affiliative (angry) negative facial expression (Hess, Blairy, & Kleck, 2000; Knutson, 1996) in a cooperative vs. a competitive setting should provoke very different reactions and thereby offer the chance to investigate a wide range of processes involved in facial reactions. For example, facial mimicry is often only observed in response to affiliative expressions (Bourgeois & Hess, 2008), whereas affective reactions are expected to both kinds of faces. 57 2.2.2. METHODS Design and Participants The experiment consisted of a 3 (Context: cooperation vs. neutral vs. competition) x 4 (Emotion: happy vs. neutral vs. sad vs. angry) x 3 (Muscle: M. zygomaticus major vs. M. corrugator supercilii vs. M. Orbicularis oris) mixed design with Context as between subjects factor. As in all the other experiment, only female subjects were tested because women show more pronounced, but not qualitatively different mimicry effects than male subjects (Dimberg & Lundqvist, 1990). Seventy‐ seven female university students were recruited on campus and received 10 € for compensation. Data from 8 participants had to be excluded from analyses for the following reasons: one participant suspected muscle activity being recorded; two reported having a mental disorder, and five showed exceptionally high numbers of trials with artifacts. Analyses were computed for the remaining sample of 69 females (with 23 subjects in each context condition), aged between 19 and 38 years (M = 23.16, SD = 3.40). Procedure Participants were recruited for a study on the evaluation of characters for a computer game. After signing a consent form, EMG electrodes were placed and participants were informed about a game of dice which they would have to play later on. Actually, this game served as a manipulation of the context (cooperation vs. neutral vs. competition). Across all conditions they were told that they could win a price in this game. They were further told that they would have to play together with an avatar and that both players would throw two dice alternately. In the cooperation condition it was said that both players will win if the sum of their scores after a certain number of rounds exceeds a certain value; if the sum of both players’ scores does not exceed this value both will lose. In the competition condition participants were told that the one with the highest score will win the price. And in the neutral condition it was said that both players play for their own account. Although participant and avatar would throw the dice alternately each of them had to reach a certain score for winning. If both reach this score both will win, if only one of them reaches the score then only this player will win, and if no one reaches the score no one will win. Participants were also told that they will only see their own results but not the ones of the avatar. Instead, they will see the avatar’s facial expression in response to his result. Thereby, the avatars’ facial expressions now had a specified meaning in the certain context. 58 After these instructions participants played one example round that was not part of the crucial game. At first, it was the participant’s turn. She started and stopped the dice with the space bar. Afterwards, she saw the avatar’s dice, but instead of points the dice only showed some question marks. Then, a happy facial expression of a randomly chosen avatar was presented for six seconds. The next task for the participants was an experiment ostensibly designed to measure skin conductance in response to the avatars in order to test their suitability for the upcoming computer game. This cover story was created to disguise the recording of facial muscular reactions. In this task participants just had to look through facial expressions of 6 avatar characters. Participants were told that the upcoming expressions are potential reactions of the avatars to their throws of dice as they might occur in the game. Participants were also told that one of the avatars in this part would be their partner in the game later on. Each character was shown with a happy, neutral, sad and angry expression resulting in 24 expressions. The order of presentation of the facial expressions was randomized. To ensure that participants paid attention to the stimuli they were told that they will be asked about the pictures later. Each picture was presented for 6 seconds, preceded by a pitch warning tone and a fixation cross appearing in the center of the screen three seconds before picture onset. Inter‐trial‐intervals varied from 19 to 23 seconds. During picture presentation, M. zygomaticus major. M. corrugator supercilii and M. Orbicularis oculi activity were recorded electromyografically. This part of the experiment lasted about 12 minutes. At the end, participants completed the Positive and Negative Affect Schedule (PANAS, Watson, Clark, & Tellegen, 1988, German adaption: Krohne, Egloff, Kohlmann, & Tausch, 1996) to test for potential manipulation effects on mood. In order to maintain the cover story participants then played the game. It consisted of five rounds. In the cooperation condition participant and avatar had to reach 72 points for winning. In the neutral condition they had to reach 36 points. Across all conditions participants won the game. Thereafter, emotional reactions to all the avatars’ facial expressions were assessed, and therefore participants saw the avatar expressions again. They were asked to remember the game context and to indicate their amount of joy, sadness and anger in response to these faces on 9‐point Likert scales. Finally, cognitive and emotional empathy as well as situational demands (see below) were assessed and participants were probed for suspicion and debriefed. None was aware of the hypotheses, and none suspected that facial muscular reactions were measured. 59 Apparatus and Materials Facial EMG. The assessment and processing of facial reactions via EMG followed the procedure of Experiment 1. Avatar emotional facial expressions. The same stimuli as in Experiment 1 were used with the following exceptions: Four emotional facial expressions were created from a prototypical female and a prototypical male face: a neutral, a happy, a sad and an angry expression (for details see Spencer‐ Smith et al., 2001). Each male and female emotional expression was then combined with three types of hairstyles (blond, brown, and black hair), resulting in 24 stimuli. Cognitive Empathy. Cognitive Empathy was assessed with the “Reading the Mind in the Eyes Test” (Baron‐Cohen, Wheelwright, Hill, Raste, & Plumb, 2001, German adaption: Bölte, 2005). In this test, participants see photographs of pairs of eyes and are asked to choose one of four words that best describes what the person in the photograph is feeling or thinking. Originally, this test was designed to measure theory of mind, i.e. the ability to understand mental states of another person (Premack & Woodruff, 1978). Therefore, it is well suited for measuring what Decety and Jackson (2004) define as cognitive empathy. It has been shown that the performance in this test is sensitive to experimental manipulations (Domes, Heinrichs, Michel, Berger & Herpertz, 2007). Emotional empathy. Since, to my knowledge, there is no existing state measure of emotional empathy, a method applied by Batson et al. (1997) originally used in a study on the influence of empathy on group attitudes was modified. Participants in Experiment 1 by Batson et al. (1997) heard an interview with a young women being ill with AIDS. Afterwards, they had to complete a questionnaire assessing several emotional responses to this woman’s plight. Participants were asked to read excerpts of this interview and to further complete a questionnaire on this text. This questionnaire consisted of 24 items measuring distress (8 items), sadness (4 items) and emotional empathy (6 items). Additionally, 6 items assessing positive emotional states were added as filler items. The empathy items were sympathetic, compassionate, soft‐hearted, warm, tender, and moved. Situation demands. Participants had to complete a questionnaire evaluating certain characteristics and demands of the respective game context. These demands were assessed as they might depict reasons for a strategic use of facial expressions. Specifically, participants were asked to remember the game context and rate on a 5‐point Likert scale how important the following aspects 60 are in such a situation: to appear likeable, to have a harmonious and smooth interaction, to understand the other person’s feelings and thoughts, to communicate one’s own feelings and thoughts, nonverbal communication, to hide one’s own feelings and thoughts, a coolly and non‐ emotional atmosphere, to appear superior to the other player. Individual difference measures. There is empirical evidence that high empathic individuals show stronger facial reactions (Sonnby‐Borgström et al., 2003) compared to low empathic subjects. Thus the SPF (Saarbrücker Persönlichkeitsfragebogen, Paulus, 2009), a German version of the Interpersonal Reactivity Index (IRI, Davis, 1980), was used to check for empathy differences between the two experimental groups. The SPF has reliability and validity coefficients comparable to the IRI, e.g. Cronbach α = .78 (see Paulus, 2009). Statistical analyses To determine the processes behind the context effects on facial reactions mediational analyses following the recommendations by Baron and Kenny (1986) were performed. According to this approach, three relationships between the target variables must be demonstrated in a series of regression analyses to establish a basis for testing mediation. The independent variable must predict both the dependent and the mediator variable, and the mediator must predict the dependent variable. Once these conditions are established, the dependent variable is regressed onto the independent variable in a final regression analysis with the mediator as an additional predictor. Support for mediation is obtained by demonstrating that the effect of the independent variable (context) on the dependent variable (muscle activity) is significantly reduced when accounting for the effect of the hypothesized mediator. Finally, the time course of the competing processes was examined by computing the mediational analyses for each second of stimulus presentation. This was done to shed light on the question whether different processes are involved in the muscular reactions during the 6 seconds of stimulus presentation. 61 2.2.3. RESULTS Individual difference measures A univariate ANOVA subjects factor for the trait empathy scores (SPF) with Context as between did not show significant group differences, F(2, 66) = 1.0, p = .367. Facial EMG M. zygomaticus major (see Figure 17). A repeated measure ANOVA revealed a significant main effect of Context, F(2, 66) = 5.3, p = .007, ηp2 = .139, and a marginal main effect of Emotion, F(3, 64) = 2.4, p = .071, ηp2 = .035.These effects were qualified by a significant interaction of Emotion and Context, F(6, 128) = 2.4, p = .031, ηp2 = .068. Figure 17. Mean EMG change in μV for M. zygomaticus major in response to happy, neutral, sad and angry faces in the cooperation, neutral and competition condition. Error bars indicate standard errors of the means. Separate follow‐up ANOVAs for each emotion revealed a significant main effect of Context for happy avatar expressions, F(2, 66) = 7.4, p = .001, ηp2 = .184. Following t‐tests revealed a significant difference regarding M. zygomaticus major reactions to happy faces between the competition condition (M = ‐0.07) and the neutral (M = 0.30) as well as the cooperation condition (M = .51), t(44) = 2.5, p = .015, and t(44) = 3.9, p < .001, respectively, whereas the neutral and the cooperation condition did not differ, t(44) = 1.3, p = .196. Furthermore, a significant main effect of 62 Context for sad expressions could be observed, F(2, 66) = 4.0, p = .023, ηp2 = .108. M. zygomaticus major activation to sad faces was stronger in the cooperation (M = 0.37) as compared to the neutral (M = ‐0.01) and the competition condition (M = .03), t(44) = 2.2, p = .036, and t(44) = 2.2, p = .039, respectively. No Context effects occurred with neutral or angry facial expressions, all ps > .31. In sum, congruent facial reactions to happy faces were only observed in the neutral and cooperation, but not the competition condition. Furthermore, there was a significant M. zygomaticus major activation to sad expressions in the cooperation condition M. corrugator supercilii (see Figure 18). A repeated measures ANOVA revealed main effects of Emotion, F(3, 64) = 11.0, p < .001, ηp2 = .143, and Context, F(2, 66) = 4.1, p = .021, ηp2 = .110, and a significant interaction of Emotion and Context, F(6, 128) = 3.0, p = .020, ηp2 = .084. Follow up ANOVAs for each emotion separately revealed significant main effects of Context for reactions to sad and angry expressions, F(2, 66) = 7.8, p = .001, ηp2 = .192 and F(2, 66) = 3.5, p =.037, ηp2 = .095, respectively, but not for reactions to happy and neutral expressions, both ps > .13. Following t‐tests revealed significant differences between M. corrugator supercilii reactions to sad faces in the competition condition (M = ‐0.35) compared to the neutral (M = 0.39) and the cooperation condition (M = 0.47), t(44) = 4.2, p < .001, and t(44) = 3.2, p = .003, respectively. Also, M. corrugator supercilii activity in response to angry faces was lower in the competition condition (M = ‐0.41) compared to the neutral (M = 0.15) and the cooperation condition (M = 0.15), t(44) = 2.3, p = .028 and t(44) = 2.4, p = .019, respectively. The neutral and the cooperation conditions did not differ significantly concerning reactions to sad and angry expressions, all ps > .74. Furthermore, the reactions to sad and angry faces in the competition condition were incongruent responses. This was revealed by significant t‐tests against zero, t(22)= 2.6, p = .017, and t(22) = 2.7, p = .013, respectively, indicating a decrease in M. corrugator supercilii activity. In sum, activity of M. corrugator supercilii in response to sad and angry faces was lower in the competition condition compared to the neutral and the cooperation condition. 63 Figure 18. Mean EMG change in μV for M. corrugator supercilii in response to happy, neutral, sad and angry faces in the cooperation, neutral and competition condition. Error bars indicate standard errors of the means. M. Orbicularis oculi. Neither the main effect of Context nor the interaction Emotion x Context gained significance, F(2, 66) = 1.0, p = .360, ηp2 = .030, and F(6, 128) = 1.5, p = .185, ηp2 = .044, respectively. Emotional Empathy An ANOVA with the between factor Context did not reveal a main effect of Context, F < 1. Thus, emotional empathy was not influenced by competition or cooperation and can therefore be excluded as potential mediator of the facial reactions. Cognitive Empathy Analyses revealed a significant main effect Context on the “Reading the Mind in the Eyes Test”, F(2, 66) = 5.8, p = .005, ηp2 = .149. Participants in the neutral (M = 25.2) and the cooperation (M = 26.3) condition scored higher than those in the competition condition (M = 23.1), t(44) = 2.2, p = .033, and t(44) = 3.2, p = .003, respectively. No difference could be observed between the neutral and the cooperation condition, t(44) = 1.2, p = .237. 64 Mediational analyses. Mediational analyses were computed to test whether the competition effects on facial reactions occurred due to the decrease in cognitive empathy. Therefore, three regression analyses were conducted for the effects of the context condition on facial reactions to happy, sad and angry expressions separately with cognitive empathy as mediator. Regression analyses revealed that Context as well as cognitive empathy predicted M. zygomaticus major reactions in response to happy expressions, β = .358, t(44) = 2.5, p = .015 and β = .684, t(44) = 6.2, p < .001, respectively. When entered together, cognitive empathy still had a significant effect on zygomaticus reactions to happy expressions, β = .634, t(44) = 5.5., p < .001, whereas the effect of Context turned non‐significant, β = .158, t(44) = 1.4, p = .176 (see Figure 19). The corresponding Sobel test confirmed the conclusion of a full mediation, z = 2.0, p = .041. Figure 19. Mediational analysis for cognitive empathy mediating the Context effect on zygomaticus reactions to happy facial expressions. For M. corrugator supercilii reactions to sad and angry expressions no mediation by cognitive empathy was observed. When entered into the regression analysis together with Context, cognitive empathy remained no longer a significant predictor for reactions to both expressions, β = ‐.186, t(54) = 1.4, p = .168, and β = ‐.209, t(54) = 1.4, p = .167, respectively. In summary, cognitive empathy only mediated the effect of competition on M. zygomaticus major reactions to happy facial expressions. 65 Emotional reactions. From the assessed emotional reactions (see Table 1) only joy in response to happy, sad and angry expressions and sadness in response to sad expressions were affected by the context condition, all other ps > .16. Thereby, the effect of Context on joy in response to seeing happy expressions was only marginally significant, F(2,66) = 2.5, p = .088, ηp2 = .071. Specifically, in the cooperation condition (M = 6.6) participants reported feeling more joy when seeing happy expressions compared to the neutral (M = 5.0) and the competition condition (M = 5.1), t(44) = 2.0, p = .051 and t(44) = 1.8, p = .073, respectively. No difference could be observed between the neutral and the competition condition, t(44) = 0.2, p = .868. Table 1. Mean scores (SEMs) in emotional reaction ratings (joy, sadness, anger) for emotional facial expressions (happy, neutral, sad, angry) separated for the cooperation, the neutral, and the competition condition. Condition Expression Rating Angry Cooperation Joy 2.43 (0.53) Sadness 3.13 (0.49) Anger 5.39 (0.71) Happy Joy 6.57 (0.59) Sadness 2.04 (0.43) Anger 2.57 (0.60) Neutral Joy 4.09 (0.34) Sadness 3.04 (0.40) Anger 3.00 (0.49) Sad Joy 2.96 (0.58) Sadness 5.13 (0.55) Anger 3.26 (0.52) Neutral 3.52 (0.44) 2.70 (0.42) 4.17 (0.64) 4.96 (0.55) 3.78 (0.51) 4.39 (0.37) 2.39 (0.42) 3.70 (0.51) 2.70 (0.50) 5.30 (0.59) 3.65 (0.62) 5.09 (0.55) 3.61 (0.52) 4.22 (0.49) 2.48 (0.34) 4.26 (0.48) 2.87 (0.45) 2.74 (0.39) 2.30 (0.32) 2.87 (0.51) Competition 2.78 (0.37) 5.87 (0.52) 3.00 (0.44) 2.39 (0.39) Note: Rating scales ranged from 1 = not at all to 9 = very much. 66 Furthermore, the main effects Context for joy in response to sad and angry expressions as well as for sadness in response to sad expressions gained significance, F(2, 66) = 7.9, p = .001, ηp2 = .193, F(2, 66) = 7.7, p = .001, ηp2 = .189 and F(2, 66) = 4.8, p = .012, ηp2 = .126, respectively. Participants in the competition condition experienced more joy in response to sad avatar expressions (M = 5.9) than participants in the neutral (M = 3.0) and in the cooperation condition (M = 3.7), t(44) = 3.0, p = .005 and t(44) = 3.7, p = .001, respectively. The same pattern could be observed for angry avatar expressions. Participants indicated more elevated joy in the competition (M = 5.3) compared to the neutral (M = 3.5) and the cooperation condition (M = 2.4), t(44) = 2.4, p = .020 and t(44) = 3.6, p = .001, respectively. Joy ratings between the neutral and the cooperation condition in response to sad and angry avatar expressions did not differ, t(44) = 1.0, p = .344 and t(44) = 1.6, p = .121, respectively. Furthermore, participants in the competition condition (M = 3.0) reported less sadness in response to sad avatar expressions than those in the neutral (M = 4.3) and the cooperation condition (M = 5.1), t(44) = 2.0, p = .057 and t(44) = 3.0, p = .004, respectively. Again, control and cooperation conditions did not differ, t(44) = 1.2, p = .239. Mediational analyses. Mediational analyses were used to test whether the competition effects on incongruent reactions to sad and angry expressions occurred due to effects in emotional reactions. Regression analyses revealed that Context (competition vs. neutral) as well as joy in response to sad expressions predicted M. corrugator supercilii reactions in response to sad expressions, β = .536, t(44) = 4.2, p < .001 and β = ‐.524, t(44) = 4.1, p < .001, respectively. When entered together, joy in response to sad expressions and Context both remained significant predictors of zygomaticus reactions to happy expressions, β = ‐.366, t(44) = 2.8., p = .007 and β = .387, t(44) = 3.0, p = .005 (see Figure 20). However, the corresponding Sobel test revealed a partial mediation indicated by a reduction in the effect of Context, z = 2.0, p = .042. No additional mediation of the M. corrugator supercilii reactions to sad expressions by the emotional reaction of sadness could be observed. When entered together into a regressional analysis, sadness in response to sad avatar expressions was no longer a significant predictor, β = ‐.199, t(44) = 1.5, p = .136. Finally, a full mediation could be observed for competition effects on incongruent reactions to angry avatar expressions (see Figure 21). M. corrugator supercilii reactions to angry expressions were predicted both by Context (competition vs. neutral) and joy in response to angry avatar expressions, β = .325, t(44) = 2.3, p = .028 and β = ‐.532, t(44) = 4.2, p < .001. When entered together, joy in response to angry expressions was still a significant predictor, β = ‐.477, t(44) = 3.5., p = .001, whereas Context turned non‐significant, β = .161, t(44) = 1.2., p = .240. The Sobel test confirmed the 67 influence of the mediator, z = 2.0, p = .045. In sum, emotional reactions mediated the effects of competition on incongruent M. corrugator supercilii reactions to sad and angry avatar expressions. Figure 20. Mediational analysis for joy ratings mediating the Context effect on corrugator reactions to sad facial expressions. Figure 21. Mediational analysis for joy ratings mediating the Context effect on corrugator reactions to angry facial expressions. Situation demands Explorative analyses were computed with Context as predictor and all situation demand items as dependent variables. A significant main effect Context could only be observed for the item on how important one finds having a harmonious and smooth interaction in the respective situation, 68 F(2, 66) = 5.3, p = .008, ηp2 = .141, all other F < 1. Specifically, t‐tests revealed higher indicated demands of harmony and smoothness in the cooperation (M = 4.2) as compared to the neutral (M = 3.5) and the competition condition (M = 3.4), t(44) = 2.4, p = .020 and t(44) = 3.2, p = .002, respectively. The neutral and the competition condition did not differ, t(44) = 0.6, p = .521. Mediational analyses. Further explorative analyses were computed to examine whether situation demands can account for the Context effects on facial reactions. A mediational analysis was computed on M. zygomaticus major reactions to sad expressions with the harmony and smoothness item as potential mediator. Regression analyses revealed that Context (neutral vs. cooperation) as well as harmony demands predicted M. zygomaticus major reactions to sad expressions, β = .313, t(44) = 2.2, p = .034 and β = .527, t(44) = 4.1, p < .001, respectively. When entered together, harmony demands still had an effect on zygomaticus reactions to sad expressions, β = .476, t(44) = 3.5, p = .001, whereas Context was no longer a significant predictor, β = .146, t(44) = 1.1, p = .295 (see Figure 22). This full mediation was further confirmed by the corresponding significant Sobel test, z = 2.0, p = .048. Figure 22. Mediational analysis for harmony and smoothness demands mediating the Context effect on zygomaticus reactions to sad facial expressions. Thus, the demand to have a harmonious and smooth interaction served as a significant mediator of the effects of cooperation on incongruent M. zygomaticus major reactions to sad avatar expressions. That is, to smile in response to a sad collaborator seems to be a strategic mean for keeping a smooth interaction and attaining the common goal. 69 Time course analyses (see Table 2) M. zygomaticus major. The differences in congruent reactions on M. zygomaticus major, i.e. the main effect Context for happy expressions, as well as the mediation of these effects by cognitive empathy could already be detected in the first second and persisted for the whole 6 seconds of stimulus exposure. The increase in M. zygomaticus major to sad expressions, i.e. the main effect Context for sad expressions, in the cooperation condition as well as the mediation of these effects by situation demands could only be observed beginning from the third second. Table 2. p‐values for context effects and mediations of these effects for each second of stimulus exposure. Second of stimulus exposure 1st 2nd 3rd 4th 5th 6th Zygomaticus Happy Main effect context .051+ .008* .004* .004* .009* .034* Mediation by cognitive empathy .059+ .054+ .045* .043* .058+ .058+ Main effect context .251 .126 .049* .025* .060+ .048* Mediation by situation demands .673 .185 .086+ .048* .042* .056+ Corrugator Sad Main effect context .007* .005* .002* .014* .030* .017* Mediation by cognitive empathy .707 .174 .151 .367 .279 Mediation by joy .538 .095+ .038* .038* .044* .066+ Mediation by sadness .750 .163 .148 .338 .545 .288 Sad Angry .401 .115 + .079 .268 .217 .919 .984 .981 .788 .418 .051+ .060+ .051+ .116 .160 Main effect context * .008 .108 Mediation by cognitive empathy .551 Mediation by joy .092+ Note: * p < .05, + p < .10 70 M. corrugator supercilii. The context effects on M. corrugator supercilii reactions to sad expressions, i.e. the main effect Context for sad expressions, are present for every single second during the 6 seconds of stimulus presentation. These incongruent reactions in the competition condition are mediated by joy ratings in all seconds except for the first second. A mediation by cognitive empathy or sadness could not be observed for any second. The incongruent reactions on M. corrugator supercilii to angry expressions, i.e. the main effect Context for sad expressions, could be observed in the first and fourth second only whereas there was a trend in the second and third second. Accordingly, the mediation of these reactions by joy could also only be observed for the first four seconds. There was no mediation of the effects by cognitive empathy in any of the seconds. 2.2.4. DISCUSSION The EMG results replicate findings by Lanzetta and Englis (1989) and Weyers et al. (2009) by showing less congruent reactions to happy and even incongruent reactions to sad expressions in the competition compared to a neutral condition. Specifically, congruent M. zygomaticus major reactions occurred in response to happy facial expressions in the cooperation and the neutral condition. No such reactions could be observed in the competition condition. Furthermore, congruent M. corrugator supercilii reactions to sad expressions could be observed in the cooperation and the neutral condition whereas incongruent reactions, i.e. a relaxation in the corrugator muscle which is usually indicating a reaction to a positive stimulus (Larsen et al., 2003), occurred in the competition condition. A similar pattern of incongruent reactions was observed in response to angry expressions in the competition condition. Whereas in the neutral (and the cooperation) condition there were no M. corrugator supercilii reactions, which fits well with the results of Bourgeois and Hess (2008) and the non‐affiliative signal value of anger, participants in the competition condition showed the same relaxation of M. corrugator supercilii in response to angry as in response to sad expressions. Mediational analyses show that the lack of congruent facial reactions to happy expressions in the competition condition can be explained by a decrease of state cognitive empathy. The additional time course analyses showed that this mediation was already present in the first second and persisted for the whole 6 seconds of stimulus exposure. Thus, the reduction of congruent facial reactions seems to be mediated by a modulation of the strength of the perception‐action or perception‐behavior link. This is the first evidence in this series of experiments suggesting that motor processes mediate the effects of the context on congruent facial reactions to emotional faces. 71 However, such a mechanism could only be observed for effects on congruent reactions to happy faces. The modulations of reactions to sad and angry competitors were not mediated by empathy and accordingly motor mechanisms. Instead, affective processes, i.e. the emotional reaction of joy, mediated the competition effects on incongruent reactions, partially for reactions to sad and fully for reactions to angry expressions. A mediation by joy makes sense if we consider that participants were told that the anger expression was directed toward a third object, i.e. the dice result. In this case, a positive affective reaction is reasonable because a negative outcome for the other one means good prospects for oneself. These reflective affective reactions had very early onsets as can be seen from the time course analyses. Thus, it can be concluded that in incongruent facial reactions affective processes in terms of emotional reactions are involved. This holds true for the relaxation of the M. corrugator supercilii (the frowning muscle) in the competition condition in response to sad as well as to angry expressions. Such a relaxation is usually seen in response to positive stimuli (e.g. Larsen et al., 2003) and can be best interpreted as schadenfreude due to a disadvantage of or harm to the competitor. However, no support was found for the involvement of affective processes in the modulation of congruent facial reactions. Although some evidence was found for the phenomenon of emotional contagion at all (participants in the cooperation condition reported feeling more joy in response to happy facial expressions of the interaction partner and less sadness in response to sad expressions in the competition condition), the data do not provide evidence for an influence of emotional contagion on any of the facial reactions. Therefore, it can be concluded that emotional contagion is not a main determinant of congruent facial reactions. Remarkably, an unexpected incongruent reaction on M. zygomaticus major in response to sad expressions in the cooperation condition was observed. At first sight, this reaction seems counterintuitive. However, mediational analyses revealed that this kind of incongruent reactions does not reflect an affective reaction of joy. Further evidence that speaks against a reaction of real joy or felt happiness comes from M. Orbicularis oculi results. According to Ekman, Davidson and Friesen (1990), this muscle should be involved if a smile is genuine and felt. However, the results did not show any involvement of the orbicularis muscle in the reaction to sad expressions in the cooperation condition. Instead, mediational analyses revealed that the M. zygomaticus major response to sad expressions of a collaborator can be fully explained by strategic processes aiming at creating and maintaining a smooth and harmonious interaction. Furthermore, this increase in M. 72 zygomaticus major activity did not occur until the third second of stimulus exposure and therefore seems to be based on reflective, cognitive processes. Thus, the increase in activity of the M. zygomaticus major can be interpreted as a strategic and unfelt emotional expression in order to encourage the interaction partner and to prevent the cooperation from failing. Additionally, participants in the cooperation condition did also show M. corrugator supercilii activation in response to a sad expression. Usually, the two muscles zygomaticus and corrugator are either negatively correlated when observing a positive stimulus (e.g. Larsen et al., 2003), or they are not connected to each other at all. The present double activation of both muscles therefore leads to assume that they are not caused by the same process. Instead, it suggests that there are at least two underlying processes that might have contributed independently to the final facial expression. On the one hand those were strategic processes leading to an encouraging smile. And on the other hand a congruent frowning reaction was observed. Unfortunately, it cannot be computed whether the latter resulted from motor or affective processes because there were no differences in corrugator reaction between the cooperation and the control condition. An unexpected finding was the lack of cooperation effects on congruent facial reactions. Research proposing that mimicry enhances liking and smoothens interactions (Chartrand & Bargh, 1999; Chartrand, Cheng, & Jefferis, 2002; Lakin & Chartrand, 2003; Lakin et al., 2003) led to the assumption of an increase in facial mimicry of a vis‐á‐vis’ affiliative expressions in a cooperative setting. But except for the above analyzed M. zygomaticus major reaction in response to sad expressions no differences in reactions between the cooperation and the neutral condition could be found. One possible explanation for this lack of effects might be that the assumption of an enhanced activation of the perception‐behavior link (Chartrand & Bargh, 1999) – the mechanism behind mimicry – was wrong. According to the Perception‐Action Model of Preston and de Waal (2002) such an enhanced activation might have unwanted and negative side effects because it suggests that a strengthened perception‐action link would also mean more empathy. And enhanced empathy would not only mean understanding what the interaction partner feels but also experiencing the same feelings. But reactions like suffering with him or her may not be functional in such a setting because it could lead to a sad mood on both sides and thereby endanger a successful outcome of the cooperation. For a successful cooperation it might be more promising to keep the activation of the perception‐action link at a normal level at the expense of only average harmony and liking. This assumption is supported by the lack of effects on the empathy measures and would explain the lack of differences in mimicry between the neutral and the cooperation condition. Actually, this 73 interpretation of cooperation as an expedient alliance rather than an affiliative interaction is speculative. But if it holds true it would change our understanding of cooperative settings into a less harmonious and more success oriented fashion. In sum, it was shown how explicit competition and cooperation manipulations affect facial reactions to emotional facial expressions. Cognitive empathy was identified as a psychological mediator of the context effects on congruent but not incongruent facial reactions indicating that motor mechanisms underlie the modulation of congruent facial reactions. It was further demonstrated that incongruent reactions are mainly reflecting affective processes in terms of emotional reactions. Third, it was found that besides mimicry and emotional reactions a third kind of processes – namely strategic processes – is involved in facial reactions to emotional facial expressions. And there were hints in the data suggesting that different mechanisms can contribute independently to a facial muscular reaction pattern. 2.2.5. REVISION OF THE WORKING MODEL II As the first experiment in this line of research Experiment 3 supported the assumption that motor mechanisms are involved in the contextual modulation of congruent facial reactions. Specifically, cognitive empathy as a psychological indicator of the strength of the perception‐ behavior link mediated the impact of a competition context on M. zygomaticus reactions to happy facial expressions. Thus, the path from the perception‐behavior link to congruent facial reactions is accentuated in the second revised version of the working model (see Figure 23). Experiment 3 also gave the first actual evidence for the involvement of affective mechanisms, namely emotional reactions, in the modulation of incongruent facial reactions. Whereas in Experiment 1 it could only be speculated via logical exclusion that emotional reactions should be the process underlying incongruent reactions, there is now clear evidence. Emotional reactions of joy mediated competition effects on incongruent M. corrugator supercilii reactions to sad as well as angry expressions. Accordingly, the involvement of emotional reactions in the modulation of incongruent reactions is confirmed. In contrast to the findings of Experiment 2, the involvement of affective mechanisms in the modulation of congruent reactions could not be verified. In Experiment 3, none of the mediations with congruent emotional reactions turned significant. Therefore, the path from emotional reactions to congruent facial reactions was qualified in the second revised version of the working model. 74 Additionally, the involvement of a third class of processes was observed. Specifically, in Experiment 3 an unexpected incongruent reaction on M. zygomaticus major in response to sad expressions in the cooperation condition was found. Mediational analyses revealed that this reaction can be fully explained by strategic processes aiming at creating and maintaining a smooth and harmonious interaction. Thus, the increase in activity of the M. zygomaticus major can be interpreted as a strategic and unfelt emotional expression in order to encourage the interaction partner and to prevent the cooperation from failing. Figure 23. The working model on mechanisms underlying facial reactions to emotional facial expressions after a second revision following Experiment 3 (see text for explanations). This third class of mechanisms that can be involved in the modulation of facial reactions shall be termed strategic processes. Our face often reflects intentions to give a certain impression or to appear socially desired. For example, people suppress anger in public or show a Non‐Duchenne smile (e.g. Ekman et al., 1990) when the boss tells a bad joke. Other examples of strategically motivated facial behavior are smile expressions in job interview settings that are often produced by applicants for impression management purposes (DePaulo, 1992; Deutsch, 1990). Such masking or faking of emotional expressions often occurs automatically and without conscious awareness (e.g. smiles in an interview; Schmidt, Cohn, & Tian, 2003). However, to my knowledge Experiment 3 is the first study 75 that observed the influence of such strategic processes on facial reactions to emotional facial expressions. Since there was only one case of such strategic incongruent reactions observed in one out of the five studies and no evidence exists whether such mechanisms are also involved in the modulation of congruent reactions, strategic impression formation processes are only tentatively included in the revised model. Though, further studies are required to shed light on the nature of these processes. One idea for such a study would be to suggest a concrete interaction strategy to participants. Promising strategies for gaining knowledge about the involvement of strategic processes in both congruent and incongruent reactions might be for example to give a good impression or to hide thoughts and feelings. 2.3. EXPERIMENTS 4 & 5 – NEURONAL CORRELATES Experiments 4 and 5 are designed to assess the neuronal correlates of the motor and affective mechanisms and to determine how they are related to the modulation of congruent and incongruent facial reactions. The perception‐behavior link (Chartrand & Bargh, 1999) and the perception‐action model (Preston & de Waal, 2002) offer a structured approach to identifying also neural correlates of motor mechanisms. Both models assume that perceiving a certain behavior automatically activates its respective representation. Due to an overlap this activation spreads to autonomic and somatic reactions that are tied to that behavior. Thereby, an execution of the observed behavior is facilitated. Accordingly, by looking at specific motor mechanisms by which contexts can modulate facial reactions one can pay attention to two different parts of the model: On the one hand, already the perception of a facial expression can be modified and on the other hand the social modulation can occur due to a modified strength of the spread of activation. According to Chartrand and Bargh the latter would mean a modulation of the “link” per se. Experiment 4 aims at testing the hypothesis that changes in the early face perception are responsible for the modulation of congruent facial reactions. The strength of perception is thereby assessed via event‐related potentials in the EEG. Experiment 5 will examine the neuronal correlates of the “link”, i.e. the mirror neuron system, as well as neuronal correlates of affective processes via functional magnetic resonance imaging. 76 2.3.1. EXPERIMENT 4 2.3.1.1. THEORETICAL BACKGROUND AND HYPOTHESES Experiment 4 aims at investigating whether a change in the strength of perception can explain the contextual modulation of facial reactions to facial expressions. According to the perception‐ behavior link (Chartrand & Bargh, 1999) and the perception‐action model (Preston & de Waal, 2002) the strength of perception is directly related to the strength of the spread of activation from perception to the execution of an action and thereby to the strength of the resulting mimicry behavior. In the perception‐action model the authors talk about attended perception. This suggests that variations of the strength of perception mean variations in the strength of attention to a stimulus. Accordingly, Preston and de Waal (2002) posit that “attended perception of the object’s state automatically activates the subject’s representations of the state, situation, and object, and that activation of these representations automatically primes or generates the associated autonomic and somatic responses, unless inhibited“ (p. 4). They further state: „The more interrelated the subject and object, the more the subject will attend to the event, the more their similar representations will be activated, and the more likely a response.” (Preston & de Waal, 2002, p. 5). However, this assumption remains untested in the perception‐action model. Nevertheless, it leads to the assumption that the strength of attention that is directed to an emotional expressions is related to the strength of the activated representations and thus to the amount of mimicry behavior. Strength of attention is usually assessed via early event‐related potentials (ERPs, e.g.. Luck & Hillyard, 1994; Wijers, Mulder, Okita, Mulder, & Scheffers, 1989). ERPs are highly sensible regarding very small changes in neural activity and have a high temporal resolution. The latter allows conclusions about the involved processes from the point in time at which changes are observed. Therefore, event‐correlated potentials are chosen as the best measure for the measurement of the strength of attention in the current experiment. There are several empirical studies that support the assumption that social contexts can modulate the strength of attention as indicated via ERPs (N100, N170, P200). The most prominent moderator of the perception of faces in literature is race. Ito and Urland (2003) presented participants with neutral facial expressions of persons belonging to a racial in‐ or outgroup. They found a significant modulation of the N100 and the P200, i.e. both components associated with basic attention processes, by race. The N100 is a preattentive negative event‐related potential involved in 77 the perception of new and unexpected stimuli (Wang, Mouraux, Liang, & Iannetti, 2008). The P200 is a positive event‐realted potential associated with attentional processing of perceptual cues like faces. In contrast to other early potentials it is also sensitive to the emotional and motivational significance of a face (Amodio, 2010). Ito, Thompson and Cacioppo (2004) further found a modulation of the N170 by target race. The N170 component is an event‐related negative potential which peaks around 170 msec over posterior temporal scalp electrodes and is found to be evoked by faces but not by other animate and inanimate non‐face stimuli (Bentin, Allison, Puce, Perez, & McCarthy, 1996). It is further discussed to reflect the structural encoding of a face and therefore to indicate detection rather than identification of human faces (Eimer, 2000). The structural correlate of the N170 is most probably the fusiform gyrus or fusiform region (Eimer, 2000). Accordingly, Golby, Gabrieli, Chiao, and Eberhardt (2001) found that faces belonging to an observer’s own race evoke a greater neural response in the fusiform region. Finding an influence of a face’s race on the N170 component therefore indicates that already early structural face processing is affected by the social context. But results concerning this issue are mixed. While Herrmann et al. (2007) as well as Stahl, Wiese and Schweinberger (2008) and Walker, Silvert, Hewstone and Nobre (2008) found larger N170s to other‐race compared to own‐race faces, Ito and Urland (2005) found the opposite pattern. And other studies did not find an influence of race at all (e.g. Caldara, Rossion, Bovet, & Hauert, 2004). However these findings are difficult to interpret because in the case of race familiarity and attitudes are confounded. People are most likely more familiar with the perception of own‐race faces than other‐race faces and therefore have a higher expertise for these faces. At the same time people are also more likely to have positive attitudes towards own‐race faces. Therefore it is difficult to conclude whether effects of race on the N170 are really caused by different attitudes or simply by different levels of expertise for these faces. For example, Caharel et al. (2002) reported more negative N170 components for familiar compared to unfamiliar faces. This could be an alternative explanation of the above mentioned findings by Ito and Urland (2005) of a larger N170 towards own‐race faces. However, studies which purely investigate the effect of attitudes towards a face are so far rare. Pizzagalli, Lehmann, Hendrick, Regard, Pasqual‐Marqui, and Davidson (2002) for example investigated at which stage liking influenced face processing. They found that the attitude towards the face had a significant effect on the N170 amplitude as pictured by a more negative component in response to liked compared to disliked and neutral faces. However Pizzagali et al. (2002) only 78 assessed participants’ liking of a face without experimentally manipulating this attitude and thus many confounding variables such as e.g. attractivity may have instead caused the effects. Furthermore, there is also evidence suggesting a relation between the amplitude of attention indicating ERPs and the amount of congruent facial reactions. Achaibou, Pourtois, Schwartz and Vuilleumier (2008) presented participants with emotional facial expressions. At the same time, EEG and EMG were recorded simultaneously. They found a significant correlation between congruent facial reactions to happy and sad expressions on M. zygomaticus major and M. corrugator supercilii and the event related potentials P100 and N170. The authors interpret these effects as based on based on attention: an increase in strength of attention leads to a deeper processing of the facial expressions and thereby to an increase what they call facial mimicry. The present experiment therefore aimed at measuring the event‐related P100, N170 and P200 potential as well as facial reaction towards happy, neutral and sad faces for which the attitude the participants held was experimentally manipulated according to Experiment 1. Attitudes were chosen as the relevant contextual moderator for two reason. First, no study examined the influence of experimentally manipulated attitudes on ERPs so far. Second, by applying the same manipulation as in Experiment 1 it shall be able to answer the open question of whether motor or affective processes where involved int he modulation of congruent reactions in Experiment 1. The current working model predicts that motor mechanisms are involved in the modulation of congruent facial reactions. Therefore it is expected that the strength of attention can explain this modulation. No relation should be observed with incongruent reactions. Furthermore, it is expected that the strength of attention as indicated by the N170 should be moderated by attitudes. The direction of this effect is left open because evidence on race, attitudes and the N170 is mixed (see above). 79 2.3.1.2. METHODS Participants Twenty‐five right‐handed female participants were investigated. Only female subjects were tested because women show more pronounced, but not qualitatively different mimicry effects than male subjects (Dimberg & Lundqvist, 1990). All participants signed an informed consent prior to participation and received 12€ allowance. Two participants had to be excluded from the analysis due to insufficient quality of the recorded EEG. Therefore, analyses were performed for 23 participants, aged between 19 and 38 years (M = 22.35, SD = 3.04). Stimuli and apparatus Emotional facial stimuli. The facial stimuli were the same as in Experiment 1, displaying the emotions happiness and sadness as well as a neutral facial expression. The facial stimuli have successfully been used in a previous EEG experiment (Mühlberger, et al., 2009). Attitude manipulation. The attitude manipulation procedure was the same as in Experiment 1 with the following exceptions: The neutral characters were skipped because they did not satisfactorily stand out from the other two. Facial EMG. Activity of the M. zygomaticus major (the muscle involved in smiling) and the M. corrugator supercilii (the muscle responsible for frowning) was recorded and processed according to Experiment 1. Before statistical analysis, EMG data were collapsed over the 24 trials with the same emotional expression of a specific character, and reactions were averaged over the 4 seconds of stimulus exposure. EEG measurement and data analyses. The electroencephalogram (EEG) was recorded according to the 10‐20 system from 28 Ag‐AgCl electrodes (Fp1, Fpz, Fp2, F7, F3, Fz, F4, F8, T7, C3, Cz, C4, T8, P9, P7, P3, Pz, P4, P8, P10, PO7, PO9, O9, O1, O2, O10, PO10, PO8) at a sampling rate of 1,000 Hz. Electrodes were attached to an elastic cap (BrainCAP, Munich, Germany). The left mastoid (M1) was used as reference during data acquisition. Both vertical and horizontal EOGs were recorded. Electrode impedance was kept below 5 kΩ. Using a Brain‐amp‐MR amplifier (Brain Products, Munich, Germany) and the software Brain Vision Recorder Version 1.03 (Brain Products, Munich Germany) electroencephalograms were amplified and 80 recorded. Amplifier bandpass was set to 0.1 – 250 Hz. EEG data were analysed offline using the software Brain Vision Analyzer Version 1.05. Off‐line the sampling rate was subsequently reduced to 200 Hz and data were mathematically re‐referenced to an average reference of all 28 EEG electrodes. Data were lowpass filtered with 35 Hz and corrected for horizontal and vertical ocular artefacts using a method introduced by Gratton, Coles, and Donchin, (1983). The continuous data was segmented into trials lasting from 200ms before stimulus onset to 800ms after stimulus onset and were baseline corrected with reference to the mean baseline interval (100ms before stimulus onset). Trials with a gradient criterion of more than 50 µVolt were rejected as artefacts. Furthermore trials that exceeded an amplitude criterion of ± 100 µVolt were excluded from further analysis. Trials were averaged for each experimental condition and participant. The P100 component was defined as the peak positive amplitude between 80 and 120 ms on occipital sites, i.e. O1 and O2. The N170 component was defined as the peak negative amplitude between 140 and 170 ms on lateral parietal sites, i.e. P7 and P8. Finally, the P200 component was defined as the peak positive amplitude between 150 and 250 ms at Pz. Manipulation check. At the end of the experiment participants were asked to rate the facial stimuli according to valence, emotional arousal and likeability on a nine‐point visual analogue scale. Following Experiment 1, negative characters were expected to score worse on the liking scale than the positive characters. This served as manipulation check of the explicit attitude towards the characters. Procedure After arriving at the laboratory, participants were informed about the procedure of the experiment and were asked to give informed consent. They were told that the experiment was designed to study the avatars’ suitability for a future computer game. EMG, EEG and EOG electrodes were then attached in a shielded cabin and the experimental procedure was started. Participants were presented with the blond and black haired avatars and their respective traits. After gaining an impression of the attitudes belonging to the different avatars, participants conducted the short memory task where they were asked to indicate whether a presented avatar was characterised by positive or negative attitudes. Following this the EEG recording session started. Each stimulus was repeated 12 times, i.e. a total of 144 facial stimuli were presented in a randomized order. Faces were displayed for 4000 ms after a fixation cross had been presented for 2000 ms to ensure that participants were focusing on the centre of the screen. The intertrial interval varied randomly 81 between 4000 and 6000ms. Participants were instructed to simply view the pictures without any further task. After the EEG recording participants repeated the memory task and were asked to rate the facial stimuli according to valence, emotional arousal and likeability. Finally participants completed a questionnaire regarding demographic data, were paid and thanked. 2.3.1.3. RESULTS Manipulation Check Separate repeated‐measures ANOVAs with the within‐subject factors attitude (positive vs. negative) and emotion (happy vs. neutral vs. sad) were computed for valence, arousal and liking ratings. Valence ratings revealed a main effect of attitude, F(1,22) = 4.95, p = .04, ηp2 = .18. Avatars who were characterized by negative traits were rated as less positive (M = 4.30) than avatars with positive traits (M = 4.66). No main effect of attitude was found for arousal or liking ratings, all ps > .43. Furthermore, there were no significant interactions of attitude and emotion, all ps > .19, for any of the three ratings. Recall Task Analyses of the first and second recall task following the attitude manipulation revealed that no participant made more than one error in the six classification trials. Thus, all data were used for further analyses. EEG measures A repeated measures analysis of variance with the within‐subject factors attitude (positive vs. negative) and emotion (happy vs. neutral vs. sad) was conducted for P100, N170 and P200. This revealed a significant main effect of attitude on the N170, F(1,22) = 4.85, p = .04, ηp2 = .18. No other attitude main effects or interactions with attitude could be observed. A larger N170 was found in response to pictures of negative (M = ‐4.07 µV) compared to positive characters (M = ‐2.96 µV), as can be seen in Figure 24. No other effect gained significance, all ps > .31. 82 ΔµV positive character negative character time (ms) Figure 24. ERPs of mean of electrodes P7 and P8 separated for positive and negative characters. Arrows indicate the N170 component. EMG measures A repeated measures analysis of variance with the within‐subject factors muscle (M. zygomaticus major vs. M. corrugator supercilii ), attitude (positive vs. negative) and emotion (happy vs. neutral vs. sad) was conducted. This revealed a significant main effect of muscle, F(1,22) = 7.46, p = .01, ηp2 = .25, a significant main effect attitude, F(1,22) = 6.26, p = .02, ηp2 = .22, a significant Muscle x Emotion effect, F(2,21) = 3.51, p = .04, ηp2 = .14, a significant Attitude x Emotion effect, F(2,21) = = 6.52, p = .01, ηp2 = .23, and a significant three‐way interaction Muscle x Attitude x Emotion, F(2,21) = 4.30, p = .03, ηp2 = .16. No other effect gained significance, all ps > .15. To further specify this 83 interaction, separate follow up ANOVAs for the M. zygomaticus major and the M. corrugator supercilii were calculated. M. zygomaticus major. As prediced, activity in M. zygomaticus major to happy faces was larger for positive compared to negative characters (see Figure 25). This was verified by a significant Attitude x Emotion interaction, F(2, 21) = 3.53, p = .04, ηp² = .14. Following t‐tests revealed a significant difference between M. zygomaticus major reactions to happy faces of positive characters (M = .21) as compared to negative characters (M = ‐.02), t(22) = 2.95, p = .01. No effects were observed for neutral or sad facial expressions, all ps > .35. Additionally, one‐sample t‐tests against zero revealed only for positive characters a significant increase in M. zygomaticus major activity in response to happy facial expressions, t(22) = 5.34, p < .01. For negative characters the activation was not different from zero, t(22) = .40, p = .69. Thus, happy faces of positive, but not of negative characters evoked a smiling response. Figure 25. Mean EMG change in μV for M. zygomaticus major in response to happy, neutral and sad faces for positive and negative characters. Error bars indicate standard errors of the means. M. corrugator supercilii. As predicted, activity in M. zygomaticus major to sad faces was larger for positive compared to negative characters (see Figure 26). This was verified by a significant Attitude x Emotion interaction, F(2, 21) = 5.05, p = .02, ηp² = .19. Following t‐tests revealed a significant difference between M. corrugator supercilii reactions to sad faces of positive characters 84 (M = .25) as compared to negative characters (M = ‐.18), t(22) = 2.49, p = .02. No effects were observed for happy or neutral facial expressions, all ps > .21. Additionally, one‐sample t‐tests against zero revealed only for positive characters a significant increase in M. corrugator supercilii activity in response to sad facial expressions, t(22) = 3.05, p < .01. For negative characters the activation was not different from zero, t(22) = 1.56, p = .13. Thus, sad faces of positive, but not of negative characters evoked a frowning response. Figure 26. Mean EMG change in μV for M. corrugator supercilii in response to happy, neutral and sad faces for positive and negative characters. Error bars indicate standard errors of the means. Correlational analyses Correlational analyses were computed to test whether the modulation of facial reactions by attitudes can be explained by changes in early face perception processes. Therefore, the difference between positive and negative characters (i.e. the attitude effect) on the N170 amplitude to happy and sad faces was correlated with the differences between positive and negative characters on facial reactions, namely zygomaticus reactions to happy and corrugator reactions to sad faces. A significant correlation between the attitude effect on zygomaticus reactions to happy expressions and the attitude effect on the N170 amplitude to happy faces was observed, r = .66, p < .01 (see Figure 27). The correlation between the attitude effect on corrugator reactions to sad faces and the attitude effect on the N170 amplitude to sad faces was not significant, r = ‐.03, p = .88 (see Figure 28). 85 Figure 27. Correlation of the difference between positive and negative characters (i.e. the attitude effect) on the N170 amplitude to happy facial expressions with the differences between positive and negative characters on M. zygomaticus major reactions to happy facial expressions. 2.3.1.4. DISCUSSION The EMG results of Experiment 1 could be replicated. Happy and sad faces of positive characters evoked congruent facial muscular reactions. Happy faces of negative characters evoked less congruent and sad faces of negative characters even incongruent reactions indicated by a relaxation in M. corrugator supercilii. More importantly, the present experiment revealed the first experimental evidence of a modulation of the N170 by attitudes. Thereby it allows inferring a causal influence of attitudes on the structural encoding of faces. It is furthermore in line with previous findings showing that contexts can already modulate early stages of face perception (e.g. Ito & Urland, 2003). In this experiment a larger negativity of the N170 component was found towards faces for which a negative attitude existed. Given that negative attitudes are usually also held in response to people of ethnic outgroups (see e.g. Stewart, van Hippel, & Radvansky, 2009) this finding is perfectly in line with results by Herrmann et al. (2007) and Stahl et al. (2008) who report larger N170s towards other‐race faces. With the present 86 findings which are based on an experimental manipulation of attitudes one could now conclude that the effects observed by Herrmann and Stahl were in fact caused by different attitudes instead of differences in familiarity with the faces. The enhanced N170 towards faces for which negative attitudes exist can be interpreted as caused by differences in information processing. Following Stahl et al. (2008) the stronger N170 may reflect a more featural or a disturbed holistic processing of negative compared to positive characters. This argumentation is supported by studies showing that own‐race faces which elicit a smaller N170 are processed rather holistically whereas other‐race faces which elicit a larger N170 are processed rather featurally (Tanaka, Kiefer, & Bukach, 2004) as well as studies demonstrating larger N170s when holistic, configural processing is disturbed (Rossion et al., 2000). To my knowledge the only other study investigating the isolated influence of attitudes on face processing is Pizzagali et al. (2002), who also reported an effect of liking of a face on the N170. In their case a stronger N170 was found towards liked faces. Since their study, however, did not experimentally manipulate participants’ attitudes and the liking of a face was also confounded with a higher attractiveness of the face their results might not have been caused by attitudes alone. The data do not only show that perceptual information is attended differently depending on affective attitudes towards the particular faces. They further reveal that the difference in the N170, i.e. indicating the early face encoding, between positive and negative characters is correlated with the difference in congruent reactions on M. zygomaticus major between happy expressions of positive and negative characters. Given the assumption that changes in early face encoding mean changes in the strength of perception this suggests that motor mechanisms were involved in the modulation of congruent facial reactions by attitudes. Actually, this conclusion is a bit exaggerated because the correlative evidence of a relation between the N170 and congruent facial reactions does not allow causal inference. However, given the temporal evidence, i.e. the N170 occurs 140‐170 ms after stimulus onset whereas facial reactions cannot be observed until 300‐400 ms after stimulus onset, a causal interpretation might be in part applicable. Given that evidence, the mimicry hypothesis of Experiment 1 can be seen as confirmed for the modulation of congruent facial reactions. It might seem counter‐intuitive that stronger amplitudes of the N170 in response to negative characters are connected to less congruent facial reactions. This discrepancy could be solved if we assume that a heightened N170 does not mean a stronger perception in a qualitative but in a 87 qualitative way. According to Stahl et al. (2008) a stronger N170 may reflect a more featural or a disturbed holistic processing. A less holistic processing might in turn mean an inhibiton of automatic processes due to the need to conscientously examine the outstanding stimulus. Thus, such an inhibition of automatic processes might also affect automatic spreads of activation like the one from preception to action. Such an involvement of motor mechanisms could, however, only be observed for the modulation of congruent facial reactions to happy expressions. No relation between the modulation of the N170 and the incongruent reactions on M. corrugator supercilii to sad expressions was found. This is perfectly in line with the working model’s assumption that motor mechanisms cannot lead to incongruent facial reactions. In sum, the current working model in its second revision can be seen as affirmed and needs no further revision. Specifically, evidence has been found that supports the involvement of motor mechanisms in the modulation of congruent but not incongruent facial reactions. Since no affective mechanisms have been assessed no conclusions concerning their influence can be made. 2.3.2. EXPERIMENT 5 2.3.2.1. THEORETICAL BACKGROUND AND HYPOTHESES In Experiment 5 the investigation of neuronal correlates shall be extended to the observation of involved brain areas. Thereby, a focus will be put on areas underlying motor as well as affective mechanisms and their involvement in the modulation of congruent and incongruent facial reactions. In the present experiment an attitude manipulation according to Experiment 1 will be applied as social context. Participants will see happy, neutral and sad facial expressions of positive and negative characters while BOLD response and facial reactions are measured simultaneously in an MRI scanner. Current literature (see 1.2.4.) considers the mirror neuron system (MNS) as the brain area(s) depicting the motor mechanism underlying (facial) mimicry. The MNS consists of the ventral premotor cortex, the inferior parietal cortex (both concerned with the motoric aspects of the action) and the inferior frontal gyrus (concerned with the goal of an action) (Iacoboni & Dapretto, 2006). Iit can be seen as the instance responsible for what is described as activation spread from perceptual to executive representations (= the perception‐behavior “link”) in the perception‐behavior link model by Chartrand and Bargh (1999). Evidence suggesting a relationship between activity of the mirror 88 neuron system and mimicry as well as congruent facial reactions is reported at chapter 1.2.4. and will not be repeated at this point. Furthermore, there is also evidence suggesting that the strength of activity of the MNS depends on the social context. Previous studies have reported modulation of mirror system activity by (social) context (Fogassi et al., 2005; Gutsell & Inzlicht, 2010; Iacoboni et al., 2005; Shimada & Abe, 2010; Umilta et al., 2001; van Schie, Mars, Coles, & Bekkering, 2004). For example, Shimada and Abe (2010) could show that the activity of the MNS differs between situations in which a competitor is winning a game compared to a competitor losing a game. Furthermore, Gutsell and Inzlicht (2010) observed a lower suppression of EEG oscillations in the mu frequency at scalp locations over the primary motor cortex when participants watched outgroup compared to ingroup members. Mu suppression has recently been used to index activity of the mirror neuron system, i.e. when mu power is suppressed, motor neurons are active and a simulation of the perceived action can be expected (Marshall, Bouquet, Shipley, & Young, 2009; Perry & Bentin, 2009). These findings indicate that the mirror neuron system does not always similarly resonate to the observed action, but that the degree of its involvement is affected by the context. However, no study examined the influence of attitudes on the activity of the MNS in response to emotional facial expressions. To sum up, evidence exists showing both the possibility of a contextual modulation of the MNS and the participation of the MNS in mimicry. Thus, changes in the strength of activity in the MNS may be responsible in the modulation of congruent facial reactions. This is also the hypothesis derived from the revised working model. According to a review by Iacoboni and Dapretto (2006) the human mirror neuron system comprises the following regions: the ventral premotor cortex, the inferior frontal gyrus, the inferior parietal cortex and the superior temporal sulcus. Hence, these anatomical areas are defined as regions of interest for the current experiment. Neuronal correlates of affective mechanisms are expected in prominent areas of emotion processing. From the revised working model it is expected that affective mechanisms underlie the modulation of incongruent facial reactions. Specifically, the amygdala is assumed to reflect processing of mainly negative but also positive stimuli (Berntson, Bechara, Damasio, Tranel, & Cacioppo, 2007; for a review see Pessoa, 2010). The insula is expected to play a role in both positive and negative affective processes (e.g. Berntson et al., 2010). The striatum on the other hand is often also called the reward system and is therefore the prominent structure connected to the processing of positive stimuli (e.g. Mobbs et al., 2003; Schultz et al., 2002). To sum up, the regions of interest for 89 the current experiment depicting areas involved in the processing of positive and negative stimuli are: amygdala, insula, striatum (nucleus accumbens, caudate, putamen). Finally, following the results of Experiment 4 it is expected that the fusiform gyrus which is supposed to be the neuronal structure behind the N170 (Eimer, 2000) is involved in the modulation of congruent reactions. Therefore the fusiform gyrus is also defined as a region of interest for the current fMRI study. 2.3.2.2. METHODS Participants Thirty‐three right‐handed female participants were investigated. As in all the other experiments only female subjects were tested because women show more pronounced, but not qualitatively different mimicry effects than male subjects (Dimberg & Lundqvist, 1990). All participants signed an informed consent prior to participation and received 12€ allowance. Eleven participants had to be excluded from the analysis due to incomplete recordings or insufficient quality of the MRI data. Therefore, analyses were performed for 22 participants, aged between 18 and 28 years (M = 22.41, SD = 2.42). Stimuli and apparatus Emotional facial stimuli. The facial stimuli were the same as in Experiment 1, displaying the emotions happiness and sadness as well as a neutral facial expression. The stimuli were presented on a light gray background via MRI‐compatible goggles (VisuaStim; Magnetic Resonance Technologies, Northridge, CA). Attitude manipulation. The attitude manipulation procedure was the same as in Experiment 1. To keep this information present during the fMRI‐task, participants were furthermore told that they would have to repeat this task at the end of the experiment. Manipulation check. At the end of the experiment participants were asked to rate the facial stimuli according to valence, emotional arousal and likeability on a nine‐point visual analogue scale. Following Experiments 1 and 3, negative characters were expected to score worse on the valence and/or the liking scale than the positive characters. This served as manipulation check of the explicit attitude towards the characters. 90 Facial EMG. Activity of the M. zygomaticus major (the muscle involved in smiling) and the M. corrugator supercilii (the muscle responsible for frowning) was recorded and processed according to Experiment 1 with the following exceptions. The EMG raw signal was measured with a MRI‐ compatible BrainAmp ExG MR amplifier (Brain Products Inc.) using MRI‐compatible electrodes (MES Medizinelektronik GmbH, Munich, Germany). Additionally, a specific correction algorithm for eliminating MR artifacts was applied during processing with the Software Vision Analyzer 2.0 (Brain Products Inc.). Before statistical analysis, EMG data were collapsed over the 12 trials with the same emotional expression of a specific character, and reactions were averaged over the 4 seconds of stimulus exposure. Image acquisition Image acquisition followed the standard procedure in our lab (see Mühlberger et al., 2010): Functional and structural MRI was performed with a Siemens 1.5 T MRI whole body scanner (SIEMENS Avanto) using a standard head coil and a custom‐built head holder. Functional images were obtained using a T2*‐ weighted single‐shot gradient echoplanar imaging (EPI) sequence (TR: 2500 ms, TE: 30 ms, 908 flip angle, FOV: 200 mm, matrix: 64x64, voxel size: 3.1x3.1x3mm3). Each EPI volume contained 25 axial slices (thickness 5 mm, 1mm gap), acquired in interleaved order, covering the whole brain. The orientation of the axial slices was parallel to the AC–PC line. Each session contained 475 functional images. The first eight volumes of each session were discarded to allow for T1 equilibration. In addition, a high‐resolution T1‐weighted magnetization‐prepared rapid gradient‐ echo imaging (MP‐RAGE) 3D MRI sequence was obtained from each subject (TR: 2250 ms, TE: 3.93 ms, 98 flip angle, FOV: 256 mm, matrix: 256x256, voxel size: 1x1x1mm3). Image preprocessing and analyses Image processing and analyses followed the standard procedure in our lab (see Mühlberger et al., 2010): Data were analyzed by using Statistical Parametric Mapping software (SPM8; Wellcome Department of Imaging Neuroscience, London, UK) implemented in Matlab R2010a (Mathworks Inc., Sherborn, MA, USA). Functional images were slice‐time corrected and realigned by an affine registration. The mean functional image was subsequently normalized to the Montreal Neurological Institute (MNI) single‐subject template (Evans et al., 1992). Normalization parameters were then applied to the functional images and co‐registered to the T1‐image. Images were re‐sampled at a 2x2x2mm3 voxel size and spatially smoothed using a 8mm full width half maximum Gaussian kernel, and temporally filtered with a high‐pass filter (cutoff 128 s). Each experimental condition was 91 modeled using a boxcar reference vector convolved with a canonical hemodynamic response function. Parameter estimates were subsequently calculated for each voxel using weighted least squares to provide maximum likelihood estimates based on the non‐sphericity assumption of the data in order to get identical and independently distributed error terms. Realignment parameters for each session were included to account for residual movement related variance. Parameter estimation was corrected for temporal autocorrelations using a first‐order autoregressive model. For each subject, main effects were computed by applying appropriate baseline contrasts (simple effects). Afterwards these first‐level individual contrasts were fed into a second‐level group analysis. First, simple contrasts of the attitude effects, i.e. positive character > negative character, were computed for each expression. Second, the relationship between the modulation of mimicry (i.e. the difference between facial reactions to positive and facial reactions to negative characters) and the modulation of brain activations (i.e. the difference between changes in brain activity to positive and changes in brain activity to negative characters) was analyzed by inserting muscle activations as covariates into a regression with brain activity. Furthermore, analogous analyses are planned for the reverse attitude effect, i.e. negative character > positive character, for sad expressions. This was done because incongruent reactions to sad expressions are expected on M. corrugator supercilii suggesting that for negative characters additional processes might be at work. For a priori expected activations, ROI analyses were carried out in the amygdala, the insula, the ventral and dorsal part of the striatum (nucleus accumbens, caudate and putamen), the inferior frontal gyrus, the ventral premotor cortex, the inferior parietal lobe and the superior temporal sulcus (human mirror neuron system according to Iacoboni & Dapretto, 2006; Keysers & Gazzola, 2006) based on masks from the WFU Pick Atlas (Maldjian, Laurienti, Kraft, & Burdette, 2003) as implemented in SPM8. These analyses were performed using the small volume correction of SPM 8, a height threshold of p < 0.01, and an extent threshold of k=5 contiguous voxels. Resulting activation peaks were superimposed on standard high‐resolution anatomical images. Procedure After arriving at the laboratory, participants were informed about the procedure of the experiment and were asked to give informed consent. They were told that the experiment was 92 designed to study the avatars’ suitability for a future computer game. EMG electrodes were then attached participants were placed in the MRI scanner. They were presented with the blond and black haired avatars and their respective traits. After gaining an impression of the attitudes belonging to the different avatars, participants conducted the short memory task where they were asked to indicate whether a presented avatar was characterized by positive or negative attitudes. Following this the functional MRI session started. Each stimulus was repeated 12 times, i.e. a total of 72 facial stimuli were presented in a randomized order. Faces were displayed for 4000 ms after a fixation‐ cross had been presented for 2000 ms to ensure that participants were focusing on the centre of the screen. The inter‐trial interval varied randomly between 8750 and 11250ms. Participants were instructed to simply view the pictures without any further task. After the functional MRI structural MRI (MP‐RAGE) was recorded. Then, participants were taken out of the scanner and electrodes were detached. The final memory task and ratings of the facial stimuli according to valence, emotional arousal and likeability were applied outside the scanner. Finally participants completed a questionnaire regarding demographic data, were paid and thanked. 2.3.2.3. RESULTS Manipulation Check Separate repeated‐measures ANOVAs with the within‐subject factors attitude (positive vs. negative) and emotion (happy vs. neutral vs. sad) were computed for valence, arousal and liking ratings. Valence ratings revealed a main effect of attitude, F(1,21) = 8.79, p < .01, ηp2 = .29. Avatars who were characterized by negative traits were rated as less positive (M = 4.67, SD = .20) than avatars with positive traits (M = 5.46, SD = .24). Analyses of liking ratings also revealed a significant attitude effect, F(1,21) = 11.29, p < .01, ηp2 = .35. Negative characters were rated as less likeable (M = 5.08, SD = .29) than positive ones (M = 6.27, SD = .18). No main effect of attitude was found for arousal ratings, all ps > .72. Furthermore, there were no significant interactions of attitude and emotion, all ps > .11, for any of the three ratings. Recall Task Analyses of the first and second recall task following the attitude manipulation revealed that no participant made more than one error in the six classification trials. Thus, all data were used for further analyses. 93 EMG measures A repeated measures analysis of variance with the within‐subject factors muscle (M. zygomaticus major vs. M. corrugator supercilii ), attitude (positive vs. negative) and emotion (happy vs. neutral vs. sad) was conducted. A significant main effect of muscle, F(1,21) = 6.46, p = .02, ηp2 = .24, a significant main effect attitude, F(1,21) = 21.34, p < .01, ηp2 = .50, a significant Muscle x Emotion effect, F(2,20) = 8.83, p < .01, ηp2 = .30, a significant Attitude x Emotion effect, F(2,20) = = 4.71, p = .02, ηp2 = .18, and a significant three‐way interaction Muscle x Attitude x Emotion, F(2,20) = 6.64, p < .01, ηp2 = .24. No other effect gained significance, all ps > .84. To further specify this interaction, separate follow up ANOVAs for the M. zygomaticus major and the M. corrugator supercilii were calculated. M. zygomaticus major. As prediced, activity in M. zygomaticus major to happy faces was larger for positive compared to negative characters (see Figure 28). This was verified by a significant Attitude x Emotion interaction, F(2, 20) = 7.96, p < .01, ηp² = .27. Following t‐tests revealed a significant difference between M. zygomaticus major reactions to happy faces of positive characters (M = 0.52) as compared to negative characters (M = 0.01), t(21) = 3.19, p < .01. No effects were observed for neutral or sad facial expressions, all ps > .13. Figure 28. Mean EMG change in μV for M. zygomaticus major in response to happy, neutral and sad faces for positive and negative characters. Error bars indicate standard errors of the means. 94 M. corrugator supercilii. As prediced, activity in M. zygomaticus major to sad faces was larger for positive compared to negative characters (see Figure 29). This was verified by a significant Attitude x Emotion interaction, F(2, 20) = 4.04, p = .04, ηp² = .16. Following t‐tests revealed a significant difference between M. corrugator supercilii reactions to sad faces of positive characters (M = 0.25) as compared to negative characters (M = ‐0.20), t(21) = 2.56, p = .02. No effects were observed for happy or neutral facial expressions, all ps > .21. Figure 29. Mean EMG change in μV for M. corrugator supercilii in response to happy, neutral and sad faces for positive and negative characters. Error bars indicate standard errors of the mean. Additionally, one‐sample t‐tests against zero revealed that the reaction to sad faces of negative characters was indeed a significant inhibition of M. corrugator supercilii activity, t(21) = 2.53, p = .02. This confirms the assumption of an incongruent reaction. fMRI data Regions of interest (ROI) analyses (amygdala, insula, nucleus accumbens, caudate, putamen, inferior frontal gyrus (IFG), ventral premotor cortex, inferior parietal lobe (IPL), superior temporal sulcus (STS)) were performed for the attitudes effects on emotional expressions, i.e. ‘happy expressions of positive characters’ > ‘happy expressions of negative characters’, ‘sad expressions of positive characters’ > ‘sad expressions of negative characters’ and ‘sad expressions of negative characters’ > ‘sad expressions of positive characters’. These analyses revealed for the contrast 95 between happy facial expressions of positive and negative characters (positive happy > negative happy) significant activations (p < 0.01, k = 5 voxels) in the left and right fusiform gyrus, the right inferior parietal lobule, right inferior frontal gyrus, left and right superior temporal gyrus, right putamen, left caudate and left insula (see Table 1 for all main effects of the ROI analyses). Table 3. Significant activations as revealed by the ROI analyses for the reported contrasts. Alpha = 0.01 for ROI analyses with a minimum cluster size of k = 5. L = left, R = right hemisphere, IFG = inferior frontal gyrus, IPL = inferior parietal lobule, STS = superior temporal sulcus, N. acc. = Nucleus accumbens. aThe cluster with the largest number of significant voxels within each region is reported. Coordinates x, y and z are given in MNI space. Contrast x y positive_happy > ‐22 negative_happy z Z k Brain region ‐36 ‐14 2.94 74 Fusiform L 30 ‐30 ‐16 3.61 54 Fusiform R 54 ‐54 38 3.14 56 IPL R 38 36 ‐8 2.82 11 IFG R ‐56 ‐8 6 2.95 66 STS La 58 ‐60 28 3.60 86 STS R 22 ‐6 12 3.32 42 Putamen R ‐12 18 12 2.91 14 Caudate L ‐40 ‐10 4 2.90 48 Insula La positive_sad 42 ‐24 ‐18 3.33 16 Fusiform R > ‐42 10 26 2.94 33 IFG L negative_sad 50 34 16 2.73 33 IFG R ‐48 ‐32 22 2.79 9 STS L 42 ‐34 18 3.20 32 STS R 32 16 10 2.39 5 Insula R negative_sad ‐38 ‐50 ‐14 3.70 120 Fusiform L > 40 14 ‐12 2.82 27 Fusiform R positive_sad ‐52 ‐12 ‐6 2.55 6 Fusiform R ‐44 14 2 3.37 65 Insula L 42 ‐52 ‐24 2.72 20 Insula R 16 6 0 1.74 5 N. acc. R 96 Regression analyses with the contrast ‘positive_happy > negative_happy’ as dependent and the difference between M. zygomaticus major activity to happy expressions of positive and happy expressions of negative characters (‘positive_happy – negative_happy’) as predictor variable revealed significant effects in the right fusiform gyrus, the right superior temporal sulcus and the right inferior frontal gyrus (see Figure 30). Figure 30. Statistical parametric maps for the regression analyses of ‘positive_happy > negative_happy’ with attitude effect on congruent facial reactions to happy expressions on M. zygomaticus major (‘positive_happy – negative_happy’) as predictor variable. (A) Significant co‐ activation in the right fusiform gyrus, (x = 26, y = ‐32, z = ‐16; Z = 3.49; p < 0.001; k = 34 voxel. (B) Significant co‐activation in the right superior temporal sulcus, (x = 48, y = ‐12, z = ‐14; Z = 2.85; p = 0.002; k = 5 voxel. (C) Significant co‐activation in the right inferior frontal gyrus, (x = 38, y = 38, z = ‐6; Z = 2.73; p = 0.003; k = 8 voxel. Regression analyses for the contrast ‘positive_sad > negative_sad’ with the difference between M. corrugator supercilii activity to sad expressions of positive and sad expressions of negative characters (‘positive_sad – negative_sad’) as predictor variable revealed significant effects in the right fusiform gyrus and the right insula (see Figure 31). Finally, regression analyses for the reverse contrast, i.e. ‘negative_sad > positive_sad’, with the difference between M. corrugator supercilii activity to sad expressions of positive and sad expressions of negative characters (‘positive_sad – negative_sad’) as predictor variable revealed significant effects in the right fusiform gyrus and the right nucleus accumbens (see Figure 32). 97 Figure 31. Statistical parametric maps for the regression analyses of ‘positive_sad > negative_sad with attitude effect on congruent facial reactions to sad expressions on M. corrugator supercilii (‘positive_sad – negative_sad) as predictor variable. (A) Significant co‐activation in the right fusiform gyrus, (x = 40, y = ‐56, z = ‐16; Z = 2.99; p = 0.001; k = 66 voxel. (B) Significant co‐activation in the right insula, (x = 40, y = 14, z = ‐8; Z = 3.36; p < 0.001; k = 97 voxel. Figure 32. Statistical parametric maps for the regression analyses of ‘negative_sad > positive_sad with attitude effect on congruent facial reactions to sad expressions on M. corrugator supercilii (‘positive_sad – negative_sad) as predictor variable. (A) Significant co‐activation in the right superior temporal sulcus, (x = 68, y = ‐22, z = 12; Z = 3.769; p < 0.001; k = 42 voxel. (B) Significant co‐activation in the right nucleus accumbens, (x = 18, y = 10, z = ‐2; Z = 2‐67; p = 0.004; k = 7 voxel. 98 2.3.2.4. DISCUSSION For a third time, in line with Experiments 1 and 4, it has been found that people react with a decrease in congruent facial reactions to happy expressions of negative compared to positive characters. Sad faces of positive characters evoked strong congruent reactions. Sad faces of negative characters strong incongruent reactions, i.e. a relaxation of M. corrugator supercilii. This pattern can be seen as very robust and stable. Furthermore, fMRI data revealed that participants showed significantly stronger activations of the fusiform gyrus, the inferior frontal gyrus, the superior temporal sulcus and the insula in response to both happy and sad expressions of positive compared negative characters. Additionally, happy expressions of positive characters as compared to happy expressions of negative characters elicited stronger activity in the right inferior parietal lobule, the right putamen and the left caudate. This is the first evidence showing that activity of parts of the mirror neuron system is modulated by attitudes. Specifically, the MNS is less active in response to emotional facial expressions of negative compared to positive characters. Furthermore, a significant difference in activity of parts of the striatum (the caudate and the putamen) was observed in response to happy expressions between positive and negative characters. This suggests that the reward system is more active when someone we like shows a happy expression. Contrary to expectations, no effect of attitude manipulation was found in the amygdala in response to sad expressions. The reason for that is unclear. Maybe the low arousal of sad facial expressions compared to other negative stimuli hampered the detection of amygdala activation in this case. The effects concerning the fusiform gyrus are surprising when considered in the light of the results of Experiment 4. In Experiment 4, positive characters elicited a smaller N170 compared to negative characters. In the present experiment, the effect is reversed and the data show stronger activations of the fusiform gyrus in response to positive compared to negative characters. The reason for this effect is not clear. Maybe, our understanding of the way in which the EEG component N170 is connected to the fusiform gyrus is wrong. It is interesting to note in this context that two fMRI studies (Morris et al., 1998; Vuilleumier, Armony, Driver, & Dolan, 2001) have found that activity within face‐specific fusiform areas is modulated by emotional facial expression. The fact that EEG studies find that the N170 is usually insensitive to the emotional expression (e.g. Eimer, Holmes, & McGlobe, 2003; Eimer & Holmes, 2007; but see Mühlberger et al., 2009; Frühholz, Jellingshaus, & 99 Herrmann, 2011 for contrary results) could indicate that this component reflects face processing at another anatomical structure than fusiform gyrus. Additionally, there is evidence from source localization studies (Itier & Taylor, 2004) that the N170 may at least in part be generated in more lateral temporal regions such as the superior temporal sulcus. This assumption also does not fit with the fMRI results but it highlights the problem of a structural localization of the N170. Due to this lack of clarity concerning the relation between N170 and the fusiform gyrus the results of the fusiform gyrus will not be discussed further. It shall, however, be noted that these problems do not seriously impair the interpretation of the results of Experiment 4. There is clear evidence that the N170 reflects the structural encoding of faces (Eimer, 2000) and thus the conclusions drawn in Experiment 4 about its role in the perception of faces are fairly unaffected by the debate about its concrete anatomic localization. When looking at the results on co‐variations of the observed differences in activations of the MNS and the affective areas with the differences in congruent facial reaction to positive and negative characters we can see that not all of the above mentioned areas show relations to the modulation of facial reactions. Significant co‐variations between the attitude effect on brain activations and the attitude effect on congruent facial reactions to happy expressions could be observed in the IFG and the STS. This means that changes in the activity of parts of the MNS are related to the modulation of congruent facial reactions of happy expressions. Although a causal influence of MNS activity to facial reactions cannot be inferred this finding supports the working model’s assumption of the involvement of motor mechanisms in the modulation of congruent facial reactions. For the modulation of facial reactions to sad expressions no involvement of the MNS could be observed. Instead, co‐variation analyses revealed significant relations between the attitude effect on activations in the insula and the attitude effect in facial reactions to sad expressions. The current view about the role of the insula is that it is playing a broad role in integrating affective and cognitive processes as well as in the recognition, evaluation and processing of emotions (Berntson et al., 2011). Furthermore, it has been shown that the insula plays a central role in the elicitation of social affiliative behavior (Caruana, Jezzini, Sbriscia‐Fioretti, Rizzolatti, & Gallese, 2011). Given this broad and unspecific function of the insula it can only be speculated about the meaning of its role in the modulation of facial reactions to sad expressions. Probably, the co‐variation reflects processes of evaluation of the meaning of the interaction of context and emotional expression. However, it could also reflect a modulation via affiliative behavioral tendencies. Which of these possibilities is more applicable can only be answered with further research. 100 The evidence coming from the reverse contrast between sad facial expressions of positive and negative characters (negative sad > positive sad) is easier to interpret. Here, significantly stronger activations were found in the fusiform gyrus, the insula and the nucleus accumbens in response to sad expressions of negative compared to positive characters. Significant co‐activations with the attitude effect on incongruent facial reactions to sad expressions on M. corrugator supercilii were found for the STS and the nucleus accumbens. This means that the STS, i.e. the main visual input into the MNS (Iacoboni & Dapretto, 2006), but not the actual MNS is involved in the modulation of incongruent reactions. The involvement of the STS means that the higher the activity in the STS the more incongruent reactions are shown. This could point to the possibility that the STS has also a regulating function in the MNS. The involvement of the nucleus accumbens in the modulation of incongruent facial reactions means a higher activation of the nucleus accumbens in trials with strong incongruent reactions. This probably reveals the involvement of the emotional reaction of schadenfreude in response to a sad negative character. Evidence supporting this assumption comes from a study by Singer et al. (2006) who found increased activity when participants watched an unfair person being punished compared to a fair person. In sum, these results support the working model regarding the assumption of the involvement of motor mechanisms (the mirror neuron system) in the modulation of congruent facial reactions and the involvement of affective mechanisms in terms of emotional reactions in the modulation of incongruent facial reactions. Thereby, the current working model in its second revision can be seen as affirmed and needs no further revision. Again and also in line with the current version of the working model, no evidence has been found for the involvement of affective processes in the modulation of congruent reactions. Possible candidates herefore would have been the activations of the caudate and the putamen in the contrast ‘positive_happy > negative_happy’. However, those contrasts did not show co‐activations with the modulation of facial reactions. Finally, one might pose the question which higher instance in the brain might be responsible for the attitudes effects on the MNS. Explorative analyses in the frontal cortex were computed to detect such areas that were e.g. responsible for down regulating the MNS in response to negative characters. Therefore, a regression analysis with the contrast ‘positive_happy > negative_happy’ as dependent variable and the attitude effect on M. zygomaticus major reactions to happy expressions (‘positive_happy – negative_happy) as predictor variable was computed. In a similar vein, a regression analysis with the contrast ‘positive_sad > negative_sad as dependent variable and the attitude effect on facial reactions to sad expressions on M. corrugator supercilii (‘positive_sad – 101 negative_sad) as predictor variable was computed. A co‐activation was only found for the analyses on reactions to sad expressions. Here a significant effect was found in the orbital part of the medial frontal cortex (oMFC), x = ‐38, y = 44, z = ‐6; Z = 2.86; p = 0.002; k = 14 voxel. This indicates that the medial frontal cortex was involved in the modulation of facial reactions to sad expressions. According to a review by Amodio and Frith (2006) the medial frontal cortex plays a central role in social cognition. The oMFC in particular is assumed to represent the value of possible future outcomes and to be responsible for outcome monitoring and regulation. This proposed function would fit well with the affiliative function of mimicry. If there is no desire to affiliate with a certain character due to e.g. his or her negative traits one should care for controlling his or her facial reactions in order to avoid further contact. The oMFC could therefore represent this desired future outcome of less affiliative facial reactions and regulate, i.e. inhibit, the MNS in contexts like the negative interaction partner. However, such an assumption is pure speculation. Connectivity analyses could be one step to further clarify the role of the oMFC in the modulation of facial reactions. 3. GENERAL DISCUSSION The objective of this thesis was to examine the mechanisms and neuronal correlates underlying the modulation of facial reactions to emotional facial expressions. Based on a review of theoretical models and empirical evidence a working model on mechanisms underlying the modulation of congruent and incongruent facial reactions to emotional facial expressions was proposed. Thereby, two main mechanisms were posited: motor mechanisms indicated by the perception‐behavior link and affective mechanisms, i.e. valence evaluations and emotional reactions. In five experiments, potential effects and interactions within the working model were examined. 3.1. MOTOR MECHANISMS: INTEGRATION OF FINDINGS In sum, the results of the five experiments provide strong support for the involvement of motor mechanisms in congruent facial reactions. Therefore, it can be concluded that facial mimicry per se exists. Although congruent facial reactions cannot reliably be observed in response to all kinds of emotional expressions (see the case of anger in Experiment 3) it can be posited that if they are observed their strength is determined by motor mechanisms. 102 The specific mechanism underlying such a modulation is the perception‐behavior (Chartrand & Bargh, 1999) or perception‐action link (Preston & deWaal, 2002). In Experiment 4 it has been shown that a contextual moderator already modulates the early perception of a facial expression. Further on, correlational evidence suggested that this modulation of perception further leads to the modulation of congruent facial reactions in response to positive and negative characters. Of course, such a co‐variation of both the N170 modulation and the modulation of congruent facial reactions could have occurred without any causal relation, e.g. as a result of the modulation of a common third structure. However, the temporal sequence of the changes, i.e. changes in face perception (140‐ 170ms after stimulus onset) occurring prior to changes in facial muscular activity (300‐400ms after stimulus onset), might on the other hand suggest that this correlational result could indeed reflect a causal influence. Furthermore, Experiment 3 for could also show, that the perception‐behavior “link” itself, as indicated by a measure of cognitive empathy, is modulated by the context. It could be shown that empathy was reduced in the competition compared to a neutral and a cooperation condition. Furthermore, results revealed that this change in cognitive empathy mediated the modulation of congruent facial reactions. Admittedly, the assumption that the amount of state cognitive empathy reflects the strength of the perception‐behavior link is not empirically verified. One might argue that although it may be that a stronger perception‐behavior link leads to more cognitive empathy, the presence or absence of empathy does not speak towards the strength of the link because the understanding of another's state can also be achieved via other means, such as the application of general knowledge about emotion eliciting situations. Experiment 5 might give a stronger evidence for the involvement of the perception‐behavior link in the modulation of facial reactions. Here, it has been shown that the neuronal structure representing the perception‐behavior “link” itself is also modulated by the context. This can be seen as much more reliable evidence since a broad body of literature supports the assumption that the MNS reflects activity of the perception‐behavior link and is strongly connected to imitation and mimicry. Specifically, in Experiment 5 the activity of (parts of) the mirror neuron system is decreased when the perceived expression belongs to a negative compared to a positive character. Furthermore, it has been shown that these changes in the strength of the MNS are related to the modulation of congruent facial reactions. There is also evidence that the orbital part of the medial frontal cortex (oMFC) could be the instance responsible for monitoring the outcome and inhibiting the MNS if required as e.g. in response to negative characters. And it is imaginable that the oMFC also regulates 103 to which emotional expressions congruent reactions are shown per se. This suggestion relies on the finding that congruent facial reactions are not found in response to all kinds of emotional expressions as e.g. in response to anger. It can be concluded that the modulation of congruent facial reactions occurs due to a modulation of the strength of both parts of the perception‐behavior link, i.e. the strength of perception as well as the spreading of activation between perception and behavior, i.e. “the link” or the MNS. Thereby, it is assumed, that both perception and link can be affected independently by the context. However, it is assumed that changes in perception also affect the activity of the mirror neuron system. This view is supported by a study of Muthukumaraswamy and Singh (2008) who could show that the activity of the MNS can be modulated by attention. Finally, it shall be noted that there are also some critical views on the mirror neuron system in literature. First, there is some disagreement about the exact location of the mirror neurons, whether these neurons actually constitute a “system” (in the sense of interconnected elements), and whether there actually are specialized neurons dedicated to mirroring or whether regular neurons can simply perform a mirroring function (Niedenthal, 2007). Second, there is disagreement on the function of the MNS. As mentioned before (see chapter 1.2.3), there are two alternative explanations concerning its purpose. The earlier assumed function of decoding and understanding the actions of others (Iacoboni & Dapretto, 2006) is challenged by more and more authors proposing that the mirror neuron system evolved simply as a byproduct of associative learning. Despite these criticisms it can be concluded from this body of research that motor mechanisms are involved in the contextual modulation of congruent facial reactions to emotional facial expressions. No involvement of motor determinants could be shown in incongruent reactions. This has also been proposed in the working model due to conclusions from the perception‐behavior link, the perception‐action model and the MNS. All these three models of motor mechanisms of facial reactions do not allow the prediction of an inhibition of a congruent muscle below zero or the activation of a different muscle. Results of Experiments 1‐5 support this conclusion. Therefore, what in literature is usually termed as “counter‐mimicry” seems to be mainly caused by affective processes like e.g. schadenfreude or anger, and researchers should be careful with the label “counter‐mimicry” for incongruent reactions in future studies. 104 3.2. AFFECTIVE MECHANISMS: INTEGRATION OF FINDINGS Experiments 1 and 2 ruled out the involvement of valence evaluations in the modulation of both congruent and incongruent facial reactions. One possible explanation for that may be that valence evaluations are not independent of emotional reactions. Ruys and Stapel (2008) posit that information processing unfolds from the global to the specific. Accordingly, they found that if emotional stimuli were presented “quick” (for 120 ms outside the perceptual field) they evoked the corresponding specific emotional experience. However, if these stimuli were presented “super‐ quick” (for 40 ms outside the perceptual field) they only evoked global positive and negative, valence‐based, feelings. This suggests that valence evaluations might only be a determinant of facial reactions when stimuli are presented for a very short time or when the stimulus has no specific emotional content. Otherwise, our brain extracts the specific emotional content and our face reacts to it with the respective specific emotional muscular pattern. Evidence for the involvement of affective mechanisms in terms of emotional reactions in the modulation of congruent reactions only comes from Experiment 2. Here, participants reacted with enhanced activity of the M. corrugator supercilii to accidentally compared to negligently HIV‐infected and healthy people. Valence evaluations and motor mechanisms could not explain this pattern. Instead, it was best explained by feelings of pity and sympathy. In contrast, Experiments 3 and 5, which measured affective processes instead of manipulating them, could not find evidence for the involvement of emotional reactions in the modulation of congruent reactions. One important and probably crucial difference between these experiments and Experiment 2 is that in Experiments 3 and 5 emotional reactions that might have been candidates for the modulation of congruent reactions were implicit emotional reactions in the sense of emotional contagion or embodied emotions. However, the emotional reactions of pity and sympathy that were involved in congruent reactions in Experiment 2 were clear reflective emotional reactions. This suggests that the influence of implicit emotional reactions on congruent facial reactions might be too weak to be detectable in the presence of motor mechanisms. However, reflective emotional reactions assert a much stronger influence and even dominate influences from motor mechanisms. Such reflective emotional reactions are also the main determinant of incongruent reactions. Experiments 3 and 5 could clearly show that the modulation of incongruent facial reactions could only be explained by the reaction of joy in response to sad or angry expressions of a competitor or 105 activity of the nucleus acumens in response to a sad negative character. All these results were explained as triggered by the emotional reactions of schadenfreude. 3.3. REVISED MODEL ON MECHANISMS UNDERLYING THE MODULATION OF FACIAL REACTIONS TO EMOTIONAL FACIAL REACTIONS The integration of the results of Experiments 1‐5 lead to a revision of the working model. The revised version of the working model on mechanisms underlying facial reactions to emotional facial expressions can be seen in Figure 33 and comprises changes following Experiments 1‐3. Experiments 4 and 5 could affirm the second revision of the working model (see 2.2.5.). Therefore, the revised model equates the second revision. The revised model on mechanisms underlying facial reactions to emotional facial expressions makes the following assumptions: 1. Motor mechanisms, i.e. the perception‐behavior link, are involved in the modulation of congruent facial reactions. Contextual moderators can influence congruent reactions by modulating the strength of all parts of the perception‐behavior link (the strength of perception as well as the spreading of activation between perception and behavior) independently from the specific emotional expression. Contexts that require smooth interactions, affiliation and liking are assumed to increase the activity of the perception‐ behavior link whereas contexts in which affiliation and liking are not required or desired inhibit it. 2. Motor mechanisms are not involved in the occurrence or modulation of incongruent facial reactions. 3. Valence evaluations are not involved in the modulation of congruent facial reactions. 4. Reflective emotional reactions are involved in the modulation of congruent and incongruent facial reactions. The modulation of emotional reactions is expected to arise from an interaction of context and emotional expression, i.e. inferring the meaning of the emotional expression in the respective context. Due to the mixed evidence on the involvement of emotional reactions in the modulation of congruent reactions the path from emotional reactions to congruent facial reactions is dashed. 106 5. Strategic mechanisms reflecting intentions to give a certain impression are probably involved in the occurrence and modulation of incongruent facial reactions. As in the latter case the modulation of strategic processes is assumed to work through an interaction of context and emotional expression. Because of the weak evidence on that modulation the path from strategic processes to incongruent facial reactions is painted in grey. 6. Due to the lack of empirical evidence no assumptions concerning the role of strategic processes in the modulation of congruent facial reactions are made. 7. Facial reactions in general are a function of congruent and incongruent behavioral schemata that can be controlled by motor and affective mechanisms. Once activated, both kinds of schemata additively contribute to the muscular facial reaction according to their respective strength. This assumption is made although no evidence for a dual influence of both mechanisms could have been observed in the Experiments. Figure 33. The revised model on mechanisms underlying facial reactions to emotional facial expressions (see text for explanations). 107 3.4. LIMITATIONS OF THE APPROACH A number of limitations of the approach pursued in this dissertation shall be discussed. First, the revised model makes the assumption that motor and affective processes can additively contribute to the muscular facial reactions according to their respective strength. Unfortunately, no such evidence for a dual influence of both mechanisms was observed in the experiments. Although it is very unlikely to assume that in any case only one of both mechanisms determines the facial reaction, more studies are needed to investigate the interplay of congruent and incongruent behavioral schemata in the generation of facial reactions. One possibility would be a study applying cognitive load. By occupying cognitive resources it should be possible to decrease reflective emotional reactions to virtually zero. Given that reflective emotional reactions were the only affective mechanisms involved in the five experiments this would mean that the observed facial reactions in a high load condition could be solely interpreted as determined by motor mechanisms. If these reactions differ from those in a condition without (or with low) cognitive load, it can be concluded that under normal conditions both processes contribute to facial reactions. It also has to be noted that in the present experiments congruency of the facial reaction and involved facial muscle were confounded. Whereas a modulation of congruent reactions was observed on both the M. zygomaticus major (Experiments 1‐5) and the M. corrugator supercilii (Experiment 2) the modulation of incongruent reactions was observed only on the M. corrugator supercilii. No incongruent reactions could be observed on M. zygomaticus supercilii although the occurrence of schadenfreude would have allowed that. One reason for that might be that an incongruent reaction on M. corrugator supercilii in terms of a relaxation is invisible whereas an incongruent reaction in terms of an increase in activity of the M. zygomaticus major is very well visible to the interaction partner. Therefore, the corrugator relaxation combined with the null‐ reaction of the zygomaticus could have been a means to live the emotion but to cover it at the same time in the present experiments. However, to be able to further generalize the findings more muscles to cover congruent and incongruent reactions should be assessed in future studies. Another crucial point regarding the design might be that all the samples only consisted of women. It still has to be shown whether the observed effects can really be generalized to men. According to findings by Van Vugt, De Cremer and Janssen (2007) women are more cooperative then men and less competitive. Translated to the results of Experiment 3 this could mean that men should show weaker but not qualitatively different effects in the cooperation condition and enhanced 108 incongruent reactions in the competition condition. But based on the results of Van Vugt, one could also assume that cooperation is the default status for women. If this is the case than the lack of cooperation effects could also be explained by the choice of the sample. With the attitude manipulation no such gender effects are to be expected. None of the above mentioned studies on the effects of attitudes on approach and avoidance tendencies (Chen & Bargh, 1999; Neumann & Strack, 2000; Neumann et al., 2004; Seibt et al., in press; Solarz, 1960) and on the effects of affiliation and competition goals on facial reactions and mimicry (Lakin & Chartrand, 2003; Lanzetta & Englis, 1989) found any gender differences. Therefore, no difference between men and women regarding this kind of attitude manipulation is expected. Nevertheless, further studies are needed to replicate the results with differing samples. Another limitation can be seen in the exclusive use of avatars, i.e. computer generated instead of real human characters. Although avatars have the great advantage of full control over the facial expression and its dynamics, e.g. its intensity and temporal course (cf. Krumhuber & Kappas, 2005), one can argue that they are simply not human. In most areas, studies can show that reactions to avatars are comparable to those of human actors. For example, Bailenson and Yee (2005) could show that effects of a mimicking avatar on liking are comparable to those reported in studies with real individuals (Chartrand & Bargh, 1999). Furthermore, Mühlberger et al. (2009) found in an EEG study similar emotional effects on N170, EPN and LPP amplitudes of artificial and natural facial expressions. But there are also studies reporting differences between the processing of avatar and real faces. Although Moser et al. (2007) found similar amygdala activations in response to both kinds of actors faces they reported stronger activations in the fusiform gyrus in response to human compared to avatar faces. Finally, Longo & Bertenthal (2009) report that imitation was reduced in their study when they emphasized the virtualness of the used virtual hand. In sum, these results suggest that reactions to avatars might differ in a quantitative way from those to human characters. However, no evidence is available suggesting qualitatively different reactions. Therefore, the avatars used in the five experiments can be seen as a good model of human behavior although they maybe underestimate the strength of the expected effect to some degree. 109 3.5. CONCLUSION/OUTLOOK In sum, this dissertation helped to clarify the involvement of motor and affective processes in the modulation of facial reactions to emotional facial expressions. The reported series of experiments is the first approach in literature to shed light on the mechanisms involved in congruent and incongruent facial reactions. It has been shown, that motor processes are the main determinant underlying the modulation of congruent reactions. Thereby, the existence of facial mimicry has been confirmed. Furthermore, the experiments revealed that affective processes in terms of emotional reactions are the main determinant of the modulation of incongruent facial reactions. For the existing and future body of literature this means that what has so far been called “counter‐mimicry” has to be rephrased and, even more importantly, re‐conceptualized into affective terms. The additional time course analyses of Experiment 3 showed that these mechanisms are already at work in the first second of the facial reaction and that their influence persists for the whole time of stimulus exposure. Future studies should examine the interplay of motor and affective mechanisms in the generation of facial reactions. Additionally, the findings of the reported five experiments should be generalized to a broader population and a wider range of social contexts. The revised model on mechanisms of the modulation of facial reactions may provide a concise approach to derive specific hypotheses. Further studies are also required to shed light on the nature of the third class of involved mechanisms, namely strategic impression formation processes, and its neuronal structures. Following such studies, the revised model should be extended. In addition with neuroimaging studies examining the connectivity between the neural correlates of the involved mechanisms and higher order control structures this finally can lead to an empirically validated functional anatomical model of the modulation of facial reactions to emotional facial expressions. 110 4. REFERENCES Achaibou, A., Pourtois, G., Schwartz, S., & Vuilleumier, P. (2008). Simultaneous recording of EEG and facial muscle reactions during spontaneous emotional mimicry. Neuropsychologia, 46, 1104‐ 1113. Amodio, D. (2010). Coordinated roles of motivation and perception in the regulation of intergroup responses: Frontal cortical asymmetry effects on the P2 event‐related potential and behavior. Journal of Cognitive Neuroscience, 22, 2609‐2617. Amodio, D. M., & Frith, C. D. (2006). Meeting of minds: The medial frontal cortex and social cognition. Nature Reviews Neuroscience, 7, 268‐277. Atkinson, A., & Adolphs, R. (2005). Visual emotion perception: Mechanisms and processes. In L. Feldman‐Barret, P. M. Niedenthal, & P. Winkielman (Eds.), Emotion and consciousness (pp. 150‐184). New York: Guilford Press. Bailenson, J. N., & Yee, N. (2005). Digital chameleons: automatic assimilation of nonverbal gestures in immersive virtual environments. Psychological Science, 16, 814‐819. Barrett, L. F., Niedenthal, P. M., & Winkielman, P. (2005). Emotion and consciousness. New York, NY: Guilford Press. Baron, R. M., & Kenny, D. A. (1986). The moderator‐mediator variable distinction in social psychological research: Conceptual, strategic, and statistical considerations. Journal of Personality and Social Psychology, 51, 1173‐1182. Baron‐Cohen, S., Wheelwright, S., Hill, J., Raste, Y., & Plumb, I. (2001). The "Reading the mind in the eyes" Test revised version: A study with normal adults, and adults with Asperger syndrome or high‐functioning autism. Journal of Child Psychology and Psychiatry, 42, 241‐251. Batson, C. D., Polycarpou, M. P., Harmon‐Jones, E., Imhoff, H. J., Mitchener, E. C., Bednar, L. L., et al. (1997). Empathy and attitudes: Can feeling for a member of a stigmatized group improve feelings toward the group? Journal of Personality and Social Psychology, 72, 105‐118. 111 Bentin, S., Allison, T., Puce, A., Perez, E., & McCarthy, G. (1996). Electrophysiological studies of face perception in humans. Journal of Cognitive Neuroscience, 8, 551‐565. Berntson, G.G., Bechara, A., Damasio, H., Tranel, D., & Cacioppo, J.T. (2007). Amygdala contribution to selective dimensions of emotion. Social Cognitive and Affective Neuroscience, 2, 123‐129. Berntson, G. G., Norman, G. J., Bechara, A., Bruss, J., Tranel, D., & Cacioppo, J. T. (2011). The insula and evaluative processes. Psychological Science, 22, 80‐86. Blairy, S., Herrera, P., & Hess, U. (1999). Mimicry and the judgment of emotional facial expressions. Journal of Nonverbal Behavior, 23, 5‐41. Blakemore, S.‐J., & Frith, C. (2005). The role of motor contagion in the prediction of action. Neuropsychologia, 43, pp. 260‐267. Blascovich, J., Loomis, J., Beall, A. C., Swinth, K. R., Hoyt, C. L., & Bailenson, J. N. (2002). Immersive virtual environment technology as a methodological tool for social psychology. Psychological Inquiry, 13, 103‐124. Bölte, S. (2005). Reading Mind in the Eyes Test für Erwachsene (dt. Fassung) von S. Baron‐Cohen, from http://www.kgu.de/zpsy/kinderpsychiatrie/Downloads/Eyes_test_erw.pdf Bourgeois, P., & Hess, U. (2008). The impact of social context on mimicry. Biological Psychology, 77, 343‐352. Buccino, G., Binkofski, F., Fink, G. R., Fadiga, L., Fogassi, L., Gallese, V., et al. (2001). Action observation activates premotor and parietal areas in a somatotopic manner: An fMRI study. European Journal of Neuroscience, 13, 400‐404. Cacioppo, J. T., & Petty, R. E. (1979). Attitudes and cognitive response: An electrophysiological approach. Journal of Personality and Social Psychology, 37, 2181‐2199. Cacioppo, J. T., Martzke, J. S., Petty, R. E., & Tassinary, L. G. (1988). Specific forms of facial EMG response index emotions during an interview: From Darwin to the continuous flow hypothesis of affect‐laden information processing. Journal of Personality and Social Psychology, 54, 592‐604. 112 Caharel, S., Poiroux, S., Bernard, C., Thibaut, F., Lalonde, R., & Rebai, M. (2002). ERPs associated with familiarity and degree of familiarity during face recognition. International Journal of Neuroscience, 112, 1499‐1512. Caldara, R., Rossion, B., Bovet, P., & Hauert, C. A. (2004). Event‐related potentials and time course of the "other‐race" face classification advantage. Neuroreport, 15, 905‐910. Carr, L., Iacoboni, M., Dubeau, M. C., Mazziotta, J. C., & Lenzi, G. L. (2003). Neural mechanisms of empathy in humans: a relay from neural systems for imitation to limbic areas. Proceedings of the National Acadademy of Science U S A, 100, 5497‐5502. Caruana, F., Jezzini, A., Sbriscia‐Fioretti, B., Rizzolatti, G., & Gallese, V. (2011). Emotional and social behaviors elicited by electrical stimulation of the insula in the macaque monkey. Current Biology, 21, 195‐199. Catmur, C., Walsh, V., & Heyes, C. (2007). Sensorimotor learning configures the human mirror system. Current Biology, 17, 1527‐1531. Chartrand, T. L., & Bargh, J. A. (1999). The chameleon effect: The perception‐behavior link and social interaction. Journal of Personality and Social Psychology, 76, 893‐910. Chartrand, T. L., Cheng, C. M., & Jefferis, V. E. (2002). You’re just a chameleon: The automatic nature and social significance of mimicry. In M. Jarymowicz & R. K. Ohme (Eds.), Natura automatyzmow (Nature of Automaticity) (pp. 19‐24). Warszawa: IPPAN & SWPS. Chartrand, T. L., & Van Baaren, R. (2009). Human Mimicry. In M. P. Zanna (Ed.), Advances in Experimental Social Psychology (Vol. 41, pp. 219‐274). Burlington: Academic Press. Chen, M., & Bargh, J. A. (1999). Consequences of automatic evaluation: Immediate behavioral predispositions to approach or avoid the stimulus. Personality and Social Psychology Bulletin, 25, 215‐224. Corrigan, P. W., Edwards, A. B., Green, A., Diwan, S. L., & Penn, D. L. (2001). Prejudice, social distance, and familiarity with mental illness. Schizophrenia Bulletin, 27, 219‐225. Crandall, C. S. (1991). Multiple stigma and AIDS: Illness stigma and attitudes toward homosexuals and IV drug users in AIDS‐related stigmatization. Journal of Community & Applied Social Psychology, 1, 165‐172. 113 Dapretto, M., Davies, M. S., Pfeifer, J. H., Scott, A. A., Sigman, M., Bookheimer, S. Y., et al. (2006). Understanding emotions in others: mirror neuron dysfunction in children with autism spectrum disorders. Nature Neuroscience, 9, 28‐30. Davis, M. (1980). A multidimensional approach to individual differences in empathy. JSAS catalogue of Selected Documents in Psychology, 10, 85. Decety, J., & Jackson, P. L. (2004). The functional architecture of human empathy. Behavioral and Cognitive Neuroscience Reviews, 3, 406‐412. DePaulo, B. (1992). Nonverbal behavior and self‐presentation. Psychological Bulletin, 111, 203‐243. Deutsch, F. M. (1990). Status, sex, and smiling: The effect of smiling in men and women. Personality and Social Psychology Bulletin, 16, 531‐540. Dijker, A. J., & Koomen, W. (2003). Extending Weiner's attribution‐emotion model of stigmatization of ill persons. Basic and Applied Social Psychology, 25, 51‐68. Dimberg, U. (1982). Facial reactions to facial expressions. Psychophysiology, 19, 643‐647. Dimberg, U. (1997). Facial reactions: Rapidly evoked emotional responses. Journal of Psychophysiology, 11, 115‐123. Dimberg, U., & Lundqvist, L.‐O. (1990). Gender differences in facial reactions to facial expressions. Biological Psychology, 30, 151‐159. Dimberg, U., & Thunberg, M. (1998). Rapid facial reactions to emotion facial expressions. Scandinavian Journal of Psychology, 39, 39‐46. Dimberg, U., Thunberg, M., & Elmehed, K. (2000). Unconscious facial reactions to emotional facial expressions. Psychological Science, 11, 86‐89. Dimberg, U., Thunberg, M., & Grunedal, S. (2002). Facial reactions to emotional stimuli: Automatically controlled emotional responses. Cognition & Emotion, 16, 449‐472. di Pellegrino, G., Fadiga, L., Fogassi, L., Gallese, V., & Rizzolatti, G. (1992). Understanding motor events: A neurophysiological study. Experimental Brain Research, 91, 176‐180. 114 Domes, G., Heinrichs, M., Michel, A., Berger, C., & Herpertz, S. C. (2007). Oxytocin Improves 'Mind‐ Reading' in Humans. Biological Psychiatry, 61, 731‐733. Dooley, P. (1995). Perceptions of the Onset Controllability of AIDS and Helping Judgments: An Attributional Analysis. Journal of Applied Social Psychology, 25, 858‐869. Eimer, M. (2000). Event‐related brain potentials distinguish processing stages involved in face perception and recognition. Clinical Neurophysiology, 111, 694‐705. Eimer, M., & Holmes, A. (2007). Event‐related brain potential correlates of emotional face processing. Neuropsychologia, 45, 15‐31. Eimer, M., Holmes, A., & McGlone, F. (2003). The role of spatial attention in the processing of facial expression: An ERP study of rapid brain responses to six basic emotions. Cognitive, Affective, and Behavioral Neuroscience, 3, 97‐110. Ekman, P. (1973). Universal facial expressions in emotion. Studia Psychologica, 15, 140‐147. Ekman, P., & Friesen, W. V. (1976). Pictures of Facial Affect. Palo Alto, CA: Consulting Psychologists Press. Ekman, P., & Friesen, W. V. (1978). The Facial Action Coding System. Palo Alto, CA: Consulting Psychologists Press. Ekman, P., Davidson, R. J., & Friesen, W. V. (1990). The Duchenne smile: Emotional expression and brain physiology: II. Journal of Personality and Social Psychology, 58, 342‐353. Evans, A. C., Marrett, S., Neelin, P., Collins, L., Worsley, K., Dai, W., et al. (1992). Anatomical mapping of functional activation in stereotactic coordinate space. Neuroimage, 1, 43‐53. Fogassi, L., Ferrari, P. F., Gesierich, B., Rozzi, S., Chersi, F., & Rizzolatti, G. (2005). Parietal lobe: from action organization to intention understanding. Science, 308, 662‐667. Fridlund, A. J., & Cacioppo, J. T. (1986). Guidelines for human electromyographic research. Psychophysiology, 23, 567‐589. Frühholz, S., Jellinghaus, A., & Herrmann, M. (2011). Time course of implicit processing and explicit processing of emotional faces and emotional words. Biological Psychology, 87, 265‐274. 115 Gallese, V. (2001). The ‘shared manifold’ hypothesis: From mirror neurons to empathy. Journal of Consciousness Studies, 8, 33‐50. Gallese, V. (2003). The manifold nature of interpersonal relations: The quest for a common mechanism. Philosophical Transactions of the Royal Society of London, B, 358, 517‐528. Gallese, V., Fadiga, L., Fogassi, L., & Rizzolatti, G. (1996). Action recognition in the premotor cortex. Brain, 119, 593‐609. Gallese, V. & Goldman, A. (1998). Mirror neurons and the simulation theory of mind‐reading. Trends in Cognitive Science, 2, 493–501. Golby, A. J., Gabrieli, J. D. E., Chiao, J. Y., & Eberhardt, J. L. (2001). Differential responses in the fusiform region to same‐race and other‐race faces. Nature Neuroscience, 4, 845‐850. Goldenberg, G., & Karnath, H.‐O. (2006). The Neural Basis of Imitation is Body Part Specific. The Journal of Neuroscience, 26, 6282‐6287. Goldman, A.I. & Sripada, C.S. (2005). Simulationist models of face‐based emotion recognition. Cognition, 94, 193‐213. Gratton, G., Coles, M. G., & Donchin, E. (1983). A new method for off‐line removal of ocular artifact. Electroencephalography and clinical Neurophysiology, 55, 468‐484. Gregg, A. P., Seibt, B., & Banaji, M. R. (2006). Easier done than undone: Asymmetry in the malleability of implicit preferences. Journal of Personality and Social Psychology, 90, 1‐20. Gutsell, J. N., & Inzlicht, M. (2010). Empathy constrained: Prejudice predicts reduced mental simulation of actions during observation of outgroups. Journal of Experimental Social Psychology, 46, 841‐845. Hatfield, E., Cacioppo, J. T., & Rapson, R. L. (1993). Emotional contagion. Current Directions in Psychological Science, 2, 96‐99. Hatfield, E., Cacioppo, J. T., & Rapson, R. L. (1994). Emotional contagion. New York, NY. Herek, G. M., Capitanio, J. P., & Widaman, K. F. (2002). HIV‐related stigma and knowledge in the United States: Prevalence and trends, 1991‐1999. American Journal of Public Health, 92, 371‐ 377. 116 Herrera, P., Bourgeois, P., & Hess, U. (1998). Counter mimicry effects as a function of racial attitudes. Paper presented at the 38. Annual Meeting of the Society for Psychophysiological Research, Denver, Colorado, USA. Herrmann, M. J., Schreppel, T., Jäger, D., Koehler, S., Ehlis, A.‐C., & Fallgatter, A. (2007). The other‐ race effect for face perception: an event‐related potential study. Journal of Neural Transmission, 114, 951‐957. Hess, U. (2001). The communication of emotion. In A. Kaszniak (Ed.), Emotions, Qualia and Consciousness (pp. 397‐409). Singapore: World Scientific Publishing. Hess, U., Philippot, P., & Blairy, S. (1998). Facial reactions to emotional facial expressions: Affect or cognition? Cognition & Emotion, 12, 509‐531. Hess, U., Blairy, S., & Kleck, R. E. (2000). The influence of facial emotion displays, gender, and ethnicity on judgments of dominance and affiliation. Journal of Nonverbal Behavior, 24, 265‐ 283. Hess, U., & Blairy, S. (2001). Facial mimicry and emotional contagion to dynamic emotional facial expressions and their influence on decoding accuracy. International Journal of Psychophysiology, 40, 129‐141. Hickok, G. (2009). Eight problems for the mirror neuron theory of action understanding in monkeys and humans. Journal of Cognitive Neuroscience, 21, 1229‐1243. Hickok, G., & Hauser, M. (2010). (Mis)understanding mirror neurons. Current Biology, 20, 593‐594. Hutchison, W. D., Davis, K. D., Lozano, A. M., Tasker, R. R., & Dostrovsky, J. O. (1999). Pain‐related neurons in the human cingulate cortex. Nature Neuroscience, 2, 403‐405. Iacoboni, M., Molnar‐Szakacs, I., Gallese, V., Buvvino, G., Mazziotta, J. C., & Rizzo‐ latti, G. (2005). Grasping the intentions of others with one’s own mirror neuron system. PLoS Biology, 3, 529‐ 535. Iacoboni, M., & Dapretto, M. (2006). The mirror neuron system and the consequences of its dysfunction. Nature Reviews Neuroscience, 7, 942‐951. 117 Itier, R. J., & Taylor, M. J. (2004). Source analysis of the N170 to faces and objects. Neuroreport, 15, 1261‐1265. Ito, T. A., & Urland, G. R. (2003). Race and gender on the brain: Electrocortical measures of attention to the race and gender of multiply categorizable individuals. Journal of Personality and Social Psychology, 85, 616‐626. Ito, T. A., Thompson, E., & Cacioppo, J. T. (2004). Tracking the timecourse of social perception: the effects of racial cues on event‐related brain potentials. Personality & Social Psychology Bulletin, 30, 1267‐1280. Ito, T. A., & Urland, G. R. (2005). The influence of processing objectives on the perception of faces: An ERP study of race and gender perception. Cognitive, Affective, & Behavioral Neuroscience, 5, 21‐36. Jakobs, E., Fischer, A. H., & Manstead, A. S. R. (1997). Emotional experience as a function of social context: The role of the other. Journal of Nonverbal Behavior, 21, 103‐130. Johnston, L. (2002). Behavioral mimicry and stigmatization. Social Cognition, 20, 18‐35. Jones, A., & Fitness, J. (2008). Moral hypervigilance: The influence of disgust sensitivity in the moral domain. Emotion, 8, 613‐627. Keltner, D., & Haidt, J. (2001). Social functions of emotions. In T. Mayne & G. A. Bonanno (Eds.), Emotions: Current issues and future directions (pp. 192‐213). New York: Guilford Press. Keysers, C., & Gazzola, V. (2006). Towards a unifying neural theory of social cognition. Progress in Brain Research, 156, 379‐401. Knutson, B. (1996). Facial expressions of emotion influence interpersonal trait inferences. Journal of Nonverbal Behavior, 20, 165‐182. Krohne, H. W., Egloff, B., Kohlmann, C.‐W., & Tausch, A. (1996). Investigations with a German version of the Positive and Negative Affect Schedule (PANAS). Diagnostica, 42, 139‐156. Krumhuber, E., & Kappas, A. (2005). Moving smiles: The role of dynamic components for the perception of the genuineness of smiles. Journal of Nonverbal Behavior, 29, 3‐24. 118 Laird, J. D., Alibozak, T., Davainis, D., Deignan, K., & et al. (1994). Individual differences in the effects of spontaneous mimicry on emotional contagion. Motivation and Emotion, 18, 231‐247. Lakin, J. L., & Chartrand, T. L. (2003). Using nonconscious behavioral mimicry to create affiliation and rapport. Psychological Science, 14, 334‐339. Lakin, J. L., Jefferis, V. E., Cheng, C. M., & Chartrand, T. L. (2003). The chameleon effect as social glue: Evidence for the evolutionary significance of nonconscious mimicry. Journal of Nonverbal Behavior, 27, 145‐162. Lanzetta, J. T., & Englis, B. G. (1989). Expectations of cooperation and competition and their effects on observers' vicarious emotional responses. Journal of Personality and Social Psychology, 56, 543‐554. Larsen, J. T., Norris, C. J., & Cacioppo, J. T. (2003). Effects of positive and negative affect on electromyographic activity over zygomaticus major and corrugator supercilii. Psychophysiology, 40, 776‐785. Leary, M. R. (1983). A brief version of the Fear of Negative Evaluation Scale. Personality and Social Psychology Bulletin, 9, 371‐375. LeDoux, J. (1996). The Emotional Brain: The mysterious underpinnings of emotional life. New York: Simon and Schuster. Leslie, K. R., Johnson‐Frey, S. H., & Grafton, S. T. (2004). Functional imaging of face and hand imitation: towards a motor theory of empathy. Neuroimage, 21, 601‐607. Leventhal, H., & Scherer, K. (1987). The relationship of emotion to cognition: A functional approach to a semantic controversy. Cognition & Emotion, 1, 3‐28. Likowski, K. U., Weyers, P., Seibt, B., Stöhr, C., Pauli, P., & Mühlberger, A. (2011). Sad and lonely? Sad mood suppresses facial mimicry. Journal of Nonverbal Behavior, 35, 101‐117. Lipps, T. (1907). Das Wissen von fremden Ichen. In T. Lipps (Ed.), Psychologische Untersuchungen (Vol. 1, pp. 694‐722). Leipzig: W. Engelmann. Longo, M. R., & Bertenthal, B. I. (2009). Attention modulates the specificity of automatic imitation to human actors. Experimental Brain Research, 192, 739‐744. 119 Luck, S. J., & Hillyard, S. A. (1994). Electrophysiological correlates of feature analysis during visual search. Psychophysiology, 31, 291‐308. Lundqvist, L.‐O., & Dimberg, U. (1995). Facial expressions are contagious. Journal of Psychophysiology, 9, 203‐211. Maddux, W. W., Mullen, E., & Galinsky, A. D. (2008). Chameleons bake bigger pies and take bigger pieces: Strategic behavioral mimicry facilitates negotiation outcomes. Journal of Experimental Social Psychology, 44, 461‐468. Magnee, M. J., Stekelenburg, J. J., Kemner, C., & de Gelder, B. (2007). Similar facial electromyographic responses to faces, voices, and body expressions. Neuroreport, 18, 369‐ 372. Maldjian, J.A., Laurienti, P.J., Kraft, R.A., Burdette, J.H. (2003). An automated method for neuroanatomic and cytoarchitectonic atlas‐based interrogation of fMRI data sets. Neuroimage, 19, 1233‐1239. Marshall, P. J., Bouquet, C. A., Shipley, T. F., & Young, T. (2009). Effects of brief imitative experience on EEG desynchronization during action observation. Neuropsychologia, 47, 2100‐2106. McHugo, G. J., Lanzetta, J. T., & Bush, L. K. (1991). The effect of attitudes on emotional reactions to expressive displays of political leaders. Journal of Nonverbal Behavior, 15, 19‐41. McIntosh, D. N. (2006). Spontaneous facial mimicry, liking and emotional contagion. Polish Psychological Bulletin, 37, 31‐42. Mobbs, D., Greicius, M. D., Abdel‐Azim, E., Menon, V., & Reiss, A. L. (2003). Humor Modulates the Mesolimbic Reward Centers. Neuron, 40, 1041 ‐1048. Moody, E. J., McIntosh, D. N., Mann, L. J., & Weisser, K. R. (2007). More than mere mimicry? The influence of emotion on rapid facial reactions to faces. Emotion, 7, 447‐457. Morris, J. S., Friston, K. J., Buechel, C., Frith, C. D., Young, A. W., Calder, A. J., et al. (1998). A neuromodulatory role for the human amygdala in processing emotional facial expressions. Brain, 121, 47‐57. 120 Moser, E., Derntl, B., Robinson, S., Fink, B., Gur, R. C., & Grammer, K. (2007). Amygdala activation at 3T in response to human and avatar facial expressions of emotions. Journal of neuroscience methods, 161, 126‐133. Mühlberger, A., Wieser, M. J., Gerdes, A. B., Frey, M. C., Weyers, P., & Pauli, P. (2010). Stop looking angry and smile, please: start and stop of the very same facial expression differentially activate threat‐ and reward‐related brain networks. Social Cognitive and Affective Neuroscience. doi: 10.1093/scan/nsq039 Mühlberger, A., Wieser, M. J., Herrmann, M. J., Weyers, P., Tröger, C., & Pauli, P. (2009). Early cortical processing of natural and artificial emotional faces differs between lower and higher socially anxious persons. Journal of Neural Transmission, 116, 735‐746. Mühlberger, A., Wieser, M. J., Herrmann, M. J., Weyers, P., Tröger, C., & Pauli, P. (2009). Early cortical processing of natural and artificial emotional faces differs between lower and higher socially anxious persons. Journal of Neural Transmission 116, 735‐746. Muthukumaraswamy, S. D., & Singh, K. D. (2008). Modulation of the human mirror neuron system during cognitive activity. Psychophysiology, 45, 896‐905. Neumann, R., & Strack, F. (2000). Approach and avoidance: The influence of proprioceptive and exteroceptive cues on encoding of affective information. Journal of Personality and Social Psychology, 79, 39‐48. Neumann, R., Hülsenbeck, K., & Seibt, B. (2004). Attitudes towards people with AIDS and avoidance behavior: Automatic and reflective bases of behavior. Journal of Experimental Social Psychology, 40, 543‐550. Neumann, R., Hess, M., Schulz, S. M., & Alpers, G. W. (2005). Automatic behavioural responses to valence: Evidence that facial action is facilitated by evaluative processing. Cognition & Emotion, 19, 499‐513. Niedenthal, P. M. (2007) Embodying emotion. Science, 316, 1002‐1005. Niedenthal, P. M., Barsalou, L. W., Winkielman, P., Krauth‐Gruber, S., & Ric, F. (2005). Embodiment in Attitudes, Social Perception, and Emotion. Personality and Social Psychology Review, 9, 184‐ 211. 121 Niedenthal, P. M., Brauer, M., Halberstadt, J. B., & Innes‐Ker, A. H. (2001). When did her smile drop? Facial mimicry and the influences of emotional state on the detection of change in emotional expression. Cognition & Emotion, 15, 853‐864. Niedenthal, P. M., Halberstadt, J. B., Margolin, J., & Innes‐Ker, A. H. (2000). Emotional state and the detection of change in facial expression of emotion. European Journal of Social Psychology, 30, 211‐222. Niedenthal, P. M., Winkielman, P., Mondillon, L. & Vermeulen, N. (2009) Embodiment of emotion concepts. Journal of Personality and Social Psychology 96, 1120‐1136. Norman, D. A., & Shallice, T. (1986). Attention to action. Willed and automatic control of behavior. In R. J. Davidson, G. E. Schwartz & D. Shapiro (Eds.), Consciousness and self regulation: Advances in research (pp. 1‐18). New York: Plenum. Oberman, L. M., Winkielman, P., & Ramachandran, V. S. (2007). Face to face: Blocking facial mimicry can selectively impair recognition of emotional expressions. Social Neuroscience, 2, 167‐178. Paulus, C. (2009). Der Saarbrücker Persönlichkeitsfragebogen SPF (IRI) zur Messung von Empathie: Psychometrische Evaluation der deutschen Version des Interpersonal Reactivity Index. [The Saarbrueck Personality Questionnaire on Empathy: Psychometric evaluation of the German version of the Interpersonal Reactivity Index]. Web document [On‐line]. Available: http://psydok.sulb.uni‐saarland.de/volltexte/2009/2363/ Perry, A., & Bentin, S. (2009). Mirror activity in the human brain while observing hand movements: A comparison between EEG desynchronization in the µ‐range and previous fMRI results. Brain Research, 1282, 126‐132. Pessoa, L. (2010). Emotion and cognition and the amygdala: From 'what is it?' to 'what's to be done?'. Neuropsychologia, 48, 3416‐3429. Pettigrew, T. F. (1998). Intergroup contact theory. Annual Review of Psychology, 49, 65‐85. Phelan, J. E., & Basow, S. A. (2007). College students' attitudes toward mental illness: An examination of the stigma process. Journal of Applied Social Psychology, 37, 2877‐2902. 122 Pizzagalli, D. A., Lehmann, D., Hendrick, A. M., Regard, M., Pascual‐Marqui, R. D., & Davidson, R. J. (2002). Affective judgments of faces modulate early activity (approximately 160 ms) within the fusiform gyri. Neuroimage, 16, 663‐677. Premack, D., & Woodruff, G. (1978). Does the chimpanzee have a "theory of mind''? Behaviour and Brain Sciences, 4, 515‐526. Preston, S. D., & de Waal, F. B. M. (2002). Empathy: Its ultimate and proximate bases. Behavioral and Brain Sciences, 25, 1‐20. Rizzolatti, G., & Craighero, L. (2004). The mirror neuron system. Annual Review of Neuroscience, 27, 169‐192. Rossion, B., Gauthier, I., Tarr, M. J., Despland, P., Bruyer, R., Linotte, S., et al. (2000). The N170 occipito‐temporal component is delayed and enhanced to inverted faces but not to inverted objects: an electrophysiological account of face‐specific processes in the human brain. Neuroreport, 11, 69‐74. Ruch, W. (1999). The Eysenck Personality Questionnaire‐‐Revised and the construction of the German standard and short versions (EPQ‐R and EPQ‐RK). Zeitschrift fur Differentielle und Diagnostische Psychologie, 20, 1‐24. Ruys, K. I., & Stapel, D. A. (2008). The secret life of emotions. Psychological Science, 19, 385‐391. Rydell, R. J., & McConnell, A. R. (2006). Understanding implicit and explicit attitude change: A systems of reasoning analysis. Journal of Personality and Social Psychology, 91, 995‐1008. Rydell, R. J., McConnell, A. R., Mackie, D. M., & Strain, L. M. (2006). Of Two Minds: Forming and Changing Valence‐Inconsistent Implicit and Explicit Attitudes. Psychological Science, 17, 954‐ 958. Schilbach, L., Eickhoff, S. B., Mojzisch, A., & Vogeley, K. (2008). What's in a smile? Neural correlates of facial embodiment during social interaction. Social Neuroscience, 3, 37‐50. Schmidt, K. L., Cohn, J. F., & Tian, Y. (2003). Signal characteristics of spontaneous facial expressions: automatic movement in solitary and social smiles. Biological Psychology, 65, 49‐66. 123 Schnall, S., Haidt, J., Clore, G. L., & Jordan, A. H. (2008). Disgust as embodied moral judgment. Personality and Social Psychology Bulletin, 34, 1096‐1109. Schubert, T. W., & Häfner, M. (2003). Contrast from social stereotypes in automatic behavior. Journal of Experimental Social Psychology, 39, 577‐584. Schultz, W. (2002). Getting formal with dopamine and reward. Neuron,36, 241‐263. Seacat, J. D., Hirschman, R., & Mickelson, K. D. (2007). Attributions of HIV onset controllability, emotional reactions, and helping intentions: Implicit effects of victim sexual orientation. Journal of Applied Social Psychology, 37, 1442‐1461. Seibt, B., Neumann, R., Nussinson, R., & Strack, F. (in press). Movement direction or change in distance? Self‐ and object‐related approach–avoidance motions. Journal of Experimental Social Psychology. Shamay‐Tsoory, S.G., Tibi‐Elhanany, Y., Aharon‐ Peretz, J. (2007). The green‐eyed monster and malicious joy: the neuroanatomical bases of envy and gloating (schadenfreude). Brain, 130, 1663–1678. Shimada, S., & Abe, R. (2010). Outcome and view of the player modulate motor area activity during observation of a competitive game. Neuropsychologia, 48, 1930‐1934. Singer, T., Seymour, B., O'Doherty, J., Kaube, H., Dolan, R. J., & Frith, C. D. (2004). Empathy for pain involves the affective but not sensory components of pain. Science, 303, 1157‐1162. Singer, T., Seymour, B., O'Doherty, J. P., Stephan, K. E., Dolan, R. J., & Frith, C. D. (2006). Empathic neural responses are modulated by the perceived fairness of others. Nature, 439, 466‐469. Smith, E. R., & DeCoster, J. (2000). Dual‐process models in social and cognitive psychology: Conceptual integration and links to underlying memory systems. Personality and Social Psychology Review, 4, 108‐131. Smith, E. R., & Neumann, R. (2005). Emotion Processes Considered from the Perspective of Dual‐ Process Models. In L. F. Barrett, P. M. Niedenthal & P. Winkielman (Eds.), Emotion and consciousness. (pp. 287‐311). New York, NY: Guilford Press. 124 Solarz, A. K. (1960). Latency of instrumental responses as a function of compatibility with the meaning of eliciting verbal signs. Journal of Experimental Psychology, 59, 239‐245. Sonnby‐Borgström, M., Jönsson, P., & Svensson, O. (2003). Emotional empathy as related to mimicry reactions at different levels of information processing. Journal of Nonverbal Behavior, 27, 3‐ 23. Spears, R., Gordijn, E., Dijksterhuis, A., & Stapel, D. A. (2004). Reaction in action: Intergroup contrast in automatic behavior. Personality and Social Psychology Bulletin, 30, 605‐616. Spencer‐Smith, J., Wild, H., Innes‐Ker, A. H., Townsend, J., Duffy, C., Edwards, C., Ervin, K., Merritt, N., & Paik, J. W. (2001). Making faces: Creating three‐dimensional parameterized models of facial expression. Behavior Research Methods, Instruments & Computers, 33, 115‐123. Stahl, J., Wiese, H., & Schweinberger, S. R. (2008). Expertise and own‐race bias in face processing: an event‐related potential study. NeuroReport, 19, 583‐587. Stewart, B. D., van Hippel, W., & Radvansky, G. A. (2009). Age, Race, and Implicit Prejudice. Using Process Dissociation to Separate the Underlying Components. Psychological Science, 20, 164‐ 168. Strack, F., & Deutsch, R. (2004). Reflective and impulsive determinants of social behavior. Personality and Social Psychology Review, 8, 220‐247. Swaab, R. I., Maddux, W. W., & Sinaceur, M. (2011). Early words that work: When and how virtual linguistic mimicry facilitates negotiation outcomes. Journal of Experimental Social Psychology, 46, 616‐621. Tai, Y. F., Scherfler, C., Brooks, D. J., Sawamoto, N., & Castiello, U. (2004). The human premotor cortex is 'mirror' only for biological actions. Current Biology, 14, 117‐120. Tanaka, J. W., Kiefer, M., & Bukach, C. M. (2004). A holistic account of the own‐race effect in face recognition: evidence from a cross‐cultural study. Cognition, 93, B1‐9. Tassinary, L. G., & Cacioppo, J. T. (1992). Unobservable facial actions and emotion. Psychological Science, 3, 28‐33. 125 Uithol, S., van Rooij, I., Bekkering, H., & Haselager, P. (in press). Understanding motor resonance. Social Neuroscience. Umilta, M. A., Kohler, E., Gallese, V., Fogassi, L., Fadiga, L., Keysers, C., et al. (2001). I know what you are doing: A neurophysiological study. Neuron, 31, 155‐165. van Baaren, R. B., Fockenberg, D. A., Holland, R. W., Janssen, L., & van Knippenberg, A. (2006). The Moody chameleon: The effect of mood on nonconsious mimicry. Social Cognition, 24, 426‐ 437. van der Gaag, C., Minderaa, R. B., & Keysers, C. (2007). Facial expressions: what the mirror neuron system can and cannot tell us. Social Neuroscience, 2, 179‐222. van Kleef, G. A. (2009). How emotions regulate social life: The emotions as social information (EASI) model. Current Directions in Psychological Science, 18(3), 184‐188. Vanman, E. J., Paul, B. Y., Ito, T. A., & Miller, N. (1997). The modern face of prejudice and structural features that moderate the effect of cooperation on affect. Journal of Personality and Social Psychology, 73, 941‐959. van Schie, H. T., Mars, R. B., Coles, M. G. H., & Bekkering, H. (2004). Modulation of activity in medial frontal and motor cortices during error observation. Nature Neuroscience, 7, 549‐554. van Vugt, M., De Cremer, D., & Janssen, D. P. (2007). Gender differences in cooperation and competition: the male‐warrior hypothesis. Psychological Science, 18, 19‐23. Varas‐Diaz, N., & Marzan‐Rodriguez, M. (2007). The emotional aspect of AIDS stigma among health professionals in Puerto Rico. AIDS Care, 19, 1247‐1257. Vormbrock, F., & Neuser, J. (1983). Konstruktion zweier spezifischer Trait‐Fragebogen zur Erfassung von Angst in sozialen Situationen (SANB und SVSS). Diagnostica, 29, 165‐182. Vrana, S. R., & Gross, D. (2004). Reactions to facial expressions: effects of social context and speech anxiety on responses to neutral, anger, and joy expressions. Biological Psychology, 66, 63‐78. Vuilleumier, P., Armony, J. L., Driver, J., & Dolan, R. J. (2001). Effects of attention and emotion on face processing in the human brain: An event‐ related fMRI study. Neuron, 30, 829‐841. 126 Walker, P. M., Silvert, L., Hewstone, M., & Nobre, A. C. (2008). Social contact and other‐race face processing in the human brain. Social Cognitive and Affective Neuroscience, 3, 16‐25. Wallbott, H. G. (1991). Recognition of emotion from facial expression via imitation? Some indirect evidence for an old theory. British Journal of Social Psychology, 30, 207‐219. Wang A. L., Mouraux A., Liang M., & Iannetti G.D. (2008). Enhancement of the N1 wave elicited by sensory stimuli presented at very short inter‐stimulus intervals is a general feature across sensory systems. PLoS ONE, 3, 3929. Watson, D., Clark, L. A., & Tellegen, A. (1988). Development and validation of brief measures of positive and negative affect: The PANAS scales. Journal of Personality and Social Psychology, 54, 1063‐1070. Weiner, B. (1995). Judgments of responsibility: A foundation for a theory of social conduct. New York, NY, US: Guilford Press. Weiner, B., Perry, R. P., & Magnusson, J. (1988). An attributional analysis of reactions to stigmas. Journal of Personality and Social Psychology, 55, 738‐748. Weyers, P., Mühlberger, A., Hefele, C., & Pauli, P. (2006). Electromyographic responses to static and dynamic avatar emotional facial expressions. Psychophysiology, 43, 450‐453. Weyers, P., Mühlberger, A., Kund, A., Hess, U., & Pauli, P. (2009). Modulation of facial reactions to avatar emotional faces by nonconscious competition priming. Psychophysiology, 46, 328‐335. Wicker, B., Keysers, C., Plailly, J., Royet, J.‐P., Gallese, V., & Rizzolatti, G. (2003). Both of us disgusted in my insula: The common neural basis of seeing and feeling disgust. Neuron, 40, 655‐664. Wijers, A. A., Mulder, G., Okita, T., Mulder, L. J., & Scheffers, M. K. (1989). Attention to color: an analysis of selection, controlled search, and motor activation, using event‐related potentials. Psychophysiology, 26, 89‐109. Wilson, T. D., Lindsey, S., & Schooler, T. Y. (2000). A model of dual attitudes. Psychological Review, 107, 101‐126. Yabar, Y., Johnston, L., Miles, L., & Peace, V. (2006). Implicit Behavioral Mimicry: Investigating the Impact of Group Membership. Journal of Nonverbal Behavior, 30, 97‐113. 127 5. FOOTNOTES 1 The assignment of shirt color to health status was not counterbalanced because there was no reason to assume that shirt color should influence facial mimicry. Both colors are used for patient clothes in hospitals in Germany. However, to be on the safe side the assignment was counterbalanced in study 2 and no effect of shirt color was found (see Footnote 2). 2 There was no effect of shirt color on any of the analyses. The pattern of results was essentially the same as in the main analyses without shirt color. All main effects of shirt color and interactions with that factor were non‐significant, all ps > .10. 128 6. APPENDIX 6.1. MATERIALS: EXPERIMENT 1 129 Institut für Psychologie der Universität Würzburg Lehrstuhl für Psychologie I Dipl.-Psych. Katja Likowski Informationsblatt zur Studie: Körperliche Veränderungen bei der Betrachtung von Gesichtsausdrücken Sie haben Gelegenheit, an einer Untersuchung im Rahmen eines Forschungsprojekts zu Körperlichen Veränderungen bei der Betrachtung von Gesichtsausdrücken teilzunehmen. Ziel der Untersuchung ist, die Wirkung von emotionalen Gesichtsausdrücken auf körperliche Funktionen und auf das Befinden in Abhängigkeit von Spielsituationen zu erfassen. Es werden verschiedene Bilder mit unterschiedlichen Gesichtsausdrücken am Computermonitor präsentiert, die Sie frei betrachten sollen. Dabei handelt es sich um Gesichtsausdrücke von computergenerierten Personen, wie Sie sie z.B. bei Computerspielen oder im Internet auch finden können. Während der Präsentation werden verschiedene körperliche Funktionen wie Herzschlag und Veränderungen der Leitfähigkeit der Haut gemessen. Schließlich sollen Sie noch die Gesichtsausdrücke bewerten und Ihr Befinden beschreiben. Zur Messung der Hautleitfähigkeit werden Oberflächenelektroden an Ihrem Brustkorb und Ihrem Gesicht befestigt. Sie können sicher sein, dass dies völlig ungefährlich ist. Vor und während der Untersuchung bitten wir Sie, ein paar Fragebögen zu beantworten. Die Untersuchung wird insgesamt etwa 1,5 Stunden dauern. Die Teilnahme an der Untersuchung ist völlig freiwillig. Sie können jederzeit - ohne Angabe von Gründen - die Teilnahme abbrechen. Dadurch werden Ihnen dann keinerlei persönliche Nachteile entstehen. Alle Daten, die erhoben werden, dienen ausschließlich Forschungszwecken, werden vertraulich behandelt und ohne Angabe des Namens unter einer Codenummer abgespeichert. Außerdem können Sie nachträglich, zu jedem Zeitpunkt, Ihre Einwilligung zur Datenanalyse widerrufen sowie die Löschung Ihrer Daten anordnen. Wenden Sie sich dazu bitte an Frau Dipl.-Psych. Katja Likowski, [email protected], Tel.: 0931/312069. Einverständniserklärung Ich erkläre, dass ich dieses Informationsblatt gelesen und verstanden habe und meine Fragen zufrieden stellend beantwortet wurden. Ich willige ein, an der Untersuchung teilzunehmen. Ebenso willige ich ein, dass die erhobenen Daten in anonymisierter Form wissenschaftlich ausgewertet werden. Ich bin darüber informiert worden, dass ich die Einwilligung zur Datenanalyse jederzeit widerrufen und die Löschung meiner Daten anordnen kann. Ich bin außerdem darüber informiert worden, dass ich die Untersuchung jederzeit abbrechen kann, ohne dass mir persönliche Nachteile entstehen. Würzburg, den _______________________________________________ Unterschrift _______________________________________________ Geburtsdatum _______________________________________________ Name (Druckschrift) _______________________________________________ Anschrift (Druckschrift) _______________________________________________ _______________________________________________ Unterschrift des Untersuchungsleiters ........................................................................ Institut für Psychologie der Universität Würzburg Lehrstuhl für Psychologie I Dipl.-Psych. Katja Likowski Sehr geehrter Teilnehmer, sehr geehrte Teilnehmerin, vielen Dank, dass Sie sich bereit erklärt haben, an dieser Untersuchung zu körperlichen Veränderungen bei der Betrachtung von Gesichtsausdrücken teilzunehmen. In diesem Fragebogen werden Sie gebeten, ihre aktuelle Stimmung einzuschätzen. Danach folgt die Aufgabe am Computer. Anschließend werden wir Sie noch einmal bitten, einige kurze Fragebögen zu Ihrer Person auszufüllen. Füllen Sie die Fragebögen bitte in der vorgegebenen Reihenfolge aus und beantworten Sie alle Fragen vollständig. Ihre Angaben dienen rein wissenschaftlichen Zwecken und werden selbstverständlich anonym behandelt. Zu den Aussagen in diesem Fragebogen gibt es keine richtigen oder falschen Antworten. Wir sind nur an Ihren ehrlichen Reaktionen und Meinungen interessiert. Bitte geben Sie an, wie Sie sich im Moment fühlen: gar nicht gut gelaunt bekümmert interessiert freudig erregt verärgert stark schuldig erschrocken feindselig angeregt stolz gereizt schlecht gelaunt beschämt wach nervös entschlossen aufmerksam durcheinander ängstlich aktiv niedergeschlagen verunsichert begeistert zufrieden erschöpft fröhlich ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ äußerst ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ Bitte geben Sie an, wie stark die folgenden Eigenschaften auf Sie persönlich zutreffen: trifft gar nicht zu ruhig freundlich egoistisch selbstbewusst nett sympathisch ernst aggressiv distanziert hinterlistig ordentlich boshaft ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ trifft voll zu ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ Bitte geben Sie an, wie negativ bzw. positiv Sie die folgenden Charaktereigenschaften, jeweils einzeln betrachtet, bei einer fiktiven Person finden würden: sehr negativ ruhig freundlich egoistisch selbstbewusst nett sympathisch ernst aggressiv distanziert hinterlistig ordentlich boshaft ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ sehr positiv neutral ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ Bitte geben Sie im Folgenden an, wie sehr Sie den einzelnen Aussagen zustimmen: trifft gar nicht zu trifft sehr gut zu 1. Ich empfinde oft warmherzige Gefühle für Leute, denen es weniger gut geht als mir. ○ ○ ○ ○ ○ 2. Die Gefühle einer Person in einem Roman kann ich mir oft sehr gut vorstellen. ○ ○ ○ ○ ○ 3. In Notfallsituationen fühle ich mich ängstlich und unbehaglich. ○ ○ ○ ○ ○ 4. Ich versuche, bei einem Streit zuerst beide Seiten zu verstehen, bevor ich eine Entscheidung treffe. ○ ○ ○ ○ ○ 5. Wenn ich sehe, wie jemand ausgenutzt wird, glaube ich, ihn schützen zu müssen. ○ ○ ○ ○ ○ 6. Manchmal fühle ich mich hilflos, wenn ich inmitten einer sehr emotionsgeladenen Situation bin. ○ ○ ○ ○ ○ 7. Ich versuche manchmal, meine Freunde besser zu verstehen, indem ich mir vorstelle, wie die Dinge aus ihrer Sicht aussehen könnten. ○ ○ ○ ○ ○ 8. Nachdem ich einen Film gesehen habe, fühle ich mich manchmal so, als ob ich eine der Personen aus dem Film sei. ○ ○ ○ ○ ○ 9. Oft berühren mich Dinge sehr, die ich nur beobachte. ○ ○ ○ ○ ○ 10. Ich glaube, jedes Problem hat zwei Seiten und versuche deshalb beide zu berücksichtigen. ○ ○ ○ ○ ○ 11. Ich würde mich selbst als eine ziemlich weichherzige Person bezeichnen. ○ ○ ○ ○ ○ 12. Wenn ich einen guten Film sehe, kann ich mich sehr leicht in die Hauptperson hineinversetzen. ○ ○ ○ ○ ○ 13. Ich neide dazu, in Notfällen die Kontrolle über mich zu verlieren. ○ ○ ○ ○ ○ 14. Wenn ich eine interessante Geschichte oder ein gutes Buch lese, versuche ich mir vorzustellen, wie ich mich fühlen würde, wenn mir die Ereignisse des Buches passieren würden. ○ ○ ○ ○ ○ 15. Wenn ich jemanden sehen müsste, der dringend Hilfe in einem Notfall bräuchte, würde ich bestimmt zusammenbrechen. ○ ○ ○ ○ ○ 16. Bevor ich jemanden kritisiere, versuche ich mir vorzustellen, wie ich mich an seiner Stelle fühlen würde. ○ ○ ○ ○ ○ Bitte geben Sie im Folgenden an, wie sehr Sie den einzelnen Aussagen zustimmen: trifft gar nicht zu trifft sehr gut zu 1. Ich mache mir Gedanken darüber, was andere Leute von mir denken, auch wenn ich weiß, dass es egal ist. ○ ○ ○ ○ ○ 2. Ich verhalte mich unbekümmert, auch wenn ich merke, dass andere Leute einen schlechten Eindruck von mir bekommen. ○ ○ ○ ○ ○ 3. Ich habe oft Angst, dass andere Leute meine Fehler bemerken. ○ ○ ○ ○ ○ 4. Ich mache mir selten Gedanken darüber, welchen Eindruck ich auf jemand anderes mache. ○ ○ ○ ○ ○ 5. Ich habe Angst, dass andere sich nicht positiv über mich äußern. ○ ○ ○ ○ ○ 6. Ich habe Angst, dass andere Leute etwas an mir auszusetzen haben. ○ ○ ○ ○ ○ 7. Die Meinung anderer Leute über mich lässt mich kalt. ○ ○ ○ ○ ○ 8. Wenn ich mit jemandem spreche, mache ich mir Gedanken darüber, was der andere über mich denken könnte. ○ ○ ○ ○ ○ 9. Normalerweise mache ich mir Gedanken darüber, wie ich auf andere wirke. ○ ○ ○ ○ ○ 10. Wenn ich weiß, dass mich jemand beurteilt, macht es mir kaum etwas aus. ○ ○ ○ ○ ○ 11. Manchmal glaube ich, ich beschäftige mich viel zu sehr damit, was andere Leute von mir denken. ○ ○ ○ ○ ○ 12. Ich habe oft Angst, dass ich etwas Falsches sagen oder tun würde. ○ ○ ○ ○ ○ Bitte beantworten Sie die folgenden Fragen durch Ankreuzen der für Sie zutreffenden Antwort: ja nein 3. Macht es Ihnen etwas aus, wenn Sie jemandem Geld schulden? ○ ○ ○ ○ ○ ○ 4. Waren Sie jemals so gierig, dass Sie sich mehr genommen haben, als Ihnen eigentlich zustand? ○ ○ 5. Sind Sie ziemlich lebhaft? ○ ○ 6. Würde es Sie sehr aus der Fassung bringen, wenn Sie ein Kind oder ein Tier leiden sehen? ○ ○ 7. Haben Sie eine Abneigung gegen Leute, die sich nicht zu benehmen wissen? ○ ○ 8. Lassen Sie sich auf einer lebhaften Party gerne gehen und amüsieren Sie sich? ○ ○ 9. Sind Sie leicht reizbar? 15. Halten Sie sich bei geselligen Zusammenkünften lieber im Hintergrund? ○ ○ ○ ○ ○ ○ ○ 16. Würden Sie Drogen nehmen, die seltsame oder gefährliche Auswirkungen haben könnten? ○ ○ 17. Haben Sie es häufig „richtig satt“? ○ ○ 18. Haben Sie jemals etwas genommen (und wenn es nur eine Stecknadel oder ein Knopf war), obwohl es einem anderen gehörte? ○ ○ 19. Sind Sie oft von Schuldgefühlen geplagt? 22. Sind gutes Benehmen und Sauberkeit wichtig für Sie? ○ ○ ○ ○ 23. Haben Sie jemals etwas zerbrochen oder verloren, das einem anderen gehörte? ○ ○ 24. Ergreifen Sie gewöhnlich die Initiative, wenn Sie neue Bekanntschaften machen? ○ ○ 1. Wechselt Ihre Stimmung oft? 2. Sind Sie sehr gesprächig? 10. Sollte man immer das Gesetz befolgen? 11. Machen Sie gerne neue Bekanntschaften? 12. Sind gute Manieren sehr wichtig? 13. Sind Ihre Gefühle leicht verletzt? 14. Sind alle Ihre Gewohnheiten gut und wünschenswert? 20. Halten Sie sich für einen nervösen Menschen? 21. Sind Sie ein Typ, der sich oft sorgt? ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ja nein 25. Sind Sie meist schweigsam, wenn Sie mit anderen Leuten zusammen sind? ○ ○ 26. Sind Sie der Meinung, dass die Ehe eine altmodische Sache ist und abgeschafft werden sollte? ○ ○ 27. Gelingt es Ihnen leicht, Leben in eine langweilige Party zu bringen? 30. Sind Sie gern unter Leuten? ○ ○ ○ ○ 31. Stört es Sie, wenn Sie bemerken, dass Sie Fehler in Ihrer eigenen Arbeit gemacht haben? ○ ○ 32. Fühlen Sie sich ab und zu ohne Grund matt und erschöpft? 43. Handeln Sie auch immer so, wie Sie reden? ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ 44. Ist es besser, sich an die Regeln der Gesellschaft zu halten, als seinen eigenen Weg zu gehen? ○ ○ 45. Sind Sie je zu spät zu einer Verabredung oder zur Arbeit gekommen? 49. Halten andere Sie für sehr lebhaft? ○ ○ ○ ○ ○ 50. Glauben Sie, dass man gegenüber der eigenen Familie eine besondere Verpflichtung hat? ○ ○ 28. Haben Sie jemals schlecht oder gemein über jemanden gesprochen? 29. Waren Sie als Kind jemals frech zu Ihren Eltern? 33. Haben Sie beim Spiel schon mal gemogelt? 34. Finden Sie oft, dass das Leben langweilig ist? 35. Haben Sie schon mal jemanden ausgenützt? 36. Können Sie eine Party in Schwung bringen? 37. Vermeiden Sie es, grob zu anderen Leuten zu sein? 38. Denken Sie oft lange über eine peinliche Erfahrung nach? 39. Ist „erst denken, dann handeln“ Ihr Grundsatz? 40. Haben Sie jemals darauf bestanden, Ihren eigenen Willen durchzusetzen? 41. Haben Sie es „mit den Nerven“? 42. Fühlen Sie sich oft einsam? 46. Haben Sie gern Geschäftigkeit und Trubel um sich herum? 47. Hätten Sie es gern, dass andere Leute Sie fürchten? 48. Verschieben Sie manchmal etwas auf morgen, was Sie heute tun müssten? ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ Angaben zur Person Alter: _______ Geschlecht: ○ weiblich Schulabschluss: ○ Volks-/Hauptschulabschluss ○ mittlere Reife ○ Fachabitur ○ Abitur ○ Fachhochschulabschluss ○ Hochschulabschluss ○ voll berufstätig ○ teilzeitbeschäftigt ○ in Ausbildung ○ arbeitslos ○ Studium Berufsstand: ○ männlich Studienfach: _______________________ Muttersprache: Haarfarbe: ○ deutsch ○ _______________________ Semester: ________ andere Bitte beantworten Sie uns abschließend noch kurz folgende Fragen: Wie gut ist es Ihnen Ihrer Meinung nach gelungen, sich die Avatare mit Ihren jeweiligen Eigenschaften einzuprägen? gar nicht ○ ○ ○ ○ ○ ○ ○ sehr gut Wie wichtig war es Ihnen, sich die Avatare mit Ihren jeweiligen Eigenschaften einzuprägen? gar nicht wichtig ○ ○ ○ ○ ○ ○ ○ sehr wichtig Worin könnte Ihrer Meinung nach die Untersuchungsabsicht bestehen? ............................................................................................................................................ ............................................................................................................................................ ............................................................................................................................................ ............................................................................................................................................ Was ist Ihnen während der Untersuchung ansonsten aufgefallen? ............................................................................................................................................ ............................................................................................................................................ ........................................................................................................................................ ............................................................................................................................................ Haben Sie schon einmal an einer ähnlichen Studie teilgenommen? Wenn ja, worum ging es dabei? ............................................................................................................................................ ............................................................................................................................................ ............................................................................................................................................ Vielen Dank für Ihre Mitarbeit! 6.2. MATERIALS: EXPERIMENT 2 142 Institut für Psychologie der Universität Würzburg Lehrstuhl für Psychologie I Dr. Peter Weyers Informationsblatt zur Studie: Körperliche Veränderungen bei der Betrachtung von Gesichtsausdrücken Sie haben Gelegenheit, an einer Untersuchung zu teilzunehmen, deren Ergebnisse für die Entwicklung eines Computertrainings verwendet werden sollen. Dieses Computertraining soll später Krankenschwesterschülerinnen im Umgang mit AIDS-Patienten schulen. Es werden verschiedene Bilder gesunder sowie an Aids erkrankter Personen mit unterschiedlichen Gesichtsausdrücken am Computermonitor präsentiert, die Sie frei betrachten sollen. Dabei handelt es sich um Gesichtsausdrücke computergenerierter Personen, wie Sie sie z.B. auch in Computerspielen oder im Internet auch finden können. Während der Präsentation werden Veränderungen der Leitfähigkeit der Haut gemessen. Schließlich sollen Sie noch die Gesichtsausdrücke bewerten und Ihr Befinden beschreiben. Zur Messung der Hautleitfähigkeit werden Oberflächenelektroden an Ihrem Gesicht befestigt. Sie können sicher sein, dass dies völlig ungefährlich ist. Vor und während der Untersuchung bitten wir Sie, ein paar Fragebögen zu beantworten. Die Untersuchung wird insgesamt etwa 1,5 Stunden dauern. Die Teilnahme an der Untersuchung ist völlig freiwillig. Sie können jederzeit - ohne Angabe von Gründen - die Teilnahme abbrechen. Dadurch werden Ihnen dann keinerlei persönliche Nachteile entstehen. Alle Daten, die erhoben werden, dienen ausschließlich Forschungszwecken, werden vertraulich behandelt und ohne Angabe des Namens unter einer Codenummer abgespeichert. Außerdem können Sie nachträglich, zu jedem Zeitpunkt, Ihre Einwilligung zur Datenanalyse widerrufen sowie die Löschung Ihrer Daten anordnen. Wenden Sie sich dazu bitte an Herrn Dr. Peter Weyers, [email protected], Tel.: 0931/312849. Einverständniserklärung Ich erkläre, dass ich dieses Informationsblatt gelesen und verstanden habe und meine Fragen zufrieden stellend beantwortet wurden. Ich willige ein, an der Untersuchung teilzunehmen. Ebenso willige ich ein, dass die erhobenen Daten in anonymisierter Form wissenschaftlich ausgewertet werden. Ich bin darüber informiert worden, dass ich die Einwilligung zur Datenanalyse jederzeit widerrufen und die Löschung meiner Daten anordnen kann. Ich bin außerdem darüber informiert worden, dass ich die Untersuchung jederzeit abbrechen kann, ohne dass mir persönliche Nachteile entstehen. Würzburg, den _______________________________________________ Unterschrift _______________________________________________ Geburtsdatum _______________________________________________ Name (Druckschrift) _______________________________________________ Anschrift (Druckschrift) _______________________________________________ _______________________________________________ Unterschrift des Untersuchungsleiters ........................................................................ PANAS Dieser Fragebogen enthält eine Reihe von Wörtern, die unterschiedliche Gefühle und Empfindungen beschreiben. Lesen Sie jedes Wort und tragen Sie dann in die Skala neben jedem Wort die Intensität ein. Sie haben die Möglichkeit, zwischen fünf Abstufungen zu wählen: 1. ganz wenig oder gar nicht 2. ein bißchen 3. einigermaßen 4. erheblich 5. äußerst Geben Sie bitte an, wie Sie sich im Moment fühlen. ganz wenig oder gar nicht ein bißchen einigermaßen erheblich äußerst aktiv bekümmert interessiert O---------O---------O---------O---------O O---------O---------O---------O---------O O---------O---------O---------O---------O freudig erregt verärgert stark schuldig erschrocken feindselig angeregt stolz gereizt begeistert beschämt wach nervös entschlossen O---------O---------O---------O---------O O---------O---------O---------O---------O O---------O---------O---------O---------O O---------O---------O---------O---------O O---------O---------O---------O---------O O---------O---------O---------O---------O O---------O---------O---------O---------O O---------O---------O---------O---------O O---------O---------O---------O---------O O---------O---------O---------O---------O O---------O---------O---------O---------O O---------O---------O---------O---------O O---------O---------O---------O---------O O---------O---------O---------O---------O aufmerksam durcheinander ängstlich O---------O---------O---------O---------O O---------O---------O---------O---------O O---------O---------O---------O---------O - B F N E Name/Code: _____________________ Datum: ___________ Bitte lesen Sie jede der folgenden Feststellungen aufmerksam durch und geben Sie durch Ankreuzen auf der angegebenen Skala an, wie charakteristisch diese ihrer Meinung nach für Sie ist. 1 = überhaupt nicht charakteristisch für mich, 2 = ein bisschen charakteristisch für mich, 3 = einigermaßen charakteristisch für mich, 4 = sehr charakteristisch für mich, 5. äußerst charakteristisch für mich 1. Ich mache mir Gedanken darüber, was andere Leute von mir denken, auch wenn ich weiß, dass es egal ist. 1 2 3 4 5 2. Ich verhalte mich unbekümmert, auch wenn ich merke, dass andere Leute einen schlechten Eindruck von mir bekommen. 1 2 3 4 5 3. Ich habe oft Angst, dass andere Leute meine Fehler bemerken. 1 2 3 4 5 4. Ich mache mir selten Gedanken darüber, welchen Eindruck ich auf jemand anderes mache. 1 2 3 4 5 5. Ich habe Angst, dass andere sich nicht positiv über mich äußern. 1 2 3 4 5 6. Ich habe Angst, dass andere Leute etwas an mir auszusetzen haben. 1 2 3 4 5 7. Die Meinung anderer Leute über mich lässt mich kalt. 1 2 3 4 5 8. Wenn ich mit jemandem spreche, mache ich mir Gedanken darüber, was der andere über mich denken könnte. 1 2 3 4 5 9. Normalerweise mache ich mir Gedanken darüber, wie ich auf andere wirke. 1 2 3 4 5 10. Wenn ich weiß, dass mich jemand beurteilt, macht es mir kaum etwas aus. 1 2 3 4 5 11. Manchmal glaube ich, ich beschäftige mich viel zu sehr damit, was andere Leute von mir denken. 1 2 3 4 5 12. Ich habe oft Angst, dass ich etwas Falsches sagen oder tun würde. 1 2 3 4 5 Angaben zur Person Alter: _______ Geschlecht: ○ weiblich Schulabschluss: ○ Volks-/Hauptschulabschluss ○ mittlere Reife ○ Fachabitur ○ Abitur ○ Fachhochschulabschluss ○ Hochschulabschluss ○ voll berufstätig ○ teilzeitbeschäftigt ○ in Ausbildung ○ arbeitslos ○ Studium Berufsstand: ○ männlich Studienfach: _______________________ Muttersprache: Haarfarbe: ○ deutsch ○ _______________________ Semester: ________ andere Bitte beantworten Sie uns abschließend noch kurz folgende Fragen: Welche Farbe hatte das Shirt, durch das die an AIDS erkrankten Avatare gekennzeichnet waren? ............................................................................................................................................ Welche Farbe hatte das Shirt, durch das die gesunden Avatare gekennzeichnet waren? ............................................................................................................................................ Wie gut ist es Ihnen Ihrer Meinung nach gelungen, sich die Kennzeichnung der Avatare (ShirtFarbe) einzuprägen? gar nicht ○ ○ ○ ○ ○ ○ ○ sehr gut Wie wichtig war es Ihnen, sich die Kennzeichnung der Avatare (Shirt-Farbe) einzuprägen? gar nicht wichtig ○ ○ ○ ○ ○ ○ ○ sehr wichtig Worin könnte Ihrer Meinung nach die Untersuchungsabsicht bestehen? ............................................................................................................................................ ............................................................................................................................................ ............................................................................................................................................ Was ist Ihnen während der Untersuchung ansonsten aufgefallen? ............................................................................................................................................ ............................................................................................................................................ Haben Sie schon einmal an einer ähnlichen Studie teilgenommen? Wenn ja, worum ging es dabei? ............................................................................................................................................ ............................................................................................................................................ ............................................................................................................................................ Vielen Dank für Ihre Mitarbeit! 6.3. MATERIALS: EXPERIMENT 3 149 Institut für Psychologie der Universität Würzburg Lehrstuhl für Psychologie I Dipl.-Psych. Katja Likowski Informationsblatt zur Studie: Körperliche Veränderungen bei der Betrachtung von Spielcharakteren Sie haben Gelegenheit, an einer Untersuchung im Rahmen eines Forschungsprojekts zu Körperlichen Veränderungen bei der Betrachtung von Spielcharakteren teilzunehmen. Ziel der Untersuchung ist, die Wirkung von emotionalen Gesichtsausdrücken auf körperliche Funktionen und auf das Befinden in Abhängigkeit von Spielsituationen zu erfassen. Es werden verschiedene Bilder mit unterschiedlichen Gesichtsausdrücken am Computermonitor präsentiert, die Sie frei betrachten sollen. Dabei handelt es sich um Gesichtsausdrücke von computergenerierten Personen, wie Sie sie z.B. bei Computerspielen oder im Internet auch finden können. Während der Präsentation werden verschiedene körperliche Funktionen wie Veränderungen der Leitfähigkeit der Haut gemessen. Zur Messung der Hautleitfähigkeit werden Oberflächenelektroden in Ihrem Gesicht befestigt. Sie können sicher sein, dass dies völlig ungefährlich ist. Vor und während der Untersuchung bitten wir Sie, ein paar Fragebögen zu beantworten. Die Untersuchung wird insgesamt etwa 1,5 Stunden dauern. Die Teilnahme an der Untersuchung ist völlig freiwillig. Sie können jederzeit - ohne Angabe von Gründen - die Teilnahme abbrechen. Dadurch werden Ihnen dann keinerlei persönliche Nachteile entstehen. Alle Daten, die erhoben werden, dienen ausschließlich Forschungszwecken, werden vertraulich behandelt und ohne Angabe des Namens unter einer Codenummer abgespeichert. Außerdem können Sie nachträglich, zu jedem Zeitpunkt, Ihre Einwilligung zur Datenanalyse widerrufen sowie die Löschung Ihrer Daten anordnen. Wenden Sie sich dazu bitte an Frau Dipl.-Psych. Katja Likowski, [email protected], Tel.: 0931/312069. Einverständniserklärung Ich erkläre, dass ich dieses Informationsblatt gelesen und verstanden habe und meine Fragen zufrieden stellend beantwortet wurden. Ich willige ein, an der Untersuchung teilzunehmen. Ebenso willige ich ein, dass die erhobenen Daten in anonymisierter Form wissenschaftlich ausgewertet werden. Ich bin darüber informiert worden, dass ich die Einwilligung zur Datenanalyse jederzeit widerrufen und die Löschung meiner Daten anordnen kann. Ich bin außerdem darüber informiert worden, dass ich die Untersuchung jederzeit abbrechen kann, ohne dass mir persönliche Nachteile entstehen. Würzburg, den _______________________________________________ Unterschrift _______________________________________________ Geburtsdatum _______________________________________________ Name (Druckschrift) _______________________________________________ Anschrift (Druckschrift) _______________________________________________ _______________________________________________ Unterschrift des Untersuchungsleiters ........................................................................ Bitte lesen Sie sich den folgenden Auszug aus einem Interview aufmerksam durch: Im folgenden Interviewmitschnitt beschreibt Julia, eine junge Frau aus Berlin, wie sich ihr Leben unerwartet vor drei Monaten änderte, als sie erfuhr, dass sie HIV-positiv sei: „Nun, wie Sie sich vorstellen können, ist es ziemlich furchteinflößend. Jedes Mal, wenn ich huste oder mich ein bisschen schlapp fühle, frage ich mich, ob es nun so weit ist. Ist dies nun der Anfang vom Ende? Manchmal fühle ich mich ziemlich gut, aber ich habe die Krankheit immer im Hinterkopf. Jeden Tag könnte sich mein Zustand verschlechtern. Und ich weiß, dass es - zumindest zum jetzigen Zeitpunkt – kein Entkommen gibt. Ich weiß, dass nach Wegen geforscht wird, die Krankheit zu heilen und ich weiß auch, dass wir alle sterben werden. Aber es erscheint mir alles so ungerecht. So furchtbar. Wie ein Albtraum. (Pause) Ich meine, ich fühle mich, als ob ich gerade angefangen habe zu leben, und stattdessen werde ich jetzt sterben. Es kann einen wirklich runterziehen.“ Das Interview fährt damit fort, beschreibt, die sie durchführt, um sich Gedanken macht, was passiert, ihrer Krankheit erfahren und dass finanzielle Situation macht. dass Julia ihre Vorsichtmaßnahmen Infektionen zu vermeiden, dass sie wenn ihre Kollegen und ihr Chef von sie sich Sorgen um ihre zukünftige Das Interview endet damit, dass der Interviewer Julia fragt, wie sie sich mit AIDS infiziert habe. „Nun, im Sommer nach dem Abitur ging ich mit ein paar Freunden aus. Wir waren auf dem Weg ins Kino, als ein betrunkener Autofahrer mit unserem Wagen zusammenstieß. Es war wirklich ein schwerer Unfall. Ich verlor viel Blut und erhielt im Krankenhaus einige Bluttransfusionen. Weder ich noch die Ärzte wissen genau wie, aber die Ärzte sagen, dass ich irgendwann in dieser Nacht mit dem HI-Virus in Berührung gekommen bin. Es gab in dem Zeitraum noch zwei weitere Fälle im gleichen Krankenhaus.“ Bitte geben Sie nun an, wie Sie sich gefühlt haben, während Sie die Geschichte gelesen haben: gar nicht mitfühlend bedrückt amüsiert warmherzig besorgt fröhlich aufgebracht bekümmert hoffnungsvoll bewegt niedergeschlagen erschüttert bestürzt erleichtert aufgewühlt beunruhigt anteilnehmend verstört mitleidsvoll freudig weichherzig optimistisch betrübt traurig ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ äußerst ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ Bitte beantworten Sie uns abschließend noch kurz folgende Fragen: Wie stark haben Sie sich während des Lesens vorgestellt, wie sich Julia gefühlt hat? gar nicht ○ ○ ○ ○ ○ ○ ○ sehr stark ○ ○ sehr stark Wie sachlich und objektiv sind Sie während des Lesens geblieben? gar nicht ○ ○ ○ ○ ○ Bitte erinnern Sie sich nun noch einmal an die Spielsituation. Wie wichtig finden Sie in solch einer Situation die folgenden Aspekte? gar nicht wichtig 1. … dem Spielpartner sympathisch erscheinen 2. … eine harmonische, reibungslose Interaktion 3. … verstehen, was der Spielpartner gerade denkt und fühlt 4. … dem Spielpartner die eigenen Gefühle und Gedanken mitteilen 5. … nonverbale Kommunikation mit dem Spielpartner 6. … die eigenen Gefühle und Gedanken verbergen 7. … eine nüchterne, kühle Atmosphäre 8. … dem Spielpartner überlegen erscheinen ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ sehr wichtig ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ Bitte geben Sie im Folgenden an, wie sehr Sie den einzelnen Aussagen zustimmen: trifft gar nicht zu trifft sehr gut zu 1. Ich empfinde oft warmherzige Gefühle für Leute, denen es weniger gut geht als mir. ○ ○ ○ ○ ○ 2. Die Gefühle einer Person in einem Roman kann ich mir oft sehr gut vorstellen. ○ ○ ○ ○ ○ 3. In Notfallsituationen fühle ich mich ängstlich und unbehaglich. ○ ○ ○ ○ ○ 4. Ich versuche, bei einem Streit zuerst beide Seiten zu verstehen, bevor ich eine Entscheidung treffe. ○ ○ ○ ○ ○ 5. Wenn ich sehe, wie jemand ausgenutzt wird, glaube ich, ihn schützen zu müssen. ○ ○ ○ ○ ○ 6. Manchmal fühle ich mich hilflos, wenn ich inmitten einer sehr emotionsgeladenen Situation bin. ○ ○ ○ ○ ○ 7. Ich versuche manchmal, meine Freunde besser zu verstehen, indem ich mir vorstelle, wie die Dinge aus ihrer Sicht aussehen könnten. ○ ○ ○ ○ ○ 8. Nachdem ich einen Film gesehen habe, fühle ich mich manchmal so, als ob ich eine der Personen aus dem Film sei. ○ ○ ○ ○ ○ 9. Oft berühren mich Dinge sehr, die ich nur beobachte. ○ ○ ○ ○ ○ 10. Ich glaube, jedes Problem hat zwei Seiten und versuche deshalb beide zu berücksichtigen. ○ ○ ○ ○ ○ 11. Ich würde mich selbst als eine ziemlich weichherzige Person bezeichnen. ○ ○ ○ ○ ○ 12. Wenn ich einen guten Film sehe, kann ich mich sehr leicht in die Hauptperson hineinversetzen. ○ ○ ○ ○ ○ 13. Ich neige dazu, in Notfällen die Kontrolle über mich zu verlieren. ○ ○ ○ ○ ○ 14. Wenn ich eine interessante Geschichte oder ein gutes Buch lese, versuche ich mir vorzustellen, wie ich mich fühlen würde, wenn mir die Ereignisse des Buches passieren würden. ○ ○ ○ ○ ○ 15. Wenn ich jemanden sehen müsste, der dringend Hilfe in einem Notfall bräuchte, würde ich bestimmt zusammenbrechen. ○ ○ ○ ○ ○ 16. Bevor ich jemanden kritisiere, versuche ich mir vorzustellen, wie ich mich an seiner Stelle fühlen würde. ○ ○ ○ ○ ○ Bitte geben Sie im Folgenden an, wie sehr Sie den einzelnen Aussagen zustimmen: trifft gar nicht zu trifft sehr gut zu 1. Ich mache mir Gedanken darüber, was andere Leute von mir denken, auch wenn ich weiß, dass es egal ist. ○ ○ ○ ○ ○ 2. Ich verhalte mich unbekümmert, auch wenn ich merke, dass andere Leute einen schlechten Eindruck von mir bekommen. ○ ○ ○ ○ ○ 3. Ich habe oft Angst, dass andere Leute meine Fehler bemerken. ○ ○ ○ ○ ○ 4. Ich mache mir selten Gedanken darüber, welchen Eindruck ich auf jemand anderes mache. ○ ○ ○ ○ ○ 5. Ich habe Angst, dass andere sich nicht positiv über mich äußern. ○ ○ ○ ○ ○ 6. Ich habe Angst, dass andere Leute etwas an mir auszusetzen haben. ○ ○ ○ ○ ○ 7. Die Meinung anderer Leute über mich lässt mich kalt. ○ ○ ○ ○ ○ 8. Wenn ich mit jemandem spreche, mache ich mir Gedanken darüber, was der andere über mich denken könnte. ○ ○ ○ ○ ○ 9. Normalerweise mache ich mir Gedanken darüber, wie ich auf andere wirke. ○ ○ ○ ○ ○ 10. Wenn ich weiß, dass mich jemand beurteilt, macht es mir kaum etwas aus. ○ ○ ○ ○ ○ 11. Manchmal glaube ich, ich beschäftige mich viel zu sehr damit, was andere Leute von mir denken. ○ ○ ○ ○ ○ 12. Ich habe oft Angst, dass ich etwas Falsches sagen oder tun würde. ○ ○ ○ ○ ○ Angaben zur Person Alter: _______ Geschlecht: ○ weiblich Schulabschluss: ○ Volks-/Hauptschulabschluss ○ mittlere Reife ○ Fachabitur ○ Abitur ○ Fachhochschulabschluss ○ Hochschulabschluss ○ voll berufstätig ○ teilzeitbeschäftigt ○ in Ausbildung ○ arbeitslos ○ Studium Berufsstand: ○ männlich Studienfach: _______________________ Muttersprache: ○ deutsch ○ Semester: ________ andere Bitte beantworten Sie uns abschließend noch kurz folgende Fragen: Worin könnte Ihrer Meinung nach die Untersuchungsabsicht bestehen? ............................................................................................................................................ ............................................................................................................................................ ............................................................................................................................................ ............................................................................................................................................ Was ist Ihnen während der Untersuchung ansonsten aufgefallen? ............................................................................................................................................ ............................................................................................................................................ ........................................................................................................................................ ............................................................................................................................................ Haben Sie schon einmal an einer ähnlichen Studie teilgenommen? Wenn ja, worum ging es dabei? ............................................................................................................................................ ............................................................................................................................................ ............................................................................................................................................ Vielen Dank für Ihre Mitarbeit! 6.4. MATERIALS: EXPERIMENT 4 160 Lehrstuhl für Psychologie I - Prof. Dr. Paul Pauli Biologische Psychologie, Klinische Psychologie und Psychotherapie „Körperliche Veränderungen bei der Betrachtung von Spielcharakteren“ Sie haben Gelegenheit, an einer Untersuchung im Rahmen eines Forschungsprojekts zu Körperlichen Veränderungen bei der Betrachtung von Spielcharakteren teilzunehmen. Ziel der Untersuchung ist, die Wirkung von emotionalen Gesichtsausdrücken auf körperliche Funktionen und auf das Befinden in Abhängigkeit von Spielsituationen zu erfassen. Es werden verschiedene Bilder mit unterschiedlichen Gesichtsausdrücken am Computermonitor präsentiert, die Sie frei betrachten sollen. Dabei handelt es sich um Gesichtsausdrücke von computergenerierten Personen, wie Sie sie z.B. bei Computerspielen oder im Internet auch finden können. Während der Präsentation werden verschiedene körperliche Funktionen wie Veränderungen der Leitfähigkeit der Haut sowie ihre Hirnströme gemessen. Schließlich sollen Sie noch die Gesichtsausdrücke bewerten. Zur Messung der Hautleitfähigkeit werden Oberflächenelektroden in Ihrem Gesicht befestigt. Die Messung Ihrer Hirnströme (EEG) erfolgt über Elektroden auf ihrer Kopfhaut. Sie können sicher sein, dass dies völlig ungefährlich und schmerzfrei ist. Vor und während der Untersuchung bitten wir Sie, ein paar Fragebögen zu beantworten. Die Untersuchung wird insgesamt etwa 1,5 Stunden dauern. Die Zahlung der Aufwandentschädigung erfolgt im Anschluss an die Untersuchung. Sie haben dann außerdem die Gelegenheit, ihre Kontaktdaten zu hinterlassen, um weitere Informationen zu den Zielen und Ergebnissen dieser Untersuchung zu bekommen. Die Teilnahme an der Untersuchung ist völlig freiwillig. Sie können jederzeit - ohne Angabe von Gründen und ohne, dass Ihnen Nachteile entstehen - die Teilnahme abbrechen. Dadurch werden Ihnen keinerlei persönliche Nachteile entstehen und Sie erhalten die Aufwandsentschädigung anteilig ausbezahlt. Alle Daten, die erhoben werden, dienen ausschließlich Forschungszwecken, werden unter Einhaltung der geltenden Datenschutzbedingungen vertraulich behandelt und ohne Angabe des Namens unter einer Codenummer abgespeichert. Außerdem können Sie nachträglich und zu jedem Zeitpunkt - ohne Angabe von Gründen und ohne, dass Ihnen Nachteile entstehen - Ihre Einwilligung zur Datenanalyse widerrufen sowie die Löschung Ihrer Daten anordnen. Wenden Sie sich dazu bitte an Frau Dipl.-Psych. Katja Likowski, [email protected], Tel.: 0931/312069. Seite 1 von 2 Einverständniserklärung Ich erkläre, dass ich dieses Informationsblatt gelesen und verstanden habe und meine Fragen zufrieden stellend beantwortet wurden. Ich willige ein, freiwillig an der Untersuchung teilzunehmen. Ebenso willige ich ein, dass die erhobenen Daten in anonymisierter Form wissenschaftlich ausgewertet werden. Ich bin darüber informiert worden, dass ich jederzeit - ohne Angabe von Gründen und ohne, dass mir Nachteile entstehen - die Einwilligung zur Datenanalyse widerrufen und die Löschung meiner Daten anordnen kann. Ich bin außerdem darüber informiert worden, dass ich die Untersuchung jederzeit abbrechen kann, ohne dass mir persönliche Nachteile entstehen. Würzburg, den _______________________________________________ Unterschrift _______________________________________________ Geburtsdatum _______________________________________________ Name (Druckschrift) _______________________________________________ Anschrift (Druckschrift) _______________________________________________ _______________________________________________ Unterschrift des Untersuchungsleiters ........................................................................ Seite 2 von 2 Angaben zur Person Alter: _______ Geschlecht: ○ weiblich Schulabschluss: ○ Volks-/Hauptschulabschluss ○ mittlere Reife ○ Fachabitur ○ Abitur ○ Fachhochschulabschluss ○ Hochschulabschluss ○ voll berufstätig ○ teilzeitbeschäftigt ○ in Ausbildung ○ arbeitslos ○ Studium Berufsstand: ○ männlich Studienfach: _______________________ Semester: ________ Muttersprache: ○ deutsch ○ andere Sind Sie ○ Linkshänder ○ Rechthänder? Bitte beantworten Sie uns abschließend noch kurz folgende Fragen: Worin könnte Ihrer Meinung nach die Untersuchungsabsicht bestehen? ............................................................................................................................................ ............................................................................................................................................ ............................................................................................................................................ ............................................................................................................................................ Was ist Ihnen während der Untersuchung ansonsten aufgefallen? ............................................................................................................................................ ............................................................................................................................................ ........................................................................................................................................ ............................................................................................................................................ Haben Sie schon einmal an einer ähnlichen Studie teilgenommen? Wenn ja, worum ging es dabei? ............................................................................................................................................ ............................................................................................................................................ ............................................................................................................................................ Vielen Dank für Ihre Mitarbeit! 6.5. MATERIALS: EXPERIMENT 5 165 Lehrstuhl für Psychologie I - Prof. Dr. Paul Pauli Biologische Psychologie, Klinische Psychologie und Psychotherapie „Körperliche Veränderungen bei der Betrachtung von Spielcharakteren“ Sehr geehrte(r) Proband(in), die Kernspintomographie benutzt anstelle von Röntgenstrahlen oder radioaktiven Kontrastmitteln Radiowellen zur Abbildung des Gehirns und seiner Funktionen. Dazu ist es notwendig, dass Sie sich innerhalb des Magnetfeldes des Kernspintomographen befinden. Ihr Kopf liegt dabei in einer speziellen Kopfspule, die sie nicht belästigt oder drückt. Die von der Kopfspule empfangenen Signale werden im Computer weiterverarbeitet und können zur Erstellung von Bildern verwendet werden. Diese Technik wird weltweit eingesetzt. Es sind bislang keine schädigenden Wirkungen aufgetreten. Es werden keine Kontrastmittel gespritzt. Untersuchung Die Untersuchung wird mit einem modernen Kernspintomographen mit 1,5-Tesla (das ist die mittlere magnetische Flussdichte des Magnetfeldes) durchgeführt. Sie liegen dabei auf einer Liege, die in das Magnetfeld hineingefahren wird. Während der Untersuchung hören Sie Klopfgeräusche, die auf elektromagnetischen Schaltvorgängen im Kernspintomographen beruhen. Die Untersuchung im Tomographen dauert ca. 45 Minuten. Während der Untersuchung werden Sie mit Hilfe einer Kamera und einer Gegensprechanlage überwacht. Zusätzlich erhalten Sie einen Alarmknopf in die Hand, so dass die Untersuchung im Bedarfsfall jederzeit abgebrochen werden kann. Geplante Untersuchungen: Anatomie/Volumetrie: Die bei Ihnen geplante Untersuchung ermöglicht die bildliche Darstellung und Vermessung des Gehirns, wie hier dargestellt. Funktionelle Kernspintomographie: Die bei Ihnen geplante Untersuchung ermöglicht die bildliche Darstellung von funktionellen Zentren des Gehirns. Hierzu werden Bildserien in Ruhephasen und während der Ausführung einer Aktivierungsaufgabe (z.B. Betrachten von Bildern) aufgenommen. Die Aktivierungsaufgabe wird Ihnen vor der Untersuchung ausführlich erläutert. Seite 1 von 3 Ziel der Untersuchung ist, die Wirkung von emotionalen Gesichtsausdrücken auf körperliche Funktionen und auf das Befinden in Abhängigkeit von Spielsituationen zu erfassen. Dazu werden Ihnen verschiedene Bilder mit unterschiedlichen Gesichtsausdrücken präsentiert, die Sie frei betrachten sollen. Dabei handelt es sich um Gesichtsausdrücke von computergenerierten Personen, wie Sie sie z.B. bei Computerspielen oder im Internet auch finden können. Während der Präsentation werden neben Ihrer Hirnaktivität verschiedene körperliche Funktionen wie Veränderungen der Leitfähigkeit der Haut gemessen. Schließlich sollen Sie noch die Gesichtsausdrücke bewerten. Die Messung Ihrer Hirnaktivität erfolgt in mehreren Durchgängen im Kernspintomographen. Zusätzlich werden zur Messung der Hautleitfähigkeit Oberflächenelektroden in Ihrem Gesicht befestigt. Sie können sicher sein, dass dies völlig ungefährlich und schmerzfrei ist. Vor und während der Untersuchung bitten wir Sie, ein paar Fragebögen zu beantworten. Die Untersuchung wird insgesamt etwa 1,5 Stunden dauern. Die Zahlung der Aufwandentschädigung erfolgt im Anschluss an die Untersuchung. Sie haben dann außerdem die Gelegenheit, ihre Kontaktdaten zu hinterlassen, um weitere Informationen zu den Zielen und Ergebnissen dieser Untersuchung zu bekommen. Die Teilnahme an der Untersuchung ist völlig freiwillig. Sie können jederzeit - ohne Angabe von Gründen und ohne, dass Ihnen Nachteile entstehen - die Teilnahme abbrechen. Dadurch werden Ihnen keinerlei persönliche Nachteile entstehen und Sie erhalten die Aufwandsentschädigung anteilig ausbezahlt. Alle Daten, die erhoben werden, dienen ausschließlich Forschungszwecken, werden unter Einhaltung der geltenden Datenschutzbedingungen vertraulich behandelt und ohne Angabe des Namens unter einer Codenummer abgespeichert. Außerdem können Sie nachträglich und zu jedem Zeitpunkt - ohne Angabe von Gründen und ohne, dass Ihnen Nachteile entstehen - Ihre Einwilligung zur Datenanalyse widerrufen sowie die Löschung Ihrer Daten anordnen. Wenden Sie sich dazu bitte an Frau Dipl.-Psych. Katja Likowski, [email protected], Tel.: 0931/3189752. Seite 2 von 3 Einverständniserklärung Ich erkläre, dass ich dieses Informationsblatt gelesen und verstanden habe und meine Fragen zufrieden stellend beantwortet wurden. Ich willige ein, freiwillig an der Untersuchung teilzunehmen. Ebenso willige ich ein, dass die erhobenen Daten in anonymisierter Form wissenschaftlich ausgewertet werden. Ich bin darüber informiert worden, dass ich jederzeit - ohne Angabe von Gründen und ohne, dass mir Nachteile entstehen - die Einwilligung zur Datenanalyse widerrufen und die Löschung meiner Daten anordnen kann. Ich bin außerdem darüber informiert worden, dass ich die Untersuchung jederzeit abbrechen kann, ohne dass mir persönliche Nachteile entstehen. Würzburg, den _______________________________________________ Unterschrift _______________________________________________ Geburtsdatum _______________________________________________ Name (Druckschrift) _______________________________________________ Anschrift (Druckschrift) _______________________________________________ _______________________________________________ Unterschrift des Untersuchungsleiters ........................................................................ Seite 3 von 3 Forschungszentrum Magnet-Resonanz-Bayern e.V. Am Hubland D-97074 Würzburg Sehr geehrte Probandin, sehr geehrter Proband! Die Kernspintomographie ist ein Untersuchungsverfahren zur Erzeugung von Körperschnittbildern ohne Anwendung von Röntgenstrahlen. Die Untersuchung erfolgt mit starken statischen und dynamischen Magnetfeldern, ohne dass dies von Ihnen wahrgenommen wird. Gesundheitliche Risiken oder Schäden sind nach heutigem Kenntnisstand nicht beschrieben. Gefahren können dann entstehen, wenn sich in ihrem Körper bestimmte metallische Fremdkörper (z.B. Granatsplitter) oder elektrische Geräte (z.B. Herzschrittmacher) befinden. Für einige Untersuchungen im Rahmen von Studien ist es erforderlich Kontrastmittel, Medikamente bzw. Atemgase zu verabreichen um den Gewebekontrast, die Herzleistung bzw. die Sauerstoffsättigung des Gewebes zu verändern. Zum Einsatz kommen folgende Substanzen: keine orales Kontrastmittel intravenöses Kontrastmittel Dipyridamol Atemgasgemische Sauerstoff Carbogen Adenosin Andere: Die Verabreichung von Atemgasen erfolgt über eine Atemmaske. Nur in sehr seltenen Fällen ist mit leichten Nebenwirkungen im Sinne von Kreislaufbeschwerden wie z.B. Schwindel oder Atembeschwerden und Medikamentennebenwirkungen zu rechnen. Wir werden während der Untersuchung zur Sicherheit Kontakt zu Ihnen halten und Sie können bei Beschwerden oder Problemen auch jederzeit durch Betätigung eines Alarm-Ballons auf sich aufmerksam machen, sodass die Messung ggf. sofort abgebrochen werden kann. Um eine gefahrlose Untersuchung zu gewährleisten, bitten wir Sie vor dem Betreten der Anlage folgende Fragen zu beantworten: 1. Tragen Sie einen Herzschrittmacher oder eine Insulinpumpe? Ja Nein 2. Leiden Sie an epileptischen Anfällen ? Ja Nein 3. Sind Sie am Herz oder Kopf operiert worden ? Ja Nein 4. Tragen Sie eine Zahnprothese ? Ja Nein 5. Befinden sich in oder auf Ihrem Körper Metallteile/-partikel ? (z. B. Prothesen, OP-Clips, Splitter, Verhütungsspirale, Tätowierungen, Piercings, “Glitzer-Make-Up“ o.ä.) Ja Nein 6. Könnten Fremdkörper im Auge sein ? Ja Nein 7. Besteht eine Schwangerschaft ? Ja Nein 8. Haben oder hatten Sie beruflich mit Metallverarbeitung zu tun ? Ja Nein Seite 1 von 2 Stand: 10. Juli 2007 Vor Betreten des Magnetraumes: Da Metallteile im Magnetfeld gefährlich sind und die Messung stören können, müssen sämtliche ferromagnetischen Gegenstände abgelegt werden. Hierzu zählen vor allem: - Geldbeutel einschließlich Magnetkarten (Scheckkarten, Telefonkarten etc.) - Schmuck, Haarspangen, lose Metallteile (Schlüssel, Münzen, Kugelschreiber, Feuerzeug) - Gürtel, Brille, Zahnspange, oder entfernbare Zahnprothesen, Hörgeräte - Taschenmesser oder Werkzeuge aller Art - BH mit Metallteilen Unsere Mitarbeiter sind verpflichtet, Sie auf diesen Punkt noch einmal gesondert hinzuweisen. Bitte bedenken Sie, dass ferromagnetische Objekte im Magneten eine extrem hohe Kraftwirkung erfahren und zur Gefahr für Leib und Leben werden können! Während der Untersuchung: liegen Sie in einer engen Röhre. Über ein Mikrophon und Kamera besteht während der Untersuchung Kontakt zu den Mitarbeitern. Da die Untersuchung (je nach Vereinbarung) eine halbe bis zu zwei Stunden dauern kann, sollten Sie ganz bequem und ruhig liegen und Bewegungen vermeiden, damit die Messergebnisse nicht unbrauchbar werden. Da das Gerät während der Messung laute Geräusche macht, können wir Ihnen Ohrstöpsel anbieten. Zwischendurch sind immer wieder Pausen in denen der Computer die Bilder errechnet, bzw. Messparameter verändert werden. Wir weisen ausdrücklich darauf hin, dass die erstellten Tomogramme der Überprüfung von Messmethoden dienen, die für medizinische Fragestellungen entwickelt werden. Da diese Methoden noch nicht für bestimmte medizinische Fragestellungen validiert sind, sind die Tomogramme für jegliche Art der Diagnosestellung vollkommen ungeeignet. Im Falle von möglichen Auffälligkeiten die auf eine Pathologie hinweisen könnten, haben Sie die Wahl von unserem Mitarbeiter darüber informiert zu werden: Ja, ich möchte über Auffälligkeiten informiert werden Nein, ich möchte nicht über Auffälligkeiten informiert werden Für weitere Auskünfte stehen wir Ihnen gerne zur Verfügung. Bitte erklären Sie durch Ihre Unterschrift Ihr Einverständnis zur Untersuchung. Würzburg, den _________________ Name: ___________________________ geb. am: _________________ Unterschrift des Probanden: Untersucher: __________________________ _________________________ Seite 2 von 2 Stand: 10. Juli 2007 Lehrstuhl für Psychologie I - Prof. Dr. Paul Pauli Biologische Psychologie, Klinische Psychologie und Psychotherapie AUFKLÄRUNG ZUM UMGANG MIT ZUFALLSBEFUNDEN BEI BILDGEBENDEN VERFAHREN IN DER HIRNFORSCHUNG Name: _____________________________ Vorname: _____________________________ Geb.-Datum: _____._____.19____ Zufallsbefunde sind Signalauffälligkeiten, die bei der wissenschaftlichen Datenanalyse entdeckt werden können und nach denen nicht mit der Absicht einer klinisch orientierten Diagnostik gesucht wurde. Sie werden mit dieser Aufklärung darüber informiert, dass im Rahmen dieser Forschungsstudie kein Arzt-Patient-Verhältnis besteht. In dieser Forschungsstudie wird keine klinische Individualdiagnostik durchgeführt, da die Forschungsstudie ausschließlich auf wissenschaftlichen Erkenntnisgewinn und nicht auf die Entdeckung individueller hirnstruktureller oder -funktioneller Auffälligkeiten abzielt. Bei der Entdeckung eines Zufallsbefundes wird dieser bezüglich einer Empfehlung einer klinischneuroradiologischen Diagnostik eingeschätzt, nicht jedoch in klinischer Hinsicht diagnostisch spezifiziert und beurteilt werden. Ein als abklärungsbedürftig eingeschätzter Zufallsfund wird Ihnen vom Forscher zeitnah mitgeteilt und in diesem Fall die Durchführung einer klinisch- neuroradiologischen Diagnostik empfohlen. Zufallsfunde, die nach Einschätzung des Forschers keiner klinisch-diagnostischen Abklärung bedürfen, werden Ihnen nicht mitgeteilt. 2 EINWILLIGUNG Ich willige ein, dass … 1) … der Forscher mir einen abklärungsbedürftigen Zufallsbefund mitteilt. 2) … der Forscher im Falle eines Zufallsbefundes einen externen neuroradiologischen Experten hinzuzieht. 3) … mir ausschließlich solche Zufallsbefunde mitgeteilt werden, die als abklärungsbedürftig eingeschätzt werden. 4) (nur für Patientenprobanden)… Zufallsbefunde den behandelnden Ärzten zur Kenntnis gebracht werden. Darüber hinaus besitze ich einen Krankenversicherungsschutz, so dass die Kosten, die bei einem abklärungsbedürftigen Zufallsfund im Zusammenhang mit einer klinischen Diagnostik entstehen, gedeckt sind. Ich habe diese Aufklärung verstanden und Fragen, die sich ergaben, wurden ausreichend beantwortet. __.__.____ ______________________ ______________________ Datum TeilnehmerIn UntersuchungsleiterIn Bitte geben Sie im Folgenden an, wie sehr Sie den einzelnen Aussagen zustimmen: trifft gar nicht zu trifft sehr gut zu 1. Ich empfinde oft warmherzige Gefühle für Leute, denen es weniger gut geht als mir. ○ ○ ○ ○ ○ 2. Die Gefühle einer Person in einem Roman kann ich mir oft sehr gut vorstellen. ○ ○ ○ ○ ○ 3. In Notfallsituationen fühle ich mich ängstlich und unbehaglich. ○ ○ ○ ○ ○ 4. Ich versuche, bei einem Streit zuerst beide Seiten zu verstehen, bevor ich eine Entscheidung treffe. ○ ○ ○ ○ ○ 5. Wenn ich sehe, wie jemand ausgenutzt wird, glaube ich, ihn schützen zu müssen. ○ ○ ○ ○ ○ 6. Manchmal fühle ich mich hilflos, wenn ich inmitten einer sehr emotionsgeladenen Situation bin. ○ ○ ○ ○ ○ 7. Ich versuche manchmal, meine Freunde besser zu verstehen, indem ich mir vorstelle, wie die Dinge aus ihrer Sicht aussehen könnten. ○ ○ ○ ○ ○ 8. Nachdem ich einen Film gesehen habe, fühle ich mich manchmal so, als ob ich eine der Personen aus dem Film sei. ○ ○ ○ ○ ○ 9. Oft berühren mich Dinge sehr, die ich nur beobachte. ○ ○ ○ ○ ○ 10. Ich glaube, jedes Problem hat zwei Seiten und versuche deshalb beide zu berücksichtigen. ○ ○ ○ ○ ○ 11. Ich würde mich selbst als eine ziemlich weichherzige Person bezeichnen. ○ ○ ○ ○ ○ 12. Wenn ich einen guten Film sehe, kann ich mich sehr leicht in die Hauptperson hineinversetzen. ○ ○ ○ ○ ○ 13. Ich neige dazu, in Notfällen die Kontrolle über mich zu verlieren. ○ ○ ○ ○ ○ 14. Wenn ich eine interessante Geschichte oder ein gutes Buch lese, versuche ich mir vorzustellen, wie ich mich fühlen würde, wenn mir die Ereignisse des Buches passieren würden. ○ ○ ○ ○ ○ 15. Wenn ich jemanden sehen müsste, der dringend Hilfe in einem Notfall bräuchte, würde ich bestimmt zusammenbrechen. ○ ○ ○ ○ ○ 16. Bevor ich jemanden kritisiere, versuche ich mir vorzustellen, wie ich mich an seiner Stelle fühlen würde. ○ ○ ○ ○ ○ Bitte geben Sie im Folgenden an, wie sehr Sie den einzelnen Aussagen zustimmen: trifft gar nicht zu trifft sehr gut zu 1. Ich mache mir Gedanken darüber, was andere Leute von mir denken, auch wenn ich weiß, dass es egal ist. ○ ○ ○ ○ ○ 2. Ich verhalte mich unbekümmert, auch wenn ich merke, dass andere Leute einen schlechten Eindruck von mir bekommen. ○ ○ ○ ○ ○ 3. Ich habe oft Angst, dass andere Leute meine Fehler bemerken. ○ ○ ○ ○ ○ 4. Ich mache mir selten Gedanken darüber, welchen Eindruck ich auf jemand anderes mache. ○ ○ ○ ○ ○ 5. Ich habe Angst, dass andere sich nicht positiv über mich äußern. ○ ○ ○ ○ ○ 6. Ich habe Angst, dass andere Leute etwas an mir auszusetzen haben. ○ ○ ○ ○ ○ 7. Die Meinung anderer Leute über mich lässt mich kalt. ○ ○ ○ ○ ○ 8. Wenn ich mit jemandem spreche, mache ich mir Gedanken darüber, was der andere über mich denken könnte. ○ ○ ○ ○ ○ 9. Normalerweise mache ich mir Gedanken darüber, wie ich auf andere wirke. ○ ○ ○ ○ ○ 10. Wenn ich weiß, dass mich jemand beurteilt, macht es mir kaum etwas aus. ○ ○ ○ ○ ○ 11. Manchmal glaube ich, ich beschäftige mich viel zu sehr damit, was andere Leute von mir denken. ○ ○ ○ ○ ○ 12. Ich habe oft Angst, dass ich etwas Falsches sagen oder tun würde. ○ ○ ○ ○ ○ Angaben zur Person Alter: _______ Geschlecht: ○ weiblich Schulabschluss: ○ Volks-/Hauptschulabschluss ○ mittlere Reife ○ Fachabitur ○ Abitur ○ Fachhochschulabschluss ○ Hochschulabschluss ○ voll berufstätig ○ teilzeitbeschäftigt ○ in Ausbildung ○ arbeitslos ○ Studium Berufsstand: ○ männlich Studienfach: _______________________ Muttersprache: ○ deutsch ○ Semester: ________ andere Bitte beantworten Sie uns abschließend noch kurz folgende Fragen: Worin könnte Ihrer Meinung nach die Untersuchungsabsicht bestehen? ............................................................................................................................................ ............................................................................................................................................ ............................................................................................................................................ ............................................................................................................................................ Was ist Ihnen während der Untersuchung ansonsten aufgefallen? ............................................................................................................................................ ............................................................................................................................................ ........................................................................................................................................ ............................................................................................................................................ Haben Sie schon einmal an einer ähnlichen Studie teilgenommen? Wenn ja, worum ging es dabei? ............................................................................................................................................ ............................................................................................................................................ ............................................................................................................................................ Vielen Dank für Ihre Mitarbeit! 6.6. LEBENSLAUF 177 Lebenslauf Persönliche Angaben Name Katja Ute Likowski Geburtdatum 29. April 1983 Geburtsort Friedrichroda Büroadresse Marcusstr. 9-11 97070 Würzburg Privatadresse Wilhelmstr. 5 97070 Würzburg Telefon 0931 3189752 0177 8645200 Email [email protected] Homepage http://www.phil2.uni-wuerzburg.de/index.php?id=77098 Bildung 2006 Diplom in Psychologie, Note: 1,5 (sehr gut) 2001 – 2006 Studium der Psychologie (Diplom) an der Universität Jena 1993 – 2001 Abitur am Albert-Schweitzer-Gymnasium Ruhla, Note: 1,3 (sehr gut) Sprachen Englisch: Flüssig in Schrift und Sprache, Französisch: Fortgeschritten Beruflicher Werdegang seit 09/2009 wissenschaftliche Mitarbeiterin im DFG-Projekt "Impulsive und reflektive Mechanismen der Modulation fazialer Reaktionen auf emotionale Gesichtsausdrücke" am Lehrstuhl für Psychologie I, Julius-Maximilians-Universität Würzburg 09/2006-09/2009 wissenschaftliche Mitarbeiterin im DFG-Projekt "Modifikation fazialer Reaktionen auf emotionale Gesichtsausdrücke" am Lehrstuhl für Psychologie I, Julius-Maximilians-Universität Würzburg 2006 Praktikum im Deutschen Zentrum für Luft- und Raumfahrt e.V., Abteilung Luft- und Raumfahrtpsychologie, Hamburg 2003 – 2006 studentische Hilfskraft in der Nachwuchsgruppe „Motivationale und kognitive Determinanten sozialer Diskriminierung“ bei PD Dr. Kai Sassenberg, Universität Jena Workshop-Teilnahmen 2008 1. Springschool der DGPA “Biopsychology of Emotions” im Kloster Seeon, Seeon 2011 Workshop “Networks in the human brain” am Max Planck Institut für Kognitions- und Neurowissenschaften, Leipzig Lehre WS2009/10 Empiriepraktikum “Biological psychology” SS 2009 Seminar “Journal Club” SS 2007 Seminar “Social Neuroscience” 2003 – 2005 diverse Tutorien am Lehrstuhl für Allgemeine Psychologie / Soziale Kognition und am Lehrstuhl für Sozialpsychologie, Universität Jena Mitgliedschaften Society for Personality and Social Psychology Society for Psychophysiological Research Association for Psychological Science Reviewertätigkeit Psychological Science Publikationen Artikel in Zeitschriften mit Peer-Review Verfahren Likowski, K. U., Mühlberger, A., Seibt, B., Pauli, P., & Weyers, P. (in press). Processes underlying congruent and incongruent facial reactions to emotional facial expressions. Emotion. Likowski, K. U., Weyers, P., Seibt, B., Stöhr, C., Pauli, P., & Mühlberger, A. (in press). Sad and lonely? Sad mood suppresses facial mimicry. Journal of Nonverbal Behavior. Topolinski, S., Likowski, K. U., Weyers, P., & Strack, F. (2009). The face of fluency: Semantic coherence automatically elicits a specific pattern of facial muscle reactions. Cognition and Emotion. Hoefling, A., Likowski, K. U., Deutsch, R., Häfner, M., Seibt, B., Mühlberger, A., Weyers, P, & Strack (2009). When hunger finds no fault with moldy corn: Food deprivation reduces foodrelated disgust. Emotion. Likowski, K. U., Mühlberger, A., Seibt, B., Pauli, P., & Weyers, P. (2008). Modulation of facial mimicry by attitudes. Journal of Experimental Social Psychology, 44, 1065-1072. publizierte Abstracts Likowski, K. U., Mühlberger, A., Pauli, P., & Weyers, P. (2010). Neural correlates of facial mimicry. Psychophysiology, 47, S64. Reicherts, P., Gerdes, A. B. M., Wieser, M. J., Likowski, K. A., Weyers, P., Mühlberger, A., & Pauli, P. (2010). Pain in your face - is the processing of dynamic expressions of pain special? Psychophysiology, 47, S63. Weyers, P., Likowski, K. U., Seibt, B., Pauli, P., & Mühlberger, A. (2010). Modulation of facial mimicry is connected to changes in early face perception. Psychophysiology, 47, S64. Reicherts, P., Gerdes, A. B. M.,Wieser, M. J., Likowski, K. A., Weyers, P., Mühlberger,A. & Pauli, P. (2010). Die Verarbeitung von Schmerzgesichtern und deren modulatorische Wirkung auf das subjektive Schmerzerleben. Schmerz, Suppl1, S110. Likowski, K. U., Mühlberger, A., Pauli, P., Seibt, B., & Weyers, P. (2010). Modulation der Verarbeitung emotionaler Gesichtsausdrücke durch Einstellungen: Elektrokortikale Potentiale und faziale Reaktionen. In C. Frings, A. Mecklinger, D. Wentura, & H. Zimmer (Eds.), Beiträge zur 52. Tagung experimentell arbeitender Psychologen (pp. 72). Lengerich: Papst. Likowski, K. U., Pauli, P., & Weyers, P. (2009). Corrugator relaxation as a sign of a felt smile. Psychophysiology, 46, S128. Likowski, K. U., Pauli, P., & Weyers, P. (2009). Duchenne or not Duchenne? Corrugatorentspannung als Indikator für ein echtes Lächeln. A. B. Eder, K. Rothermund, S. R. Schweinberger, M. C. Steffens, & H. Wiese (Eds.), 51. Tagung experimentell arbeitender Psychologen in Jena (pp. 40). Lengerich: Papst. Likowski, K. U., Mühlberger, A., Seibt, B., Pauli, P., & Weyers, P. (2008). Mediators of goal effects on mimicry. International Journal of Psychology, 43, p376. Geissler, J., Likowski, K. U., Mühlberger, A., Seibt, B., Pauli, P., & Weyers, P. (2008). Faziale Reaktionen bei der Betrachtung emotionaler Gesichtsausdrücke von AIDS-Patienten. In P. Khader, K. Jost, H. Lachnit, & F. Rösler (Eds.), Beiträge zur 50. Tagung experimentell arbeitender Psychologen (pp. XXX). Lengerich: Papst. Mühlberger, A., Frey, M. C. M., Likowski, K. U., Wieser, M. J., Pauli, P. & Weyers, P. (2007). A direct comparison of the central nervous processing of social and non-social emotional stimuli. Psychophysiology, 44, S36. Weyers, P., Frey, M. C. M., Likowski, K. U., Wieser, M. J., Pauli, P. & Mühlberger, A. (2007). A direct comparison of facial reactions to social and non-social emotional stimuli. Psychophysiology, 44, S36. Likowski, K. U., Mühlberger, A., Pauli, P., Seibt, B., & Weyers, P. (2007). Good Guys - Bad Guys: Modulation fazialer Mimikry durch Einstellungsmanipulation. In K. F. Wender, S. Mecklenbräuker, G. D. Rey & T. Wehr (Eds.), Beiträge zur 49. Tagung experimentell arbeitender Psychologen (pp. 255). Lengerich: Papst. Eingeladene Vorträge Likowski, K. U., Mühlberger, A., Anderson, C., Pauli, P., Seibt, B., & Weyers, P. (2010). Modulation of facial mimicry is connected to changes in early face perception. Symposium “Are We Chameleons or Not? When Mimicry Reactions are Reduced.” (Marielle Stel) at the Social Psychology Section Conference 2010, Winchester, UK. Likowski, K. U., Mühlberger, A., Pauli, P., Seibt, B., & Weyers, P. (2010). Modulation der Verarbeitung emotionaler Gesichtsausdrücke durch Einstellungen: Elektrokortikale Potentiale und faziale Reaktionen. Symposium “Automatismen in der Verarbeitung emotionaler Reize“ (Andrea Paulus & Michaela Rohr) at the 52. Tagung experimentell arbeitender Psychologen und Psychologinnen, Saarbrücken, Germany. Likowski, K. U., Mühlberger, A., Seibt, B., Pauli, P., & Weyers, P. (2008). Mediators of goal effects on mimicry. Symposium “Emotion and Behavior” (Fritz Strack & Paul Pauli) at the XXIX International Congress of Psychology, Berlin, Germany). Eingeladene Workshops Likowski, K. U. (2009). Workshop Psychophysiologie. Lehrstuhl für Psychologie II (Prof. Dr. Fritz Strack), University of Würzburg, Germany. Konferenzbeiträge Likowski, K. U., Mühlberger, A., Anderson, C., Pauli, P., Seibt, B., & Weyers, P. (2010). Modulation of facial mimicry is connected to changes in early face perception. Paper presented at the Social Psychology Section Conference 2010, Winchester, UK. Likowski, K. U., Mühlberger, A., Anderson, C., Pauli, P., Seibt, B., & Weyers, P. (2010). Modulation of brain potentials and facial reactions to facial expressions by attitudes. Paper presented at the 13th European Conference on Facial Expression, Duisburg, Germany. Reicherts, P., Gerdes, A. B. M., Wieser, M. J., Likowski, K. A.,Weyers, P. & Pauli, P. (2010). Von Angesicht zu Schmerzgesicht - Die Verarbeitung dynamischer Schmerzgesichter. Poster presented at the 36. Tagung Psychologie und Gehirn 2010, Greifswald, Germany. Likowski, K. U., Mühlberger, A., Seibt, B., Pauli, P., & Weyers, P. (2008). Mimicry and Emotion! On the interplay of motor and affective processes in facial reactions to emotional facial expressions. Paper presented at the 12th European Conference on Facial Expression, Geneva, Switzerland. Likowski, K. U., Mühlberger, A., Wernecke, S. , Seibt, B., Pauli, P., & Weyers, P. (2008). Goal priming effects on mimicry are mediated by empathy. Paper presented at the 15th General meeting of the European Association of Experimental Social Psychology, Opatija, Croatia. Likowski, K. U., Mühlberger, A., Seibt, B., Pauli, P., & Weyers, P. (2008). The involvement of motor and affective processes in facial reactions to emotional facial expressions. Poster presented at the DGPA-Springschool “Biopsychology of Emotions”, Seeon, Germany. Likowski, K. U., Mühlberger, A., Seibt, B., Pauli, P., & Weyers, P. (2008). Mimicry or Emo-tion? The involvement of motor and affective processes in facial reactions to emotional facial expressions. Poster presented at the International Symposium "Foundations of Human Social Behavior", Zürich, Schweiz. Likowski, K. U., Mühlberger, A., Seibt, B., Pauli, P., & Weyers, P. (2008). The involvement of motor and affective processes in facial reactions to emotional facial expressions. Poster presented at the Psychologie und Gehirn, Magdeburg, Germany. Gorges, S., Alpers, G. W., Götz, A.-T., Likowski, K. & Pauli, P. (2008). Herzklopfen im Rampenlicht: Subjektive und physiologische Korrelate von Lampenfieber. Poster presented at the Psychologie und Gehirn, Magdeburg, Germany. Likowski, K. U., Schubert, T. W., Fleischmann, B., Landgraf, J., & Volk, A. (2007). Die Effekte von Synchronität und Mimikry auf Sympathie werden durch Gruppenzugehörigkeit moderiert. Paper presented at the 11. Tagung der Fachgruppe Sozialpsychologie der Deutschen Gesellschaft für Psychologie, Freiburg im Breisgau, Germany. Likowski, K. U., Mühlberger, A., Rieger, S., Seibt, B., Pauli, P., & Weyers, P. (2007). When we prefer men and women at the same time: Effects of gender categorisation on facial mimicry. Paper presented at the Psychologie und Gehirn, Dortmund, Germany. Likowski, K. U. & Jonas, K. J. (2005). Implizite Messung des "Black-Sheep-Effekts". Poster presented at the 10. Fachgruppentagung Sozialpsychologie, Jena, Germany.
© Copyright 2026 Paperzz