Research Digest Free every fortnight Issue 162 Contents: 1. Milgram's personal archive reveals how he created the 'strongest obedience situation' 2. Child's play! The developmental roots of the misconception that psychology is easy 3. Scary health messages can backfire 4. Large, longitudinal study finds tentative links between internet use and loneliness 5. A social version of a basic cognitive mechanism 6. The sight of their own blood is important to some people who self-harm Further information Email the editor: [email protected] Download past Digest issues as PDFs: www.researchdigest.org.uk Visit the Digest blog: www.researchdigest.org.uk/blog Download a free Digest poster: http://tinyurl.com/59c63v Subscribe free at www.researchdigest.org.uk/blog Research Digest Free every fortnight Milgram's personal archive reveals how he created the 'strongest obedience situation' Stanley Milgram's 1960s obedience to authority experiments, in which a majority of participants applied an apparently fatal electric shock to an innocent 'learner', are probably the most famous in psychology, and their findings still appall and intrigue to this day. Now, in a hunt for fresh clues as to why ordinary people were so ready to harm another, Nestar Russell, at Victoria University of Wellington, has reviewed Milgram's personal notes and project applications, which are housed at Yale University's Sterling Memorial Library. Milgram trained under Solomon Asch, author of the famous conformity experiments, and the obedience project was originally conceived as an extension of Asch's work. Milgram was going to see how the behaviour of a group of cooperating participants (actually confederates working for the researcher) influenced the naive participants' willingness to harm another. A condition in which single participants followed the experimenter's orders on their own was planned as a mere control condition. It was during Milgram's extensive pilot work that he discovered the remarkable willingness for participants to obey instructions, without the need for group coercion, thus changing the direction of his project. The focus shifted to lone participants and Milgram began a process of trial and error pilot work to identify the perfect conditions for inducing obedience - what he described as 'the strongest obedience situation'. Early on, Milgram recognised the need for an acceptable rationale for harming another and so he invented the cover story that the experiment was about using punishment to improve learning. To counter participants' reluctance to harm an innocent person, Milgram also devised several other 'strain resolving mechanisms'. This included replacing the final shock level label 'LETHAL' with the more ambiguous 'XXX'; removing a Nazi-sounding 'pledge to obey' from the experiment instructions; and creating physical distance between the participants and the innocent, to-be-electrocuted learner. In fact, this latter factor worked too well. When Milgram removed any sight or sound of the learner, 'virtually all' participants showed a willingness to inflict lethal harm. Milgram realised this near-total obedience was counter-productive and would prevent his paradigm from 'scaling obedient tendencies'. For his first official experiment he therefore settled on auditory feedback only, in the form of the learner banging on the wall in distress. Another 'strain resolving mechanism' that Milgram devised included increasing the number of levels on the shock generator. This allowed for exploitation of the 'foot in the door' persuasion effect whereby people are more likely to cooperate once they have already agreed to a less significant request - a kind of piecemeal compliance. Milgram was also careful about the actors he chose to play the part of experimenter and learner. Though both nonprofessionals, the man acting as learner was chosen because he was 'mild and submissive; not at all academic' and a 'perfect victim', whilst the man playing the experimenter was 'stern' and 'intellectual looking'. Finally, Milgram was careful to plan things so that the 'experimenter', whenever challenged, replied that he was responsible for anything that happens to the learner. Taken altogether, Russell's new analysis shows how Milgram used ad hoc trial and error pilot testing to hone his methodology and ensure his first official obedience experiment achieved such a high obedience rate (of 65 per cent). 'Knowing step-by-step how Milgram developed this result may better arm theorists interested in untangling this still enigmatic question of why so many participants inflicted every shock,' Russell said. _________________________________ Russell, N. (2010). Milgram's obedience to authority experiments: Origins and early evolution. British Journal of Social Psychology DOI: 10.1348/014466610X492205 Subscribe free at www.researchdigest.org.uk/blog Research Digest Free every fortnight Child's play! The developmental roots of the misconception that psychology is easy The widespread misconception that psychology is easy and mere common sense has its roots in the biased way that children work out whether a topic is challenging or not. Frank Keil and colleagues asked children aged between five and thirteen, and adults, to rate the difficulty of questions from physics (e.g. How does a spinning top stay upright?), chemistry, biology, psychology (e.g. Why is it hard to understand two people talking at once?) and economics. The questions had been carefully chosen from earlier pilot work in which they'd all been rated as equally difficult by adults. Consistent with the pilot work, the adults in the study proper rated the questions from the different disciplines as equally difficult. However, children from age 7 to 13 rated psychology as easier than the natural sciences - physics, chemistry and biology, which they rated as equally difficult. Young children can't possibly have the depth of understanding to know which scientific questions are more difficult. Instead they must resort to some kind of mental short-cut to make their verdict. Keil's team think that children's feelings of control over their own psychological faculties - memories, emotions and so forth - and the superficial familiarity of those kinds of concepts, likely lead them to believe psychological concepts are easier to understand. A second study provided this account with some support. This time children and adults rated the difficulty of questions from within the various branches of psychology. Similar to the first study, the children, but not the adults, rated questions related to social psychology, personality and emotions as progressively easier, compared with questions related to cognition, perception and biological psychology, which they rated as progressively more difficult. So, when do these childish misconceptions leak through into adult judgments? For a third study, another batch of children and adults were again presented with the same questions from the different scientific disciplines, but this time they were asked to say whether they would be able to solve each question on their own (or require expert help) and to estimate what proportion of the adult population would know the answers. This time the adults as well as the children tended to say they could solve more psychology questions on their own, compared with questions in the other sciences, and kids and adults estimated that more people knew the answers to the psychology questions. Remember these were psychology questions that adults had already rated as just as difficult and complex as questions in the other sciences. 'Such biases [towards seeing psychology as easy] may be observed when tasks do not so directly ask about difficulty of understanding and instead use measures such as ease of learning on one's own,' the researchers said. Keil's team said their findings have real-life implications, for example in the court-room. 'If psychological phenomena are seen as usually quite easy to understand and largely self-evident and if such judgments are inaccurate and underestimate the need for experts,' they warned, 'cases might well be decided in ways that unfairly exclude valuable expert insights.' In fact, the researchers pointed out that such situations have already occurred. In the US trial of former Presidential assistant I. Lewis 'Scooter' Libby, for example, the judge disallowed the use of psychology experts on memory, on the basis that the jury could rely on their common sense understanding of memory. This is particularly ironic given that prior psychology research has shown that jurors and judges have a woefully poor understanding of how memory actually works. _________________________________ Keil FC, Lockhart KL, & Schlegel E (2010). A bump on a bump? Emerging intuitions concerning the relative difficulty of the sciences. Journal of experimental psychology. General, 139 (1), 1-15 PMID: 20121309 Subscribe free at www.researchdigest.org.uk/blog Research Digest Free every fortnight Scary health messages can backfire A short while ago there was a shocking advert on British TV that used slow motion to illustrate the bloody, crunching effects of a car crash. The driver had been drinking. Using these kind of scare tactics for anti drinkdriving and other health issues makes intuitive sense. The campaigners want to grab your attention and demonstrate the seriousness of the consequences if their message is not heeded. However, a new study makes the surprising finding that for a portion of the population, scare tactics can back-fire, actually undermining a message's efficacy. Steffen Nestler and Boris Egloff had 297 participants, 229 of them female, average age 35, read one of two versions of a fictional news report from a professional medical journal. The report referred to a study showing links between caffeine consumption and a fictional gastro-intestinal disease 'Xyelinenteritis'. One version was extrascary, highlighting a link between Xyelinenteritis and cancer and saying that the participant's age group was particularly vulnerable. The other version was lower-key and lacked these two details. Both versions of the article concluded by recommending that readers reduce their caffeine consumption. Before gauging the participants' reaction to the article and its advice, the researchers tested them on a measure of 'cognitive avoidance'. People who score highly on this personality dimension respond to threats with avoidance tactics such as distracting themselves, denying the threat or persuading themselves that they aren't vulnerable. The key finding is that participants who scored high on cognitive avoidance actually rated the threat from Xyelinenteritis as less severe after reading the scary version of the report compared with the low-key version. Moreover, after reading the scary version, they were less impressed by the advice to reduce caffeine consumption and less likely to say that they planned to reduce their caffeine intake. On the other hand, highly cognitive avoidant participants were more responsive to the low-key report than were the low cognitive avoidant participants. In other words, for people who are cognitively avoidant, scary health messages can actually back-fire. 'Practically, our results suggest that instead of giving all individuals the same threat communications, messages should be given that are concordant with their individual characteristics,' Nestler and Egloff said. 'Thus, the present findings are in line with the growing literature on tailoring intentions to individual characteristics, and they highlight the role of individual differences when scary messages are used.' _________________________________ Nestler, S., & Egloff, B. (2010). When scary messages backfire: Influence of dispositional cognitive avoidance on the effectiveness of threat communications Journal of Research in Personality, 44 (1), 137-141 DOI: 10.1016/j.jrp.2009.10.007 Subscribe free at www.researchdigest.org.uk/blog Research Digest Free every fortnight Large, longitudinal study finds tentative links between internet use and loneliness Internet use is growing at a phenomenal rate and much ink has been spilled by commentators forecasting the psychological consequences of all this extra web-time. A lot of that comment is mere conjecture whilst many of the studies in the area are crosssectional, with small samples, producing conflicting results. The latest research contribution comes from Irena Stepanikova and her colleagues and involves a massive sample, some of whom were followed over time. The results suggest that more time on the internet is associated with increased loneliness and reduced life satisfaction. However, it's a complicated picture because the researchers' different outcome measures produced mixed results. Over thirteen thousand people answered questions about their internet use, loneliness and life satisfaction in 2004 and in 2005. They'd been chosen at random from a list of US land-line numbers. The majority of the people quizzed in 2004 were different from those quizzed in 2005, but 754 people participated in both phases, thus providing some crucial longitudinal data. An important detail is that the researchers used two measures of internet use. The first 'time-diary' method required participants to consider six specific hours spread out over the previous day and to estimate how they'd spent their time during those hours. The other 'global recall' measure was more open-ended and required participants to consider the whole previous twenty-four hours and detail as best they could how they'd used that time. The cross-sectional data showed that participants who reported spending more time browsing the web also tended to report being lonelier and being less satisfied with life. This association was larger for the time-diary measure. The strength of the association was modest, but to put it in perspective, it was five times greater than the (inverse) link between loneliness and amount of time spent with friends and family. Turning to web-communication, the global recall measures showed that time spent instant messaging, in chat rooms and news groups (but not email) was associated with higher loneliness scores. For the time-diary measure, it was increased email use that was linked with more loneliness. The longitudinal data showed that as a person's web browsing increased from 2004 to 2005, their loneliness also tended to increas (based on the global recall measure only). Both measures showed that increased non-email forms of web communication, including chat rooms, also went hand in hand with increased loneliness. Finally, more web browsing over time was linked with reduced life satisfaction by the time-diary measure, whilst more non-email web communication over time was linked with reduced life satisfaction by the global recall measure. Perhaps the most important message to come out of this research is that the results varied with the measure of internet use that was used - future researchers should note this. The other message is that more time browsing and communicating online appears to be linked with more loneliness, the two even increase together over time. However, it is important to appreciate that we don't know the direction of causation. Increased loneliness may well encourage people to spend more time online, rather than web time causing loneliness. Or some other factor could be causing both to rise in tandem. It's worth adding too that the web/loneliness link held even after controlling for time spent with friends and family. So if more web use were causing loneliness, it wasn't doing it by reducing time spent socialising face-to-face. 'We are hopeful that our study will stimulate future research ... ,' the researchers said, 'but at this point any claims suggesting that as Internet use continues to grow in the future, more people will experience loneliness and low life-satisfaction would be premature.' _________________________________ Stepanikova, I., Nie, N., & He, X. (2010). Time on the Internet at home, loneliness, and life satisfaction: Evidence from panel timediary data Computers in Human Behavior, 26 (3), 329-338 DOI: 10.1016/j.chb.2009.11.002 Subscribe free at www.researchdigest.org.uk/blog Research Digest Free every fortnight A social version of a basic cognitive mechanism We're slower to direct our attention to the same location twice in succession, a well-established phenomenon that cognitive psychologists call 'inhibition of return' (IoR). It's thought the mechanism may act to make our search of the visual scene more efficient by deterring us from looking at the same spot twice. Now Paul Skarratt and his colleagues have documented a new 'social' form of inhibition of return, in which people are slower to attend to a location that social cues, such as gaze direction, suggest another person has already attended to. Twelve participants sat at a table with an animated character projected opposite. Each participant and their animated partner had two lights and two buttons in front of them, near the middle of the table. One light/button pair was to the left, the other pair was to the right. The basic task was to press the corresponding button as fast as possible when its light came on. Participants were slower to respond to a light when the animated partner had just responded to the adjacent light on their side of the table this is what you might call a weak version of social inhibition of return. However, when two large vertical barriers were put up with a gap in the middle, so that the participants could only see their partner's eyes and initial reaching action, and not their actual button presses, this social IoR disappeared. In a second experiment, the animated partner was replaced with a human. This time, the social IoR effect occurred even when the barriers were erected and only the partner's eye gaze and initial hand movement could be seen. In other words, inferences about where the partner was going to attend, based on their eyes or early hand movement, seemed to be enough to inhibit a participant's own attention to the same location. For some reason, this strong version of social IoR only occurred with a real, human partner, not the animated, computer-controlled partner of the first experiment. The final experiment added yet another visual barrier, which left only the partner's eyes or only their early hand movement visible. This was to try to establish which cue was the more important for provoking social IoR. The answer was that both cues were equally effective. It's only supposition at this stage, but Skarratt and his team think social IoR could be supported by the postulated mirror neuron system. Monkey research has shown, for example, that there are mirror neurons in the premotor cortex that fire whether a monkey sees another person grasp an object or if they just see the initial part of that grasping movement. 'Although the critical mechanisms underlying social IoR remain to be discovered,' the researchers said, 'the current study indicates that it can be generated independently of direct sensory stimulation normally associated with IoR, and can occur instead on the basis of an inference of another person's behaviour.' _________________________________ Skarratt, P., Cole, G., & Kingstone, A. (2010). Social inhibition of return. Acta Psychologica, 134 (1), 48-54 DOI: 10.1016/j.actpsy.2009.12.003 Subscribe free at www.researchdigest.org.uk/blog Research Digest Free every fortnight The sight of their own blood is important to some people who self-harm The sight of their own blood plays a key role in the comfort that some non-suicidal people find in deliberately cutting themselves. That's according to a new study by Catherine Glenn and David Klonsky that suggests it is those self-harmers who have more serious psychological problems who are more likely to say the sight of blood is important. There are plenty of anecdotal reports hinting at the importance of the sight and taste of blood to self-harmers, as well as references in popular music. 'Yeah you bleed just to know you're alive,' sing the Goo Goo dolls in Iris. 'I think it's time to bleed I'm gonna cut myself and Watch the blood hit the ground,' sings Korn on Right Now. However, this is the first systematic investigation on the topic. Glenn and Klonsky recruited 64 self-harmers from a mass screening of 1,100 new psychology students. With an average age of 19, and 82 per cent being female, the students answered questions about their self-harming and other psychological problems and specifically reported on the importance of the sight of blood. Just over half the participants said that it was important to see blood when they self-harmed, with the most common explanation being that it helps relieve tension and induces calmness. Other explanations were that it 'makes me feel real' and shows that 'I did it right/deep enough'. The participants who said blood was important didn't differ in terms of age and gender from those who said it wasn't. However, the blood-important group reported cutting themselves far more often (a median of 30 times compared with 4 times) and they were more likely to say they self-harmed as a way of regulating their own emotions. The blood-important group also reported more symptoms consistent with bulimia nervosa and borderline personality disorder. 'Overall, these results suggest that self-injurers who report it is important to see blood are a more clinically severe group of skin-cutters,' the researchers said. 'Therefore, a desire to see blood during non-suicidal self-injury may represent a marker for increased psychopathology.' Glenn and Klonsky said more research was needed to find out why the sight of blood has the significance it does for some people who self-harm. However, they surmised that the sight of one's own blood could, after an initial rise in heart-rate, lead to a rebound effect characterised by reduced heart-rate and feelings of calmness. _________________________________ Glenn, C., & Klonsky, E. (2010). The Role of seeing blood in non-suicidal self-injury. Journal of Clinical Psychology DOI: 10.1002/jclp.20661 Subscribe free at www.researchdigest.org.uk/blog
© Copyright 2026 Paperzz