Unconscious Bias Department of Life Sciences

Unconscious Bias
Department of Life Sciences
Imperial College
London April 2015
1
Unconscious Bias
As our understanding of the importance of diversity awareness deepens, attention has been
increasingly focussed on the more subtle aspects of diversity such as unconscious bias.
We are constantly making decisions about what is safe or appropriate as we navigate our world and
many of these decisions originate in the fight-flight reflex which helps us assess threat and act
accordingly. This fight/flight reflex has helped us in our survival as a species and is often working
away in the background unnoticed, unless there is obvious danger which makes us more aware of it.
The reflex also helps us identify friend or foe, or in the language of psychology, the in-group (me)
and the out group (other).
Unconscious or hidden beliefs, attitudes and biases underlie many of our patterns of behaviour
beyond our regular perceptions of ourselves and others. However objective we may think
ourselves, we are all susceptible to the pull of our hidden biases.
Becoming aware of unconscious bias can make for a more inclusive culture, a wider pool of talent
and improved performance.
Drawing on current concepts and thinking, this ‘bite-sized’ introduction to the ideas and impact of
unconscious bias will quickly acquaint you with the basics and challenge you to consider your own
bias. We will also introduce you to the rewards and difficulties of ‘fast’ and ‘slow’ thinking, the
result of work by Nobel Memorial Prize winner in Economics Daniel Kahneman and show you how
cognitive bias affects so many of your decisions.
Thinking, Fast and slow
The unconscious is often thought of as a dark, unknowable and possibly shameful element of our
psychological life. More recently, researchers have been discussing the ‘dual-process’ model of the
brain. Thanks to the work of Kahneman (Thinking, Fast and Slow), we know that we apprehend the
world in two radically opposed ways, employing two fundamentally different modes of thought:
"System 1" and "System 2".
System 1 is always running. It’s fast, intuitive, associative, metaphorical, automatic, and
impressionistic. Its operations involve no sense of intentional control, or even sentience, however
we use it to inform so many of our decisions.
System 2 is slow, deliberate, and effortful. Its operations require attention. As you will have seen
from our training session today, System 2 takes over, rather reluctantly, when things get tougher
and we can’t rely on System 1 for a quick response. We tend to identify with System 2 as our
conscious mind, however Kahneman contends that this is a mistake – system 1 is just as much a part
of us and Kahneman compares System 2 to a supporting character who believes herself to be the
lead actor yet often has little idea of what's going on. This accounts in part for the hidden bias we
are talking about today.
System 2 is lazy, and tires easily (a process called "ego depletion") – so it usually accepts what
System 1 tells it. This is frequently the right thing to do, as System 1 is usually good at its job. It is
sensitive to subtle social or environmental signs and any signs of danger. However, it is less good at
2
the kind of statistical thinking often needed for good decisions, because System 1 jumps wildly to
conclusions and is subject to a range of irrational biases, some of which we will be thinking about in
this session. We are also susceptible to being influenced by our unconscious because we are so
ignorant of it.
Example
We are susceptible to being influenced by features of our surroundings in ways we don't suspect.
One famous experiment centred on a New York City phone booth. Each time a person came out of
the booth after having made a call, an accident was staged – someone dropped papers on the
pavement. Sometimes a coin had been placed in the phone booth, sometimes not. If there was no
dime in the phone booth, only 4% of the exiting callers helped to pick up the papers. If there was a
dime, no fewer than 88% helped.
Since then, thousands of other experiments have been carried out, all to the same general effect.
We are also susceptible to underestimating the role of chance in life (a System 1 tendency).
There is an urban myth circulating that analysis of the performance of fund managers over the
longer term proves conclusively that you'd do just as well if you entrusted your financial decisions to
a monkey throwing pins in a board! People would prefer to believe that any good results are down
to skill, not luck.
Anchoring
Some highly experienced judges were given a description of a shoplifting offence. They were then
"anchored" to different numbers by being asked to roll a pair of dice that had been secretly loaded
to produce only two totals – three or nine. Finally, they were asked whether the prison sentence for
the shoplifting offence should be greater or fewer, in months, than the total showing on the dice.
Normally the judges would have made extremely similar judgments, but those who had just rolled
nine proposed an average of eight months while those who had rolled three proposed an average of
only five months. All were unaware of the anchoring effect.
The same is true for all of us for much of the time. Even Kahneman himself admitted that he could
not always fight the effects of System 1.
In another example, when people reflect on a past experience of pain, they tend to prefer a larger,
longer amount of it to a shorter, smaller amount, so long as the closing stages of the greater pain
were easier to bear than the closing stages of the lesser one!
Daniel Kahneman won a Nobel prize for economics in 2002 for his work.
Hidden Bias Tests
These measure unconscious, or automatic, biases. Your willingness to examine your own possible
biases is an important step in understanding the roots of stereotypes and prejudice in our society.
The ability to distinguish friend from foe helped early humans survive, and the ability to quickly and
automatically categorize people is a fundamental quality of the human mind.
3
Categories give order to life, and every day, we group other people into categories based on social
and other characteristics. Details for the most famous test, the IAT can be found at the end of these
notes. Hidden bias helps form the foundation for stereotypes, prejudice and, ultimately,
discrimination.
Discrimination, prejudice and stereotypes - Definition of terms
A stereotype is an exaggerated belief, image or distorted truth about a person or group — a
generalization that allows for little or no individual differences or social variation. Stereotypes are
based on images in mass media, or reputations passed on by parents, peers and other members of
society. Stereotypes can be positive or negative.
A prejudice is an opinion, prejudgment or attitude about a group or its individual members. A
prejudice can be positive, but usually refers to a negative attitude.
Prejudices are often accompanied by ignorance, fear or hatred. Prejudices are formed by a complex
psychological process that begins with attachment to a close circle of acquaintances or an "in-group"
such as a family. Prejudice is often aimed at "out-groups."
Discrimination is behaviour that treats people unequally because of their group memberships and
often begins with negative stereotypes and prejudices.
Social scientists believe children begin to acquire prejudices and stereotypes as toddlers. Many
studies have shown that as early as age 3, children pick up terms of racial prejudice without really
understanding their significance.
Reinforcing hidden bias
Once learned, stereotypes and prejudices resist change, even when evidence fails to support them or
points to the contrary. People will embrace anecdotes that reinforce their biases, but disregard
experience that contradicts them. The statement "Some of my best friends are _____" captures this
tendency to allow some exceptions without changing our bias.
Mass media routinely take advantage of stereotypes as shorthand to paint a mood, scene or
character. Particularly in the past, the elderly have been portrayed as being frail and forgetful, while
younger people were often shown as vibrant and able. Stereotypes can also be conveyed by
omission in popular culture, as when TV shows present an all-white world.
More about Hidden Bias
Scientific research has demonstrated that biases thought to be absent or extinguished can remain as
"mental residue" in most of us. Studies show people can be consciously committed to egalitarianism,
and deliberately work to behave without prejudice, yet still possess hidden negative prejudices or
stereotypes. How do we know?
"Implicit Association Tests" (IATs) can tap those hidden, or automatic, stereotypes and prejudices
that avoid conscious control. Project Implicit, a collaborative research effort between researchers at
Harvard University, the University of Virginia, and University of Washington offers various tests to
measure unconscious bias.
Biases and behaviour
A growing number of studies now show a link between hidden biases and actual behaviour. Hidden
biases can reveal themselves in action, especially when a person's efforts to control behaviour
consciously dwindles under stress, distraction, relaxation or competition. Unconscious beliefs and
4
attitudes have been found to be associated with language and certain behaviours such as eye
contact, blinking rates and smiles.
Studies have found, for example, that teachers clearly telegraph prejudices, so much so that some
researchers believe children of color and white children in the same classroom may effectively
receive different educations.
We also know from a famous experiment that white interviewers sat farther away from black
applicants than from white applicants, made more speech errors and ended the interviews 25%
sooner. It is likely that discrimination of this kind would affect anyone’s performance, regardless of
colour.
Researchers are now trying to discover whether a strong hidden bias in someone results in more
discriminatory behaviour. So far we know that:

Those who showed greater levels of implicit prejudice toward, or stereotypes of, black or
gay people were more unfriendly toward them

Subjects who had a stronger hidden race bias had more activity in a part of the brain known
to be responsible for emotional learning when shown black faces than when shown white
faces
People who argue that prejudice is not a big problem today are, ironically, demonstrating the
problem of unconscious prejudice. Because these prejudices are outside our awareness, they can
indeed be denied.
The Effects of Prejudice and Stereotypes
Hidden bias has emerged as an important clue to the disparity between public opinion and the
amount of discrimination that still exists. Despite years of equality legislation, discrimination
persists, robbing people of their rights and identities as individuals and preventing them from
challenging the stereotypes.
Conscious attitudes and beliefs can and do change!
The negative stereotypes associated with many immigrant groups, for example, have largely
disappeared over time. In the US, is believed that for African-Americans, civil rights laws forced
integration and nondiscrimination, which, in turn, helped to change public opinion.
The first step in addressing unconscious bias may be to admit biases are learned early and go against
our commitment to just treatment. Many studies show that when people work together in a
structured environment to solve shared problems, their attitudes about diversity can change
dramatically.
There also is preliminary evidence that unconscious attitudes are ‘plastic’, or malleable. We know
that imagining strong women leaders or seeing positive role models of black people has been shown
to change unconscious biases (researchers are waiting to see if this is long term).
Many test takers can "feel" their hidden prejudices as they perform tests like the IAT, by the time it
takes them to respond to some of the associations. If people are aware of their hidden biases, they
can monitor and attempt to manage these hidden attitudes before they are expressed through
behaviour. It is also likely that a change in behaviour can modify beliefs and attitudes.
Unconscious bias and recruitment
A recent discrimination case highlights the risk of hidden bias to the recruitment process. Ms
Francis, of Afro-Caribbean origin, claimed she was passed over for promotion by her employer, the
5
London Probation Trust, in favour of a white female candidate. Ms Francis was unable to show a
history of racial slurs or poor treatment; however her claim was upheld at the tribunal because
evidence showed that the all-white panel had taken an inconsistent approach to scoring the two
candidates in the process. The panel concluded that there was a "de facto glass ceiling" in place
which prevented non-white candidates from achieving more senior roles. Other black employees
gave evidence that white staff often received informal sponsorship and encouragement, while black
staff did not.
It is unlikely that the events of this case stemmed from malicious intentions from the organisation’s
management, but it does show how easily bias can become part of an organisation's culture without
people realising.
More examples of unconscious bias
1) Confirmation Bias
We tend to prefer people who agree with us be put off by individuals, groups, data and news
sources that make us feel uncomfortable or insecure about our views — what the behavioural
psychologist B. F. Skinner called cognitive dissonance. It's this preferential mode of behaviour that
leads to the confirmation bias — the often unconscious act of referencing only those perspectives
that fuel our pre-existing views, while at the same time ignoring or dismissing opinions — no matter
how valid — that threaten our world view. And paradoxically, the internet has only made this
tendency even worse.
2) Ingroup Bias
Similar to confirmation bias is ingroup bias, a manifestation of our innate tribalistic tendencies.
Ultimately, the ingroup bias causes us to overestimate the abilities and value of our immediate
group at the expense of people we don't really know.
3) Gambler's Fallacy
It's called a fallacy, but it's more a glitch in our thinking. We tend to put a tremendous amount of
weight on previous events, believing that they'll somehow influence future outcomes. The classic
example is coin-tossing. After flipping heads, say, five consecutive times, our inclination is to predict
an increase in likelihood that the next coin toss will be tails — that the odds must certainly be in the
favor of heads. But in reality, the odds are still 50/50. As statisticians say, the outcomes in different
tosses are statistically independent and the probability of any outcome is still 50%.
4) Observational Selection Bias
This is that effect of suddenly noticing things we didn't notice that much before — but we wrongly
assume that the frequency has increased. A perfect example is what happens after we buy a new
car and we inexplicably start to see the same car virtually everywhere, or new parents suddenly
seeing babies everywhere! Most people don't recognize this as selection bias, and actually believe
these items or events are happening with increased frequency.
5) Status-Quo Bias
We tend to be apprehensive of change, which often leads us to make choices that guarantee that
things remain the same, or change as little as possible. We like to stick to our routines, political
parties, and our favourite meals at restaurants. We may unconsciously assume that another choice
will be inferior or make things worse. The status-quo bias can be summed with the saying, "If it ain't
broke, don't fix it".
6
6) Negativity Bias
Social scientists have identified that we have selective attention and given the choice, we perceive
negative news as being more important or profound. We also tend to give more credibility to bad
news, perhaps because we're suspicious (or bored) of proclamations to the contrary. Steven Pinker,
in his book The Better Angels of Our Nature: Why Violence Has Declined, argues that crime, violence,
war, and other injustices are steadily declining, yet most people would argue that things are getting
worse — what is a perfect example of the negativity bias at work.
7) Bandwagon Effect
Though we're often unconscious of it, we prefer to go with the crowd. ‘Groupthink’ doesn’t have to
be a large crowd or the whims of an entire nation; it can include small groups, like a family or a
department at work. The bandwagon effect is what often causes behaviours, social norms, and
memes to propagate among groups of individuals — regardless of the evidence or motives in
support. This is why opinion polls are often maligned, as they can steer the perspectives of
individuals accordingly. Much of this bias has to do with our built-in drive to conform, as many
famous experiments have demonstrated
8) Projection Bias
We tend to assume that most people think just like us — though there may be no justification for it.
This cognitive shortcoming often leads to a related effect known as the false consensus bias where
we tend to believe that people not only think like us, but that they also agree with us. We
overestimate how ‘ typical’ and ‘normal’ we are, and assume that a consensus exists on matters
when there may be none.
9) The Current Moment Bias
Most of us find it hard to imagine what we will really want/do in the future and to alter our
behaviours and expectations accordingly. Most of us would rather have pleasure now and pain
later. This is a bias that is of particular concern to economists (i.e. our unwillingness to not
overspend and save money) and health practitioners.
10) Anchoring Effect
This is the tendency we have to compare and contrast only a limited set of items. It's called
the anchoring effect because we tend to fixate on a value or number that in turn gets compared to
everything else. For example, people will often select the second wine from the top of the wine lis ;
we tend to value the difference in price, but not necessarily the overall price itself. This is why some
restaurants, shops and other service providers offer expensive options, whilst also including more
(apparently) reasonably priced ones. Anchoring effect helps to explain why, when given a choice, we
tend to pick the middle option — not too expensive, and not too cheap.
Resources
The most effective tool available for testing one’s own unconscious bias is the Implicit Association
Test (IAT), created and maintained by Project Implicit, a consortium made up of researchers from
Harvard University, the University of Virginia, and the University of Washington.
The IAT was created more than 10 years ago and has now been used by millions of people in over 20
countries. Researchers at these three schools, as well as others, have used the test to study many
aspects of organizational and social performance, ranging from healthcare decisions to the
operations of the criminal justice system. To take the IAT, without charge, go to
https://implicit.harvard.edu/implicit/demo/
7
5 Points for Progress Toolkit – Know Yourself Unconscious Bias Tool
A free toolkit titled ‘5-Points for Progress’ is available to promote employee engagement and give
employers some practical tools and advice on how to be compliant with the Equality Act 2010.
- See more at: http://raceforopportunity.bitc.org.uk/tools-case-studies/toolkits/five-5-pointsprogress-toolkit-know-yourself-unconscious-biastool#sthash.mX9xvUpm.dpufhttp://raceforopportunity.bitc.org.uk/tools-case-studies/toolkits/five5-points-progress-toolkit-know-yourself-unconscious-bias-tool
Links
http://www.mckinsey.com/insights/strategy/taking_the_bias_out_of_meetings
http://www.mckinsey.com/insights/organization/the_global_gender_agenda
http://www.gladwell.com/blink/blink_excerpt2.html
Science faculty’s subtle gender biases favor male students -Why does John get the STEM job rather
than Jennifer? http://www.pnas.org/content/109/41/16474.short
Reading
Lean In: Women, Work, and the Will to Lead, Sheryl Sandberg 2013
Blindspot: Hidden Biases of Good People, By Mahzarin R. Banaji and Anthony G. Greenwald 2013
The Value of Difference Eliminating Bias in Organisations by Bina Kandola 2009
Blink: The Power of Thinking Without Thinking
2005
Thinking, Fast and Slow Daniel Kahneman 2011
Video
https://www.youtube.com/watch?v=nLjFTHTgEVU
8
9