Daniel Kahneman on the Two Kinds of Thinking

Daniel Kahneman on the Two Kinds of Thinking
Fast and Slow
By Laurence B. Siegel
June 5, 2012
When advisors want to understand why their clients make seemingly
irrational financial choices, odds are they will find answers in the
research of Nobel-winning behavioral economist Daniel Kahneman.
But guiding clients toward a better financial future is only one way to
apply behavioral finance. Kahneman says we solve virtually all
problems, not just financial ones, with two distinct types of thinking.
His recent book, Thinking, Fast and Slow, was a 2011 bestseller. It
summarizes his lifetime of work on how the mind works, covering many
topics familiar to those who follow behavioral economics and finance:
prospect theory, overconfidence, loss aversion, anchoring, separate
mental accounting, the representativeness bias and the availability
bias.
Kahneman, who, at 78, is still teaching at Princeton, recently discussed these and other
discoveries at the 2012 CFA Institute Annual Conference, which took place in Chicago on
May 6-9.
I’ll look at how Kahneman’s research can be applied in the context of investing, but first
let’s examine the central subject of his book: our two ways of thinking.
Think fast! Or think slowly?
Try this experiment: Just before making a left turn in a busy intersection, begin to multiply
17 by 24. I’m kidding; please don’t. You’ll either quickly abandon the arithmetic problem or
wreck your car. But I’ll bet you can add two plus two while making a left turn without any
problem whatsoever.
What is the difference between the two tasks?
Most people would say that one of the tasks is easy and the other is hard. But Kahneman,
who won the 2002 Nobel Prize in economics for work relating economic decision-making
to psychology, says that there’s more to it – a substantive difference, not merely one of
degree.
Adding two and two is done using what Kahneman calls System 1 thinking, the kind of fast
thinking that feels like it is done on autopilot. The product of 17 and 24 is arrived at using
-1© Copyright 2012, Advisor Perspectives, Inc. All rights reserved.
System 2 thinking – slow, deliberate thinking that involves an entirely different
physiological process, one that (for example) interferes with driving a car.
When you engage in intense System 2 thinking, Kahneman says, something happens to
your body. Your pupils dilate. Your heart rate increases. Your blood glucose level drops.
You become irritable if someone or something interrupts your focus. You become partially
deaf and partially blind to stimuli that ordinarily command your attention. Kahneman writes
that “intense focusing on a task can make people effectively blind.”
My grown son recently reported an occurrence of a related phenomenon, blindness
caused by having made up one’s mind. While he was preparing to perform in a concert,
his girlfriend paid him a surprise visit, hundreds of miles from either his home or hers.
Despite increasing efforts to recognize the strangely familiar person approaching him from
a distance, he couldn’t figure out who she was until she was quite close. There is nothing
wrong with my son’s eyesight. Having decided, using System 1 thinking, that his girlfriend
was far away, it was physically impossible for him to see her until she was right under his
nose. He was unable to invoke System 2 thinking to figure out that maybe she had taken
an unplanned trip. He was temporarily blinded by an idea.
A young boy’s puzzlement about human nature
Kahneman, who was born in Tel Aviv but grew up in pre-war France, recalls that he first
became interested in psychology when, as a young child, a German police officer asked to
talk to him. Rightly terrified of the officer, young Danny Kahneman, a Jew, discovered that
the officer was interested in him because he reminded the officer of his own son, who had
died. The officer became very emotional when conversing with Danny, gave him money,
and kept him safe.
From that point forward, Danny decided to figure out what made people tick.
.
Thinking, Fast and Slow reads like a primer, romping through familiar territory, but that is
because Kahneman was instrumental in discovering much of what he discusses. Younger
scholars such as Richard Thaler, Shlomo Benartzi, Hersh Shefrin, and Meir Statman may
have gotten to the reader first, but Kahneman and his deceased collaborator, Amos
Tversky, are the true source of most of these insights.
Systems 1 and 2 in focus
The phrase “what you see is all there is,” a play on the old adage “what you see is what
you get,” runs through the book like a mantra to describe System 1 thinking. System 1
takes visible evidence as the only source of knowledge, and ignores hidden evidence.
Centered in the brain’s amygdala (a part of the limbic system or “reptile brain”), System 1
evolved in response to the need to obtain quick answers. Over here is a tiger: danger!
-2© Copyright 2012, Advisor Perspectives, Inc. All rights reserved.
Over there is a pheasant: delicious! Those who needed to think slowly and carefully to
arrive at these conclusions did not survive to become our ancestors.
System 2 is more complex, and resides in the brain’s prefrontal lobes, which are well
developed in humans but not in other animals. System 2 recognizes that what you see is
not all there is. Is Steve, “a meek and tidy soul, with a need for order and structure, and a
passion for detail” a businessman or a librarian? While System 1, spotting the
resemblance between the description and the librarian stereotype, shouts out “librarian,”
System 2 recognizes that the number of librarians, relative to businessmen, is tiny and that
Steve is actually more likely to be a businessman, despite personality traits that might
have made library work a better fit.
In Kahneman’s telling, System 2 clearly produces the superior answers, at least in most
situations. A great deal of Thinking, Fast and Slow is devoted to demonstrating, through
psychological experiments, how System 1 gets it wrong. “How many animals of each kind
did Moses take into the Ark?” “Two,” says System 1. “You’re trying to fool me,” says
System 2. “It was Noah.”
These two conflicting brain functions behave differently in noteworthy ways. System 1
doesn’t mind working all the time, for example, because its work is not that hard. When
System 2 is put to work, it requires so much effort that it takes over the whole body, so it
goes to work only reluctantly. People do not shy away from solving problems requiring a
quick, automatic reaction but, perhaps because they anticipate the physical strain
described earlier, they procrastinate in working on questions that require careful thought.
No wonder young kids hate word problems in math class: word problems test the ability to
puzzle out what math problem the questioner wants solved, a task much harder than doing
the underlying arithmetic.
Malcolm Gladwell’s beautifully written Blink is essentially an argument that System 1
thinking produces the superior answers. When Kahneman catalogs the errors of System 1
thinking, some laughable and some tragic, it becomes obvious that Gladwell’s celebration
of snap judgment is terribly flawed. I am being a bit unfair to Gladwell because he does
expend some effort identifying when quick thinking goes awry. But his antagonist, David
Adler, whose book, Snap Judgment, is a response to Gladwell, makes the far better case
that judgments rendered in the blink of an eye are usually wrong, and that it is necessary
to apply System 2 thinking if one is serious about coming to sensible answers to most
questions.
Does economics really depend on perfect rationality?
If there is a weakness in Thinking, Fast and Slow, it is in Kahneman’s critique of standard
economics as relying on unrealistic assumptions. He writes,
-3© Copyright 2012, Advisor Perspectives, Inc. All rights reserved.
[An] essay by a Swiss economist named Bruno Frey, which discussed the
psychological assumptions of economic theory, [begins]: “The agent of
economic theory is rational, selfish, and his tastes do not change.”
I was astonished. My economist colleagues worked in the building next door,
but I had not appreciated the profound difference between our intellectual
worlds. To a psychologist, it is self-evident that people are neither fully
rational nor completely selfish. … Our two disciplines seem to be studying
different species, which the behavioral economist Richard Thaler dubbed
Econs and Humans.
But economics is supposed to rely on unrealistic, simplifying assumptions! Otherwise it
would be mathematically intractable. In this respect, it is like physics, which assumes a
frictionless world to get the first-order solutions to problems. You would never get to the
second-order solutions, which include the effects of friction and other imperfections,
without the first-order ones.
Most economists don’t really believe that people are perfectly rational and completely
selfish – at least I hope they don’t – but such simplifications are necessary for getting
started in economic analysis, and the simplified approach predicts the operation of supply
and demand, which are the heart of economics, quite well. Most economists have long
acknowledged that a more sophisticated analysis, one that takes into account the
psychological factors that Kahneman stresses, is needed for certain types of problem
solving.
The success of an economic theory is determined by how well it explains and predicts
phenomena, not by the realism of its assumptions. Milton Friedman’s classic essay “The
Logic of Positive Economics,” a foundational work of economic methodology, explains this
principle in beautiful detail, so I don’t need to.
Confused about Bayesian statistics?
Some of the pleasures of reading Kahneman’s book are off the main track of his
arguments. Maybe I wasn’t the best statistics student 35 years ago, but Bayesian
inference has always seemed to me like guesswork – or, worse, deliberate obfuscation.
(Thomas Bayes, a leading 18th century statistician, presented a formula by which one’s
“prior belief” about the likelihood of an event could be combined with observed data to
arrive at the correct likelihood.) The idea that one could have a meaningful “prior belief” is
what bothered me; but no more.
Kahneman is the only person ever to explain Bayesian statistics so I could understand it.
To paraphrase his example, consider a cab involved in a hypothetical late-night hit-andrun. There was an eyewitness to the crime, who identified the cab as blue. We learn from
-4© Copyright 2012, Advisor Perspectives, Inc. All rights reserved.
further research that 85% of cabs in the area are green and 15% are blue, and we learn
that the witness can correctly distinguish the two colors at night 80% of the time.
Kahneman explains how Bayesian statistics helps us in this case:
In the absence of a witness, the probability of the guilty cab being blue is 15%,
which is the base rate of that outcome. If the two cab companies had been equally
large, the base rate would be uninformative and you would consider only the
reliability of the witness, concluding that the probability is 80%. The two sources
can be combined by Bayes’ rule. The correct answer is 41%.
In other words, the prior expectation comes from a different empirical (and very real)
source: the number of taxicabs of each color. It is not a mysterious “belief” or prejudice,
but an independent source of information that would be the only source if there were no
witness to the accident. The fact that the witness identified the guilty cab as blue does not
mean that it was blue, because he has been shown to be imperfect (but nonetheless pretty
good). The witness’ report means the likelihood that the guilty cab is blue is much higher
than suggested by the fact that only 15% of the cabs are blue.
Thank you, Dr. Kahneman. And phooey on those statistics professors who didn’t explain
what they meant by “prior belief,” making it sound like a feeling or thought, not an
independent piece of data. Maybe sometimes the Bayesian prior is just a feeling, but that
is almost certainly not what Thomas Bayes had in mind.
Behavior and investing
Kahneman’s insights are valuable to the investment management process. If investors
make predictable errors, such as paying too much for popular growth companies or
extrapolating recent returns into long-term asset-class forecasts, then one can trade
against these errors and make money. (By “make money” I mean earn true alpha, the
economic profit that remains after all market-related or beta returns have been subtracted,
and after accounting for all costs.)
But while many proponents of behavioral finance are confident that their insights can be
turned to making money, Kahneman is not. When asked at the CFA conference what can
be accomplished by knowing the many errors to which behavioral finance says investors
are prone, he was blunt. “Very little,” he told his audience. “I have 40 years of experience
with this, and I still commit these errors. Knowing the errors is not the recipe [for] avoiding
them.” Kahneman is more optimistic about using his insights to improve the decisionmaking processes of organizations than he is about providing actionable insights to
individuals.
-5© Copyright 2012, Advisor Perspectives, Inc. All rights reserved.
Investors should understand all sides of the behavioral debate. Behavioral economics
and finance have recently become so popular that many investors do not realize there is
another side to the question. Perhaps it is more productive to analyze securities and
markets as if investors were making rational decisions based on all the information
available to them – even if we know they’re not.
Certainly behaviorism cannot help us all become better investors; if about half of all active
managers underperform the relevant market index, those investors would be better off
recognizing their limitations and improving their performance by indexing. But then the
alpha source for the other half of the investor population, the winning half, would
disappear! Recognizing that humans err in processing information is not a panacea.
Sentiment and science
I am baffled as to what people get out of most psychology books, which I place in two
unhelpful categories: “Don’t worry, be happy” and “You want to kill your father and marry
your mother.” Kahneman’s work is a delightful exception.
His approach to psychology is real science, involving testable (and falsifiable) hypotheses,
controlled experiments, and appropriately modest findings. The investment manager and
behavioral finance expert Arnold Wood compares Kahneman’s achievements favorably to
those of Sigmund Freud. While there is room for both approaches, psychology – at least
as the public understands it – has long suffered from an excess of sentiment. The
injection of scientific discipline in the spirit of Daniel Kahneman’s work is long overdue.
Laurence B. Siegel is research director of the Research Foundation of CFA Institute and
senior advisor at Ounavarra Capital LLC. Before he retired in 2009, he was the director of
research for the investment division of the Ford Foundation, which he joined in 1994 from
Ibbotson Associates, a consulting firm that he helped to establish in 1979.
www.advisorperspectives.com
For a free subscription to the Advisor Perspectives newsletter, visit:
http://www.advisorperspectives.com/subscribers/subscribe.php
-6© Copyright 2012, Advisor Perspectives, Inc. All rights reserved.