Can you trust your instinct? (October 2009)

The South African Index Investor Newsletter
www.indexinvestor.co.za
October 2009
Dealing with the uncertain and unknown future:
Can you trust your instinct?
By
Daniel R Wessels
“To invest successfully over a lifetime does not require a stratospheric IQ, unusual business insights, or inside
information. What’s needed is a sound intellectual framework for making decisions and the ability to keep
emotions from corroding that framework
–
Warren Buffett, foreword to The Intelligent Investor by Benjamin Graham
“The radical insight of behavioural finance is that people are human”
–
Werner Du Bondt
Among the highly trained and skilled profession of actuaries there is this classical joke: What’s the
difference between God and an actuary? Answer: God doesn’t think he’s (she’s) an actuary! This little
joke encapsulates the core message: No matter your training or measured intelligence you are most
likely to act and think like a human being when facing multiple choices to address uncertainty and the
unknown future. Formal training does certainly help, but above all it takes wisdom and the application of
a good dose of common sense to avoid the typical pitfalls we are prone to when dealing with the
unknown and uncertain.
I am always a bit startled when hearing arguments that because Mr X or Ms Y is so bright and intelligent
that they are bound to be successful in their later careers. That is if intelligence on its own is a passport
to success while downplaying other critical aspects such as emotional intelligence, social network,
communication skills and pure luck (being at the right place at the right time). I am rather of the opinion
that one’s self-confidence (knowing that I am bright) might even be an obstacle when dealing with
uncertainty!
Overconfidence in our own abilities may install a false sense of belief that we always have a ready
answer for any problem thrown at us. Moreover, we often believe and trust that experts should always
know better than anyone else how the future would be. In fact, one can not imagine an expert claiming
he does not know, but in reality it would be probably be more ingenuous for that expert to acknowledge
he or she does not know how events will develop over time.
To be sure, I certainly believe in the value of experts in the field of art, literature, mathematics, physics
and other natural science disciplines, but once you are dealing with complex, non-linear and dynamic
systems such as the economy and financial markets the notion “expert” is becoming vastly more vague
and less reliable. That is not because these experts are less intelligent or trained than their colleagues in
other disciplines, but simply because they are dealing with the uncertain and unknown where current
trends and relationships might take a sudden and significant deviation from the expected or well-trodden
paths of the past.
Moreover, economists and financial experts like money managers and advisors do not operate only in
complex environments, the outcome of their advice or recommendations are often uncertain. To make
matters worse the feedback process is slow; i.e. no immediate feedback exists to evaluate their
decision-making. It often requires years of evaluation to make appropriate judgement whether the
2
correct course of action was taken. Hence given all this uncertainty it is no surprise experts in these
disciplines are most likely to think and acts like most of us – like human beings!
Over the years an alternative theory has been developed in the economic and financial disciplines,
namely behavioural finance which focuses specifically on the study of investors’ behaviour in capital
markets because data are relatively easy to gather and the results are measurable. Behavioural finance
investigates the psychology that underlies and drives financial decision-making behaviour. While I am
not intending to give an expert overview I will highlight the most important aspects that influence our
financial decision-making; more often than not leading to suboptimal or wrong choices. The idea would
be to inform the reader something like (pardon the phrase, I am borrowing it from some politician or
historian, I think): “if you don’t know where you are coming from you are unlikely to know where you are
heading”!
The Efficient Market Hypothesis lies at the heart of modern economic and finance theory. Among the
core assumptions underpinning this theory – that markets will always allocate and price scarce
resources correctly – are that the participants (humans) are rational at all times, none of the participants
exercises a dominant market position and all participants have one common goal, namely the
maximising of their economic utility (profit). The Credit Crunch of 2008 proved this theory to be a fallacy
or at best a gross over-simplification of market reality.
The rationality of people in modern economic theory rests upon the theorem Expected Utility Theory
which states that rational people should always act the same irrespective of how opportunities are
presented to them, i.e. they would not act differently whether facing potential gains or losses.
Furthermore, people in general tend to avoid risk (risk averse) in maximising their gains, which is based
on the work of the mathematician, Daniel Bernoulli (1700-1782) that developed the concept of declining
marginal utility. Basically, the latter concept states that a gain of R1,000 is worth less to a rich man than
a poor man, even though both gain the same amount.
3
Figure 1: Expected Utility Theory
Source: Taylor, 2000
Behavioural finance, however, has identified the loopholes in the above theory already three decades
ago. More specifically, it was two Israeli psychologists, Daniel Kahneman and Amos Tversky that in the
1970s laid the foundation with their Prospect Theory that explained how people typically make financial
decisions under different set of conditions. For example:
Option 1:
An 80% chance of winning R4,000 and 20% chance of winning nothing
Option 2:
A 100% chance of winning R3,000
While Option 1 has a higher expected outcome than Option 2 (80% x R4,000 = R3,200), 80% of the
respondents chose the guaranteed return (Option 2). It is consistent with the Expected Utility Theory in
that most people would prefer a definite gain above a speculative, albeit higher gain. But then
Kahneman and Tversky re-phrased the options as follows:
Option 3:
An 80% chance of losing R4,000 and 20% chance of losing nothing
Option 4:
A 100% chance of losing R3,000
4
Interestingly, even though the expected loss of Option 3 (R3,200) is bigger than the loss with Option 4,
Kahneman and Tversky found that more than 90% of the respondents chose the gamble (Option3).
Contrary to the prediction of the Expected Utility Theory people do not act the same if the circumstances
or reference points are different than before. Basically, people are risk averse, but when facing losses
they become risk seeking. For example, many investors are prepared to sell stock of which they made a
reasonable profit, but at the same time are prepared to stick with poor performing stocks in the hope of
recouping their losses.
Based on their findings, Kahneman and Tversky could therefore improve on the original Expected Utility
Theory model.
Figure 2: Prospect Theory
Source: Taylor, 2000
From the findings of the Prospect Theory it became known that framing (the way something is phrased,
for example gains versus losses) is of utmost importance how people make financial decisions. In
5
general people try to avoid losses, even if it means taking bigger risks and potentially bigger losses. A
classic example of framing studies would be:
Option 1:
A 100% chance of losing R1000
Option 2:
A 25% chance of losing R4000, 75% chance of losing nothing
From Prospect Theory we know most people would opt for the second option. But let us be smart and
re-phrase option 1:
Option1:
An insurance premium of R1000 to avoid a 25% chance of losing R4000
Option 2:
A 25% chance of losing R4000, 75% chance of losing nothing
Surprise, surprise, the magic word “insurance” has done the trick for most people and 65% of the
subjects opted for option 1. Undoubtedly the word “insurance” offers a comforting mindset (regret
aversion) and hence the existence of a mega insurance industry.
An example how people often make decisions “narrow-mindedly” and not consistently is illustrated by
the following example: People were asked if they would accept a gamble that offered a 50% chance to
win $2000 and a 50% chance of losing $500. Most of the respondents declined the bet (as predicted by
Prospect Theory), even if the gamble had a positive expected outcome. But when they were told they
could play the gamble five times, more than 60% of the people accepted the bet. In fact, when people
were presented with the most likely outcomes of playing the gamble five times, more than 80% of the
people accepted the gamble.
Likewise, think of investment returns as a series of repeated gambles. Researchers such as Benartzi
and Thaler carried out some experiments to examine how people make investment decisions. For
example, when the expected distribution of one-year equity returns – typically a very wide range of
possible outcomes, both positive and negative – was shown the median allocation of equity funds to the
overall investment portfolio was 40%. Yet, when the expected distribution of 30-year equity returns –
where all the returns are positive and within a narrow band of possible outcomes – was shown the
median allocation to equity funds in the overall investment portfolio rose to 90%! Undoubtedly, the way
(and frequency) in which information is presented to people will have a strong influence on their
investment decisions.
6
Investment return and holding period
Start and end any date between 1 July 1995 and 31 September 2009
80.0%
Annualised Return
60.0%
40.0%
20.0%
0.0%
-20.0%
-40.0%
-60.0%
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
Holding period (years)
Figure 3: Distribution of equity returns over various holding periods (1995 to date)
Source: DRW Investment Research
Let us turn to some of our own “internal weaknesses” or biases; the fact that we are human after all
when we are making judgements and decisions. First, we are not very good at estimating probabilities or
adjusting for specific effects over multiple periods. For example, answer the following question within 5
seconds: Given a piece of paper 0.1 mm thick, estimate how thick the paper would be if it were folded
on itself 100 times? Answer: 127 x 1021km (yes, even in these days when we are not intimidated by
billions –109 – or trillions – 1012 – anymore, we are talking here about astronomical distances!). Most
people would estimate a few metres at most; that is because we tend to produce an estimate by imaging
the first few folds and then adjusting for subsequent folds, but ignoring the exponential impact of
subsequent folds.
This tendency is generally known as anchoring, which proves to be a very powerful bias in the decisionmaking process. Researchers have shown that even when absurd anchors were used in experiments it
did influence results markedly compared with that of control groups which were not exposed to the same
anchors. One of the most common effects of anchoring is underreaction, where people fail to react to
new information quickly enough and tend to base their views mostly on recent experiences.
7
Furthermore, people in general do not like negative events and tend to underestimate their occurrence,
even if statistics are showing to the contrary; i.e. we may be aware of the real possibility of negative
episodes, but prefer to leave them aside when formulating our base case scenarios. Obviously, it is not
easy to deal with negativity in our lives; our friends and family expect us to have a positive outlook, but it
does not mean we should ignore realities.
Another classical mistake in our estimation of probabilities is when we are given partial information only
or some information are emphasised more than others. For example consider the following: Bill suffered
recently from a heart attack. He is most likely to be: 1) a 55-year old or 2) a 55-year old, smoker and a
couch potato. Most people would opt for the second option, but statistically option 1 should always be
the correct choice since option 2 is a subset of option 1.
Likewise, many more people die of cancer
each year than in car accidents, but because the latter receives more publicity people would
automatically ascribe a higher probability to car accident deaths than statistics would indicate.
Secondly, most of us suffer from overconfidence to various degrees and would only stubbornly agree we
are only average or even below-average in some aspects. Interestingly, researchers found that
overconfidence among people is at its greatest in areas of expertise; more specifically for those difficult,
complex tasks with a low predictability and slow feedback! The reason behind this overconfidence is that
there is a belief that more information will lead to better decisions, thus with experts having access to
more information they should be able to make better decisions. However, researchers have found that
people are not making necessarily better decisions with access to a lot of information. For example,
experiments showed that people with 40 pieces of information could not make better predictions than if
they had only 5 pieces of information available.
However, even if people knew they tend to be overconfident, it is likely to persist and strengthened
because we are prone to two types of biases, namely hindsight and confirmation. For example, today
you may find many experts claiming that the Credit Crisis of 2008 was predictable or unavoidable – all
the evidence pointed to overleveraging and irresponsible credit practices – yet, before the event only a
handful of experts warned of a financial system collapsing as it did. Hindsight bias tends to make events
that happen to be thought of as having been predictable prior to the event. Alternatively, events that do
not happen will be thought of as having been unlikely prior to the event.
Confirmation bias leads people to seek evidence confirming their view, while ignoring evidence to the
contrary. Also, it may lead us to see patterns where none actually exist. For example: You are required
8
to establish the general rule from the series 2,4,6… Most people would add two to the previous value
(2,4,6,8,10…), others might conclude that the following number is obtained by adding the previous two
numbers (2,4,6,10,16…), etcetera. However, the actual rule may simply be that each number generated
is larger than the previous number!
A third “internal weakness” is that we have this uncanny way of compartmentalising our decision-making
process, otherwise known as mental accounting. Kahneman and Tversky demonstrated the importance
thereof in a number of experiments. For example, you decided to see a play at the theatre for R100. At
the theatre you discover that you have lost a R100 note. Would you still pay R100 for a ticket to the
play? The vast majority of people indicated they would. On the other hand, if you already bought the
ticket and then you realise that you have lost the ticket, but the seat was not marked and the ticket
cannot be replaced. Would you buy another ticket? In this case most people indicated they would not!
Although in both cases the monetary loss was R100 the first loss was allocated to a different mental
account than purchasing the theatre ticket. Mental accounting and money illusion is especially prevalent
during financial decision-making. For example, say workers have the option of a R500 pay rise, while
their cost of living would increase by R750 or alternatively, a wage reduction of R250 but with no
increase in the cost of living. Clearly it is the same, yet most workers would prefer the first option,
because under the “wage increase compartment” they would be getting a R500 increase instead of nil!
It was the legendary investor, Warren Buffett that once said: “investing is simple, but not easy”. What he
meant was that investing per se will reward the patient investor, but he or she must realise that the
human nature (emotion or instinct) is often one’s worst enemy with investments. An investor must
realise that he must develop a framework to counteract his or her own shortcomings, typically like those
I described earlier. This is the difficult part of investing because at times it will feel counterintuitive or not
the right thing to do while everybody else is jumping ship or applying for an additional bond on their
house for greater market exposure.
We all know the simplest rule of investing: “Buy low, sell high”. Yet, we do not follow this simple rule
because at times of market depression we feel we should protect our wealth (play it safe) or at times of
market buoyancy we want to share in the “never-ending” optimism of the market. Therefore, we should
realise as human beings we are really psychologically hard-wired to buy high and sell low! After all, we
are social beings. We want to own the hottest stocks and not be left out in the cold at social gatherings.
How can one then overcome emotional decision-making?
9
A simple solution would be to include index investing in your investment strategy because it forces you
not to make emotional choices. While an index strategy is also not a perfect strategy – for example, one
could have too much exposure to an overvalued sector at specific times – active management is not a
clear cut alternative solution. First, you are dealing again with the frailties of human decision-making (as
I explained earlier experts are not robots); secondly, you have no guarantees that the manager will be
able to protect your wealth during sharp downswings (market timing as a strategy only works very well
with hindsight knowledge!); and thirdly, will the manager outperform the market over time, which
incidentally is no small feat!
Sources:
1)
Nigel Taylor, 2000. “Making actuaries less human”. Presented to the Staple Inn Actuarial
Society, January 18.
2)
Robert Shiller, “Human Behavior and the Efficiency of the Financial System”.
10