Threats to Rational Decision-Making in Local Government Finance

Threats to Rational Decision-Making in
Local Government Finance
Bree E. Nation and Tammy R. Waymire
making (see Figure 1). For example,
decision-making is inherently challenged
by constraints on our mental capacity
and the need to reduce some decisions
to an automated process free of timeconsuming analysis. Further­more, there
are inherent biases that mitigate our
ability to make rational decisions. Most
people fail to recognize these challenges
in their own decision-making. Consider,
for example, the following short problem:
This article has been republished
in entirety with permission from
the Journal of Government Financial
Management (Winter 2015 Vol. 64,
No. 4). Copyright (2015). Association
of Government Accountants. AGA®
and the Journal of Government
­Financial Management® are reg­is­
tered trademarks. All rights reserved.
Please visit www.agacgfm.org for
more information.
Decision-making that is both rational
and consistent with ethical values is a
noble goal for government financial
managers. Ethical challenges arise
when there are conflicts of interest that
provide the decision-maker an incentive
to select a course of action that is not in
the best interest of the government and
its constituents. While many decisions
involve conflicts of interest, all decisions
are subject to threats to rational decision-
A bat and a ball cost $1.10. The bat
costs $1.00 more than the ball.
How much does the ball cost?
Did you conclude the ball costs $0.10?
If so, you are among many in your answer,
although you are incorrect. On its face,
it seems like an easy problem; but,
Nobel Prize-winning economist Daniel
Kahneman reports that the majority of
Figure 1: Decision-Making Framework
Feedback
Loop
Issue/
Problem
Outcome
DecisionMaking
Conflicts
of Interest
Biases/
Threats
(sometimes
present)
(always
present)
© FINANCIAL MANAGEMENT INSTITUTE OF CANADA 2016. ALL RIGHTS RESERVED.
Definition and implications
• Decision-maker’s and
organization’s incentives diverge
• Ethical dilemmas results
Examples
• System 1 v. System 2
• Overconfidence
• Anchoring
• Escalation of Commitment
• Confirmation Trap
people answer it incorrectly.1,2 If we use
algebra to solve the problem, we find:
X = Cost of the ball
X+X+1.00 = 1.10
2X+1.00 = 1.10
2X = 0.10
X = 0.05
Correct answer: The ball costs $0.05,
and the bat costs $1.05, for a total cost of
$1.10. So what gives? Kahneman’s life’s
work has centered on decision-making,
and more specifically, threats to rational
decision-making. In describing the two
types of ways we can make decisions,
he concludes individuals answer this
problem using System 1 thinking,
which is reflexive, almost automated
thinking. Automated thinking is easier,
and in this example would cause us to
reach the incorrect conclusion that the
ball costs $0.10 and the bat costs $1.00.
In contrast, System 2 thinking is more
deliberate and requires more effort (and
perhaps some algebra). In short, we
answer the problem above incorrectly
because we fail to identify the need for
the kind of analysis only System 2 can
provide.3
It is not easy to identify which system
to employ when making decisions.
If, as in the previous example, we use
System 1 thinking to solve a System
MARCH 2016
FMI*IGF e-JOURNAL 1
THREATS TO RATIONAL DECISION-MAKING IN LOCAL GOVERNMENT FINANCE
2 problem, we may reach the wrong
answer. As a local government example,
managers may be pressed to pursue
economic development initiatives when
opportunities arise (e.g., a corporation
that will bring jobs to an area requests
tax abatements). In this situation, local
governments may forego the necessary
analysis to make a sound decision.
Alternatively, if we use System 2
thinking, which takes longer, to answer
a System 1 problem, the additional time
required to arrive at a decision may
prove costly. It may be all too tempting
to avoid budget cuts necessitated by
economic conditions, as such decisions
invariably involve personnel, but delays
may translate into even more difficult
cuts in future. Within local government
finance, wrong decisions of either type
can have negative consequences. The
Government Accountability Office
reports a dismal fiscal outlook - both
in the short- and long-term - for state
and local governments.4 Gaps between
spending and revenue, driven by
stagnant growth in property tax receipts
and upward pressure on healthcare
costs, require decisions that are both
expedient and evidence-driven.
Local government finance decisions
are further complicated by other
poten­tial threats to rational decisionmaking that plague all disciplines and
also our personal lives. In the sections
that follow, we discuss four threats
of particular concern in this setting:
(1) overconfidence, (2) anchoring, (3)
escalation of commitment, and (4)
confirmation trap. In describing each
threat, we provide and analyze a local
government finance example. Although
some of the evidence in the decisionmaking literature is fairly dismal in its
assessment of the ability of individuals
to mitigate these threats, one thing is
certain; awareness of these threats is a
requisite condition of rational decisionmaking.
Overconfidence: of Course We
Know the Answer!
Overconfidence may be the most
destructive threat to decision-making,
as it offers the environment necessary
for other threats to thrive. Being
2
FMI*IGF e-JOURNAL
MARCH 2016
overconfident in our decisions may
prohibit us from applying additional
analysis needed to make a good decision.
Take the bat-and-ball problem. If you
reached the common incorrect answer
(ball cost = $0.10) and were asked to
give a confidence estimate after you
reached that answer, you probably
would estimate the probability of a
correct answer close to 100 percent.
This form of overconfidence is referred
to as overprecision.
Within local government finance,
professionals may be prone to over­
confidence because they have significant
education, experience and training. It
may seem counterintuitive to suggest
that expertise can be a detriment, but
one study concludes participants were
just as overconfident in domains where
they were experts as in domains where
they were not.5 While we might expect
experts to be confident, it is alarming that
experts may not be aware of the extent
(and, more importantly, the limitations)
of their own knowledge. One study
analyzed more than 10,000 forecasts
made by financial officers of thousands
of firms and concludes only 33 percent
of the time did actual market returns
fall within the executives’ forecasted
SO-percent confidence intervals.6 Over­
precision makes individuals, even experts,
too sure of their judgments, too reluctant
to take advice from others and too quick
to act on their opinions.
In the governmental environment,
overprecision can pose significant risks.
Government managers are tasked with
not only making decisions regarding the
net costs of a particular project or activity,
but also with assessing the probability
that their decisions are correct. From
decisions to outsource to assessments
of security threats, managers are likely
to assess the probability of success too
high. Outsourcing decisions, such as
the city of Chicago’s acceptance of a
lump-sum payment in exchange for
managing 36,000 parking meters, are
often the subject of debate. In this
example, the city accepted a lump­sum
payment of $1.15 billion in exchange for
operating and maintaining the parking
meters.7 The decision was challenged
by many, with a conservative estimate
by Chicago’s Office of the Inspector
General that the lump-sum payment to
the city was almost $1 billion less than
fair value.8,9 How could these estimates
be so far apart? Overconfidence in the
form of overprecision likely plagued
the decisions of all parties. While the
city did not release confidence intervals
around their estimates of the value of the
transaction, all news releases suggested
full confidence (i.e., 100 percent
certainty) that the transaction was in
the best interest of taxpayers. This bias
is therefore particularly concerning
when involving government decisions,
as elected officials are expected to be
confident, and a wider confidence
interval surrounding a point estimate
may be viewed as a weakness.
Another aspect of overconfidence,
overestimation, has to do with one’s
tendency to think a situation or event is
better than it actually is. A component
of this bias is the planning fallacy. There
is a common tendency to overestimate
the speed with which one can complete
projects and tasks.10 As governments take
bids for major projects, they are affected
by this fallacy as the entire process
pushes confidence. Contractors often
recognize that they must express more
confidence than their rivals. Those who
express confidence are viewed as more
credible and trustworthy. One study
used data from infrastructure projects
in various countries and confirmed
the tendency to underestimate costs
and project duration.11 Additionally,
government officials who propose
projects be performed in-house are
faced with similar pressures and biases
as contract bidders as they aim to rally
support for their project. In either of
these scenarios (contractor v. in-house),
government managers should be aware
of the role of overconfidence in setting
project budgets and timelines.
Anchoring: Be Careful What You
Hitch Your Wagon To!
Anchoring is an equal opportunity
threat to decision-making. Anchoring
behavior can be viewed in people of
all ages, educational backgrounds and
demographics. Although not necessarily
a bad thing, anchoring implies that
individuals develop estimates by starting
© FINANCIAL MANAGEMENT INSTITUTE OF CANADA 2016. ALL RIGHTS RESERVED.
THREATS TO RATIONAL DECISION-MAKING IN LOCAL GOVERNMENT FINANCE
with an initial anchor and adjusting
from that anchor as new information is
received. In local government finance,
we note certain purposeful anchoring
behavior, particularly in the setting of
budgets. Incrementalism is an accepted
method of budgeting entirely consistent
with anchoring, i.e., start with the prior
year budget and adjust accordingly.
This form of budgeting simplifies the
decision-making process and provides
some degree of control over long-term
spending. Other budgeting methods may
have intuitive appeal, but likely require
more analysis than incrementalism.
For example, zero-based budgeting
purposefully eliminates the anchor of
prior year budgets and instead links
budgets to needs based on actual activity.
The downside? This approach requires
more System 2 thinking.
The problem with anchoring is
individuals overwhelmingly fail to
sufficiently adjust to new information.
In fact, even in the case of zero-based
budgeting, managers are certainly
aware of prior budget amounts and
may still anchor to previous budgets
in setting new budgets. An overfunded
department (i.e., budget exceeds
need) will benefit from this bias, and
underfunded departments will suffer.
More troubling is that anchors may
not even be rational. In an experiment,
Tversky and Kahneman rigged a wheel
with numbers 0 to 100 so it would only
stop on 10 or 65. Participants spun the
wheel, wrote down their number (10 or
65), and were asked two questions: “Is
the percentage of African nations among
UN members larger or smaller than the
number you just wrote?” and “What
is your best guess of the percentage
of African nations in the UN?”12 The
wheel cannot possibly offer useful
information to the participants, but it
was not ignored as it should have been.
On average, participants who spun 10
stated 25 percent of African nations
are in the UN while participants who
spun 65 estimated 45 percent of African
nations were in the UN. If individuals
will anchor unintentionally to irrelevant
information, they may also anchor to
seemingly relevant information (prior
year budget), even if they are unaware
they are doing so.
Escalation of Commitment:
Doubling Down When You Should
Let Go
Continuing to invest resources in a
failing project is inconsistent with
rational decision-making, but pervasive
in organizations - both business and
government. We fail to correctly
eliminate sunk costs from our analysis,
and we want to believe decisions we
reached were correct and, with time,
any losses will turn around. In an
experiment that demonstrated the
power of escalation of commitment,
participants were assigned to one of two
departments in an organization, with
one of the departments responsible for
an operating decision and the other
not responsible for the decision.13
Participants in each department were
further divided into two additional
groups - one was told the decision
was successful and the other told the
decision was unsuccessful. While
participants in both departments who
were told the decision was successful
allocated approximately the same
amount to continue the project,
differences emerged for those who were
told the decision had been unsuccessful.
Participants in the responsible depart­
ment who were told the decision had
been unsuccessful to date allocated
more resources to the project than the
non-responsible department.
This threat to rational decisions is
equally pervasive in local government.
Local governments face decisions such
as selection of contractors, evaluating
the progress on those contracted
projects, and whether and when to
cut losses on a particular project. On
a grand scale, governments facing
decisions whether to host the Olympics
or other similar events are predisposed
to escalation of commitment to a
failing course of action. For example,
British Columbia made the decision
to host the 1986 World Exposition on
Transportation and Communication, or
Expo 86. Planned far in advance, losses
were estimated at $6 million in 1978,
but quickly escalated to more than $300
million in projected losses in 1985.14
Despite the increasing losses, British
Columbia held tight to the decision
© FINANCIAL MANAGEMENT INSTITUTE OF CANADA 2016. ALL RIGHTS RESERVED.
to host Expo 86. In this and other
government examples, psychological
biases toward supporting a previous
decision and social pressures to save
face may lead to expensive decisions
that divert resources from potentially
successful projects to proven failing
projects.
Confirmation Trap: Seeking Support
for a Decision Already Made
We like to be right. This leads us to search
for evidence supporting decisions we
have made, and to ignore disconfirming
evidence. This confirmation trap is
closely related to two of the prior
biases discussed - overconfidence and
escalation of commitment. Whether
expert or novice in a subject matter,
individuals are overconfident in their
decisions, which may lead them to look
for confirming evidence after the fact.
And, we continue to allocate resources
to projects in the face of evidence
of losses suggesting the project is a
failing one, in part because we want to
demonstrate that the initial decision to
pursue the project was correct. While
a previous decision may be viewed as a
done deal, how we respond to evidence,
particularly that which is disconfirming,
may shape future decisions.
Every local government decision
previously identified is at risk of the
confirmation trap. Local governments
choose projects, select contractors,
decide whether to continue projects, set
budgets, choose economic development
initiatives, and hire and attempt to
retain talented professionals. Each of
these decisions presents opportunities
for managers, after the fact, to
demonstrate their decision was the
correct one. Presumably, Chicago’s
decisions to outsource the operation of
parking meters and by British Columbia
to host Expo 86 were followed with
confirmation traps.
Hope for Eliminating Barriers to
Rational Decision-Making
While Kahneman suggests these
biases are pervasive and persistent,
there are some mitigating steps that
can be taken in our pursuit of rational
MARCH 2016
FMI*IGF e-JOURNAL 3
THREATS TO RATIONAL DECISION-MAKING IN LOCAL GOVERNMENT FINANCE
decision-making. We make two primary
recommendations: (1) evidence-based
decision-making, and (2) independent
reviews. An evidence-based approach
valuing data and analyses is essential.
Furthermore, our propensity to think
too highly of our own decisions calls
for reviews of critical decisions by an
outside department or another local
government. These steps to reduce
barriers can be taken in formal settings
like Government Finance Officers
Association meetings or merely in­
formal discussions at lunch meetings.
These two recommendations may serve
to mitigate our overconfidence, as well
as our anchoring behavior, escalation
of commitment, and seeking out of
confirming (rather than disconfirming)
evidence.
Endnotes
About the Authors
Bree Nation, MAS,
is a federal tax
associate in KPMG’s
Chicago office
Tammy Waymire,
Ph.D., CPA, is an
associate professor
of accountancy at
Northern Illinois
University. Prior to
entering academia,
she worked as an
auditor of governmental and non-profit
organizations, and as an expert witness
in Medicaid fraud cases and utility rate
cases.
1. Kahneman, D. (2011). Thinking, Fast and
Slow. New York, NY: Farrar, Straus and
Giroux.
2. Kahneman asked thousands of university
students the bat-and-ball problem; students
at both elite universities and less selective
universities. More than 50 percent of students
at the prestigious universities responded with
the intuitive, yet incorrect, price of the ball.
At the less prestigious universities, more than
80 percent of students failed to engage their
System 2 logic, and instead responded with
the intuitive price of the ball.
3. See Endnote 1.
4. U.S. Government Accountability Office
(GAO). (2013). State and Local Governments’
Fiscal Outlook. Washington, DC: GAO.
5. McKenzie, C.R.M., Liersch, M.J., and
Yaniv, I. (2008). Overconfidence in interval
estimates: What does expertise buy you?
Organizational Behavior & Human Decision
Processes 107 (2): 179-191.
6. Ben-David, I., Graham, J.R., and Harvey,
C.R. (2010). Managerial miscalibration.
Unpublished Manuscript.
7. Goldsmith, S. (2010). Chicago’s parking
meter mishap: Successful ‘fiasco.’ Governing.
http://www.governing.com/columns/mgmtinsights/Chicago-Parking­Meters.html.
8. City of Chicago Office of the Inspector
General (OIG). (2009). Report of Inspector
General’s Findings and Recommendations: An
Analysis of the Lease of the City’s Parking Meters.
Chicago, IL: Chicago OIG.
9. The City of Chicago Office of the Inspector
General reported in 2009 that the parking
meter agreement was driven in large
part by short-term budgetary pressures.
Furthermore, the Inspector General
conservatively estimated the amount by which
the negotiated amount fell short of the fair
value of the agreement was $974 million. The
negotiations were reported to be rushed, with
little time allocated to analysis, resulting in
the decreased amount received by the City.
10.Buehler, R., Griffin, D., and Ross, M. (1994).
Exploring the “planning fallacy”: Why people
underestimate their task completion times.
Journal of Personality and Social Psychology 67
(3): 366-381:
11.Flyvbjerg, B., Bruzelius, N., and
Rothengatter, W. (2003). Megaprojects and
Risk: An Anatomy of Ambition. Cambridge,
UK: Cambridge University Press.
12.Tversky, A., & Kahneman, D. (1974).
Judgment under uncertainty: Heuristics and
biases. Science 185 (4157): 1124-1131.
13.Staw, B.M. (1976). Knee-deep in the Big
Muddy: A study of escalating commitment
to a chosen course of action. Organizational
Behavior and Human Performance 16 (1), 27-44.
14.Ross,]., and Staw, B.M. (1986). Expo 86: An
escalation prototype. Administrative Science
Quarterly 31 (2): 274-297.
Winds of Change
June 12-14, 2016
St. John’s, NL
Sheraton Hotel Newfoundland
115 Cavendish Square, St. John’s
fmi.ca/events/psmw/psmw-2016
4
FMI*IGF e-JOURNAL
MARCH 2016
© FINANCIAL MANAGEMENT INSTITUTE OF CANADA 2016. ALL RIGHTS RESERVED.