Behavioral Legal Ethics

Behavioral Legal Ethics
or
Why Smart Lawyers Sometimes Do Dumb Things
by Kevin Underhill1
Introduction
“Too many ethics programs focus too much on the technicalities or exact wording
of the commentary in the Model Rules,” says Egil Krogh, who had a lot of time to think
about legal ethics after he was convicted and disbarred for his role in the Watergate scandal.
A similar view has been expressed by John Dean III, the former White House counsel who
has done even more thinking about it, partly for the same reasons. And since their actions
and Dean’s Senate testimony are the main reasons we are saddled with mandatory ethics
training today, their views seem worth considering.
This is especially true because, as you may be surprised to learn, despite the
requirement of mandatory ethics training there have still been scandals in which lawyers’
ethics have been called into question, including the savings & loan crisis, Milbank Tweed,
the fall of Enron, Arthur Andersen, Milberg Weiss, Lehman Brothers, Dickie Scruggs, the
Bush Administration, the Clinton Administration, the Bush Administration, and the Obama
Administration. These suggest there is still room for improvement.
But what should we focus on if not the technicalities or exact wording of the
commentary in the Model Rules? As many scholars have suggested recently, we might
benefit from assuming that lawyers are human beings, and applying behavioral psychology
to try to explain and possibly even change behavior. As some have also suggested, this
approach may be both more helpful and (hopefully) more interesting than the traditional
approach of reading and construing ethics rules.
1
Kevin Underhill is a partner in the San Francisco office of Shook, Hardy & Bacon LLP. He specializes in
class actions and other complex litigation and is also an experienced appellate practitioner. He also writes
the legal-humor blog Lowering the Bar (www.loweringthebar.net) and is the author of The Emergency
Sasquatch Ordinance, a book about weird laws.
I.
Watergate, Enron, and So Forth
In June 1973, John Dean, who had until very recently been White House Counsel,
testified that during the Watergate cover-up he had compiled a list of those he thought were
at risk for obstruction-of-justice charges. In addition to notations about the possible
criminal penalties involved, Dean had also put asterisks next to many of the names, and he
explained to the Senate committee what that meant:
Mr. DEAN. Now, beside several of the names, after I did the list—just my
first reaction was, there certainly are an awful lot of lawyers involved
here. So I put a little asterisk beside each lawyer, which was Mitchell,
Strachan, Ehrlichman, Dean, Mardian, O’Brien, Parkinson, Colson,
Bittman, and Kalmbach.
***
Senator TALMADGE. Any significance to the star? That they are all
lawyers?
Mr. DEAN. No, that was just a reaction myself, the fact that, how in God’s
name could so many lawyers get involved in something like this?2
Unfortunately for those who would have preferred not to have mandatory ethics CLE
classes, he said this on national television.
Dean has offered at least a partial answer to his question more recently. In 2012—
the 40th anniversary of the Watergate break-in—he began offering a CLE seminar on the
lessons of Watergate. 3 That seminar focused on events inside the Nixon White House
during the first week after the Watergate burglars were arrested. “[T]he mistakes made that
week,” he has said, “provide[] a powerful teaching tool for what an attorney should not do
when representing an entity with a powerful person in charge.” That is certainly true. He
also said, “Had the ABA’s Model Code existed in June 1972, I sincerely believe it would
have changed history.” That is certainly not true, although Dean may sincerely believe it.
2
Watergate and Related Activities. Phase I: Watergate Investigation. S. Res. 60. Senate Select Comm. on
Presidential Campaign Activities, Book 3, p. 1054 (June 26, 1974).
3
See “The Legacy of Watergate,” http://www.watergatecle.com (last visited Apr. 10, 2014).
2
Thirty years later, detailed accounting rules didn’t prevent the Enron scandal, and
legal-ethics rules didn’t stop its lawyers from downplaying conflicts of interest and
ignoring (or refusing to believe) what was happening.4 Enron hired a law firm to investigate
partnerships that the same law firm had helped set up, and its report was mostly about what
it hadn’t done to investigate. The firm wrote on October 15, 2001, that “the facts disclosed
through our preliminary investigation do not, in our judgment, warrant a widespread
investigation by independent counsel and auditors,” although it did express some concern
about the “bad cosmetics” of the situation. The next day, Enron announced it was taking a
$544 million charge against earnings related to those same partnerships, and within six
weeks it had filed for bankruptcy. In-house counsel at Enron and Arthur Anderson were
also criticized (and in one case, prosecuted). This debacle led to the Sarbanes-Oxley Act
of 2002, which put an end to such shenanigans for good.
Except that less than three years after Enron collapsed, something similar happened
at a company called Refco.5 In that case, a law firm prepared documents to facilitate 17
“round-trip loans” for its client, transactions in which one Refco entity loaned money to
third parties, which then loaned it back to another Refco entity, which used it to pay debts
it owed the first Refco entity. If that seems a little fishy to you, that’s because it was; they
were using the transactions to hide bad debts from potential investors. Two months after
its IPO—and after its principals had sold stock like crazy— Refco announced it had
“discovered” the loans and suggested there might be a problem. A week later it too filed
for bankruptcy. The outside counsel in charge of the project was eventually sentenced to
seven years in prison for fraud. He insisted that he had just been following his client’s
instructions and did not know that the purpose of the transactions was fraudulent. While
See, e.g., Dan Ackman, “Enron’s Lawyers: Eyes Wide Shut?” Forbes.com (Jan. 28, 2002); Deborah L.
Rhode and Paul D. Paton, “Lawyers, Ethics, and Enron,” 8 Stan. J. L. Bus. & Fin. 9 (2002).
4
See generally Paula Schaefer, “Harming Business Clients With Zealous Advocacy: Rethinking the
Attorney Advisor’s Touchstone,” 38 Fla. St. U.L. Rev. 251 (Winter 2011).
5
3
that may not be too surprising, the interesting thing is that he actually passed a polygraph
test on the topic. Obviously, the jury wasn’t convinced, but had he convinced himself? And
if so, how?
There are plenty of similar examples, of course, that don’t involve criminal conduct
but do involve ethical violations. In many cases, it seems highly unlikely that the offender
made a deliberate decision to break the rules. Rather, it is often the case that people delude
themselves into believing that what they are doing is not a problem, or at least can be
justified under the circumstances. Studying how this happens may give us some insight
into why highly intelligent people sometimes end up doing things that are—or at least
appear to be—completely stupid.
II.
Behavioral Legal Ethics
An excellent introduction to behavioral legal ethics can be found in an article
coincidentally titled “Behavioral Legal Ethics,” by professors Jennifer Robbennolt and
Jean Sternlight.6 As they summarize the approach:
Some have suggested that lawyers behave badly because they are inherently
“bad” or “stupid,” because they are susceptible to undue pressure from their
clients, because they are under-regulated, or because they are overregulated. Surely some attorneys do deliberately engage in conduct that they
know to be wrong in order to benefit themselves or their client. However,
psychological research suggests a more complex story: That those who
commit ethical infractions are not necessarily “bad apples,” but are human
beings. Many ethical lapses result from a combination of situational
pressures and all too human modes of thinking.7
What “modes of thinking” are they thinking of? It turns out there are an awful lot of things
going on in a human brain of which the brain’s owner may often not be aware. The more
we know about them, the better.
Daniel Kahneman won the Nobel Prize in 2002 for his work (with Amos Tversky)
on human decision-making, and although it was only a Nobel Prize in economics that is
Jennifer K. Robbennolt & Jean R. Sternlight, “Behavioral Legal Ethics,” 45 Ariz. St. L.J. 1107 (Fall
2013) (also citing numerous other articles on the same topic).
6
7
Id. at 1111.
4
still fairly impressive. In his 2011 book, Thinking, Fast and Slow, Kahneman wrote about
how many decisions that we may believe are the product of a conscious, rational process
are actually generated by intuitive processes or “gut feelings.” These processes are pretty
remarkable—if they didn’t work pretty well, we’d all have been eaten by tigers a long time
ago—but they aren’t always right.
These two systems are generally referred to, with scientific precision, as System 1
and System 2. System 1 is the intuitive, automatic system behind the scenes, and System 2
is the conscious, reasoning system that we generally think of when we think of “self.”
System 1 handles most tasks perfectly well, with relatively little effort, and much more
quickly than System 2. But to do this, it has to rely on shortcuts—biases that are often
correct but not always—and the resulting “systematic errors” can lead System 2 astray.
The list of these shortcuts or biases is actually quite long, and only a few are
summarized here:

The “anchoring effect”—incomplete or even random data tends to influence our
evaluations, because System 1 tries to fill in blanks quickly with whatever
comes to hand. It’s a remarkably strong effect that influences even people who
know to beware of it and are consciously trying to do so.

The “availability heuristic”—the tendency to estimate likelihood, frequency, or
risk by measuring how easily we can retrieve examples from memory. This is
why people vastly overestimate the risks of flying and of terrorism; it is easier
to remember the rare but dramatic examples of disaster than it is to evaluate the
statistics concerning just how unlikely those events really are.

“Hindsight” or “outcome bias”—our evaluation of a decision or a decision
maker is heavily influenced by how things actually came out, even though the
result may be due simply to luck (good or bad). This is one facet of the System
1 drive to find causal explanations for events even if there isn’t one.

The “illusion of skill”—for similar reasons, we tend to attribute successes to
our own skill or other positive qualities, when we may simply have been lucky
in the past. This leads to overconfidence when we face the next challenge.

“Loss aversion”—people tend to weigh the risk of loss about twice as heavily
as they do a chance of gain. Ordinarily we tend to be risk-adverse, but this can
also mean that when there are no good options, people are paradoxically more
willing to take risks (since they are likely to lose something no matter what
option they choose).
5
Again, these biases all have an important role to play and are not inherently bad,
but they can lead to systematic errors or poor decisions. Especially as lawyers, we tend to
think that we are always running System 2, but like other human beings we are often not
operating in that mode.
Other psychological factors that are important in behavioral legal ethics include the
tendency towards conformity and the impulse to obey authority figures. The famous
Milgram experiments in the 1960s and 1970s, which showed that ordinary people were
surprisingly willing to administer strong electric shocks to others just because somebody
in a lab coat told them to, showed just how powerful the latter impulse is.8 The tendency
to obey increases along with the perceived status or power of the authority figure. It is
easy to see how these factors could play an important role in bad decision making within
the hierarchical environment of a law firm. And these, too, are generally factors that operate
below the conscious level.
Finally, lawyers are notorious for working long and sometimes stressful hours.
Studies have consistently shown that fatigue dramatically increases the likelihood that a
person will choose easier default options, possibly sticking with a poor status quo or
conforming to an arguably unethical group consensus, rather than take necessary actions.
This effect on the personality is why the phenomenon is sometimes called “ego depletion”
or “cognitive depletion.”9 In other words, an exhausted person does not have the energy to
start up System 2, so he or she is more likely to default to System 1 processes. That person
is also more likely to yield to temptation, whether the consequences are one donut too many
or signing on to a bad ethical decision that seems financially profitable.
Conclusion
8
Stanley Milgram, Behavioral Study of Obedience, 67 J. of Abnormal and Social Psychol. 371, 371-78
(1963); Stanley Milgram, Obedience to Authority: An Experimental View (1974).
9
See Behavioral Legal Ethics, 45 Ariz. St. L.J. at 1140-43 (discussing effect of cognitive depletion on
lawyers’ ethical choices, or lack of choices)
6
The really difficult question is how we might try to compensate for processes that
are often operating below the surface. Part of the answer is simply to be more aware of
them; even if we can’t compensate fully, we should be able to do better than the default.
Beyond that, it may be possible to implement structural reforms or create incentives
(economic and otherwise) to help counterbalance the factors discussed above.10 In a sense,
this involves using the strengths of System 1 to compensate for some of its weaknesses.
Because it also involves lawyer compensation to some degree, it may be difficult to
implement, but is worth trying.
Nancy B. Rapoport, “‘Nudging’ Better Lawyer Behavior: Using Default Rules and Incentives to Change
Behavior in Law Firms,” 4 St. Mary’s J. of Legal Ethics & Malpractice 42 (2014).
10
7