Living with High-Risk Systems - cs.Virginia

Living with High-Risk
Systems
Michael S. Tashbook
Department of Computer Science
University of Virginia
September 23, 2002
Categories of Risk
• Not all high-risk systems are created equal
• We can partition the set of high-risk
systems into three classes:
 Hopeless Cases
 Salvageable Systems
 Self-Correcting Systems
9/23/02
Living with High-Risk Systems
2
Hopeless Cases
• This category is composed of systems where the
(inevitable) risks far outweigh any reasonable
benefits
• These systems should just be abandoned — at
least in Perrow’s view
• Examples:
 Nuclear weapons
 Nuclear power
3
9/23/02
Living with High-Risk Systems
Salvageable Systems
• Salvageable systems are
 systems that we can’t do without, but that can
be made less risky with considerable effort, or
 systems where the expected benefits are so
great that some risks should be run
• Examples:
 Some marine transport
 DNA research
4
9/23/02
Living with High-Risk Systems
Self-Correcting Systems
• This category contains systems that are
not completely self-correcting, but are selfcorrecting to some degree
• Only modest efforts are needed to improve
these systems further
• Examples:
 Chemical plants
 Airplanes/Air Traffic Control
5
9/23/02
Living with High-Risk Systems
Is Abandonment the Answer?
• Should systems in the “Hopeless Cases”
category be abandoned summarily?
• Should drastic modifications be made for
other high-risk systems (namely, those in
the “Salvageable” category)?
• Not necessarily; Perrow’s argument
makes several assumptions that may not
be true
6
9/23/02
Living with High-Risk Systems
Perrow’s Assumptions
 Current risk assessment theory is flawed
 The public is adequately equipped to make
rational decisions, and its opinions should be
respected by policy experts
 Organizational changes will have little effect
in increasing system safety
9/23/02
Living with High-Risk Systems
7
1. Risk Assessment
• Analysis of the risks and benefits offered
by new systems — examination of the
tradeoffs (if any)
• Modern risk assessors work to:
 inform and advise on the risks and benefits of
new systems
 legitimate risks and reassure the public
 second-guess regulatory agencies’ actions
9/23/02
Living with High-Risk Systems
8
How Safe is Safe Enough?
• More accurately, how do we model risk?
• Mathematical models are generally used
to model risk
• The problem with this kind of analysis is
that it only measures things that can be
quantified
 How much is your life worth?
9
9/23/02
Living with High-Risk Systems
Biased Interpretations
• Problem of systematic biases and public
opinion
 Does every death have the same impact?
 Is a death from diabetes or cancer as bad as a
murder?
 The public doesn’t seem to think so.
 Are fifty thousand annual highway deaths
really equivalent to a single nuclear
catastrophe?
10
9/23/02
Living with High-Risk Systems
Systematic Biases
• Risk assessment differentiates between
voluntary risks and involuntary risks
• However, the system doesn’t discriminate
between the imposition of risks and the
acceptance of risks
• This dispassionate cost-benefit approach
often leads to “the tyranny of the beancounters”
11
9/23/02
Living with High-Risk Systems
Cost-Benefit Analysis (CBA)
• CBA ignores the distribution of wealth in
society
 Risk assessments ignore the social class
distribution of risks
• CBA relies heavily on current market
prices
 Thus, low-paid employees are worth less
when risks are considered
12
9/23/02
Living with High-Risk Systems
More CBA Assumptions
• New risks should not be higher than others
we have already accepted
 if other systems become riskier, we can lower
safety levels on new systems
• Competitive markets require risky
endeavors
13
9/23/02
Living with High-Risk Systems
More RA/CBA Criticisms
• RA/CBA does not distinguish between:
 Addiction and free choice
 Active risks and passive risks
 This isn’t just a matter of in/voluntary risk — it’s a
question of control
• Risk assessors would prefer to exclude
the public from decisions that affect their
interests
14
9/23/02
Living with High-Risk Systems
2. Decision-Making
• Risk assessors assert that the public is illequipped to make decisions on their own
behalf, and cognitive psychologists agree
• Humans don’t reason well:
 We maximize some dangers while minimizing
others
 We don’t calculate odds “properly”
9/23/02
Living with High-Risk Systems
15
Three Types of Rationality
• Absolute rationality
 Risks and benefits are calculated exactly,
offering a clear view of what to do
• Bounded rationality
 Employs heuristics to make decisions
• Social and cultural rationality
 Limited rationality has social benefits
9/23/02
Living with High-Risk Systems
16
Bounded Rationality
• People don’t make absolutely rational
decisions, possibly due to:
 neurological limitations
 memory/attention limits
 lack of education
 lack of training in statistics and probability
• Instead, we tend to use hunches, rules of
thumb, estimates, and guesses
17
9/23/02
Living with High-Risk Systems
More on Bounded Rationality
“There are two reasons for perfect or deductive rationality to break
down under complication. The obvious one is that beyond a certain
complicatedness, our logical apparatus ceases to cope—our rationality
is bounded. The other is that in interactive situations of complication,
agents can not rely upon the other agents they are dealing with to
behave under perfect rationality, and so they are forced to guess their
behavior. This lands them in a world of subjective beliefs, and
subjective beliefs about subjective beliefs. Objective, well-defined,
shared assumptions then cease to apply. In turn, rational, deductive
reasoning—deriving a conclusion by perfect logical processes from
well-defined premises—itself cannot apply. The problem becomes illdefined.”
— W. Brian Arthur, “Inductive Reasoning
and Bounded Rationality” (1994)
18
9/23/02
Living with High-Risk Systems
The Efficiency of Heuristics
• Heuristics are useful; they save time, even
if they are wrong on occasion
• Heuristics:
 prevent decision-making “paralysis”
 drastically reduce search costs
 improve (are refined) over time
 facilitate social life
 work best in loosely-coupled (slack, buffered)
environments
9/23/02
Living with High-Risk Systems
19
Pitfalls of Heuristics
• Heuristics rely on the problem context; if
this is wrong, then the resulting action will
be inappropriate
• Context definition is subtle and difficult
• Heuristics are related to intuitions
 Intuitions are a form of heuristic
 Intuitions may be held even in the face of
contrary evidence
20
9/23/02
Living with High-Risk Systems
Rationality and TMI
• The TMI accident occurred shortly after it
was put into service
• Absolute rationality acknowledges that a
problem was was bound to happen
eventually; it just happened sooner rather
than later
• Is this comparable to the “1x10-9
standard”?
21
9/23/02
Living with High-Risk Systems
Rationality and TMI (cont’d)
• This may be true, but is it the point?
• TMI was a new type of system, and no
heuristics existed for it at the time
• Even though problems may be rare, they
can be very serious
• Experts predicted that TMI was unlikely to
occur, yet it did; could they have been
wrong?
22
9/23/02
Living with High-Risk Systems
Bounded Rationality vs. TMI
• The logic of the public response to TMI
was technically faulty; even so, it was
efficient and understandable
• Experts have been wrong before; it’s
efficient to question them
• Bounded rationality is efficient because it
avoids extensive effort
 Can John Q. Public make a truly informed
decision about nuclear power?
9/23/02
Living with High-Risk Systems
23
Social and Cultural Rationality
• Our cognitive limits are a blessing rather
than a curse
• There are two reasons for this:
 Individuals vary in their relative cognitive
abilities (multiple intelligences theory)
 These differences encourage social bonding
 Individual limitations or abilities lead to
different perspectives on (and solutions to) a
given problem
24
9/23/02
Living with High-Risk Systems
Risk Assessment Studies
• Clark University study of experts and the lay
public
 The two groups disagreed on how to judge the risk of
some activities
 Disaster potential seemed to explain the discrepancy
between perceived and actual risk
 For the public, dread/lethality ratings were accurate
predictors of risk assessments
• Subsequent study identified three “factors”
(clusters of interrelated judgments)
25
9/23/02
Living with High-Risk Systems
Dread Risk
• Associated with:
 lack of control over activity
 fatal consequences
 high catastrophic potential
 reactions of dread
 inequitable risk-benefit distribution
 belief that risks are not reducible
• Correlation with interactively complex,
tightly-coupled systems
26
9/23/02
Living with High-Risk Systems
Unknown Risk
• This factor includes risks that are:
 unknown
 unobservable
 new
 delayed in their manifestation
• This factor is not conceptually related to
interaction and coupling as well as dread
risk
27
9/23/02
Living with High-Risk Systems
Societal/Personal Exposure
• This factor measures risks based on:
 the number of people exposed
 the rater’s personal exposure to the risk in
question
• Of all three factors, dread risk was the
best predictor of perceived risk
28
9/23/02
Living with High-Risk Systems
Thick vs. Thin Descriptions
• A “thin description” is quantitative, precise,
logically consistent, economical, and
value-free
• A “thick description” recognizes subjective
dimensions and cultural values, and
expresses a skepticism about humanmade systems
29
9/23/02
Living with High-Risk Systems
3. Organizational Solutions
• In general, risky enterprises are
organizational enterprises
• Tightly controlled, highly centralized,
authoritarian organizations should be put
into place to run risky systems and
eliminate “operator error”
• But does this really help things?
9/23/02
Living with High-Risk Systems
30
Suggested Organization Types
Linear Interaction
Complex Interaction
Tight
Coupling
Centralization for tight
coupling.
Centralization for tight coupling Decentralization for
and linear interactions
complex interactions
These demands are
incompatible!!!
Loose
Coupling
Centralization and
decentralization are both
feasible
Decentralization for
loose coupling and
complex interactions
31
9/23/02
Living with High-Risk Systems
Where Does the Problem Lie?
• Technology?
 “[W]e are in the grip of a technological
imperative that threatens to wipe out cultural
values….”
• Capitalism?
 Private profits lead to short-run concerns
 Social costs are borne by everyone
• Greed?
 Private gain versus the public good
32
9/23/02
Living with High-Risk Systems
The Problem of Externalities
• Externalities are the social costs of an
activity (pollution, injuries, anxieties) that
are not reflected in its price
• Social costs are often borne by those who
receive no benefit from the activity, or who
are even unaware of it
• Systems with identifiable/predictable
victims are more likely to consider
externalities
33
9/23/02
Living with High-Risk Systems
A New Cost-Benefit Analysis
• How risky are the systems we have been
considering, only in terms of catastrophic
potential?
• How costly are the alternative ways (if
any) of producing the same outputs?
34
9/23/02
Living with High-Risk Systems
The Final Analysis
 Systems are human constructs,
whether carefully designed or
unplanned emergences
 These systems are resistant to change
 System catastrophes are warning
signals, but not the ones we think
 These signals come not from individual
errors, but from the systems
themselves
9/23/02
Living with High-Risk Systems
35
Living with High-Risk
Systems
Michael S. Tashbook
Department of Computer Science
University of Virginia
September 23, 2002