1 Preparing for the Next Emergency Andrew Lakoff

Preparing for the Next Emergency
Andrew Lakoff
Working Paper
Laboratory for the Anthropology of the Contemporary
19 January 20061
This paper describes the emergence and extension of a novel form of security
rationality in the postwar United States. This form of rationality, “preparedness,”
provides security experts with a way of seeing uncertain future threats and bringing them
into a space of present intervention. An analysis of the underlying logic and structuring
elements of preparedness helps to address a puzzling aspect of state-based security
practices in the contemporary United States: how a series of seemingly disparate types of
events – ranging from terrorist attacks, to hurricanes and earthquakes, to epidemics –
have been brought into the same framework of “national security threats.” More broadly,
such an analysis allows us to address the question: what is the logic through which threats
to collective life are being taken up in the space of politics?
In order to show what is distinctive about preparedness, the paper begins by
comparing it to a different form of rationality for approaching potential dangers:
insurance. Preparedness becomes a salient approach to future threats when they reach the
limits of a rationality of insurance. These are threats that cannot be managed through a
logic of risk-calculation: preparedness approaches potential events whose probability is
1
This paper is an attempt to map out the historical background to a study of how future threats are brought
into the present by contemporary security experts. The study is one part of a collaborative project on the
anthropology of bio-security, under the auspices of the Laboratory for the Anthropology of the
Contemporary. Many of the ideas in this paper emerged in conversation with the co-principal investigators
on this project, Stephen Collier and Paul Rabinow. It has also benefited from comments by members of the
ICAS seminar at NYU, as well as the suggestions of Craig Calhoun, Nils Gilman, and Chris Otter.
1
incalculable but whose consequences could be catastrophic. The paper then describes the
basic elements of preparedness, tracing its history beginning in the early period of the
Cold War and following it to its current articulation in the Department of Homeland
Security. The analysis is framed by a discussion of the aftermath to Hurricane Katrina,
which revealed both the centrality of preparedness as a guiding logic for contemporary
security and its limits.
We Are Not Prepared
One evening the week after Hurricane Katrina struck, intrepid news anchor
Anderson Cooper was featured on the Charlie Rose show. Cooper was still on the scene
in New Orleans, the inundated city in the background and a look of harried concern on
his face. He told Rose that he had no intention of returning to his comfortable life in
New York City any time soon. Cooper had been among the reporters to challenge
official accounts that the situation was under control, based on the contradiction between
disturbing images on the ground and government officials’ claims of a competent
response effort. He seemed shocked and dismayed by what he had seen in New Orleans,
but was also moved, even transformed by his role as witness to domestic catastrophe. He
had covered disasters in Somalia, Sri Lanka and elsewhere, he said, but never expected to
see images like these in the United States: widespread looting, hungry refugees, corpses
left on the street to decompose. Toward the end of the interview, Rose asked him what
he had learned from the event. Cooper paused, reflected for a moment, and then
answered: “we are not as ready as we can be.”
2
Insofar as the hurricane and its aftermath could be said to have had a shared
moral, it was this: we are not prepared – whether for another major natural disaster, a
chemical or biological attack, an epidemic or some other type of emergency. This lesson
structured response to the hurricane in terms of certain kinds of interventions and not
others. And the basic elements of possible response were already in place. This helps
explain why Katrina largely failed to be a politically transformative event; instead, it has
intensified and redirected processes that were already underway. To see this it is
necessary to analyze the emergence and extension of preparedness as a guiding
framework for domestic security in the United States.
Preparedness names both an ethos and a set of techniques for reflecting about and
intervening in an uncertain, potentially catastrophic future.2 Unlike other issues
potentially raised by the hurricane, such as racial inequality, concentrated urban poverty,
the social isolation of the elderly, or the short-sightedness of environmental planning, the
demand for preparedness is a matter that enjoys widespread political agreement on the
necessity of governmental intervention. In other words, in the imperative of
preparedness, we find a shared sense of what collective security problems involve today.
To be prepared is an injunction that must be followed. What can be a source of dispute is
not whether we need to be prepared, but how to prepare and what we need to prepare for.
In the aftermath of Katrina, it was common to see comparisons made between the
failed governmental response to the hurricane and the more successful response to the
attacks of 9/11. To an observer a decade before, it might have been surprising that a
natural disaster and a terrorist attack would be considered part of the same problematic.
2
Stephen J. Collier, Andrew Lakoff and Paul Rabinow, “Biosecurity: Towards an Anthropology of the
Contemporary,” Anthropology Today 20:5 (October 2004).
3
And the image, three weeks after Katrina, of George W. Bush flying to the Northern
Command in Colorado – a military installation designed for use in national security crises
– to follow the progress of Hurricane Rita as it hurtled toward Texas might have been
even more perplexing. The aftermath of Katrina also pointed forward to other possible
emergencies, such as a novel and deadly infectious disease. In announcing its $7.1
billion influenza preparedness program the following month, the Bush administration
declared avian flu a matter of national security.3
This grouping of various types of possible catastrophe under a shared rubric of
“security threats” is exemplary of the rationality of preparedness. Preparedness marks
out a limited, but agreed-upon terrain for the management of collective life along a
temporal axis. Its techniques focus on a certain set of possible future events, operating to
bring them into the present as potential catastrophes that point to current vulnerabilities.
The Probabilistic Future
In its mode of future orientation and in its way of approaching threats,
preparedness can be usefully contrasted with another form of rationality for dealing with
possible dangers – insurance. As Francois Ewald points out, insurance is an “abstract
technology” that can take actual form in a variety of institutions, including mutual
associations, private insurance firms, and state-based social welfare agencies. It is a
technology of risk. Here the term “risk” does not refer to a danger or peril, but rather to a
“specific mode of treatment of certain events capable of happening to a group of
3
As one bio-security expert said, the Washington security establishment was in a “post-Katrina, prepandemic moment.”
4
individuals.”4 This treatment involves first, tracking the occurrence of such events over
time across a population; and second, applying probabilistic techniques to gauge the
likelihood of a given event occurring over a certain period of time. Insurance is thus a
way of reordering reality: what had been exceptional events that disrupted the normal
order become predictable occurrences.
Insurance takes up external danger and transforms it into manageable risk. It
removes accidents and other misfortunes from a moral-legal domain of personal
responsibility and places them in a technical frame of calculability. The events insurance
typically works on are dangers of relatively limited scope and statistically regular
occurrence: illness, injury, accident, fire. When taken individually, such events may
appear as misfortunes, but when their occurrence is plotted over a population, they show
a normal rate of incidence. Knowledge of this rate, gained through carefully plotted
actuarial tables, makes it possible to rationally distribute risk.
As an abstract technology, insurance can be linked up to diverse political
objectives. In the late nineteenth century in countries such as France, insurance
techniques were harnessed to a politics of solidarity in the development of state-based
social welfare, or “population security.” Population security aims to foster the health and
well-being of human beings understood as members of a national population. It works to
collectivize individual risk – of illness, accident, or infirmity. Through calculation of the
rates of such events across populations over an extended period, population security
seeks regularities – birth and death rates, illness prevalence, consumption patterns.
Planners can then target intervention into the social milieu that will increase and sustain
4
Francois Ewald, “Insurance and Risk,” in The Foucault Effect: Studies in Governmentality (Chicago,
1991).
5
collective well-being.5 Examples of population security mechanisms include mass
vaccination, urban water and sewage systems, guaranteed pensions, and health and safety
regulations.
As analysts of the European welfare state have argued, this “social” form of
security was based on the premise that technical rationality would be increasingly capable
of managing collective risk.6 By the mid-twentieth century, such risk management had
taken on a relatively stable form in the West in the various forms of collective security
provision associated with the welfare state. Developments in science and technology –
such as food production or industrial hazard mitigation – promised to further improve and
stabilize this condition of population security. Toward the end of the century, however,
this stability began to break down, and many of the security mechanisms associated with
social welfare were either dispersed outside of the state or were allowed to fall into
disrepair. Critics of social welfare argued that matters such as health insurance and
pensions were individual questions that should be handled by the private sector.
Meanwhile, a new series of environmental and health hazards emerged whose scale and
incalculability pushed them beyond the scope of insurability. In some cases, these new
vulnerabilities were generated by the extent, power and uncontrollability of the lifesupporting systems that had been developed in the context of population security. These
new hazards were characterized by their catastrophic potential and by their
unpredictability.
5
As Foucault writes, “security mechanisms have to be installed around the random element inherent in a
population of living beings so as to optimize a state of life.” Michel Foucault, “Society Must Be
Defended”: Lectures at the College de France, 1975-76 (New York, 2003).
6
Ewald, “The Return of Descartes’ Malicious Demon,” in Baker and Simon, Embracing Risk (2001).
Ulrich Beck, World Risk Society (1997).
6
The Limits of Insurance
Ulrich Beck contrasts the optimism associated with the development of the
European welfare state – that one could fully manage risk and plan for the future through
calculation – with current perceptions of these new forms of vulnerability: “The speeding
up of modernization has produced a gulf between the world of quantifiable risk in which
we think and act, and the world of non-quantifiable insecurities that we are creating.”7
According to Beck, society has entered a condition of “reflexive modernity,” in which the
very industrial and technical developments that were initially put in the service of
guaranteeing human welfare now generate new threats. Our very dependence on critical
infrastructures – systems of transportation, communications, energy, etc. - had become a
source of vulnerability. His examples include ecological catastrophes such as Bhopal and
Chernobyl, global financial crises, and mass casualty terrorist attacks. Such hazards can
cause global, irreparable damage, are of unlimited temporal duration, and their effects
may be invisible. These dangers shape a perception that “uncontrollable risk is now
irredeemable and deeply engineered into all the processes that sustain life in advanced
societies.” They “abolish old pillars of risk calculus,” outstripping our ability to calculate
their probability or to insure ourselves against them. According to Beck, the
uninsurability of these mega-hazards in the private sector is exemplary of a new social
world in which technical expertise cannot calculate and manage the risks it generates.8
Ewald suggests that this new sense of the vulnerability has led to the rise of
“precaution” as a new logic of political decision under conditions of uncertainty. From
7
Beck, “The Terrorist Threat,” Theory, Culture and Society (2002).
The criterion of uninsurability is a matter of some controversy, since solutions have in fact emerged to
what initially appears as uninsurability – for example, reinsurance, catastrophe securities, or governmental
“backstops” (Bougen 2003; Ericson and Doyle 2004). Nonetheless, it is clear that certain novel threats
have posed challenges to an actuarial rationality of security.
8
7
the European vantage, environmental and health hazards such as global warming, mad
cow disease, and genetically modified food indicate that expertise has lost its certain
grasp on the future. These are cases in which the rationality of decision cannot satisfy
itself with the cost-benefit balance, which remains unknown. And further, catastrophic
and irreparable events cannot be adequately compensated. Ewald writes of the limits of
an insurantial approach in such cases: “one cannot foresee what one does not know, even
less what one cannot know.”9 If the possibility of the event is neither measurable nor
assessable, it is not a “risk” in the technical sense of a danger that has been brought into
the realm of calculative decision.
How do political actors assume responsibility for dealing with this new form of
threat? Catastrophic threats that cannot be mitigated may enjoin decision-makers to avoid
taking risks. The catastrophe, as Niklas Luhmann writes, is “the occurrence that no one
wants and for which neither probability calculations nor expert opinions are
acceptable.”10 Whereas in an insurantial regime, risk is normal and the question is how to
distribute it, under a regime of precaution potential catastrophe is to be strictly avoided
rather than seen as a risk that might be taken. In the context of possible catastrophe,
writes Ewald, one must take into account not what is probable or improbable, but what is
most feared: “I must, out of precaution, imagine the worst possible.”11 Thus, a principle
of precaution in the face of incalculable threat enjoins against risk-taking – for example,
the implementation of new and uncertain technologies.
9
Ewald 2001.
Luhmann, “Describing the Future,” in Observations on Modernity (Stanford, 1999).
11
Ewald 2001, 285 (check).
10
8
From Risk to Preparedness
The precautionary principle has been an influential response to certain novel
forms of threat in Europe, especially those linked to “the environment”.12 It should be
noted that although it is addressed to the limit point of insurance, precaution still operates
within a problematic of insurability – that is, it concerns the problem of calculability. In
contrast, a very different set of techniques for approaching uncertain but potentially
catastrophic threats has emerged and extended its reach first in the United States, and
increasingly transnationally. Like precaution, it is applicable to events whose regular
occurrence cannot be mapped through archival knowledge, and whose probability
therefore cannot be calculated. In contrast to precaution, however, preparedness does not
prescribe avoidance; rather, it enacts a vision of the dystopian future in order to develop a
set of operational criteria for response. In this manner, rather than seeking to constrain
action in the face of uncertainty, preparedness turns potentially catastrophic threats into
vulnerabilities to be mitigated.
Both insurance and preparedness are ways of making an uncertain future available
to intervention in the present. But they demand different types of expertise, and they call
forth different forms of response. Preparedness assumes the disruptive, potentially
catastrophic nature of certain events. Since the probability and severity of such events
cannot be predicted, the only way to avert catastrophe is to have plans to address them
already in place, and to have exercised for their eventuality; in other words, to maintain
an ongoing capability to respond appropriately. Although the probability and severity of a
12
The 1992 Rio Declaration articulated the new relation between uncertainty and risk-avoidance: “In order
to protect the environment, the precautionary approach shall be widely applied by States according to their
capabilities. Where there are threats of serious or irreversible damage, lack of full scientific certainty shall
not be used as a reason for postponing cost-effective measures to prevent environmental degradation.”
9
given event is not known, one must behave as if the worst-case scenario were going to
occur: that it is “not a question of if, but when.” The point is to reduce current
vulnerabilities and put in place response measures that will keep a disastrous event from
veering into unmitigated catastrophe. Moreover, they differ in what they protect:
insurance protects individuals and groups, whereas preparedness protects the ongoing
operations of vital systems.
Preparedness organizes a set of techniques for maintaining economic and social
order in a time of emergency. First responders are trained, relief supplies stockpiled, the
logistics of distribution mapped out. During the event itself, real-time situational
awareness is critical to the coordination of crisis management practices. The duration of
direct intervention by a preparedness apparatus is limited to the immediate onset and
aftermath of crisis. But the requirement of vigilant attention to the prospect of crisis is
ongoing, permanent. Techniques such as early warning systems make possible such
sustained attention. An apparatus of preparedness comes to know its vulnerabilities
through enactment: tools such as scenario-planning and simulation exercises test the
response system and reveal gaps in readiness.
Here is a partial list of types of preparedness techniques:
•
Ways of gauging current vulnerabilities and capabilities: scenarios and
simulations
•
Early warning, monitoring systems
•
Stockpiling of supplies; training first responders
•
Plans for the coordination of response among diverse entities
•
Information sharing and information analysis
10
•
Means of assessment, e.g. readiness metrics
It should be emphasized that these techniques are not unique to U.S. domestic
security: scenarios and simulations, early warning and detection systems, and plans for
coordinating response can also be found in humanitarian relief, environmental
monitoring, and international health organizations – in any field oriented toward
managing potential catastrophe. Thus, like insurance, preparedness is an abstract
technology that can be actualized in diverse ways, according to diverse political aims.
The table on the following page some of the basic distinctions between insurance and
preparedness as forms of security rationality.
11
Table: Forms of Security Rationality
Insurance
Preparedness
Knowledge required
about event
Calculable, relatively
limited scope (one can
predict how often it will
occur, but not to whom)
Archival – actuarial tables
of statistics
Not calculable, potentially
catastrophic scope (one can say
that it is likely to happen, but not
when or where)
Narrative, imagined
How possible event is
transformed
From external danger to
manageable risks
From outside threat to
vulnerability to be mitigated
Technical operation
Calculation of probability
using tables of frequency
Gauge current vulnerabilities
through imaginative techniques
(scenarios, simulations)
How to alleviate threat
Spread risks over
population
Build capabilities for response to
multiple threats
Temporal orientation
Continuing, modulated
attention
Ongoing vigilant alertness;
sporadic intervention, lasting only
for duration of event and recovery
Initial site of
application
17th century shipping and
navigation
Cold War threat of atomic attack
Extension to new sites
Property (insurance against
fire, flood), life, accident,
old age
Natural disaster, ecological
catastrophe, humanitarian
emergency, terrorism
Type of event
addressed
12
A Continuous State of Readiness
While techniques of preparedness are now applied to a variety of potential
emergencies, they were initially assembled during the Cold War, in response to the threat
of a surprise attack by the Soviet Union. This was the context for the rise of the U.S.
national security state, in which a huge military build-up arguably took the place of what
in Europe became the welfare state. At this stage, preparedness meant massive military
mobilization in peacetime in order to deter or respond to an anticipated enemy attack.
The nation would have to be permanently ready for emergency, requiring ongoing crisis
planning in economic, political and military arenas. Civil defense was one aspect of
attack preparedness. Although its most ambitious elements were never fully
implemented, techniques linked to civil defense gradually came to take on broader
significance as they migrated into new domains of threat, such as natural disasters and
industrial accidents. Today, preparedness as an organizational principle marks out the
boundaries of what are agreed-upon tasks for the state in the protection of the national
economic and social order.
*
Post-war U.S. civil defense plans were developed in response to the rise of novel
forms of warfare in the mid-twentieth century: first, air attacks on major cities and
industrial centers in World War II, and then the prospect of nuclear attack during the
Cold War. As the war came to an end, U.S. military planners sought to ensure that the
country not demobilize after the war, as it had after World War I. They argued that the
lack of a strong military had invited the surprise attack on Pearl Harbor. Now the Soviet
Union presented a new existential threat. To meet it, the U.S. would have to remain in a
13
state of permanent mobilization. What historian Michael Sherry calls “an ideology of
preparedness” thus emerged even before the Cold War had begun.13
The U.S. Strategic Bombing Survey, conducted between 1944 and 1946, reported
on the consequences of air attacks in England, Germany and Japan, and the effectiveness
of these countries’ civil defense measures. It recommended shelters and evacuation
programs in the U.S., “to minimize the destructiveness of such attacks, and so organize
the economic and administrative life of the Nation that no single or small group of attacks
can paralyze the national organism.”14 The report pointed to the need to disperse key
industries outside of dense urban areas, and to ensure the continuity of government after
attack.15 As Peter Galison notes, the survey led military strategists to envision the United
States in terms of its key infrastructural vulnerabilities – to see the territory in terms of a
set of targets whose destruction would hamper future war efforts.16
Given their anxiety about U.S. vulnerability, military planners sought to ensure
that the nation could quickly generate an efficient war machine in the midst of
emergency.17 The 1947 National Security Act, perhaps the first explicit articulation of
the concept of “national security,” established the National Security Resources Board
(NSRB), which was dedicated to centralizing domestic preparedness for war. The NSRB
laid the foundation for much Cold War civil defense planning. The agency organized its
programs assuming the need to anticipate a massive, surprise attack by the Soviet Union:
“the national security requires continuous mobilization planning and, to maximum
13
Michael Sherry, Preparing for the Next War (Yale, 1977)
Lawrence J. Vale, The Limits of Civil Defense in the U.S.A, Switzerland, Britain and the Soviet Union:
The Evolution of Policies since 1945 (New York, 1987), 58.
15
Andrew Grossman, Neither Dead Nor Red: Civil Defense and American Political Development During
the Early Cold War (Routledge, 2001), 145 n. 57.
16
Peter Galison, “War against the Center,” Grey Room 4 (2001).
17
As Sherry writes, “If attack were to come without warning, the war machine had to be ever ready.”
Preparing for the Next War, 235.
14
14
possible degree, a continuous state of readiness.”18 Anxiety about the threat of attack
intensified following the Soviet Union’s explosion of its first nuclear weapon in 1949.
The U.S. responded with an immediate three-fold increase in its defense budget.
National security in the face of the Soviet threat comprised three inter-related
strategies: containment, deterrence and preparedness. While containment and deterrence
focused on ways of confronting the enemy, preparedness referred to continuous war
mobilization on the home front. For the nation to be ready for a surprise attack, it would
be necessary to have an economic and military infrastructure for waging full-scale war
already in place. At this stage, the strategy of preparedness for attack had several
elements: gauging the vulnerability of U.S. forces to a first strike, putting attack
detection systems in place, and ensuring that the civilian infrastructure would be capable
of mobilizing for war even in the face of a massive first strike. The latter, civil defense,
would remain the least developed of these measures.
An early definition of civil defense was: “the mobilization of the entire population
for the preservation of civilian life and property from the results of enemy attacks, and
with the rapid restoration of normal conditions in any area that has been attacked.”19
Based on a proposal from the NSRB, Truman established the Federal Civil Defense
Administration in 1950. Congress then passed the Civil Defense Act, which distributed a
series of civil defense tasks to states and local offices: coordinate volunteers, hold
training exercises, mount public awareness campaigns. However, the most ambitious
civil defense program – a nation-wide shelter system – was not funded over the next
decade, as officials remained ambivalent about civil defense for a number of reasons.
18
19
cit. in Grossman 2001, 36 (check).
Thomas J. Kerr, Civil Defense in the U.S.: Bandaid for a Holocaust? (Westview, 1983), 20.
15
First, insofar as the military’s nuclear deterrence strategy was based on mutually assured
destruction, strategists saw civil defense as either ineffectual or as destabilizing to the
strategic balance. Moreover, the military did not want its domain of responsibility
extended to the protection of the civilian population. Further, civil defense was – and
would continue to be – politically unpopular: the public saw it as an acknowledgement
that the government was planning for nuclear war.
In the early sixties, Kennedy made a strong case for the development of a nationwide fallout shelter system, arguing that it would provide a form of protection against
accidental nuclear exchange. The Berlin and Cuban Missile crises lent some urgency to
his proposal, but by the mid-sixties these plans had once again fallen flat.20 Thus efforts
to fully implement civil defense measures as an element of attack preparedness were
mostly blocked over the course of the Cold War. Nevertheless, the underlying logic of
civil defense – the need to anticipate a surprise attack – as well as the institutions and
forms of expertise that were developed as part of civil defense provided the basis for
what would eventually become a more fully articulated form of security rationality.
Techniques of Preparedness I: The Scenario
While mutually assured destruction was the structuring logic of nuclear strategy
within the Defense Department in the 1950s, experts outside of the military were
rethinking the basic tenets of deterrence. Civilian defense intellectuals with the RAND
Corporation, with backgrounds in technical fields such as physics, mathematics and
20
By this time the key debate in nuclear defense strategy concerned anti-ballistic missiles An ABM system
had two possible implications for civil defense: either it would render civil defense unnecessary, or civil
defense would be a “back-up” to the ABM strategy, given that some incoming missiles would be missed,
and that exploding them would generate fallout against which civilians would need shelter.
16
economics, honed methods for modeling the conduct of war in order to advise military
planners on weapons systems, arms control issues, and other strategic questions.21
Advocates of this approach argued that in the case of thermonuclear war, modeling the
stages of confrontation and conflict was critical, since no one had any actual experience
with this new form of war. Such modeling activity generated techniques, such as
scenarios and simulations, which would later become central to preparedness activities.
Using game theory, the RAND strategists tried to foresee the moves of the
adversary in the lead-up to super-power confrontation. On this basis, they argued that
mutually assured destruction was not credible as a strategy for deterrence against Soviet
aggression in Europe. The Soviets might well decide that even if their tanks rolled into
West Germany, the U.S. was unlikely to unleash global nuclear annihilation in response.
The strategists’ alternative was to develop plans for a limited nuclear engagement in
which there would be the opportunity for intra-war negotiation between stages of
escalation. From this vantage, civil defense measures such as rapid evacuation plans and
shelter systems became an important aspect of deterrence strategy: first, having these
measures in place would discourage enemy attack on an otherwise temptingly unprepared
target; second, it would reinforce the enemy’s belief in U.S. willingness to use strategic
retaliatory power.22
The problem civil defense approached was: how to maintain the nation’s warfighting and post-war recuperation capacities even in the face of a devastating attack. For
RAND strategists such as Herman Kahn, this question was imperative given U.S. military
doctrine: in order for the strategy of deterrence to work, the enemy had to be convinced
21
Fred M. Kaplan, The Wizards of Armageddon (Stanford, 1991).
A key moment in this was the 1957 Gaither Report, which recommended the construction of a large-scale
fallout shelter system. See Kaplan 1991.
22
17
that the U.S. was prepared to engage in a full scale nuclear war and had thus made
concrete plans both for conducting such a war and for rebuilding in its aftermath. Kahn
criticized military planners for their failure to concretely envision how a nuclear war
would unfold. If planners were serious about the strategy of deterrence, they had better be
prepared to actually wage nuclear war. It was irresponsible not to think concretely about
the consequences of such a war: what civil defense measures would lead to the loss of
only fifty million rather than a hundred million lives? What would human life be like
after a nuclear war? How could one plan for post-war reconstruction in a radiationcontaminated environment? Prewar preparation was necessary for postwar existence.
In the quest to be prepared for the eventuality of thermonuclear war, Kahn
counseled, every possibility should be pursued. “With sufficient preparation,” he wrote,
“we actually will be able to survive and recuperate if deterrence fails.”23 Kahn invented a
method for “thinking about the unthinkable” that would make such planning possible:
scenario development. Scenarios served two purposes. One was to assist in designing
role-playing games in which decision-makers would enact the lead-up to war with the
Soviet Union. In the absence of the actual experience of a nuclear standoff, these
exercises provided officials and military planners with something close to the sense of
urgency such a crisis would bring. The second use of scenarios was to force both planners
and the public to seriously face the prospect of nuclear catastrophe as something that
must be planned for in detail.
Through the development of scenarios, Kahn envisioned a range of postwar
conditions whose scale of catastrophe was a function of prewar preparations, especially
civil defense measures. These scenarios generated knowledge of vulnerabilities and led
23
Cit. in Ghamari-Tabrizi, 231.
18
Kahn to proposals for mitigating them.24 For example, a radioactive environment could
hamper post-war reconstruction unless there was a way of determining individual levels
of exposure. Thus he recommended giving out radioactivity dosimeters to the entire
population in advance of war, so that post-war survivors would be able to gauge their
exposure level and act accordingly.
*
The general problem scenario planning addressed was how to approach an
unprecedented event. Scenarios were not predictions or forecasts, but opportunities for
exercising an agile response capability. They trained leaders to deal with the
unanticipated. “Imagination,” Kahn wrote, “has always been one of the principle means
for dealing in various ways with the future, and the scenario is simply one of the many
devices useful in stimulating and disciplining the imagination.”25 Scenario planning as a
form of thought liberated experts from the reliance on archival knowledge that structured
insurantial rationality, making it possible to plan for surprise. In the wake of Kahn’s
promotion of the technique, scenario planning radiated outside of defense strategy, and
began a prolific career in other arenas concerned with managing an uncertain future,
ranging from corporate strategy, to environmental protection, to international public
health.
24
“Kahn repeatedly reprised the valiant role played by a clever civilian, uniquely blessed with
extraordinary powers of discernment and prognostication, who could smoke out the least visible clues of
fatal vulnerabilities in the national defense.” Ghamari-Tabrizi, 248.
25
Kahn, “Some Strange Aids to Thought” (1962), 145.
19
All-Hazards Planning
Although costly civil defense measures such as a national fallout shelter system
were never successfully implemented, the state and local offices spawned by the Civil
Defense Act served as a springboard for the extension of preparedness to new domains.
Beginning in the early sixties, and with increasing momentum over the next decades, an
alternate variant of preparedness developed in parallel to the Federal government’s
efforts to mobilize for nuclear war. State and local agencies sought to use federal civil
defense resources to prepare for natural disasters, such as hurricanes, floods, and
earthquakes.26 Despite its different set of objects, the field of emergency preparedness
was structured by the underlying logic of civil defense: anticipatory mobilization for
disaster.
Some governmental measures for alleviating the damage caused by natural
disasters, especially floods and wildfires, were already in place. These programs included
prevention efforts like levee construction and forest management as well as recovery
mechanisms such as the declaration of federal disasters in order to release assistance
funds. But in the 1960s, state and local officials took up a number of the techniques
associated with attack preparedness and applied them to disaster planning. These
techniques included: monitoring and alert systems, evacuation plans, training first
responders, and holding drills to exercise the system.
26
As a leading figure in the development of emergency preparedness put it: “At the national level, a civil
defense system developed earlier than any comparable disaster planning or emergency management
system. However, at the local level, the prime concern after World War II became to prepare for and
respond to disasters.” E. L. Quarantelli, “Disaster Planning, Emergency Management and Civil Protection:
The Historical Development of Organized Efforts to Plan for and to Respond to Disasters,” p. 10.
Manuscript: Disaster Research Center, University of Delaware.
20
The two forms of preparedness did not easily coexist. First, there were tensions
over whether locally-based emergency management programs should focus their
planning efforts more on nuclear war or on likely natural disasters. And in contrast to
civil defense, which operated according to the norms of hierarchical command-andcontrol associated with national security, emergency management had a distributed,
decentralized structure: while its broader vision was federally-coordinated, disaster
planning efforts took place at state and local levels, and involved loosely coupled
relations among private sector, state and philanthropic organizations.27
Despite these differences in mission and organization, civil defense and
emergency management shared a similar field of intervention – potential future
catastrophes – which made their techniques potentially transferable. Moreover, a
complementary set of interests was at play in the migration of civil defense techniques to
disaster planning. For local officials, Federally-funded civil defense programs presented
an opportunity to support local response to natural disasters. From the Federal vantage,
given that civil defense against nuclear attack was politically unpopular, natural disaster
planning developed capabilities that could also prove useful for attack preparedness.
In the late 1960s, this “dual-use” strategy was officially endorsed at the Federal
level by a National Security Council study. The strategy was institutionally implemented
in 1972 with the replacement of the Office of Civil Defense by the Office of Civil
Preparedness. Over the course of the 1970s, the forms of disaster to be addressed
through emergency planning expanded to include environmental catastrophes, such as
27
William L. Waugh, Jr., “Terrorism, Homeland Security and the National Emergency Management
Network,” Public Organization Review 3 (2003). These tensions foreshadow some of the issues raised in
the wake of Hurricane Katrina, such as the role of the military and the distribution of authority between
federal and local agencies in emergency situations.
21
Love Canal and Three Mile Island, and humanitarian emergencies, such the Cuban
refugee crisis.
When the Federal Emergency Management Agency (FEMA) was founded in
1978, the new agency consolidated federal emergency management and civil defense
functions under the rubric of “all-hazards planning.” All-hazards planning assumed that
for the purposes of emergency preparedness, many kinds of catastrophe could be treated
in the same way: earthquakes, floods, major industrial accidents, and enemy attacks were
brought into the same operational space, given certain common characteristics. Needs
such as early warning, the coordination of response by multiple agencies, public
communication to assuage panic, and the efficient implementation of recovery processes
were shared across these various sorts of disaster. Thus all-hazards planning focused not
on assessing specific threats, but on building capabilities that could function across
multiple threat domains.28
National Security after the Cold War
The rationale for civil defense as an element of deterrence strategy, developed at
RAND in the 1950s, finally came into favor among high-level U.S. military planners in
the 1970s. This was in part because the Soviet Union had developed its own extensive
civil defense program, so that civil defense was now a variable in the strategic balance. In
1975 the Defense Civil Preparedness Office recommended “Crisis Relocation Planning”
as part of a capacity for flexible nuclear response. Similarly, Carter advisor Samuel
Huntington argued that the United States’ lack of a “population relocation capability”
28
Quarantelli writes: “It is being more and more accepted that civil protection should take a generic rather
than agent specific approach to disasters.” Quarantelli, n.d., 17).
22
was a strategic vulnerability that could be politically destabilizing. In the 1980s, defense
strategists embraced the notion that without sufficient protection of the economy and
government during a nuclear war, the rationale of deterrence was not credible.29
One aspect of Cold War national security that retrospectively stands out is the
relative stability of the threat it sought to mitigate. The Soviet Union seemed to be
knowable and manageable through the logic of containment. With the end of the Cold
War, U.S. national security thinkers were almost nostalgic for a time when, however dire
the threat of nuclear catastrophe might have been, it was at least clear what one was
supposed to be preparing for. As Colin Powell said in 1991, “We no longer have the
luxury of having a threat to plan for.”30 New security formations have since consolidated
around this problem: what is the threat for which we must now plan?
The end of the Cold War undermined containment and deterrence as central
national security strategies, since there was no longer a rational enemy whose actions
could be predicted and managed. The key change in the nature of threat was from the
stable enemy to the non-specific adversary.31 This shift became even more palpable after
the attacks of 9/11. In a 2002 speech to the Council on Foreign Relations, Donald
Rumsfeld counseled that the U.S. must vigilantly prepare for the unexpected:
“September 11 taught us that the future holds many unknown dangers, and that we fail to
prepare for them at our peril.” He elaborated, using the language of the anticipation of
surprise familiar from scenario planning: “The Cold War is gone and with it the familiar
security environment. The challenges of the new century are not predictable. We will
29
Thus Reagan’s 1982 National Security Decision Directive-26 stated, “the task of maintaining the
‘continuity of government’ during a crisis and attack is increasingly a chief concern of American civil
defense planning.” cit. in Vale (1987).
30
James Mann, Rise of the Vulcans: The History of Bush’s War (New York, 2004), 203.
31
See the 2001 Quadrennial Defense Review Report. www.defenselink.mil/pubs/qdr2001.pdf
23
probably be surprised again by new adversaries who may strike in unexpected ways. The
challenge is to defend our nation against the unknown, the uncertain, the unseen, the
unexpected.”32
In the speech, Rumsfeld described the military’s strategic shift, after the Cold
War, from a threat-based strategy to a capabilities-based approach. This strategy focused
less on who might threaten the U.S., and more on how the U.S. might be threatened. Such
an approach would allow the military to focus on the central new problem of
“asymmetric threats”: instead of building armed forces around plans to fight this or that
country, Rumsfeld argued, the U.S. needed to examine its own vulnerabilities – which
would enable the military to deal with multiple forms of threat.
Techniques of Preparedness II: The Simulation
In developing protocols for all-hazards planning, one security expert in the early
1980s described the importance of simulation exercises for training purposes: “Ideally,
when a real crisis hits, no difference should exist, either operationally or emotionally,
between the current reality and the previous training simulations.”33 To design such drills
requires information about the situation to be planned for: the speed of a toxic cloud
under given weather conditions; the pattern of outbreak of an epidemic; the scale of
impact of a large earthquake in a specific urban setting. Scenario-based simulations not
32
Donald Rumsfeld, “Transforming the Military,” Foreign Affairs (May/ June 2002).
Robert Kupperman, “Vulnerable America,” in James Woolsey, ed., Nuclear Arms: Ethics, Strategy,
Politics (San Francisco, 1984).
33
24
only exercise the system of emergency response and produce knowledge about needed
capabilities, but also generate a sense of urgency among participants.34
In the 1980s techniques of preparedness extended to transnational health and
humanitarian organizations seeking ways to plan for possible emergencies. For example,
in her book The Coming Plague, journalist Laurie Garrett describes a 1989 conference in
which 800 tropical disease specialists participated in a “war games scenario” in which an
ebola outbreak was simulated. Like such exercises in national security, the goal was to
expose vulnerabilities: “The hope was that such a role-playing scenario would reveal
weaknesses in the public health emergency system that could later be corrected.”35 The
actual performance of the exercise pointed to a key function of simulation as a
preparedness technique: its ability to produce anxiety in participants: “What the war
games revealed was an appalling state of nonreadiness,” writes Garrett. “Overall, the
mood in Honolulu after five hours was grim, even nervous. The failings, weaknesses,
and gaps in preparedness were enormous.”
In contemporary preparedness planning, the lesson of a successful simulation
based on a scenario is typically the same as the one that Anderson Cooper gleaned from
Hurricane Katrina: “we are not prepared.” However, such exercises are focused on
experts and leaders rather than the public. They are an incitement to action: hold
meetings, develop plans, release funds. Security simulations involve enactments of
scenarios of varying detail and scale, followed by reports on the performance of response.
Often former public officials play the roles of leaders – presumably because they are both
34
This had been one Kahn’s reasons for developing the scenario method. As Ghamari-Tabrizi writes:
“This was Kahn’s problem: how to invest hypothetical vulnerabilities, particularly unknown and
undetectable ones, with urgency.” Ghamari-Tabrizi, 233.
35
Laurie Garrett, The Coming Plague: Newly Emerging Diseases in a World Out of Balance (New York,
1995), 593.
25
authoritative and are available for several-day long exercises. In 2001, “Dark Winter”
was performed, a scenario depicting a covert smallpox attack in the US. This was an
“executive level simulation” set in the National Security Council over 14 days. Current
and former public officials played the roles of members of the NSC, and members of the
executive and legislative branches were briefed on the results. One outcome was the Bush
administration’s decision to produce 300 million doses of smallpox vaccine.
“Silent Vector” (2002) was an exercise in how to deal with the threat of an
impending terrorist attack when there is not enough information to provide protection
against the attack. The President, played by former Senator Sam Nunn, was told of
credible intelligence indicating an upcoming attack on the nation’s energy infrastructure,
but was not given any information on where or when the attack would take place. Other
examples include 2003’s simulated anthrax attack, “Scarlet Cloud,” and the biennial
“TOPOFF” exercises held by the Department of Homeland Security. TOPOFF 3 was
enacted in April 2005, and included a car bombing, a chemical attack, and the release of
an undisclosed biological agent in New Jersey and Connecticut. It was the largest
terrorism drill ever, costing $16 million and including 10,000 participants. The event also
included a simulated news organization, which was fully briefed on events as they
unfolded. The actual press was not invited.
Such simulations were not limited to the U.S. In the January 2005 “Atlantic
Storm,” former Secretary of State Madeleine Albright played the U.S. president in an
exercise simulating a smallpox attack on multiple nations of the translatlantic community.
Istanbul, Frankfurt, Rotterdam, and multiple US cities were hit. In a mock summit,
former prime ministers of European countries played the role of heads of state.
26
Questions of immediate response were posed: what kind of vaccination approach to use?
Which countries have enough supplies of vaccine, and will they share them? Will
quarantine be necessary? After the exercise, participants concluded that, first, there was
insufficient awareness of the possibility and consequences of a bio-terrorist attack; and
second, no organization or structure is currently agile enough to respond to the challenges
posed by such an attack. Structures of coordination and communication of response in
real-time must be put into place.
The conclusions were similar to those of other simulation exercises: governments
are not adequately aware or prepared. Secretary of Homeland Security Michael Chertoff
said of TOPOFF 3: “We expect failure because we are actually going to be seeking to
push to failure.” Indeed, failure is part of a preparedness strategy. In producing systemfailure, simulations generate knowledge of gaps, misconnections, unfulfilled needs.
These can then be the target of intervention. In so doing, they forge new links –
communicational, informational – among various agencies: local and national
government, public health, law enforcement, intelligence. These simulations, by making
infrastructural vulnerabilities visible, are part of an effort to develop a system for
assigning priorities and allocating resources.
The DHS National Preparedness Plan
In the decades after the founding of FEMA, the agency faced ongoing tension
between its civil defense function and its task of emergency management. While
Republican administrations tended to emphasize the former, Democratic presidents
27
focused on the latter.36 The demand for a coherent domestic security system that would
consolidate multiple governmental prevention and response systems crystallized, after
9/11 and the anthrax letters, in the formation of the Department of Homeland Security.
DHS brought together security functions from a number of areas of government: civil
defense, disaster response, border security, intelligence, and transportation security.
FEMA’s assimilation into DHS once again shifted its orientation more toward civil
defense – in this case, toward preparation for a terrorist attack. Nonetheless, it is
noteworthy that DHS characterized its overall mission in the terms of “all-hazards”
planning familiar from emergency management. As Michael Chertoff said in 2005 in
unveiling the Department’s new National Preparedness Guidance:
The Department of Homeland Security has sometimes been viewed as a terroristfighting entity, but of course, we’re an all-hazards Department. Our
responsibilities include not only fighting the forces of terrorism, but also fighting
the forces of natural disasters.37
The DHS plan elaborated a set of administrative mechanisms for making
preparedness a measurable condition. The plan was a guide for decision-making and
self-assessment across multiple governmental and non-governmental entities concerned
with problems of domestic security. It sought to bring disparate forms of threat into a
common security field, articulating the techniques that had been honed over the prior six
decades of planning for emergency. These included detection and early warning systems,
36
Robert Ward, Gary Wamsley, Aaron Schroeder and David B. Robins, “Network Organizational
Development in the Public Sector: A Case Study of the Federal Emergency Management Administration
(FEMA),” Journal of the American Society for Information Science 51 (11) (2000).
37
“Secretary Michael Chertoff, U.S. Department of Homeland Security, Second Stage Review Remarks,”
13 July 2005, House Security Committee, www.hsc.house.gov.
28
simulation exercises, coordinated response plans, and metrics for the assessment of the
current state of readiness.38
The goal of DHS preparedness planning was to “attain the optimal state of
preparedness.”39 What is a state of preparedness, according to DHS? As the plan defined
it, “preparedness is a continuous process involving efforts at all levels of government and
between government and private-sector and nongovernmental organizations to identify
threats, determine vulnerabilities, and identify required resources.”40 In other words,
preparedness is the measurable relation of capabilities to vulnerabilities, given a selected
range of threats.
Preparedness targeted vulnerabilities in the critical infrastructure that supports the
nation’s vital activities.41 Here “critical infrastructure” refers to technological systems
for sustaining social and biological life, often built in the name of population security.
The sectors included in the National Infrastructure Protection Plan are: agriculture and
food, public health and healthcare, drinking water and waste water treatment, energy,
banking and finance, national monuments, defense industrial base, information
technology, telecommunications, chemical, transportation systems, emergency services,
postal and shipping.
38
Many of the technical elements of the National Preparedness plans were present in FEMA’s 1984
“Integrated Emergency Management System,” (IEMS) which operationalized all-hazards planning at the
federal level. The system defined a set of “common preparedness measures” that would make integrated
emergency planning possible: these were functions that were critical to a response to any disaster, such as
communications, alert, command and control, and providing food and shelter.
39
Michael Chertoff, Quoted in Eric Lipton, “Homeland Security Chief Announces Overhaul, New York
Times, July 14, 2005.
40
Department of Homeland Security, “Interim National Preparedness Goal,” March 31, 2005. Contrast this
with population security, which seeks to “optimize a state of life.”
41
The National Infrastructure Protection Plan (NIPP) “provides a roadmap for identifying assets, assessing
vulnerabilities, prioritizing assets, and implementing protection measures in each infrastructure sector.”
85% of the Critical Infrastructure is in the private sector. INPG, p. 12.
29
In the plan, threats to these systems could come from a number of sources,
including outside enemies, natural disasters, and infectious diseases. Given the many
kinds of hazards to plan for, DHS approached threats through an emphasis on capabilities
that ranged across multiple types of events rather than a focus on specific threats.
“Capabilities-based planning” was based on Department of Defense methods in the wake
of the Cold War, but was also coherent with the central premise of all-hazards planning:
that one should not focus on specific threats but on a range a possible responses that work
across diverse threats:
“[Capabilities-based planning] addresses the growing uncertainty in the threat
environment... Target levels of capability will balance the potential threat and
magnitude of terrorist attacks, major disasters, and other emergencies, with the
resources required to prevent, respond to, and recover from them.”42
The plan did not claim to be able be able to prevent all disasters. As Chertoff
commented, “There’s risk everywhere; risk is a part of life. I think one thing I’ve tried to
be clear in saying is we will not eliminate every risk.”43 Since there were a multiplicity
of risks and finite resources, DHS would prioritize by focusing on the largest scale
disasters: “DHS will concentrate first and foremost, most relentlessly, on addressing
threats that pose catastrophic consequences.”44 Among the many dire possibilities, what
were the criteria for selecting which threats are the most salient? “Risk-based”
prioritization would guide the allocation of federal resources. This meant distributing
funds according to the relative likelihood and catastrophic potential of a given attack or
42
DHS 2004.
Chertoff, quoted in Eric Lipton, “U.S. Report Lists Possibilities for Terrorist Attacks and Likely Toll,”
New York Times, March 16, 2005.
44
“Secretary Michael Chertoff, U.S. Department of Homeland Security, Second Stage Review Remarks,”
13 July, 2005, House Security Committee, www.hsc.house.gov
43
30
disaster in a given place. However, exactly how to do this given the incalculability of
salient threats remained both a technical and a political problem.
The achievement of optimal preparedness does not demand knowledge of the
norms of living beings; unlike population security, it does not develop epidemiological or
demographic knowledge. Rather, to assess and improve current preparedness requires
techniques for generating detailed knowledge of vulnerabilities in relation to capacities.
Here is where scenario planning proves useful. As we have seen, scenarios are not
predictions or forecasts: rather, they map readiness for a wide range of threats. In its
plan, DHS selected 15 disaster scenarios as “the foundation for a risk-based approach.”
These possible events – including an anthrax attack, a flu pandemic, a nuclear detonation,
and a major earthquake – were chosen on the basis of plausibility and catastrophic scale.
Notably, 12 of the 15 scenarios were for terrorist incidents.
The DHS scenarios made it possible to generate knowledge of current
vulnerabilities and the capabilities needed to mitigate them. As one expert commented:
“We have a great sense of vulnerability, but no sense of what it takes to be prepared.
These scenarios provide us with an opportunity to address that.”45 Using the scenarios,
DHS developed a menu of the “critical tasks” that would have to be performed in various
kinds of major events; these tasks, in turn, were to be assigned to specific governmental
and nongovernmental agencies.
The scenarios did not imply permanent agreement about the major threats; rather,
they were to be regularly evaluated and if necessary transformed: “DHS will maintain a
National Planning Scenario portfolio and update it periodically based on changes in the
45
David Heyman, director of the homeland security program at the Center for Strategic and International
Studies, quoted in Eric Lipton, “U.S. Report Lists Possibilities for Terrorist Attacks and Likely Toll,” New
York Times, March 16, 2005.
31
homeland security strategic environment.”46 The plan envisioned ongoing reflexive selftransformation in relation to a changing threat environment: “Our enemy constantly
changes and adapts, so we as a Department must be nimble and decisive.”47 National
Preparedness had to continually pose itself the question: are we preparing for the right
threats?
While the above describes the rationality of National Preparedness, its actual
operation was far from stabilized. DHS was fraught with bureaucratic in-fighting,
budgetary struggles, and crony-ism, leading to a widespread perception of its failure to
achieve its mission. It is worth emphasizing that such criticism presumes the normative
rationality of preparedness.
Technical Reform
Scenario 10 of the DHS Planning Scenarios was “Natural Disaster – Major
Hurricane.”48 As has been widely noted, the city of New Orleans had run a hurricane
simulation in 2004. Obviously such exercises do not in themselves ensure a state of
preparedness. Nonetheless, the perception of a massively failed response by DHS to the
actual hurricane one year later did not undermine the presumed utility of “all-hazards”
planning. Rather, it pointed to problems of implementation and coordination, of
command and control. Thus in response to the failure, we have seen the redirection and
intensification of already-developed preparedness techniques rather than a broad
rethinking of security questions.
46
INPG, 5.
Chertoff, quoted in Lipton, “Homeland Security Chief Announces Overhaul.”
48
See the following documents, available on-line: National Preparedness Guidance, Homeland Security
Presidential Directive 8: National Preparedness. Office of Homeland Security, April 27, 2005; and
Planning Scenarios: Executive Summaries, The Homeland Security Council, July 2004.
47
32
Reform proposals have been primarily technical: in the context of the Gulf Coast,
rebuild the flood protection infrastructure; in large cities, improve evacuation plans; for
preparedness planning in general, ensure that there are coherent systems in place for
communication and coordination in crisis. More broadly, scrutinize the relationship
between Federal, local and state responsibility for dealing with various aspects of disaster
preparedness. However, under the rubric of preparedness, questions surrounding the
social basis of vulnerability are not posed. This issue should be distinguished from the
debate between social welfare advocates and neoliberals over whether public services
should be privatized. Rather, it raises the question of what kind of governmental
techniques are most salient for looking after the well-being of citizens, and what the
object of knowledge and intervention in the name of security should be.
Here it is worth noting some of the differences between the objects and aims of
population security and those of preparedness. In contrast to population security-based
tasks like public health provision and poverty relief, preparedness is oriented to crisis
situations and to localized sites of disorder or disruption. These are typically events of
short duration that require urgent response.49 Their likelihood in a given place demands a
condition of readiness, rather than a long-term work of sustained intervention into the
welfare of the population. The object to be known and managed differs as well: for
preparedness the key site of vulnerability is not the health of a population but rather the
critical infrastructure that guarantees the continuity of political and economic order. If
population security builds infrastructure, preparedness catalogs it and monitors its
49
Craig Calhoun, “A World of Emergencies: Fear, Intervention, and the Limits of Cosmopolitan Order,”
The Canadian Review of Sociology and Anthropology 41 (2004).
33
vulnerabilities.50 And while preparedness may emphasize saving the lives of “victims” in
moments of duress, it does not consider the living conditions of human beings as
members of a social collectivity.
To consider Katrina and its aftermath a problem of preparedness rather than one
of population security is to focus political questions about the failure around a fairly
circumscribed set of issues. For the purposes of disaster planning, whose key question is
“are we prepared?” the poverty rate or the percentage of people without health insurance
are not salient indicators of readiness or of the efficacy of response. Rather, preparedness
emphasizes questions such as hospital surge capacity, the coherence of evacuation plans,
the condition of the electrical grid, or ways of detecting the presence of e coli in the water
supply. From the vantage of preparedness, the conditions of existence of members of the
population are not a political problem.
50
Paul Rabinow, “Diffusion of the Human Thing: Virtual Virulence, Preparedness, Dignity,” unpublished
manuscript (2005).
34