as a PDF - John J. Reilly Center

Ethical, Legal, and Societal Implications of New
Weapons Technologies
A Briefing Book for Presenters 1
Maj Gen (Ret) Robert Latiff, Ph.D.
Reilly Center for Science, Technology, and Values
University of Notre Dame
Don Howard, Ph.D.
Professor of Philosophy and
Reilly Center for Science, Technology, and Values
University of Notre Dame
This work is the product of a request by the National Research Council to create
educational materials to accompany the report entitled Emerging and Readily
Available Technologies and National Security: A Framework for Addressing
Ethical, Legal, and Societal Issues. 2 The report, published in 2014, is the result of
a request by the Defense Advanced Research Projects Agency to assemble a
committee to advise them on these issues.
Maj Gen (Ret) Robert Latiff is an adjunct professor with the Reilly Center for
Science, Technology, and Values at the University of Notre Dame. He was a
member of the DARPA Sponsored National Research Council Committee on
Emerging and Readily Available Technologies and National Security. Dr. Don
Howard is Professor of Philosophy at the University of Notre Dame and is the
past Director of the Reilly Center for Science, Technology, and Values. He
specializes in the history of physics and is a world-renowned scholar of the
physics and philosophy of Albert Einstein, and lectures and writes extensively on
questions of science and technology ethics.
This briefing book provides background and commentary for the accompanying
presentations and is intended to be used in conjunction with those presentations.
A companion bibliography has also been provided.E
1
Copyright © 2015 National Academy of Sciences.
Committee on Ethical and Societal Implications of Advances in Militarily Significant
Technologies that are Rapidly Changing and Increasingly Globally Accessible; National Research
Council; National Academy of Engineering. Emerging and Readily Available Technologies and
National Security A Framework for Addressing Ethical, Legal, and Societal Issues. (Washington,
DC: National Academies Press, 2014).
2
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 2
________________________________________________________________
Introduction
It is a commonly held fact that militaries, especially the U.S., depend heavily on
technology to achieve and maintain superiority over our adversaries. While
technology has provided us with enormous benefits over the years, it has not
been without challenges – legal, societal, and ethical. Very often in the military
research, development, acquisition, and deployment of technologies and
weapons that embody them, we ask a) can this be done technically, and b) is it
permissible legally?
Rarely do we hear anyone question whether we should, in fact, conduct such
research or employ such weapons. This presentation will examine those
questions.
The goal of this work is to explore the issues concerning the accelerating pace of
technology development and the strong interactions among technology,
weapons, and war. We will discuss the process of research, development,
acquisition, and deployment of new technologies and weapons and the culture
that drives behavior within the communities engaged in that work.
For centuries, concepts of Just War Theory and International Humanitarian Law
have evolved to modulate the violence of armed conflict between nations, and
these concepts have been underpinned by well-developed philosophical and
professional ethics theories. While there are no defined answers to the question
whether a particular technology or weapon is unethical, and no checklist or set of
guidelines could ever be complete, a framework has been proposed, by which
researchers and military planners can analyze new technologies. Case studies
are presented that cover issues of privacy and surveillance, “inhumane” chemical
weapons, synthetic biology and genetics, and large-scale use of chemicals
against an entire country.
As we will see, technology has historically had a profound effect on war, both in
its immediate conduct and in the way in which certain technologies and weapons
fundamentally alter the institution of the military. Take for example nuclear
weapons, which resulted in an entirely new mode of war fighting and new military
structures to control them. Or consider night vision devices that allowed
operations to be conducted at night. New types of war, such as terrorism, guerilla
war, and civil war, have emerged, and not only do they demand new types of
weapons, they call into question both the primacy of the state and the ability of
the state to control violence or apply rules of conduct.
New technologies have dramatically altered what is considered the battlefield. In
the past, armies pitted against armies could understand boundaries and what
constituted the edges of the battle area or the theater of operations. Today, wars
are fought over immense distances, possibly in space, and at light speeds.
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 3
________________________________________________________________
Conflict occurs in cities and ungoverned areas. The sphere of battle has
changed. Autonomous weapons, cyber warfare, and various soldier
enhancements have so altered the way in which war can be fought – and what
defines the soldier – that long held rules of war, which depend for their validity on
the understanding of the combatants, are severely challenged. With machines
perhaps thinking and deciding on courses of action for us, and machines and
highly modified humans perhaps participating actively in combat, serious and
cherished concepts that comprise the so-called warrior ethos are called into
question.
The goal of the DARPA/NRC effort and the purpose of this briefing are to inform
policy makers and government researchers, and to increase their understanding
of, and sensitivity to, the ethical issues arising from the development and
application of technology to military and intelligence operations. Our goal is to
delve into the details of some of the issues and to provide a framework for
thinking and decision making about them. In its request to the National Research
Council, the Defense Advanced Research Projects Agency (DARPA) asked that
the NRC develop a consensus report on the topic of ethical, legal, and societal
issues relating to research on, development, and use of increasingly globally
accessible and rapidly changing technologies with potential military application,
such as information technologies, synthetic biology, and nanotechnology. This
report would articulate a framework for policy makers, institutions, and individual
researchers to think about such issues as they relate to these technologies of
military relevance and to the extent feasible make recommendations for how
each of these groups should approach these considerations in their research
activities.
A committee, consisting of experts in law, science, military operations, weapons
procurement, psychology, and other pertinent fields was established by the NRC
to conduct the study. The committee held five meetings over a period of twelve
months and heard from government, industry and academic researchers. The
committee’s final report was entitled Emerging and Readily Available
Technologies and National Security: A Framework for Addressing Ethical, Legal,
and Societal Issues. The report structure provided background information on the
dependence on technology by the military and the changing nature of conflict.
Readily available technologies also included so-called dual-use technologies.
The committee considered foundational technologies such as, information
technology, synthetic biology, and neuroscience. It also considered applications
domains that incorporated some of the foundational technologies. The report
discussed where one might seek background and insight into the ethical
implications of a technology or a use of a weapon and provided a framework, or
a series of questions, whereby a researcher or decision maker might tease out
some answers about what to do.
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 4
________________________________________________________________
The report also discussed what organizations might do on a continuing basis and
provided possible ways in which organizations might establish unobtrusive
processes for considering these topics on an ongoing basis.
Following the publication of the report and its briefing to the DARPA sponsor, the
NRC (with DARPA concurrence) sought to disseminate the results of the
committee’s work broadly. The University of Notre Dame sponsored a major
conference, attended by DARPA’s Deputy Director, several Committee
Members, and experts in many of the technologies discussed in the report.
Briefings were provided to numerous organizations including government, think
tanks, and international audiences.
What Is the Issue?
In considering issues the responsibilities of scientists and engineers in weapons
development, it is useful to look back in history. One of the most important
developments of the modern era was, of course, the atomic bomb. In the early
1950s, the philosopher, Jacob Bronowski, writing in response to the often heard
argument that it was not the scientists but, rather, the elected officials who bore
responsibility for nuclear weapons, said:
Because we know how gunpowder works, we sigh for the days before
atomic bombs. But massacre is not prevented by gunpowder. The Thirty
Years War is proof of that. Massacre is prevented by the scientist’s ethic,
and the poet’s and every creator’s: that the end for which we work exists,
and is judged, only by the means which we use to reach it.
Such a cautionary statement is useful today, and it is important in developing not
only military technologies, but commercial or dual-use technologies as well, that
scientists and engineers consider the implications of their work. They are not
absolved of responsibility.
As we shall see, technology and war have long been inextricably linked. In many
cases, wars have highlighted the need for improved, or different, weapons to give
one side or another an advantage over the enemy. So war drives technology
development. In other cases, new technologies emerge from laboratories, and it
is only after they are available that military applications emerge. So technology
drives changes in weaponry and war. We shall return to this concept later. The
types and modes of violent conflict have changed in modern times. With those
changes come changes in the types of weapons needed. Tanks and armored
vehicles are not so useful in fighting guerilla wars in jungles. High altitude
bombers are not very useful in trying to find and engage insurgents in cities.
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 5
________________________________________________________________
By any measure, technology advances at a phenomenal rate. In addition, these
advances in many cases converge with one another to create capabilities not
possible using one technology alone. Consider advanced prosthetic devices that
depend on advanced electronics, neuroscience, and biology. As we will see, past
technology advancements have resulted in unanticipated and unintended
consequences.
So, what is the issue? Science and technology speeds ahead, and along with it
weapons development. As we will see, it is difficult, on the one hand, to keep up
with such a broad range of rapidly developing technology areas. On the other
hand, in such a large and complex organization as the Department of Defense,
the monitoring and control of technologies and the understanding of potential
ethical issues is made problematic by both the pressures of operations and the
excessive bureaucracy.
Let’s consider some of the hugely important technology developments of the past
and the impact they have had on the conduct of war. While underwater vehicles
(submarines) had certainly been around for many years (during the U.S. Civil
War, for example), the German development and deployment of the U-Boat
certainly changed the course of WWI, as did the introduction of the first armored
vehicles. The introduction of both contact mines and magnetic mines into naval
warfare drove the development of mine countermeasures in a deadly game of
technology one-upmanship.
The German development of the V1 or so-called “Buzz Bomb” in WWII terrorized
England and presaged the development and introduction of cruise missiles. The
Norden bombsight, a then highly classified and strategically important technology
development, dramatically changed the accuracy of U.S. bombing missions and
altered the course of the war. Likewise, British invention of and U.S. development
of radar constituted a great leap in technology that fundamentally changed
warfare as well as civil and commercial operations to this day. The civil space
program, so exciting to the public in the 1960s, was, in fact, a cover for the highly
classified development of spy satellites to monitor the Soviet Union’s ballistic
missile program. The development and introduction of stealth technology in the
1980s seemed to many to be almost like magic, and it has enabled greater
survivability and lethality of weapons platforms.
War today could not be fought without the use of satellites. Constellations include
classified imagery and signals satellites, radar and communications satellites,
weather monitoring, and global positioning. Many military platforms could not
operate effectively in the absence of global positioning signals.
Increasingly, computers are critical to everything we build, and the realm of cyber
war is becoming a strategic option. In cyber operations, vulnerabilities of our
weapons platforms, as well as a nation’s infrastructure, are exploited, or the
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 6
________________________________________________________________
systems they affect are possibly destroyed, by various computer techniques.
Unmanned aerial, terrestrial, and naval systems are now commonplace, with
new, potentially autonomous, and lethal systems being developed. Night vision
systems (not entirely new) continue to be improved and change the way wars are
fought, now allowing a full range of military operations at night. Advanced seeker
technology, including radar and infrared, have enabled fearsome and highly
lethal air-to-air, ground-to-air, and air-to-ground missiles. Hypersonic vehicles,
whose development is in its infancy, promise to revolutionize warfare with the
ability of a weapon or a sensor to get to a target at enormous speeds.
While the development of weapons technologies has sometimes been
accompanied by ethical reflection, too often the ethical perspectives are naïve.
Whether sincere in their intent or merely rationalizing their actions, the
developers of some of the most important and deadly weapons technologies in
history claimed that their motives were pure. According to the Austrian, Countess
Bertha von Suttner, Alfred (Nobel) expressed his wish to produce material or a
machine that would have such a devastating effect that war from then on, would
be impossible. Likewise, by the time of Richard Gatling’s elderly years, his
granddaughter, Mrs. Albert Newcombe, remembers that “He was a most peaceloving soul, and I remember that his reason for inventing that then-lethal gun,
was to make war so horrible that it would end wars.” As we now know, dynamite
and its successors have enabled an entire population of deadly munitions.
Gatling’s machine gun made killing in all subsequent wars far more efficient, and
like Nobel’s dynamite, spawned many successor automatic weapons.
But, sometimes there are genuine moral gains. From WWII to Korea to Vietnam,
the development of bombs and bombing technologies dramatically decreased the
number of munitions and the number of bombing aircraft, and reduced the
circular error in attempts to destroy a single target. While the quest for more
accurate bombs was likely a drive for more efficiency and less danger for pilots, a
happy result was reduced collateral damage to surrounding areas and reduced
casualties. In other words, we achieved better discrimination — a fundamental
goal and tenet of Just War theory and International Humanitarian Law. (To be
discussed later.)
Our aim in this work is to provide more sophisticated tools for ethical analysis
and reflection. Rapid technology growth poses, by itself, an ethical challenge. A
stark illustration of the speed and acceleration of technology advance is a simple
graphical picture of technology introduction plotted against world population. As
the population of the world stayed fairly low and slowly growing for thousands of
years, new “technology” introduction in agriculture, urban living, metallurgy, and
mathematics and engineering kept pace. With the emergence from the Middle
Ages, the Renaissance, and the Industrial Revolution, population soared, and
along with it new agricultural, manufacturing, and transportation technologies.
With the post-Industrial Revolution and the entry into the modern era, came
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 7
________________________________________________________________
better health care, increasing population, growing urban life, global war, and an
exponential increase the introduction of technologies such as the steam engine,
the railroad, the telephone, the airplane, and penicillin. And the last century has
seen quantum theory, the computer, space travel, the sequencing of the human
genome, the internet, and frightening advances in synthetic biology and genetics
— accompanied by a world population of over seven billion, trending toward ten
billion people.
Why is such rapidly advancing technology an ethical challenge? Stewart Brand
wrote in Time magazine in the year 2000 (before even the latest round of new
technology developments):
Change that is too rapid can be deeply divisive; if only an elite can keep
up, the rest of us will grow increasingly mystified about how the world
works. We can understand natural biology, subtle as it is, because it holds
still. But how will we ever be able to understand quantum computing or
nanotechnology [or synthetic biology] if its subtlety keeps accelerating
away from us? Constant technological revolution makes planning difficult,
and a society that stops planning for the future is likely to become a brittle
society.
If technology is moving too fast for all but the most technologically literate, it
makes it even more critical that those responsible for technology carefully
consider its consequences. The rapid introduction of new technologies in multiple
fields is further complicated by the fact many of the advances in one field aid
progress and are dependent on progress in other fields. It is difficult enough to
track and understand the implications of progress in one area, much less several
of them simultaneously.
Take, for example, neuroscience and neuro-stimulation. It requires knowledge of
biology (life sciences), engineering, computing, microelectronics, and possibly,
telecommunications. In addition, knowledge of such diverse fields as systems
engineering and psychology would be important. Further conpounding the
problem of managing or controlling technology, if indeed it is possible, is the fact
that predicting how a technology will be used is fraught with uncertainty.
Regulators and managers cannot really know the full range of the effects of a
technology until it has been in use for a longer time and in many venues.
However, once a technology is deployed and becomes entrenched in society, it
is difficult to change or control. Consider the laser, first demonstrated in crude
form in the laboratory by Theodore H. Maiman in May of 1960. It didn’t take long
for power, stability, and accuracy to be improved to the point where lasers could
be used for very precise surgical techniques. Further research demonstrated that
lasers had value in carrying communications, and the field of fiber optics was
born. Finally, clever and unique methods of lasing and beam control allowed
extremely high power and flux densities to be employed in weapons and weapon
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 8
________________________________________________________________
countermeasures. Who could have foreseen that the bench top ruby laser would
evolve into a capability to shoot down missiles or blind satellites?
We have already discussed the extreme rapidity of technology growth and the
difficulties the average citizen, much less the technologically literate person, has
in keeping up. It is natural that, as a technology is introduced, it takes time for its
use to spread and its implications to emerge. From the introduction of the internet
(ARPANET) in the 1970s to the global adoption of the world-wide-web in the
1990s, it took decades for societal change to occur. Now, the internet is an
essential part of daily life. And if social change lags technology change, and the
operations of technology based businesses depend on social acceptance, then
business change will lag social change. And similarly for political change, which
is dependent on (at least in democratic countries) a willing population, or the
consent of the governed, and an accepting business climate. Thus political
change lags social and business change. Finally, the laws of a society, which are
based partly on experience (precedence) and largely on the operations of the
society over time, tend to lag all the other changes. Law tends to be backward
facing and the legal system changes only very carefully.
Why are we so dependent on technology? We tend to be seduced by it. Unlike
problems that must be solved by making a change to an institution or a process,
which have multiple competing solutions, and that may take a long time to show
results, technology solutions most often show immediate results. Turning to
technology promises instant gratification. Like seduction in the classic sense,
technology promises something exciting and good. Like seduction, however, the
lure of technology can be dangerous. Not only can it generate an addiction, we
also tend to overlook or downplay or try to explain away the potential dangers.
There may be unintended and unknown consequences that a more rational
consideration might identify. Senior military commanders worry about the
seductive nature of high-tech weapons. First, there is the simple question of
assured availability. As retired Marine General James Mattis notes, “In future
wars, technical systems will be under attack and will go down,” . . . so, ‘We’re
going to have to restore initiative’ among small units and individual leaders.. At
least as concerning for Mattis is that technology solutions are seductive and their
continued availability on the battlefield stifles creative thought and initiative on the
part of combat unit leaders. Why, the thinking seems to go, should I have to
come up with creative solutions to a combat situation if all I need to do is call for
more or better technology solutions?
Quite often, even if we recognize that a technology might not be fulfilling its
earlier promise, it has gained a momentum and becomes difficult to jettison. The
technology itself becomes the primary actor. An example of this might be the
Space Based Infra Red System (SBIRS). Very early in its development, it
became apparent that the technology goals were very aggressive. Years and
many billions of dollars later, the system still hasn’t fulfilled its original
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 9
________________________________________________________________
expectations. Generally, outside forces play a large role in continuation of a
project, once begun. There is always the fallacy of “sunk cost” wherein, so the
argument goes, we cannot abandon the project because we have already
invested so much. Self-preservation of organizations and bureaucracies plays a
role, as large government organizations attempt to justify their continued
existence. Politics certainly enters the picture when major defense programs
worth hundreds of billions of dollars are spread over many states and
constituencies. Finally, companies and corporate self-interest play a role in
pushing to keep a technology or project viable, even when shown to be
marginally useful, because not to do so might mean the demise of the company’s
business. Finally, we must be cautious that we don’t allow our love affair with
technology to cloud our judgment or blind our thinking to the realities of war, or
lull military planners into the mistaken belief that the U.S. will always enjoy
technological superiority. Lt Gen H.R. McMaster has spoken about what many
other commentators have addressed. He noted that in the aftermath of the First
Gulf War and the easy victory over the forces of Sadaam Hussein, U.S. military
planners believed that “American military technological advantages would shift
war fundamentally from the realm of uncertainty to the realm of certainty.” He
further noted, “the language was hubristic.” Gen McMaster’s concerns in a sense
echoed those of Gen Mattis, that the overdependence on technology would alter
the ways in which soldiers, themselves, would think about soldiering. He noted
that “The warrior ethos is at risk because some continue to advocate simple,
mainly technologically based solutions to the problem of future war, ignoring
war’s very nature as a human and political activity that is fundamentally a contest
of wills.”
So, what are some of the new and emerging, readily available, technologies of
concern? The NRC report described several, discussed in the categories of
foundational technologies and applications domains. Foundational technologies
include information technology, synthetic biology, and neuroscience. These
technologies are critical in applications such as robotics and autonomous
systems, soldier enhancements, cyber weapons, and non-lethal weapons. Before
discussing some of the issues and concerns with the new technologies, it is
important to note the advantages that accrue to the military with their application.
They are not insignificant. Information technology provides better security and
dramatically improved access to a wide range of services. Synthetic biology has
the potential to provide new physiological functions and improved performance.
Neuroscience promises the possibilities of new therapies for brain injury. Robots
and autonomous systems allow us to reduce the risk to our warfighters. Soldier
enhancements provide all-important tactical advantages. Cyber warfare can
potentially reduce harm and physical damage. Non-Lethal Weapons offer viable
non-kinetic, non-lethal options in appropriate scenarios. There are, however,
potential issues with each or these technologies, and likely others, that need to
be considered.
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 10
________________________________________________________________
Continued advances in information technologies could render communications
security obsolete. Software systems are so complex, that there could be a
tendency to push the limits of reliability. Data mining is a significant threat to
personal privacy and offers, perhaps, an incomplete picture of a situation. And, of
course, security of information is always a primary concern.
Synthetic biology has numerous potential dangers, among them: environmental
and safety risks; escape into the environment; the displacement of natural
microbes; and the crowding out of food production by the need for feedstock for
biofuels. Also of concern are new adversary threats, like the production of
neurotoxins or infectious viruses resistant to vaccines and antiviral medications.
There are many researchers who are concerned about the implications of some
experiments on the whole concept of humanity and the sanctity of life in that the
ability to create life may lead to diminished respect for existing life. Of further
concern are the increased numbers of people — of all motivations — able to
engage in synthetic biology work. Prediction of biological behavior is very hard,
biological complexity is not well understood, and the fundamental “laws” of
synthetic biology are not yet fully comprehended.
Among the potential issues for neuroscience are its possibilities for use as a tool
for unreasonable search or its potential for misuse in interrogations. Also, if
neural implants are used or if other neural enhancements are made to a soldier,
what happens when that soldier transitions to civilian life? Are promotions based
on enhanced capabilities? What effects do neurological enhancements have on
“human essence”? There is great concern over the privacy of neuroscience data
and about cultural effects, such as the danger of misuse of information.
There are a host of issues surrounding research on and the potential
employment of autonomous weapons, especially lethal autonomous weapons.
They include complexity and unpredictability, questions about legal or criminal
responsibility in the case of an accident, and whether or not they can
demonstrate better or worse discrimination of legitimate targets. There are many
who worry that the ease and low cost of use — and the fact that soldiers can
avoid being in harms way — may lower the barrier to the resort to violence.
There are skeptics who worry that machines will be incapable of empathy and
question whether they will be more or less humane than humans who are, after
all, subject to extremes of emotion. Machines are, or will be, relatively cheaper
than soldiers. Will the cost of entry for this technology drive more widespread
use? What about moral agency? Can a machine actually function as a moral
agent? Military planners will need to be concerned with the age-old concept of
unit cohesion, a great indicator of unit performance. Will a human-robot team be
able to function smoothly? And how will we react if one of ours is captured and/or
the same technology is used against us?
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 11
________________________________________________________________
There are numerous concerns with the possible future attempts to enhance
soldiers artificially. Among these are the possible effects on unit cohesion of
having enhanced and non-enhanced soldiers operating together, the status of
enhanced soldiers under international conventions, the effects on civilian society
of an enhanced soldier’s return, the risk to the individual soldier, and whether or
not the enhancement is reversible, or addictive. What effect does an individual
enhancement have on human essence? When is the soldier not acting on his or
her own volition? If a soldier has artificially altered levels of fear and aggression,
what implication does that have for the military ethic?
Cyber warfare promises at the same time to provide new, non-lethal means of
war, but potentially replaces one kind of harm with others. There are new kinds of
harm (nonphysical harms), such as economic, digital, informational, and physical
infrastructure destruction. Concerns about cyber conflict include the degree of
certainty in attribution needed for a cyber response, the responsibility of a victim
to take precautions to limit damage, and the possible loss of control for weapons
that escape and cause harm on wider scale than intended. In cyber war, because
of the speeds with which it will be conducted, there is a great danger and
difficulty distinguishing between exploitation and attack, and a resulting
opportunity for dangerous misperception.
If a weapon is non-lethal, it has to be good, right? Yes, probably, but even here
there are concerns. Many non-lethal weapons are electromagnetic in nature,
utilizing lasers or RF energy. As such, they are zero-flight-time, and thus
instantaneous in their effects once fired. There is no time to respond. Since they
are non-lethal, there is a concern that military forces will be more apt to use
them, quicker to resort to the use of force. On the other hand, there is a concern
that given their non-lethality, they could embolden a determined adversary to
press the fight. Lastly, since non-lethal weapons leave no physical marks, the
undetectable harm may encourage their use as punishment by dictators or
despots who may employ them as a means of torture.
As long as there have been wars and weapons, ethical questions about their use
have been present. In 1139, Pope Innocent II forbade the use of the crossbow
when it was introduced because it was so much more accurate and lethal, and
provided an unfair advantage to its user. Chemical warfare saw widespread use
in WWI with the introduction of blister agents and asphyxiating gases and in
WWII with the mass gas chamber executions by the Nazis. More recently, the
widespread use by the U.S. of Agent Orange in Viet Nam was chemical warfare
on a national scale. Biological forms of warfare have been around since ancient
times. In 1155 the Emperor Barbarossa poisoned water wells with human bodies,
and in 1346 Tartar forces catapulted the bodies of plague victims over city walls
during a siege. In more recent times, the Japanese were significant users of
biological warfare in WWII. During the Cold War, both the U.S. and the Soviet
Union stockpiled tons of biological agents. Finally, the development and use of
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 12
________________________________________________________________
nuclear weapons in WWII and the subsequent arms race represents one of the
most significant ethical issues in weapons research and employment in modern
times.
As we will see later, Just War Theory and International Humanitarian Law have
developed over the centuries. However, in the past century there have been
numerous attempts to come to grips with, and try to mitigate the effects of war.
After WWI, an international agreement banning the development and use of
chemical weapons was signed. After the horrors and existential worries of WWII,
numerous efforts have been made to deal with issues of technology, weapons
research, and ethics. These include the 1946 Nuremberg Trials, the 1972
Biological Weapons Convention, the 1993 Chemical Weapons Convention, and
efforts by scientists to place restrictions on biomedical, genomic, and
nanotechnology research.
What is it about technology that it is such a vital part of our military thinking?
Technology has been a cornerstone of U.S. military thinking for decades,
particularly so during and since WWII and the role it played in winning that war
(radar, bombsight technology, nuclear weapons). One recalls Dwight
Eisenhower’s oft quoted warning about the military-industrial complex. In more
recent years, one has only to look at the pronouncements of succeeding
presidential to see and understand the central role that technology plays in
military and national security planning. In the current administration, Frank
Kendall, the Undersecretary of Defense for Acquisition, Technology, and
Logistics, makes explicit the criticality to national defense of military technological
superiority and the assumption that the U.S. could and would maintain that
superiority.
The U.S. dependence and its policy of military technological superiority are not
without ethical implications. For instance, weapons must conform to the laws of
war, but since technology often outstrips the laws of war, the law may not be
much of a constraint. Also, technological superiority may provide transient rather
than long-lasting advantage as adversaries learn to counter innovations, leading
to a need to invest significant resources over long periods. Adversaries, both real
and potential, react to the introduction of new U.S. military technologies. For
example, our use of a particular technology may deter adversaries from taking
hostile actions against U.S. interests, or may cause adversaries to seek the
technologies for their own use, and/or cause them to seek to counter the
advantages conferred by U.S. use. The first user of a weapon or technology
often sets precedents that other nations follow. Such precedents may be the
initial seeds out of which international law and rules governing such use can
grow, and so the U.S. needs to consider seriously its role in this regard.
Presumptions of technological superiority may deflect attention from
consideration of ethical, legal, and societal issues. The prospect of reciprocal use
has historically been a spur for reflection on the ethical implications of military
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 13
________________________________________________________________
technologies, while a strong asymmetric advantage has historically had the effect
of deferring and diffusing ethical deliberation.
Often, when faced with questions or concerns about ethics, the first reaction of
technologists, weapons developers, or decision makers is to find arguments to
stop the conversation without serious consideration of the issues. Frequently
heard are the following:
•
Adversaries are unethical, so ethics should not be a constraint in using
advanced weaponry against them. But, the U.S. has publicly committed to
abide by certain treaties and by constraints embodied in domestic law that
criminalizes violations of the Geneva conventions by the U.S. armed
forces.
•
Adversaries will pursue all technological opportunities that serve their
interests. If we don’t do likewise we’ll be at a military disadvantage. But
there is a big difference between the possibility that a technology could
provide military advantages and a clear demonstration that it does.
•
We should separate decisions about exploring a technology from
decisions based on demonstrating how it can be used to confer military
advantages. We don’t know the significance of a technology, so we must
work on it to understand its implications, and we would be unwise to give
up on it too soon. But this argument poses a false choice between
cessation of all investigatory work and proceeding to work without any
constraints at all. Perhaps the best answer is “proceed, but carefully.”
•
Consideration of ethical, legal, and societal issues will slow the innovation
process to an unacceptable degree. But, although true in some cases, it is
not necessarily true in all. Also, a small slowdown up front may in fact be
worth the cost if it helps to prevent a subsequent more serious issue.
•
Research on and development of defensive technologies and applications
is morally justified, whereas work on offensive technologies is morally
suspect. However, the categories of “offensive” and “defensive”
technologies are not conceptually clear, because offensive technology can
be used for defensive purposes, and vice versa.
As technology has accelerated over the previous years, the type and kinds of
conflicts in which we have fought have changed significantly as well. From the
set-piece battles of the Middle Ages and 17th and 18th centuries and the army
versus army engagements of our world wars, we have evolved with changing
geopolitical situations to smaller, regional, or country specific conflicts and had to
fight in guerilla wars, insurgencies, humanitarian interventions, civil wars, and the
global war on terror, with its undefined battlefield and unpredictable timetable.
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 14
________________________________________________________________
And, as the imbalance between the U.S. and its adversaries in terms of
technological prowess becomes enormous, those adversaries are driven to
asymmetric means of warfare such as terror bombings, and the use of
commercial airliners as weapons.
While the Laws of Armed Conflict have been extraordinarily successful in the
past, it has been because the conflicts were between states, or so-called rational
actors. Unfortunately, today’s threats are often come from failed states and
armed groups. Combined with the rapid evolution of technology and its
introduction into today’s conflicts, countries and militaries are challenged in the
proper application of the Laws of Armed Conflict. And, since many of the newly
emergent technologies have their roots in the civilian world and many civilians
have no basis for making ethical judgments in a military context, their inputs may
be of little help. Examples of the changing nature of the military conflicts with
which the U.S. has been involved are:
•
•
•
•
•
•
•
Viet Nam — a U.S. classic approach versus North Viet Nam’s use of
guerilla war.
Bosnia — a humanitarian intervention to prevent genocide.
The First Gulf War — a U.S. classic military action pitting ground forces
against one another, but with overwhelming U.S. technological superiority.
Iraq — no fly zones.
Iraq — counter-insurgency.
Afghanistan — eliminating bases for terror operations, nation building,
trans-border conflict.
The War on Terror — world-wide special operations
Case Study — Total Information Awareness
In aftermath of 9/11, there was enormous pressure to “connect the dots” of
intelligence data collected from multiple sources by multiple agencies and nongovernmental entities. Post-mortem analysis indicated that much of the data
needed to prevent the attack was in multiple databases. It was realized that data,
standing alone, was not sensitive, but that, when combined, the data revealed a
pattern of a person’s activity. This data was generally collected by nongovernment, commercial entities, and thus was not protected by the 4th
Amendment.
The Defense Advanced Research Projects Agency (DARPA) embarked upon a
program to collect data from numerous databases (flight records, addresses,
phone records, immigration records, etc) and to develop algorithms to combine
these data in useful and predictive ways. The program, ill-advisedly, was named
Total Information Awareness (TIA). There were several problems with TIA, the
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 15
________________________________________________________________
first being the ill-considered choice of Rear Admiral John Poindexter, already a
controversial figure, as the manager of the TIA program. The program was legal,
but crossed boundaries previously un-encountered, yet there was little
information made available to the public about the program, since much of it was
considered classified. And the all-seeing eye logo was just plain creepy.
The program was killed by congress in 2003 because of a public outcry. DARPA
recognized the privacy implications of TIA. Admiral Poindexter resigned. Some
elements of the program were moved into the classified intelligence budget, and
many of the elements of TIA exist in other forms today. Therefore, concerns
about misuse and inaccurate information remain. When TIA was killed, a
casualty was DARPA-sponsored privacy enhancement research. TIA raised the
level of public cynicism about and mistrust of government.
Surveillance laws and other protections, like the Fourth Amendment, limit only
data collection, and not inferencing power, which is the more serious threat to
privacy and civil liberties. Today, the government is constrained only by technical
limits on its inferencing capability.
Questions
•
•
•
•
•
•
What civil liberties issues are raised by TIA?
Do you think that TIA was an appropriate reaction or an overreaction to
the events of 9/11?
Who are the stakeholders?
What is the nature of the harm?
What would be the results of technological imperfections?
Would there have been unintended consequences of the work?
Process and Culture
Up to now, we have spent time talking about the growth of technology and its use
in the military, as well as about how the wars we fight are impacting and are
impacted by technology and weapons development. It is worthwhile now to look
at some of the ways in which we do weapons development, why we do such
work, and where and how ethical reflection may be useful and appropriate. As
technologists and weapons developers, and with the stated U.S. strategy of
maintaining military technological superiority as our charge, our goals are to
advance the state of science and technology, to make the nation safe and
secure, to develop the tools necessary to insure warfighter success, and to do all
of this under extreme pressure, with high priority, especially in time of war. There
is a moral impulse at work here. All of the military services have clearly stated,
and highly cherished, sets of core values to which each member is bound and in
which most members are emotionally invested. Such words as loyalty, duty,
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 16
________________________________________________________________
respect, service, honor, courage, commitment, integrity, service, and excellence
provide service members and their civilian counterparts with a moral grounding
and way of thinking about the importance of their work for the country as well as
the individual soldier, sailor, airman, or marine. It is these concepts that impel
service members to continue to drive toward completion of new systems in the
face of numerous impediments — institutional, budgetary, bureaucratic, etc. —
and pressures.
The pressures on technologists and weapons developers include the interests of
many stakeholders which must be satisfied, the need for the warfighter to have
the best equipment to succeed in battle, the need for the U.S. to maintain
technological superiority, the need to advance the state of the art in technology,
and the pressure not to allow the program to fail, be over budget, or behind
schedule. In the case of the defense and intelligence industry, sometimes a
corporation’s existence depends on the success of a program. People’s jobs may
be at stake (an issue of importance to both industry and Congress). And, of
course, for many engaged in defense work, “Failure is not an option.”
A great example of the “moral impulse at work” is the way in which the entire
community came together when troops in Iraq and Afghanistan began suffering
heavy casualties as a result of roadside improvised explosive devices (IEDs). In
a demonstration of what is possible under extreme circumstances, funds were
made available, research was accelerated, production was fast-tracked, and
numerous systems were rushed to the field in an attempt to mitigate the IED
problem. It should be noted here that DARPA played a key role in anti-IED
research and in the delivery of rapid and effective health care to deployed troops.
But, when the system of weapons research, development, acquisition, and
deployment is operating normally, it is an institutional behemoth that demands
constant input, and constant care. It is instructive to look at the history of the
military budget, and how it rises and falls precipitously with the beginning and
end of conflict. Several interesting points can be made. If one were to take an
average, smoothing out the peaks and valleys, the defense budget has
consistently been around $400B, adjusted for inflation. It is also interesting to
note that the period since September 11, 2001 appears to be one of the longest
sustained periods of growth.
What is problematic with high growth rates in weapons research is the intense
pressure to execute (spend) funds and achieve results in a relatively short period
of time. This kind of situation can produce unwanted behaviors and conditions
rife with ethical hazards. Where and when there is a lot of money, caution is
necessary.
In times of increasing budgets, opportunities are numerous and corporate growth
in the defense industry is encouraged. Conversely, in prolonged periods of
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 17
________________________________________________________________
contraction, many industries do not survive and consolidation occurs. These
conditions also are times when ethical care is encouraged, as companies (and
their employees) battle for their livelihoods.
There are two fundamental ways in which technology and the weapons it
engenders come into being, either through technology push, or through
technology pull. In the second case, that of technology pull, there is a demand
signal from the warfighter to the developer that one assumes is based on real
threats and real operational need. Based on those requirements, technologists
set about working to develop and implement technology that a warfighter can
use. In the first case, that of technology push, researchers and laboratories find a
technology that might be useful in the future and “sell” it to the warfighter as a
way to improve current capability or to provide an as yet unavailable capability.
The weapons research, development, and acquisition process is a long and
complicated one that begins with research into possible military capabilities and
has long periods of technology development before a weapon actually emerges.
There are many points along the way at which anyone can raise concerns about
the ethics of a weapon or technology. Likewise, many different groups of people,
from university researchers, to government laboratory personnel, to defense
industry employees can make their inputs or concerns known. All participants in
the weapons development process share the responsibility to consider the ethical
implications of their work. This includes scientists, as noted earlier by Jacob
Bronowski.
As also mentioned earlier, it is hard to know the implications of a technology until
it is fielded. In the early stages of weapons development, likely issues are fuzzy
and unclear. However, it is important to consider them carefully as this is an ideal
time to make changes, when designs and possible employment scenarios are
fluid. As technology matures, possible new applications emerge, but it becomes
increasingly harder to change designs as they become more mature. Another
potential for ethical issues is that the weapon may be used in unanticipated
ways.
A very positive thing is that all signatories to Additional Protocol 1 of 1977, Article
36, are required to submit new weapons for review. While positive, such reviews
are only for major weapons and come very late in the development process. It is
unclear how useful these reviews are in stopping questionable weapons, but the
fact that they exist at all provides a measure of control. However, any such
review would only address legality in accordance with existing international
agreements. While to satisfy the law is necessary, it is not sufficient. A new
weapon or technology may be legal, but it may not be ethically advisable. Law
and ethics are simply not the same. Many laws arose out of ethical prescriptions,
but there are some instances where legal acts can be unethical, and also times
when an ethical act is considered as illegal. On the one hand, any legal act is
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 18
________________________________________________________________
applicable to all people in a society that implements a particular set of laws,
while, on the other hand, many ethical acts are considered to be voluntary and
personal acts based on an individual’s beliefs about right and wrong.
There are those who would reply that in any weapon or technology development,
careful consideration is given and detailed risk analyses are conducted.
Unfortunately, such analysis is usually conducted on a purely monetary basis
and is considered merely as cost versus benefit. Risk and cost-benefit analysis
are not the same. Ethics is relevant to risk analysis with respect to which impacts
should be considered (e.g., does the environment have standing?), how each
impact should be measured, and how different outcomes should be weighted.
One should also consider the precautionary principle, as well as simple
prudence, or pragmatism. The precautionary principle asserts that, if there is a
chance of harmful, unintended consequences, then do not act.
We talked earlier about the numerous arguments presented to foreclose debates
about the ethics of weapons research and development. We should care about
ethics primarily because it’s the right thing to do. We need to also consider public
perceptions of what we do, and the need to keep out of trouble with the law, the
media, and the public. And, frankly, the consideration of ethics enriches the
research agenda and ultimately makes our jobs easier. But, as noted by Prof.
George Lucas, “Many scientists and engineers (not to mention military
personnel) tend to view the wider public’s concern with ‘ethics’ as misplaced, and
regard the invocation of ‘ethics’ in these contexts as little more than a pretext to
impede the progress of science and technology.” One of the consequences of
the reticence of the technology development community to engage on ethics
issues is poorly held public perception of defense researchers as mad scientists
out to start global war or to control the minds and lives of the citizenry. This is a
real problem that can have bad consequences when the defense department is
seeking public support for its programs.
Case Study — Lewisite
Lewisite is an organoarsenate blister agent first synthesized in 1904 Ph.D.
dissertation research by future Notre Dame science professor, Father Julius
Nieuwland, at Catholic University. It was so toxic that Nieuwland, himself, had to
be hospitalized for several days after exposure. In response to German
deployment of chlorine gas at the Second Battle of Ypres in April 1915 and later
German experiments with other poison gases as weapons of war, the Allies
rushed to develop their own chemical warfare capabilities. After the identification
of Lewisite as a potential weapon late in 1917, a crash development program
was initiated first at Catholic University, in Washington, DC, under the direction of
Dr. Winford Lee Lewis, and later in the spring of 1918 at American University. A
facility for industrial scale production of Lewisite was quickly built in Willoughby,
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 19
________________________________________________________________
Ohio, just outside of Cleveland, under the direct of future Harvard president,
James Bryant Conant. WWI ended in November 1918 before the plant had
reached full production. The gas was nicknamed “The Dew of Death.”
Throughout the development and early production process, there was lax
oversight and supervision, partly because the command structure of what
became the Chemical Warfare Service was rapidly evolving and partly because
the nature and extent of the risks posed by Lewisite were still not well
understood, especially the risks deriving from accidental exposure in research
and manufacturing, and in handling.
The health effects of Lewisite exposure include severe blistering of the skin,
shock from dramatic blood pressure decrease, suffocation, and chronic
respiratory disease. But the full range of health effects is not well known due to a
lack of serious scientific study.
Stockpiles of Lewisite were mostly destroyed or discarded after WWI. But
production was resumed in WWII, with some 20,000 tons produced in the U.S. at
Rocky Mountain, Pine Bluff, and Huntsville Arsenals. Nearly 23,000 tons were
manufactured in the Soviet Union, and Lewisite was also manufactured in Japan
for use in China during WWII.
In its weaponized form, Lewisite is often mixed with mustard gas, partly to
mitigate the effects of hydrolysis and, thus, extend the operating parameters for
the combined gas.
Human trials were conducted in the U.S. during WWII with “volunteer” subjects
from the U.S. Army, but these tests remained classified until, many years later,
veterans of the test sought information and compensation. A major study was
commissioned in the early 1990s.
There are unconfirmed reports of the use of Lewisite by Saddam Hussein against
Iran in the Iran-Iraq War, in the 1980s, and against Kurdish insurgents. There are
recent reports of exposure by U.S. troops in Iraq in the wake of Operation Iraqi
Freedom.
A surprising and happy consequence of the weaponization of Lewis was that Sir
Rudolph Peters developed an antidote, BAL (Brisith anti-Lewisite), at Oxford in
1940, which later proved to be one of the most effective chelating agents for
removing heavy metal contamination.
There was serious contamination of the testing grounds at the American
University Experimental Station, with new contamination being discovered as late
as the 1990s, when the area was developed as the Spring Valley residential
neighborhood. Residents displayed increased incidence of cancers, blood, and
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 20
________________________________________________________________
thyroid diseases. There has also been continuing serious contamination in
formerly Japanese occupied areas of China, with fatal exposure as recently as
2003. It is reported that there has been extensive contamination of unremediated
land and water sites in Russia.
After WWII, U.S. Lewisite stocks were once again disposed, this time mainly in
deep-ocean sites. The Soviet Union is reported to have dumped most of its
stockpile in shallow-ocean sites.
The environmental consequences are
unknown. There is still Lewisite waste at U.S. arsenals contained in earthencapped basis. The Rocky Mountain Arsenal alone has more than 1,000 tons of
Lewisite waste.
The 1925 Geneva Protocol banned the use of asphyxiating, poisonous, or other
gases, and of bacteriological methods of warfare.
Questions
•
•
•
•
•
•
What, if any, were Father Nieuwland’s responsibilities once it became
clear that his discovery was being weaponized?
What should Bryant have done?
What would have been the opportunities for commanders and staff to
raise ethical concerns?
Should there have been a role for congressional oversight or for some
other type of independent oversight authority?
Compare the public mood in the U.S. in WWI, WWII, and post-9/11 from
the point of view of the felt urgency of weapons technology development.
How does this put pressure on our ability to reflect upon the ethical
issues?
What are the fundamental ethical questions about the use of
incapacitating chemical agents in warfare?
Just War Theory and International Humanitarian Law
Just war theory has ancient historical roots, important parts of which lie in
different religious and philosophical traditions, as expressed in works as diverse
as Deuteronomy, the Qu’an, and the Mahabharata. But in the form in which it
later came to shape the international law of armed conflict, just war theory is
mainly derived from the works of early and medieval Christian thinkers like
Augustine (354-430) and Aquinas (1225-1274), who were among the first to
formulate explicitly such principles as proper authority, just cause, and right
intention.
It was in the writings of the seventeenth-century theorist, Hugo Grotius, in
particular his On the Law of War and Peace (1625), that the modern principles of
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 21
________________________________________________________________
just war theory were first codified and that the fundamental distinction between
justice in the decision to go to war (jus ad bellum) and justice in the conduct of
war (jus in bello) was made clear.
Only in the nineteenth century were these principles first to find expression in
such forms as explicit instructions for armies and explicit international
agreements. The first important handbook for troops in the field was the Lieber
Code of 1863, prepared by attorney, Francis Lieber, at the request of President
Abraham Lincoln. Promulgated as General Orders No. 100, the Lieber Code was
distributed to all Union forces. It included detailed rules for such things as military
jurisdiction, the protection of property, the treatment of prisoners of war,
deserters, partisans, prowlers, and spies, and assassination and insurrection.
The first, major, international agreement was the St. Petersburg Declaration of
1868, which prohibited the use of exploding bullets on the grounds that they
inflicted suffering beyond what military necessity required, thus establishing the
principle of the use of the minimal force and suffering necessary for the
achievement of a legitimate, military objective.
The modern framework for the international law of armed conflict and
international humanitarian law was established with the Hague and Geneva
Conventions. The Hague Conventions of 1899 and 1907 established general
principles for the resolution of international disputes, the conduct of war, and the
rights of neutral parties, along with specific provisions concerning poison gas,
expanding bullets, the aerial delivery of ordnance, submarine warfare, and mine
laying.
The first Geneva Convention of 1864 established rules for the treatment of sick
and wounded. It was superseded by the second, third, and fourth Geneva
Conventions of 1906, 1929, and 1949, which also extended protection to
prisoners of war and civilian non-combatants. Additional protocols were adopted
in 1977 and 2005 to extend protections for cases of internal conflict, such as
wars of liberation and civil war, and to require the wearing distinctive emblems in
order to facilitate the distinguishing of combatants from non-combatants.
This body of law continues to grow, as with the adoption of five new protocols in
2001 to cover non-detectable fragments, mines, incendiary weapons, blinding
laser weapons, and explosive remnants.
Just War Theory, today, generally starts from the aforementioned distinction
between justice in the decision to go to war (jus ad bellum) and justice in the
conduct of war (jus in bello). Under each heading one finds specific principles.
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 22
________________________________________________________________
Jus ad bellum
•
Just cause: War may be initiated only for the purpose of correcting a grave
evil, and not for such casual reasons as simple conquest of territory.
•
Comparative justice: The injustice suffered by one party to a potential
conflict must significantly outweigh the injustice suffered by the other
party.
•
Competent authority: The decision to initiate war must be made by a
properly constituted public authority. Thus, war may not be initiated by an
unauthorized subordinate, nor may it be initiated by an illegitimate public
authority, such as a dictator.
•
Right intention: War may be initiated only for the purpose of righting a
wrong.
•
Probability of success: War cannot be initiated if the effort is doomed to
failure or if means would be required that would be disproportionate to the
harm that is to be righted.
•
Last resort: War may be initiated only if all other, peaceful means to
resolve the dispute have been exhausted.
•
Proportionality: The benefits to be derived from going to war must be
proportionate with the suffering that will likely be caused.
Jus in bello
•
Distinction: Force may be used only against legitimate targets, meaning
active combatants. Civilians or other non-combatants should not be
targeted under any circumstances, although non-combatant casualties are
permitted if they are unavoidable consequences of otherwise legitimate
military action, but even then no in excess. Likewise, combatants who
have been rendered out of combat through injury or surrender may not be
targeted.
•
Proportionality: Only appropriate levels of force may be applied, with harm
to civilians and civilian property not exceeding what is required by the
pursuit of a legitimate, military objective.
•
Military necessity: Force may be applied only as is necessary for the
achievement of a legitimate military objective. Thus, reprisal for its own
sake is not permitted, nor is the casual destruction of lives or property.
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 23
________________________________________________________________
•
Fair treatment of prisoners of war: All combatants who have surrendered
or captured must be treated fairly in accordance with relevant
humanitarian law. It is this principle that bans torture, for example.
•
No means malum in se: No methods or weapons may be used that are
evil in themselves, such as rape, forcing captured enemy combatants to
fight against their own side, or recruiting child soldiers.
Ethical Theories
In discussions of technology ethics, four frameworks for ethical judgment are
commonly employed, these reflecting main the dominant Western traditions in
ethics.
Deontology is the view famously associated with the eighteenth-century, German
philosopher, Immanuel Kant (1724-1804). Sometimes referred to as the “rulebased” approach to ethics, it asserts that all ethical judgments should be made
always and only in accordance with a rule that can be generalized to apply
equally to all, rational, moral agents. Proper intention is another necessary
feature of ethical judgment on this view. It emphasizes the purely rational aspect
of ethical judgment. The most common objections to deontology are: (1) Not
every ethical situation can be covered by a rule without risk of so specializing the
rules with restricting conditions as to make them trivial rules. (2) Deontology
eliminates any role for emotion and affect in ethical judgment, whereas many
people think that emotion plays an essential role in human morality.
Consequentialism, a more specialized version of which is utilitarianism, is widely
associated with the nineteenth-century English philosopher, John Stuart Mill
(1806-1873). The distinguishing feature of consequentialism, as the name
implies, is that ethical actions are to be considered only from the point of view of
their consequences. The intention of the moral agent is not crucial, nor is the
presence of any affect or emotional state. More specifically, consequentialism
asks us to undertake a kind of ethical, cost-benefit analysis, leading to the choice
of the action that minimizes suffering and maximizes pleasure or happiness.
Important objections to consequentialism include: (1) It is not obvious whose
suffering or happiness is relevant. Only those close to us or everyone who might
be affected? Future generations? Is the suffering and happiness of everyone
included in the calculation to be weighed equally? (2) In all but the simplest
cases, the calculation is too difficult to perform, especially when urgent action is
required. And the calculation may be impossible if we lack critical information
about consequences in the distant future. (3) We lack objective measures of
suffering and happiness.
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 24
________________________________________________________________
Virtue ethics is a tradition going all the way back to Aristotle in the fourth-century
B.C.E. and was the dominant tradition in ethics through the Middle Ages and
beyond. In the latter part of the twentieth century it was revived through the work
of philosophers like Alasdair MacIntyre, and today it stands as a major competitor
of consequentialism and deontology. In virtue ethics, questions about ethical
action are recast as being less about the rightness of an individual act and more
about the moral character of the ethical actor. Virtues are regarded as settled
habits of right action aiming toward the good, and examples include virtues such
as courage and loyalty, which are included among the core values of the U.S.
military services. The inculcation of virtuous habits is chief goal of parenting and
education, on this view. When explicit moral deliberation is required, the virtue
ethics tradition understands that as being not so much about acting in
accordance with a rule as about modeling one’s actions on those of morally
exemplary individuals. This has important implications for the role of leaders in
most institutions, who should, on this view, see themselves as models. Among
the main objections to virtue ethics are these: (1) What counts as a virtue will be
context-dependent and might vary from one cultural setting to another. (2)
Explicit ethical deliberation need not always precede an action in order for the
action to count as ethical.
Religious beliefs also shape ethical judgment for many individuals. Most religious
traditions include specific moral principles or rules as fundamental, such as the
Decalogue in the Christian tradition or the precepts expressed in the Islamic
tradition in works such as the Sura Al Isra. Explicit prohibitions, such as, “Thou
shalt not kill,” or injunctions, such as, “Love thy neighbor as thyself,” can function
as a powerful force for good and offer helpful guidance to many believers. But
there are objections to grounding ethics in religion: (1) Religious beliefs are
dogmatic, allowing no criticism. (2) Many such beliefs are local to specific
religious traditions, a circumstance that might impede agreement or common
action.
It should be noted that, in practice, most ethical agents draw from more than one
of these ethical traditions, depending upon the situation. Thus, one might almost
always honor the rule that requires our telling the truth, and truthfulness is widely
recognized as an important virtue. Nonetheless, there can be situations in which
the consequences of truth-telling might cause more harm than good, as in
protecting a loved one from some especially hurtful fact or in the name of
national security.
The take-home point is that one should use some or all of these frameworks for
the critical assessment of new technologies. Self-awareness about the values
and assumptions the one brings to such assessment is just as important as
understanding the values and assumptions of other actors, which might not be
one’s own, and the values and assumptions implicit in a technology, itself, or in
the structure and functioning of the institutions within which the technology is
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 25
________________________________________________________________
developed or deployed. One should use these ethical frameworks for analyzing
the ethical implications of new technologies from every relevant perspective.
Case Study — Synthetic Biology
In 1974, Stanford biochemist, Paul Berg, developed a technique for inserting bits
of SV40 monkey virus DNA into a bacteriophage, which would then be inserted
into an E. coli host. But worries about the carcinogenic potential of SV40 and the
viability of the lab strain of E. coli in human intestines led Berg to defer the final
insertion into the E. coli host and many other labs imposed voluntary moratoria
on such research. In February 1975, a conference of leading recombinant DNA
researchers was held at Asilomar to weigh the risks.
The Asilomar conference participants promulgated guidelines for future
recombinant DNA research that included principles such as:
•
•
•
Containment is necessary and the level of containment should match the
risk.
Biological barriers should be employed, such as host organisms that are
not viable outside the lab and vectors viable only in the designated hosts.
Education of personnel and strict adherence to best practice standards.
The Asilomar example guides practice today well beyond the original area of
research.
A similar debate about risk and the possible need for a moratorium developed a
few years ago, when so-called “gain-of-function” (GoF) experiments raised new
concerns about the balance between risk and new knowledge in microbiological
research, as with debates about engineering mammalian transmissibility in the
H5N1 virus. The proponents of such research point to benefits for both public
health and national security from the understanding of either natural mutation or
deliberate manipulation producing new functional capabilities in viral hosts.
Critics point to the risk of infection, including even epidemics in populations
without immunity, from possible failures in containment. As of the fall of 2014, the
U.S. government has paused all funding for GoF experiments with influenza,
SARS, and MERS viruses pending a thorough review of procedures.
Questions
•
Under what circumstances are moratoria on research warranted?
•
How do we balance the value of new knowledge against the risks posed
by the research?
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 26
________________________________________________________________
•
Are alternative research paths available?
•
Should decisions to pause research be made by the individual
researchers, the research community, or by the government?
•
If the latter, should such decisions be left to relevant individual agencies or
should there be congressional, judicial, or executive involvement?
A Framework for Analysis
With an understanding of the difficulties of controlling technology, the changing
nature of conflict, the demands and pressures of the weapons research and
development process, and knowledge of Just War Theory, the Laws of Armed
Conflict, and fundamental ethical theories, a framework for how we might think
about new weapons technology emerges. Such a framework must include an
analysis of the effects of technologies on stakeholders, from individual users to
organizations, some generally applicable themes, such as the impacts of scale of
use or the effects of technology imperfections, and various sources of insight
from disciplinary ethics to the laws of armed conflict. One of the key stakeholders
in any research project is the subject of the research itself. During the Cold War,
soldiers were placed out in the open in the vicinity of above ground nuclear
explosions. Many of these soldiers later suffered health effects from this testing.
Other stakeholders include the actual users of a technology. Consideration must
be given to the mental or psychological effects on them in using a particular
technology. Drone operators fighting a war and killing targets from thousands of
miles away are reported to experience severe psychological disorders as a result
of their continued exposure to graphic photographs of target scenes, combined
perhaps with the discontinuity between their living conditions and the seriousness
of their duties.
Other stakeholders who may need to be considered might be our allies. Although
they are our allies, we have had considerable differences with some of them in
the past over such diverse issues as genetically modified foods, patent laws and
intellectual property rights, privacy laws, and air traffic control requirements. In
the use of certain technologies, it might be advisable to consider their views as
well. Other considerations might include different cultural traditions.
One of the crosscutting themes that should be considered in an emerging
weapon technology might be the scale of the application. How could a change in
the scale of deployment or use of a technology or application change an ethical
calculation? What about humanity? Does the research or application compromise
something essentially human? And, what about technological imperfections?
What, if any, are the tradeoffs between an application’s functionality or use and
the safety requirements imposed on it? On the subject of unanticipated military
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 27
________________________________________________________________
use, what military uses are possible for the application or technology in question
that go beyond the stated concepts of operation? What are the ELSI implications
of such uses? If there are crossovers to civilian use, how fast should such
military-to-civilian transfers of applications be made? What safeguards should be
put into place before they are made? How should such safeguards vary with the
technology involved? In regard to changing ethical standards, how does a new
application create new ethical obligations to use it in preference to older
applications addressing similar problems that may raise ELSI concerns to a
greater extent? How can research in a classified environment be reviewed for
ELSI purposes? How and to what extent does U.S. military effort in a selected
R&D problem domain signal to adversaries that this domain may be a promising
one for military applications?
Serious questions arise about humanity — the implications of a technology on
what it means to be human — in research on autonomous systems and on
soldier enhancements. In one case, the goal of the research is to make machines
more human-like; in the other, the goal is to give humans more machine-like
characteristics. The question is, at what point along the spectrum does a human
cease to be “essentially” human, and at what point does a machine take on
human essence, if ever. And, how far should this research go?
Another aspect of humanity is the effect that the use of a particular weapon or
technology might have on the soldier’s sense of humanity. In discussing the
values of war fighters, Prof. Shannon Vallor notes that a soldier’s “warrior’s code”
— a code of values — is about what is right and wrong in combat and is the
shield that guards their humanity. She notes that “that sense of humanity is
endangered by excessive distancing in war, dehumanization of the enemy, and
the erosion of traditional warrior values.”
Concerns about technological imperfections might include such issues as the
ability to employ weapons designed to be non-lethal in lethal ways. It is difficult if
not impossible to predict, or control, how a weapon will ultimately be employed.
The very real possibility of flawed software, especially in such large and software
intense systems, is a real concern. Very large, software intensive systems with
multiple components often exhibit emergent, or adaptive behavior and
predictions of such behavior is difficult if not impossible. Weapons, and
technologies in general are notorious for being used in ways that were never
intended. The steam engine and the radio were not designed with military uses in
mind. The radio was designed as an aid to shipping, while the steam engine was
originally intended as a pump to aid in mining. Of a more contemporary nature,
the internet was designed as an aid to researchers and now combat operations
and weapons performance often depend on it. Non-lethal weapons are reportedly
used by authoritarian regimes to suppress dissidents.
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 28
________________________________________________________________
The difficulty of considering ethical issues in classified environments must be
confronted, and every effort made to accommodate such ethical analysis,
regardless of classification. Basic research is largely unclassified. Most often it is
only when one begins to apply such research that the work can become
sensitive. And of course, specifics of performance and vulnerabilities and
countermeasures are highly sensitive. As an argument not to conduct ethical
analysis, classification of the work is inadequate and should not be used as a
rationale NOT to consider ethics. Classified research is not subject to open peer
review, thus those doing the research cannot benefit from input and criticism
from the broad scientific community. Limiting such input increases the likelihood
that erroneous or incomplete results obtained in classified research will not be
identified as quickly.
There are many sources of insight that researchers and decision makers can call
upon to inform deliberations about the ethics of weapons technologies. They
include philosophical and disciplinary ethics, international law, social and
behavioral sciences, scientific and technological framing, the precautionary
principle and cost-benefit analysis, and risk science and communication.
Individual disciplines each have codes of ethics by which they operate. Some
examples include civil engineers, genetics researchers, nanotechnologists and
neuroscientists. The social and behavioral sciences have important inputs to
offer in the analysis of emerging weapons technologies. Sociology and
anthropology can shed important light on cultural differences and how the use of
a technology will be received in particular countries. Psychology includes
decision sciences that will be increasingly critical in autonomous and cyber
warfare. It can also analyze the perception of risk, provide predictions of group
behavior, and aid in human systems integration. Technological framing can be
important in determining the utility of a particular technology and identifying
possible issues of employment. If systems demonstrate non-linear, or stochastic,
or emergent behavior, they must be identified as such, and the guidelines for
their use may differ from those of other, more standard systems. The adequacy
of our ability to model and simulate system behavior will be an important factor.
The essence of the precautionary principle is “first do no harm.” More specifically,
the principle places the burden of proof on the proposer of a technology or
procedure that it will have no negative effects. This is in direct opposition to an
approach that advocates that opponents of a technology must prove that it is bad
in order to prevent its adoption.
Ethical questions arising in this discussion include how, and to what extent, can
tensions between cost-benefit analysis and the precautionary principle be
reconciled in any given research effort? If a cost-benefit approach is adopted,
how will intangible costs and benefits of a research effort be taken into account?
If a precautionary approach is adopted, what level of risk must be posed by a
research effort before precautionary interventions are required? Regardless what
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 29
________________________________________________________________
approach is taken toward the analysis of benefit and risk, it is crucial in scientific
research, especially that supported by the government, that consideration be
given (where possible, given classification issues) to keeping the public informed.
While considered by some to be an annoyance, communicating with the public
can allay fears or generate enthusiasm for technology investment. Conversely,
loss of public support can doom a project, as in the case of Total Information
Awareness. Some key precepts of risk communication are identifying information
regarding context and scientific background that is most critical to members of
the audience for making the decisions that they face, identifying audience
members’ current beliefs, including the terms that they use and their organizing
mental models, designing messages that close the critical gaps between what
people know and what they need to know, evaluating those messages until
audiences reach acceptable levels understanding, developing in advance
multiple channels of communication to the relevant audiences, disclosing
problematic ethical, legal, and societal issues earlier rather than later, insuring
that messages reach intended audiences in prompt and timely fashion, and
persisting in such public engagements even over long periods of time.
Conducting an initial analysis, perhaps using the framework offered above, is
important, but it is not enough. There are likely unanticipated impacts from a
given technology and, while it sounds oxymoronic, efforts must be made to
anticipate them. We know that technology forecasting is inexact and of limited
utility, but attempts to do these things have benefit. The activity itself allows the
researcher or organization to prepare for potential future issues. A quick
response to unanticipated issues is enhanced by addressing, in advance, a wide
variety of identified issues that provide building blocks upon which responses to
unanticipated problems can be crafted. Also, such an activity allows researchers
to obtain a broad range of perspectives and possible stakeholders to get more
information, thereby enhancing legitimacy. So, research organizations should
broaden their work in predictive analysis and make extra efforts to manage the
important work of ethical analysis and response to ethical issues. To do so, the
use of deliberative processes, anticipatory governance, and adaptive planning
are recommended.
The deliberative planning process is a complement to the analytical, expertdriven mode of inquiry. It includes broad participation of all stakeholders
(consistent with achievement of mission objectives, of course). Such an
approach makes available to the researcher relevant wisdom not limited to
scientific specialists and public officials. It also has the added benefit that broad
stakeholder participation may decrease conflict and increase acceptance and
trust.
Anticipatory governance is different from standard technology forecasting in that
it treats ethical and values issues as integral and not as a post-R&D
consideration. It presumes there are ethical and value issues that are resolved —
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 30
________________________________________________________________
whether explicitly, implicitly, or by default — in the doing of the R&D. It includes
selecting a research direction and research procedures, deciding what counts as
a significant finding, and examining or ignoring what benefits or harms might
accrue and to whom. With anticipatory governance, one must identify value
dimensions in all phases of research to determine consequences, benefits or
harms, and social equities or inequities. Importantly, it does not require prediction
of R&D consequences to proceed ethically. Instead, R&D managers have a
responsibility to be aware of, and actively engage with, the ELSI dimensions of
their work.
It is highly unlikely that all relevant ethical, legal, and societal issues will be
identified before development begins. Indeed, some initially unforeseen ethical
concerns may arise over the course of development. An adaptive planning
process would thus involve plans to address issues known at the initiation of an
R&D effort, and would provide contingency plans tied to specific issues to be put
into action if and when those issues emerge as the R&D effort unfolds. It would
provide criteria for recognizing the emergence of these issues, an organizational
structure for receiving reports of such emergence, and a schedule for formally
determining whether new circumstances warrant midcourse corrections to the
original plan, perhaps tied to project milestones.
With the foregoing discussions of technology, process, and background theories,
and the suggested framework for analysis of ethics related research issues, how
should researchers and organizations proceed? Many mechanisms already exist
for the consideration and management of ethics related issues in research. They
include self-regulation and self-awareness; codes of ethics and social
responsibility in science, engineering, and medicine; ELSI research; oversight
bodies (such as institutional review boards); advisory boards; research ethics
consultation services; and DOD Article 36 and treaty compliance reviews. Many
of these (except the Article 36 Reviews) have been developed for use in primarily
civilian research environments, but could also prove useful for the military. While
it may sound trite at first, it is important to note that good judgment is critical and
can often militate against taking actions that would create ethics issues.
Managers and decision makers in the military research and development
business should be aware of some of the important key characteristics of ethics
mechanisms in a military context. Scientists and engineers engaged in R&D must
be ELSI-aware. Someone must have accountability and responsibility for ELSI
matters.
Both program officials and project personnel must have or obtain ELSI related
expertise. Likewise, access to relevant scientific and technical information for
ELSI researchers is critical. Personnel must be given time to consider ethical
issues and must be urged to consider a variety of perspectives. Most existing
mechanisms focus on specific ELSI issues, with no mechanism directed towards
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 31
________________________________________________________________
comprehensive identification, so researchers must be aware and avoid the
tendency to think too narrowly. There must be active and sustained cooperation
between project and program managers before ELSI issues manifest themselves
or become troublesome.
We have previously mentioned the issue of classified research and reiterate that
classification cannot be an adequate reason to preclude ELSI considerations. In
many cases, there are significant elements of the research that can be discussed
at unclassified, or lesser-classified levels. Often, specific details of performance
are unimportant to ethics discussions. When necessary, cleared individuals who
have no relation to the project in question can be used for review. Many times,
the argument is used that time urgency precludes the consideration of ELSI
issues. However, not all relevant ELSI concerns as “gating” issues; some can be
handled in parallel. Also, time urgency is usually not relevant to foundational or
enabling research. Not all issues have to be addressed up front; policy makers
should be prepared for the possibility that actual operational use of a given
application will raise unanticipated ELSI issues.
To deal properly with potential ELSI issues in government research, the NRC
report offered the following recommendations for decision makers:
•
Senior leadership should be engaged with ethical, legal, and societal issues
in an ongoing manner and declare publicly that they are concerned with such
issues
•
•
Such a public declaration should include a designation of functional
accountability for ethical, legal, and societal issues within their
agencies.
Each agency should develop and deploy five specific processes to enable
consideration of ethical, legal, and societal issues associated with its research
portfolio
1. Initial screening of proposed projects
2. Reviewing proposals that raise ELSI concerns
3. Monitoring R&D projects for the emergence of ethical, legal, and
societal issues and providing for periodic mid-course corrections
when necessary
4. Engaging with various segments of the public as needed
5. Periodically reviewing ELSI-related processes in an agency
•
Agencies should undertake an effort to educate and sensitize program
managers to ethical, legal, and societal issues.
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 32
________________________________________________________________
•
•
Agencies should build external expertise in ethical, legal, and societal issues
to help address such issues.
Research-performing institutions should provide assistance for researchers
attending to ethical, legal, and societal issues in their work on emerging and
readily available technologies of military significance.
The NRC Committee found that:
•
•
•
•
•
Emerging and readily available technologies in a military context are likely
to raise complex ethical, legal, and societal issues, some of which differ
from those associated with similar technologies in a civilian context.
Sustainable policy regarding science and technology requires decision
makers to attend to the ELSI aspects of the S&T involved
Public reaction to a given science or technology effort or application is
sometimes an important influence on the degree of support it receives
Ethical, legal, and societal issues that may be associated with a given
technology development are very hard to anticipate accurately at the start
Any approach to promote consideration of ethical, legal, and societal
issues in R&D of military significance will have to address implementation
at the program and the project levels
They also observed that:
•
•
•
Approaches to addressing ELSI concerns can be both systematic and
pose minimal additional burdens on agencies and on researchers
Addressing such issues may well be necessary for sustaining support for
projects
Consideration of ethical issues may also improve the quality of the
research by pointing to other overlooked problems in the research or
opportunities for improvement in the science or technology to be pursued
Case Study — Agent Orange
In 1961, the South Vietnamese government requested U.S. assistance with
defoliants for use in perimeter defense and exposure of Viet Cong camps and
trails. There were brief, early debates about the ethics and policy of
environmental gas weapons. But doubts were swept aside by operational
imperatives, and Operation “Ranch Hand” ran from 1961 to 1971 with the release
of 20M gallons of chemical herbicides.
The most common defoliant that was used was known as “Agent Orange” (so
named because it shipped in orange barrels), which was a mixture of 2,4-dichlorophenoxyacetic acid (2,4-D) and 2,4,5-trichlorophenoxyacetic acid (2,4,5-T).
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 33
________________________________________________________________
Agent Orange was manufactured mainly by Monsanto and Dow, plus a few
smaller firms. The manufacturing process was not easy to control, mainly
because of the temperatures and pressures involved, with the result that there
was frequent contamination with highly toxic 2,3,7,8-tetrachlorodibenzodioxin
(TCDD), which is the chief suspected cause of health effects.
There was extensive research to find the optimal mixture, balancing effect with
applicability under a wide range of conditions. But there were lax rules and
oversight for handling and application. There were no serious studies at the time
of near-term or long-term health effects from either high-dose or prolonged, lowdose direct exposure. Early estimates of health effects were based mainly on
industry data about health consequences of worker exposure. Nor were there
any serious studies at the time of residual effects on crops grown on affected
lands or of long-term environmental impacts. Moreover, to this day there has
been no scientifically rigorous study after the fact of health effects on civilians or
military personnel.
The consequences of the widespread use of Agent Orange in Vietnam include a
long list of illnesses now associated with exposure, including cancers, birth
defects, liver disease, and chloracne. Hundreds of thousands have been killed or
made ill by direct exposure, and possibly five hundred thousand children have
been born with birth defects, continuing today. There are possibly two million
total cases of illness. Moreover, there has been widespread environmental
damage, including massive habitat destruction and species reduction or
extinction, along with long-term soil and water contamination.
There have been many suspected deaths and thousands of illnesses among
U.S. personnel. As of 2012, the VA had paid $3.6B in compensation to nearly
two-hundred and thirty thousand claimants.
Questions
•
What was the operational imperative?
•
Was there a temporal urgency?
•
What type of (risk) analysis, or cost/benefit analysis was conducted, if
any?
•
What would have been the appropriate opportunities for raising questions
about the technology and its use?
•
Where does responsibility lie for monitoring?
Ethical, Legal, and Societal Implications of New Weapons Technologies
A Briefing Book for Presenters
Page 34
________________________________________________________________
Conclusion
We conclude that the purpose and accomplishment of military research and
development is appropriately and correctly focused intently on the mission of the
military.
It has been demonstrated and remains true that research in almost all areas is
important to our national security, but we have also shown that weapons
technologies have had, and will continue to have, ethical issues associated with
them. And, while the military has always paid strict attention to the legality of its
research and weapons developments, it must begin to pay more attention to the
ethical and societal impacts of its work.
Military technologies, new and old, have potentially significant impacts on others
and as a result, deliberative analysis is critical.
Many sources of ethical insight exist and are available to military researchers and
a framework by which to analyze the impacts of military technologies is now
available.
Processes to consider such issues do not have to be onerous. As a result of the
DARPA sponsored NRC Committee’s work, military decision makers and military
technologists now have a reasonable process for implementation in their
organization.