Look at the presentation - Oxford Martin School

Killing with computers:
g
p
The ethics of remote‐controlled and autonomous robotic weapons
autonomous robotic weapons Dr Alex Leveringhaus, Oxford Institute for Ethics, Law and Armed Conflict & Martin School, University of Oxford with Dr Tjerk
j
de Greef
Delft University of Technology, NL, Research Associate/ELAC
ROBOTS
R‐O‐B‐O‐T‐S
Why do we want robots?
Why do we want robots?
• Robots are often used for tasks that are:
1. DULL
2. DIRTY
3. DANGEROUS
4 DODGY
4.
The Dragonrunner Robot
The Dragonrunner
• D
Developed by Carnegie l
db C
i
Mellon University and Automatik Inc
Automatik, Inc. • For use in urban environment. • Can fit into a backpack.
• Used by the British Army Used by the British Army
in Afghanistan for the detection of Improvised Explosive Devices and l
d
bomb disposal. The Alpha Dog
The Alpha Dog
• Bi
Big Dog/Alpha Dog D /Al h D
robot is developed by Boston Dynamics
Boston Dynamics, NASA, Foster Miller, and Harvard. Harvard
• Robotic ‘pack mule’. Legged Squad Support
Legged Squad Support System (L3S). Accompany soldiers in
• Accompany soldiers in difficult terrain. Sentry Robot
Sentry Robot • SSentry Robot‐
t R b t
manufactured by Samsung. p y
• Deployed in DMZ between N and S Korea.
• Stationary robot. • Two machine guns and one Two machine guns and one
gun that fires rubber bullets. • Can track and identify targets that are up to 2.5 kilometres away.
• Has a microphone and speaker system. Tanaris (in study) (in study)
• Unmanned aerial vehicle manufactured by BAE systems. • Stealth plane.
• Can track and destroy radar stations without ada s a o s
ou
assistance from an operator. p
Distinctions (1)
Distinctions (1) • Non‐lethal military robots
– Alpha Dog. p
g
• Non‐lethal robots with lethal side‐effects
– Electronic countermeasure systems. El
i
• Robotic targeting systems (RTS)
g
g y
(
)
– Designed in order to apply force to a target.
– Application of force is intentional. Application of force is intentional
=> Sentry Robot, Tanaris
Distinctions (2)
Distinctions (2) 1. Remote‐controlled RTS/semi‐autonomous (
_ )
RTS (RTS_rc).
‐ unmanned aerial vehicles/drones
2. Autonomous RTS (RTS_auton).
‐ Sentry
‐ Tanaris
Remote‐controlled robotic targeting systems
• RTS_rc are also known as tele‐operated l k
l
d
robots:
1. Sensors.
g
2. Transmit information/images via a video‐satellite link.
3. Information relayed on video screen to operator (or group of operators).
4. Targeting decision made via chain of command.
5. Enactment of decision by operator via remote control. The Predator Drone (MQ 1)
The Predator Drone (MQ‐1) • U
Unmanned aerial vehicle. d
i l hi l
Manufactured by General Atomics. • Deployed by US military and intelligence services (CIA). (CIA)
• Used in the War on Terror for ‘targeted killings’. g
g
• Can also be used for reconnaissance missions.
• Remotely operated. Predator Drone: Communications
Predator Drone: Communications Predator drone command & control centre
Moral issues (1)
Moral issues (1)
• Do RTS_rc raise morally distinctive issues compared to other systems?
compared to other systems? • Are RTS_rc morally preferable over other systems? If so why? If not why not?
systems? If so, why? If not, why not?
Targeted Killings: A cautionary note
Targeted Killings: A cautionary note
007
MQ‐1
Autonomous robots???
Roomba
Immanuel Kant ???
Moral and Operational Autonomy
Moral and Operational Autonomy
• Moral autonomy: l
– Moral autonomy: act for reasons that we give ourselves (Kant, Rousseau, Rawls). • Operational autonomy: p
y
– Carry out tasks independently from an operator
• Technological Capacity: Machine (M) can take care of g
p
y
( )
itself. M does not require an operator to carry out a specific task. • Self‐Direction: M is allowed to act within a certain S lf Di
i
M i ll
d
i hi
i
domain. Restriction are lifted. No science fiction!
No science fiction!
• RTS_auton are likely to be developed: – Tanaris. • Some RTS are underutilised & precedents for autonomous technologies exit:
h l i
i
– An RTS could make a targeting decision, but is not allowed to do so (doesn’tt meet Self‐Direction allowed to do so (doesn
meet Self Direction
criterion). – Examples: Sentry Robot. p
y
Moral Issues (2)
Moral Issues (2)
• Do RTS_auton raise distinctive moral issues, i di i i
li
not found in other systems? – Responsibility? • Are RTS_auton morally preferable over other yp
systems?
– Robots are motivated by algorithms, not by hatred Robots are motivated by algorithms not by hatred
and fear (Ronald Arkin) – RTS_auton
RTS auton cannot satisfy key legal and moral cannot satisfy key legal and moral
criteria: discrimination (Human Rights Watch). The Military Enhancement Project @Delft and Oxford • Look at RTS, both remote‐controlled and autonomous. • Check whether existing systems comply with law and ethics
law and ethics.
• Try to anticipate dilemmas and problems that may arise from the operation of RTS. • Find ways to enhance targeting decisions. Find ways to enhance targeting decisions
Design Proposal: E Partnerships
Design Proposal: E‐Partnerships
• Utilise different perspectives of human and g
artificial agent. – No point in cutting out a human operator entirely. Machines are just bad at a lot of stuff.
Machines are just bad at a lot of stuff. – Introduce autonomous technologies into remote controlled systems in order to assist the operator
controlled systems in order to assist the operator in decision‐making. • Design for Responsibility. i f
ibili
Thank you!
Thank you!