Professional Documents
Culture Documents
Pros and Cons of Autonomous Weapons Systems: Amitai Etzioni, PHD Oren Etzioni, PHD
Pros and Cons of Autonomous Weapons Systems: Amitai Etzioni, PHD Oren Etzioni, PHD
Autonomous Weapons
Systems
Amitai Etzioni, PhD
Oren Etzioni, PhD
According to Defense News, Gen. Robert Cone, former As autonomous weapons systems move from concept to reality, military
commander of the U.S. Army Training and Doctrine planners, roboticists, and ethicists debate the advantages, disadvantages,
and morality of their use in current and future operating environments. (Im-
Command, suggested at the 2014 Army Aviation age by Peggy Frierson)
Symposium that by relying more on “support robots,”
the Army eventually could reduce the size of a brigade
from four thousand to three thousand soldiers without
a concomitant reduction in effectiveness.6 and unpredictable action that could confuse an op-
Air Force Maj. Jason S. DeSon, writing in the Air ponent. More striking still, Air Force Capt. Michael
Force Law Review, notes the potential advantages of Byrnes predicts that a single unmanned aerial vehicle
autonomous aerial weapons systems.7 According to with machine-controlled maneuvering and accuracy
DeSon, the physical strain of high-G maneuvers and could, “with a few hundred rounds of ammunition
the intense mental concentration and situational and sufficient fuel reserves,” take out an entire fleet of
awareness required of fighter pilots make them very aircraft, presumably one with human pilots.8
prone to fatigue and exhaustion; robot pilots, on the In 2012, a report by the Defense Science Board, in
other hand would not be subject to these physiological support of the Office of the Under Secretary of Defense
and mental constraints. Moreover, fully autonomous for Acquisition, Technology and Logistics, identified
planes could be programmed to take genuinely random “six key areas in which advances in autonomy would
could be tarnished, and a public backlash might curtail even for humans.29 Allowing AI to make decisions about
future benefits of AI. The letter has an impressive list of targeting will most likely result in civilian casualties and
signatories, including Elon Musk (inventor and founder unacceptable collateral damage.
of Tesla), Steve Wozniak (cofounder of Apple), physicist Another major concern is the problem of ac-
Stephen Hawking (University of Cambridge), and Noam countability when autonomous weapons systems are
Chomsky (Massachusetts Institute of Technology), deployed. Ethicist Robert Sparrow highlights this
among others. Over three thousand AI and robotics ethical issue by noting that a fundamental condition
researchers have also signed the letter. The open letter of international humanitarian law, or jus in bello, re-
simply calls for “a ban on offensive autonomous weapons quires that some person must be held responsible for
beyond meaningful human control.”25 civilian deaths. Any weapon or other means of war
We note in passing that it is often unclear whether that makes it impossible to identify responsibility for
a weapon is offensive or defensive. Thus, many assume the casualties it causes does not meet the require-
that an effective missile defense shield is strictly defen- ments of jus in bello, and, therefore, should not be
sive, but it can be extremely destabilizing if it allows employed in war.30
one nation to launch a nuclear strike against another This issue arises because AI-equipped machines
without fear of retaliation. make decisions on their own, so it is difficult to deter-
In April 2013, the United Nations (UN) special mine whether a flawed decision is due to flaws in the
rapporteur on extrajudicial, summary, or arbitrary program or in the autonomous deliberations of the
executions presented a report to the UN Human Rights AI-equipped (so-called smart) machines. The nature
Council. The report recommended that member states of this problem was highlighted when a driverless car
should declare and implement moratoria on the testing, violated the speed limits by moving too slowly on a
production, transfer, and deployment of lethal auton- highway, and it was unclear to whom the ticket should
omous robotics (LARs) until an internationally agreed be issued.31 In situations where a human being makes
upon framework for LARs has been established.26 the decision to use force against a target, there is a
That same year, a group of engineers, AI and clear chain of accountability, stretching from whoever
robotics experts, and other scientists and researchers actually “pulled the trigger” to the commander who
from thirty-seven countries issued the “Scientists’ Call gave the order. In the case of autonomous weapons
to Ban Autonomous Lethal Robots.” The statement systems, no such clarity exists. It is unclear who or
notes the lack of scientific evidence that robots could, what are to be blamed
in the future, have “the functionality required for or held liable.
accurate target identification, situational awareness, Oren Etzioni is chief
executive officer of the
or decisions regarding the proportional use of force.”27
Allen Institute for Artificial
Hence, they may cause a high level of collateral dam- Amitai Etzioni is a Intelligence. He received a
age. The statement ends by insisting that “decisions professor of international PhD from Carnegie Mellon
about the application of violent force must not be relations at The George University and a BA from
delegated to machines.”28 Washington University. He Harvard University. He has
Indeed, the delegation of life-or-death decision mak- served as a senior advisor been a professor at the
at the Carter White House University of Washington’s
ing to nonhuman agents is a recurring concern of those and taught at Colum- computer science depart-
who oppose autonomous weapons systems. The most bia University, Harvard ment since 1991. He was
obvious manifestation of this concern relates to systems Business School, and the the founder or cofounder
that are capable of choosing their own targets. Thus, high- University of California of several companies,
ly regarded computer scientist Noel Sharkey has called at Berkeley. A study by including Farecast (later sold
for a ban on “lethal autonomous targeting” because it vi- Richard Posner ranked him to Microsoft) and Decide
among the top one hun- (later sold to eBay), and the
olates the Principle of Distinction, considered one of the dred American intellectu- author of over one hundred
most important rules of armed conflict—autonomous als. His most recent book technical papers that have
weapons systems will find it very hard to determine who is Foreign Policy: Thinking garnered over twenty-five
is a civilian and who is a combatant, which is difficult Outside the Box (2016). thousand citations.
In their review of the debate, legal scholars Soldiers from 2nd Battalion, 27th Infantry Regiment, 3rd Brigade
Gregory Noone and Diana Noone conclude that ev- Combat Team, 25th Infantry Division, move forward toward simulat-
ed opposing forces with a multipurpose unmanned tactical transport
eryone is in agreement that any autonomous weapons 22 July 2016 during the Pacific Manned-Unmanned Initiative at Ma-
system would have to comply with the Law of Armed rine Corps Training Area Bellows, Hawaii. (Photo by Staff Sgt. Christo-
Conflict (LOAC), and thus be able to distinguish pher Hubenthal, U.S. Air Force)
between combatants and noncombatants. They write,
“No academic or practitioner is stating anything to
the contrary; therefore, this part of any argument
from either side must be ignored as a red herring. Limits on autonomous weapons systems. We take
Simply put, no one would agree to any weapon that it for granted that no nation would agree to forswear
ignores LOAC obligations.”39 the use of autonomous weapons systems unless its
adversaries would do the same. At first blush, it may
Limits on Autonomous seem that it is not beyond the realm of possibility to
Weapons Systems and Definitions obtain an international agreement to ban autonomous
of Autonomy weapons systems or at least some kinds of them.
The international community has agreed to limits Many bans exist in one category or another of
on mines and chemical and biological weapons, but weapons, and they have been quite well honored
an agreement on limiting autonomous weapons and enforced. These include the Convention on
systems would meet numerous challenges. One chal- the Prohibition of the Use, Stockpiling, Production
lenge is the lack of consensus on how to define the and Transfer of Anti-Personnel Mines and on their
autonomy of weapons systems, even among members Destruction (known as the Ottawa Treaty, which
of the Department of Defense. A standard defini- became international law in 1999); the Chemical
tion that accounts for levels of autonomy could help Weapons Convention (ratified in 1997); and the
guide an incremental approach to proposing limits. Convention on the Prohibition of the Development,
T
able to make a decision based on a
he U.S. Army Robotic and Autonomous Sys-
tems Strategy, published March 2017 by set of rules and/or limitations. It
U.S. Army Training and Doctrine Command, is able to determine what infor-
describes how the Army intends to integrate mation is important in making a
new technologies into future organizations to decision.”42 In this view, autono-
help ensure overmatch against increasingly ca- mous systems are less predictable
pable enemies. Five capability objectives are to
than merely automated ones, as
increase situational awareness, lighten soldiers’
workloads, sustain the force, facilitate move- the AI not only is performing a
ment and maneuver, and protect the force. specified action but also is making
To view the strategy, visit https://www.tradoc. decisions and thus potentially tak-
army.mil/FrontPageContent/Docs/RAS_Strate- ing an action that a human did not
gy.pdf. order. A human is still responsible
for programming the behavior of
the autonomous system, and the
ployment but also research, development, and testing actions the system takes would have to be consistent
of these machines. This may well be impossible as with the laws and strategies provided by humans.
autonomous weapons systems can be developed and However, no individual action would be completely
tested in small workshops and do not leave a trail. predictable or preprogrammed.
Nor could one rely on satellites for inspection data It is easy to find still other definitions of autono-
for the same reasons. We hence assume that if such a my. The International Committee of the Red Cross
ban were possible, it would mainly focus on deploy- defines autonomous weapons as those able to “inde-
ment and mass production. pendently select and attack targets, i.e., with auton-
Even so, such a ban would face considerable omy in the ‘critical functions’ of acquiring, tracking,
difficulties. While it is possible to determine what selecting and attacking targets.”43
is a chemical weapon and what is not (despite some A 2012 Human Rights Watch report by Bonnie
disagreements at the margin, for example, about law Docherty, Losing Humanity: The Case against Killer
enforcement use of irritant chemical weapons), and Robots, defines three categories of autonomy. Based
to clearly define nuclear arms or land mines, auton- on the kind of human involvement, the categories are
omous weapons systems come with very different human-in-the-loop, human-on-the-loop, and human-
levels of autonomy.40 A ban on all autonomous weap- out-of-the-loop weapons.44
ons would require foregoing many modern weapons “Human-in-the-loop weapons [are] robots that can
already mass produced and deployed. select targets and deliver force only with a human com-
Definitions of autonomy. Different definitions mand.”45 Numerous examples of the first type already are
have been attached to the word “autonomy” in different in use. For example, Israel’s Iron Dome system detects in-
Department of Defense documents, and the resulting coming rockets, predicts their trajectory, and then sends
this information to a human soldier who decides whether instructions and parameters its producers, program-
to launch an interceptor rocket.46 mers, and users provided to the machine.
“Human-on-the-loop weapons [are] robots that
can select targets and deliver force under the over- A Way to Initiate an
sight of a human operator who can override the ro- International Agreement
bots’ actions.”47 An example mentioned by Docherty Limiting Autonomous Weapons
includes the SGR-A1 built by Samsung, a sentry We find it hard to imagine nations agreeing to re-
robot used along the Korean Demilitarized Zone. turn to a world in which weapons had no measure of
It uses a low-light camera and pattern-recognition autonomy. On the contrary, development in AI leads
software to detect intruders and then issues a verbal one to expect that more and more machines and
warning. If the intruder does not surrender, the ro- instruments of all kinds will become more autono-
bot has a machine gun that can be fired remotely by a mous. Bombers and fighter aircraft having no human
soldier the robot has alerted, or by the robot itself if pilot seem inevitable. Although it is true that any
it is in fully automatic mode.48 level of autonomy entails, by definition, some loss of
The United States also deploys human-on-the-loop human control, this genie has left the bottle and we
weapons systems. For example, the MK 15 Phalanx see no way to put it back again.
Close-In Weapons System has been used on Navy ships Where to begin. The most promising way to
since the 1980s, and it is capable of detecting, evaluat- proceed is to determine whether one can obtain an in-
ing, tracking, engaging, and using force against antiship ternational agreement to ban fully autonomous weapons
missiles and high-speed aircraft threats without any with missions that cannot be aborted and that cannot
human commands.49 The Center for a New American be recalled once they are launched. If they malfunction
Security published a white paper that estimated as and target civilian centers, there is no way to stop them.
of 2015 at least thirty countries have deployed or are Like unexploded landmines placed without marks,
developing human-supervised systems.50 these weapons will continue to kill even after the sides
“Human-out-of-the-loop weapons [are] robots settle their difference and sue for peace.
capable of selecting targets and delivering force with- One may argue that gaining such an agreement
out any human input or interaction.”51 This kind of should not be arduous because no rational policy
autonomous weapons system is the source of much maker will favor such a weapon. Indeed, the Pentagon
concern about “killing machines.” Military strategist has directed that “autonomous and semi-autonomous
Thomas K. Adams warned that, in the future, humans weapon systems shall be designed to allow commanders
would be reduced to making only initial policy deci- and operators to exercise appropriate levels of human
sions about war, and they would have mere symbolic judgment over the use of force.”54
authority over automated systems.52 In the Human Why to begin. However, one should note that
Rights Watch report, Docherty warns, “By eliminating human-out-of-the-loop arms are very effective in re-
human involvement in the decision to use lethal force inforcing a red line. Declaration by representatives of
in armed conflict, fully autonomous weapons would one nation that if another nation engages in a certain
undermine other, nonlegal protections for civilians.”53 kind of hostile behavior, swift and severe retaliation
For example, a repressive dictator could deploy will follow, are open to misinterpretation by the oth-
emotionless robots to kill and instill fear among a er side, even if backed up with deployment of troops
population without having to worry about soldiers or other military assets.
who might empathize with their victims (who might Leaders, drawing on considerable historical expe-
be neighbors, acquaintances, or even family members) rience, may bet that they be able to cross the red line
and then turn against the dictator. and be spared because of one reason or another. Hence,
For the purposes of this paper, we take autonomy arms without a human in the loop make for much
to mean a machine has the ability to make decisions more credible red lines. (This is a form of the “precom-
based on information gathered by the machine and to mitment strategy” discussed by Thomas Schelling in
act on the basis of its own deliberations, beyond the Arms and Influence, in which one party limits its own
Notes
1. Gary E. Marchant et al., “International Governance of 17. Ibid., 49.
Autonomous Military Robots,” Columbia Science and Technology 18. Ibid.
Law Review 12 ( June 2011): 272–76, accessed 27 March 2017, 19. Ibid., 50.
http://stlr.org/download/volumes/volume12/marchant.pdf. 20. Ibid.
2. James R. Clapper Jr. et al., Unmanned Systems Roadmap: 21. Ronald C. Arkin, “The Case for Ethical Autonomy in
2007-2032 (Washington, DC: Department of Defense [DOD], Unmanned Systems,” Journal of Military Ethics 9, no. 4 (2010):
2007), 19, accessed 28 March 2017, http://www.globalsecurity. 332–41.
org/intell/library/reports/2007/dod-unmanned-systems-road- 22. Ibid.
map_2007-2032.pdf. 23. Douglas A. Pryer, “The Rise of the Machines: Why Increas-
3. Jeffrey S. Thurnher, “Legal Implications of Fully Autono- ingly ‘Perfect’ Weapons Help Perpetuate Our Wars and Endanger
mous Targeting,” Joint Force Quarterly 67 (4th Quarter, October Our Nation,” Military Review 93, no. 2 (2013): 14–24.
2012): 83, accessed 8 March 2017, http://ndupress.ndu.edu/ 24. “Autonomous Weapons: An Open Letter from AI [Artifi-
Portals/68/Documents/jfq/jfq-67/JFQ-67_77-84_Thurnher.pdf. cial Intelligence] & Robotics Researchers,” Future of Life Institute
4. David Francis, “How a New Army of Robots Can Cut the website, 28 July 2015, accessed 8 March 2017, http://futureoflife.
Defense Budget,” Fiscal Times, 2 April 2013, accessed 8 March org/open-letter-autonomous-weapons/.
2017, http://www.thefiscaltimes.com/Articles/2013/04/02/How- 25. Ibid.
a-New-Army-of-Robots-Can-Cut-the-Defense-Budget. Francis 26. Report of the Special Rapporteur on Extrajudicial, Sum-
attributes the $850,000 cost estimate to an unnamed DOD mary, or Arbitrary Executions, Christof Heyns, September 2013,
source, presumed from 2012 or 2013. United Nations Human Rights Council, 23rd Session, Agenda
5. Ibid. Item 3, United Nations Document A/HRC/23/47.
6. Quoted in Evan Ackerman, “U.S. Army Considers 27. International Committee for Robot Arms Control (ICRAC),
Replacing Thousands of Soldiers with Robots,” IEEE Spec- “Scientists’ Call to Ban Autonomous Lethal Robots,” ICRAC website,
trum, 22 January 2014, accessed 28 March 2016, http:// October 2013, accessed 24 March 2017, icrac.net.
spectrum.ieee.org/automaton/robotics/military-robots/ 28. Ibid.
army-considers-replacing-thousands-of-soldiers-with-robots. 29. Noel Sharkey, “Saying ‘No!’ to Lethal Autonomous Target-
7. Jason S. DeSon, “Automating the Right Stuff? The Hidden ing,” Journal of Military Ethics 9, no. 4 (2010): 369–83, accessed
Ramifications of Ensuring Autonomous Aerial Weapon Systems 28 March 2017, doi:10.1080/15027570.2010.537903. For more
Comply with International Humanitarian Law,” Air Force Law Review on this subject, see Peter Asaro, “On Banning Autonomous Weap-
72 (2015): 85–122, accessed 27 March 2017, http://www.afjag.af.mil/ on Systems: Human Rights, Automation, and the Dehumanization
Portals/77/documents/AFD-150721-006.pdf. of Lethal Decision-making,” International Review of the Red Cross
8. Michael Byrnes, “Nightfall: Machine Autonomy in Air- 94, no. 886 (2012): 687–709.
to-Air Combat,” Air & Space Power Journal 23, no. 3 (May- 30. Robert Sparrow, “Killer Robots,” Journal of Applied Philos-
June 2014): 54, accessed 8 March 2017, http://www.au.af. ophy 24, no. 1 (2007): 62–77.
mil/au/afri/aspj/digital/pdf/articles/2014-May-Jun/F-Byrnes. 31. For more discussion on this topic, see Amitai Etzioni
pdf?source=GovD. and Oren Etzioni, “Keeping AI Legal,” Vanderbilt Journal of
9. Defense Science Board, Task Force Report: The Role of Entertainment & Technology Law 19, no. 1 (Fall 2016): 133–46,
Autonomy in DoD Systems (Washington, DC: Office of the Under accessed 8 March 2017, http://www.jetlaw.org/wp-content/up-
Secretary of Defense for Acquisition, Technology and Logistics, loads/2016/12/Etzioni_Final.pdf.
July 2012), 31. 32. Kenneth Anderson and Matthew C. Waxman, “Law and
10. Ibid., 33. Ethics for Autonomous Weapon Systems: Why a Ban Won’t Work
11. Ibid., 38–39. and How the Laws of War Can,” Stanford University, Hoover
12. Ibid., 39. Institution Press, Jean Perkins Task Force on National Security and
13. Ibid., 41. Law Essay Series, 9 April 2013.
14. Ibid., 42. 33. Ibid.
15. Ibid., 44. 34. Anderson and Waxman, “Law and Ethics for Robot Sol-
16. Ibid. diers,” Policy Review 176 (December 2012): 46.
T
ber 2012), 2, accessed 10 March 2017, https://www.hrw.org/ he Office of the Secretary of Defense (OSD) Logis-
report/2012/11/19/losing-humanity/case-against-killer-robots.
tics Fellows Program is open to field-grade officer
45. Ibid.
46. Paul Marks, “Iron Dome Rocket Smasher Set to Change Gaza (O4 to O5) and Department of Defense (DOD) civilian
Conflict,” New Scientist Daily News online, 20 November 2012, logisticians (GS-13 to GS-14). According to the Office
accessed 24 March 2017, https://www.newscientist.com/article/ of the Assistant Secretary of Defense for Logistics and
dn22518-iron-dome-rocket-smasher-set-to-change-gaza-conflict/.
Materiel Readiness, which administers the program, the
47. Docherty, Losing Humanity, 2.
48. Ibid.; Patrick Lin, George Bekey, and Keith Abney, Auton- one-year fellowship is a developmental assignment that
omous Military Robotics: Risk, Ethics, and Design (Arlington, VA: fosters learning, growth, and experiential opportunities.
Department of the Navy, Office of Naval Research, 20 December Fellows participate in policy formulation and depart-
2008), accessed 24 March 2017, http://digitalcommons.calpoly.edu/ ment-wide oversight responsibilities. They tour public-
cgi/viewcontent.cgi?article=1001&context=phil_fac.
49. “MK 15—Phalanx Close-In Weapons System (CIWS)” Navy Fact and private-sector logistics organizations to learn how
Sheet, 25 January 2017, accessed 10 March 2017, http://www.navy.mil/ they conduct operations and to observe industry best
navydata/fact_print.asp?cid=2100&tid=487&ct=2&page=1. practices. They gain insight into legislative processes
50. Paul Scharre and Michael Horowitz, “An Introduction to through visits to Congress, and they attend national-level
Autonomy in Weapons Systems” (working paper, Center for a New
American Security, February 2015), 18, accessed 24 March 2017, forums and engage in collaborative efforts with industry
http://www.cnas.org/. partners. Fellows can observe and interact with appoint-
51. Docherty, Losing Humanity, 2. ed and career senior executives and flag officers, includ-
52. Thomas K. Adams, “Future Warfare and the Decline of ing one-on-one meetings with senior logistics leaders
Human Decisionmaking,” Parameters 31, no. 4 (Winter 2001–2002):
57–71.
in the military departments, joint staff, OSD, and other
53. Docherty, Losing Humanity, 4. agencies. They can gain a deeper understanding of the
54. DOD Directive 3000.09, Autonomy in Weapon Systems (Wash- OSD perspective and how it affects the DOD enterprise.
ington, DC: U.S. GPO, 21 November 2012), 2, accessed 10 March 2017, More information about the program and how to apply
http://www.dtic.mil/whs/directives/corres/pdf/300009p.pdf.
can be found at http://www.acq.osd.mil/log/lmr/fellows_
55. Thomas C. Schelling, Arms and Influence (New Haven: Yale
University, 1966). program.html.