Military Review English Edition May-June 2014 | Page 23
ROBOTIC WARFARE
Conclusion
The use of autonomous artificial agents raises significant issues of responsibility and accountability,
and this is especially so when the artificial agents
are part of military operations. Whether and in what
ways humans are responsible for the behavior of
artificial agents is not just a matter of delegating
tasks to machines. Negotiations about responsibility
for the behavior of these agents are ongoing with the
development of the technologies. These negotiations
involve a variety of actors, including the scientists and
engineers designing the technologies and users such
as the military or the public. Although it is difficult to
predict where current negotiations will end up (and
that i s not our goal), our analysis shows that different notions of autonomy are being used, and each
has distinctive implications for how we think about
responsibility. At the same time, issues of responsibility are not determined by the design of the artificial
agent. Decisions about responsibility, i.e., who is
responsible for what behavior, are also made in the
development and evolution of social practices that
constitute the operation of artificial agents.
None of this is to say we should stop being concerned about the tasks assigned to the nonhuman
components of military robotic systems. On the
contrary, concerns about responsibility should be an
important part of the negotiations. They should shape
the delegation of tasks to the human and nonhuman
components of these systems. The danger in concentrating on the technological side of autonomous
robots is that the development of responsibility
practices will be neglected. Instead of focusing on
whether robots or humans can be held responsible
for robots’ behavior, we should focus on the best
allocation of tasks and control among human and
nonhuman components and how best to develop
responsibility practices. MR
This paper is based on the research done for the National Science Foundation (Grant # 1058457)
and draws on several papers that have been written as part of this project. Any opinions,
findings, and conclusions or recommendations expressed in this material are those of the
authors and do not necessarily reflect the views of the National Science Foundation.
NOTES
1. Unmanned aerial vehicles (UAVs) are sometimes referred to as remotely
piloted aircraft to emphasize the need for a pilot or unmanned systems to underline that the vehicle is inextricably linked to a larger system of human actors and
technologies. For clarity, we use the terms drone and UAV. For different perspectives on the future of drones, see T. Adams, “Future Warfare and the Decline
of Human Decision-Making,” Parameters (Winter 2001-02): 55-71; Patrick Lin,
George Bekey, and Keith Abney, Autonomous Military Robots: Risk, Ethics, and
Design (California Polytechnic State University, 2008), ; Peter W. Singer, Wired for War: The Robotics Revolution and
Conflict in the 21st Century (New York, New York: Penguin, 2009); U.S. Air Force
Chief Scientist, Report on Technology Horizons: A Vision for Air Force Science &
Technology 2010–2030 (AF/ST-TR-10-01-PR, Maxwell Air Force Base, Alabama,
September 2011). .
2. Peter M. Asaro, “How Just Could a Robot War Be,” Current Issues in Computing And Philosophy, edited by Philip Brey, Adam Briggle and Katinka Waelbers,
(Amsterdam, The Netherlands: IOS Press, 2008), 50-64.
3. Lin, Bekey, and Abney.
4. For an overview of science and technology studies, see Wiebe E. Bijker,
Thomas P. Hughes, and Trevor Pinch, The Social Construction of Technological
Systems: New Directions in the Sociology and History of Technology (London,
UK: The MIT Press, 1987).
5. Mark Anderson, “How Does a Terminator Know When to Not Terminate?”
Discover Magazine, 31(4)(May 2010): 36.
6. DOD, Defense Science Board, Task Force Report: The Role of Autonomy
in DOD Systems (Washington, DC: U.S. Government Printing Office [GPO], July
2012), , 1.
7. See Thomas B. Sheridan, Telerobotics, Automation, and Human Supervisory
Control (Cambridge, MA: MIT Press, 1992).
8. See for example Bruce T. Clough, Metrics, Schmetrics: How the Heck Do
You Determine a UAV’s Autonomy Anyway? (Technical report, Air Force Research
Lab., Wright-Patterson AFB, Ohio, 2002).
9. DOD, Unmanned Systems Integrated Roadmap FY2011-2036, (Washington,
DC: GPO, 2011), , 43.
10. Ibid., 43.
11. Ibid., 43.
12. U.S. Air Force Chief Scientist, Report on Technology Horizons.
13. DOD, Unmanned Systems Integrated Roadmap FY2009-2034 (Washing-
MILITARY REVIEW
May-June 2014
ton, DC: GPO, 20 April 2009), , 27.
14. See for a brief overview Matthew Johnson, et al., “The Fundamental Principle of Coactive Design Interdependence Must Shape Autonomy,” in COIN@
AAMAS’10, Proceedings of the 6th International Conference on Coordination,
Organizations, Institutions, and Norms in Agent Systems, (2010), 172-191.
15. Robin R. Murphy and David D. Woods, “Beyond Asimov: The Three Laws of
Responsible Robotics,” IEEE Intelligent Systems, 24(4) (July-August, 2009):14-20.
16. Ibid., 18.
17. DOD, Defense Science Board, Task Force Report, 1.
18. U.S. Air Force Chief Scientist, Report on Technology Horizons, IX.
19. Ibid., 42.
20. Peter-Paul Verbeek, “Materializing Morality,” Science, Technology and
Human Values, 31(3) (2006): 361-380.
21. Hans Jonas, The Imperative of Responsibility: In Search of an Ethics for
the Technological Age (Chicago: The Chicago University Press, 1984); Mark
Coeckelbergh, “Moral Responsibility, Technology, and Experiences of the Tragic:
From Kierkegaard to Offshore Engineering,” Science and Engineering Ethics, 18
(2012): 35-48.
22. Deborah G. Johnson and Thomas M. Powers, “Computer Systems and
Responsibility: a Normative Look at Technological Complexity,” Ethics and Information Technology, 7(2) (2005): 99-107.
23. Peter M. Asaro, “The Labor of Surveillance and Bureaucratized Killing:
New Subjectivities of Military Drone Operators,” Journal of Social Semiotics,
23(2) (2013): 196-224.
24. Jo Becker and Scott Shane, “Secret ‘Kill List’ Proves a Test of
Obama’s Principles and Will,” New York Times, (May 29, 2012), .
25. See for example David S. Cloud, “Anatomy of an Afghan War Tragedy,” Los
Angeles Times (April 10, 2011) ; Sharon D. Manning, Clarence E. Rash, Patricia A.
LeDuc, Robert K. Noback an d Joseph McKeon, The Role of Human Causal Factors
in U.S. Army Unmanned Aerial Vehicle Accidents (U.S. Army Aeromedical Research
Laboratory Report # 2004-11, 2004); Kevin W. Williams, A Summary of Unmanned
Aircraft Accident/Incident Data: Human Factors Implications (Washington, DC:
U.S. Department of Transportation, Federal Aviation Administration, Office of
Aerospace Medicine. Technical Report Publication No. DOT/FAA/AM-04/24, 2004).
26. DOD, Defense Science Board, Task Force Report.
21