Military Review English Edition May-June 2014 | Page 19
ROBOTIC WARFARE
Human Influence over Military
Robots
None of the three approaches to the autonomy of
robots described above implies that humans are not
in control of the technology they create and deploy.
The Defense Science Board Task Force even argues
that “it should be made clear that all autonomous
systems are supervised by human operators at some
level, and autonomous systems’ software embodies the designed limits on the actions and decisions
delegated to the computer.”17 Instead of no human
control, robot (or machine) autonomy appears to
mean that humans have different kinds of control.
Humans exert their influence by defining the conditions for machine behavior. They choose the mathematical and probabilistic models that will guide the
behavior of the robotic system and determine the
margins of error on what the robot can and cannot
do. Designers, developers, managers, and operators
set constraints on the behavior that robotic systems
are allowed to exhibit.
As military robots become more autonomous,
it would seem that they should only be allowed to
operate autonomously if they exhibit predictable
and reliable behavior. For example, an unmanned
helicopter would be allowed to fly into an unknown
environment only if the software controlling the
helicopter would adhere to certain expectations and
norms. The helicopter should not fly into trees, it
should execute given instructions, and it should fly
between waypoints in a limited amount of time. If
the helicopter would not perform as expected, it
would be regarded as malfunctioning.
It should not be surprising, then, that the idea
of more autonomous robotic systems comes with
an increased emphasis on reliability of and trust in
technology, along with the need to develop better
methods for verification and validation. In the
Report on Technology Horizons: A Vision for Air
Force Science & Technology 2010 2030, the U.S.
–
Air Force chief scientist argues that although it is
possible to develop systems with relatively high
levels of autonomy, the lack of suitable verification
and validation methods stands in the way of certifying these technologies for use.18 The report claims
that in the near- to mid-term future, developing
methods for “certifiable trust in autonomous systems is the single greatest technical barrier that must
be overcome to obtain the capability advantages that
are achievable by increasing use of autonomous
U.S. Army soldiers operate a pack robot at Forward Operating Base Hawk, Iraq, 18 September 2008. (U.S. Air Force, Staff Sgt. Manuel J. Martinez)
MILITARY REVIEW
May-June 2014
17