Military Review English Edition May-June 2014 | Page 21
ROBOTIC WARFARE
assigned to which individuals and to which nonhuman components. Such practices might be promulgated through job descriptions, instruction manuals,
ethical codes, observation of past practices, training
before taking on a role, and so on. Backward-looking
responsibility involves practices of tracing back what
happened and identifying what went wrong. When
a failure occurs, humans will seek out the cause of
the failure, and humans operating in the system will
be asked to account for their behavior. Backwardlooking responsibility generally relies on, or at least
presumes something about, forward-looking responsibility. That is, to understand what went wrong or
what is to blame, we have to understand how tasks
and responsibilities were assigned.
The extent to which individuals operating in the
system are perceived to have responsibility or feel
themselves in a position of responsibility is not simply
a matter of tasks being delegated. It also depends on
how responsibility practices convey expectations
about the duties and obligations of the humans and
hold actors accountable for performing or failing to
perform as expected. Whether someone is considered
responsible depends, as well, on evolving notions of
what it means to be in control and able to think about
the consequences of certain actions.
In a given system, adoption of a new technology
may lead to negotiations about changes to existing
responsibility practices, creation of entirely new practices, or both. Established practices may not accommodate the changes produced from introducing the
new technology, e.g., changes in activities, attitudes,
and relationships between people. The real-time
stream of data that current UAVs produce is a good
example here. The role of pilots has changed insofar
as they now monitor video images and continuously
communicate with others who have access to the
same data and images (e.g., the sensor operator, the
mission intelligence coordinator, and the data analysts
miles away in an information fusion center). This has
transformed the way targeting decisions are made,
compared to manned operations. Decision making
has become more shared and less compartmentalized. As a result, established ideas about what various
human actors are supposed to do and what they have
to account for have had to be adjusted.23
New norms and rules have to be established to
govern the activities a new technology makes possible. Duties and obligations have to be reevaluated
MILITARY REVIEW
May-June 2014
and redefined. Mechanisms for evaluating actions and
holding others to account have to be adjusted or created. This will also be the case for future autonomous
technologies. Regardless of how machine autonomy
is interpreted, whether someone is responsible for the
behavior of the system will not only depend on what
Human responsibility
can best be understood as
constituted through a set of
responsibility practices.
the machine can and cannot do, it will also depend
on the practices that prevail in the context.
Shared values and principles may shape the
establishment of new practices. In the case of UAVs,
organizational values and national and international
laws provide a moral framework for and set limits
on the new activities these technologies enable. Take
the principle of distinction, a key principle in international law that states civilians should be distinguished
from combatants. This principle is intertwined with
established responsibility practices within military
organizations, as they are part of their routines, protocols, and procedures.
Yet, these shared values and principles are subject
to negotiation. Achieving an interpretation of them
may be challenging because of the introduction of
new technologies and also because of social, political, and economic developments. The current debates
about the use of drones provide a pertinent example.
One contentious issue is that, according to anonymous
government officials, the U.S. government regards all
military-age males killed in a drone strike as combatants unless proven otherwise.24 Such a controversial
and broad interpretation of a key principle of the law
of war affects responsibility practices significantly, at
least in the sense that soldiers involved in deploying
drones are held to a certain standard of responsibility
for harm to noncombatants.
Responsibility practices are continuously negotiated and renegotiated. This can often be seen when
something goes wrong with a new technology, and
investigators trace back the cause of the failure.
19