technology
Rise of the
machines
As robots become more common in the care sector,
critics are questioning their value and safety.
By Dallas Bastian
W
e’ve seen warnings in the pages of science fiction
novels on the potential for robots to switch from
helping humans to hurting them, but with their
increasing use in care environments, who would be responsible if
a robot did cause harm?
This is an example of the kinds of concerns unearthed through a
series of interviews between an Australian and international research
team and care providers, technology suppliers and experts.
The subsequent report unpacked the roles that robots should
and, importantly, shouldn’t play in care delivery.
Robots can do tasks that human carers would rather not do, the
report’s authors said. This includes being able to bear the brunt
of repetitive activities, freeing up time for carers to focus on other
tasks. They’re also immune to the psychological wear of a busy or
frustrating workload.
Still, the team from the University of Melbourne, UNSW Sydney
and Harvard noted that there is concern over robots not yet being
sufficiently tested in care settings.
“Robotics raises questions around accountability and surveillance
that have not yet been thought through,” the report said. “A host of
legal issues could emerge from the use of robotics where it is not
well established where responsibility ultimately lies.”
Writing in The Conversation, the authors said concerns have
been expressed about the use of robots potentially reducing
privacy, exposing people to data hacking, or even inflicting
physical harm.
On top of this, the team found that robots are being brought
into aged care facilities, schools, hospitals and government
agencies without much strategic thought about their use.
Study co-author Catherine Smith from the University of
Melbourne said often as technology was brought into a sector the
move was based more on principles of marketing and partnerships
rather than being driven by a specific need for a robot.
36 agedcareinsite.com.au
“It was not very evident at all that it was the end user who was
defining the need that the robot was addressing,” Smith said.
The Conversation article said in this way the sector is largely
being driven by the interests of technology suppliers.
“Providers in some cases are purchasing these technologies to
differentiate them in the market, but are also not always engaging
in critical analysis.”
Currently, many robotic devices are coming into caring sectors
through different entry points, Smith added. “Some of them are
considered toys, so they’re under legislation that protects people
in terms of how safe a toy should be.
“Others are considered medical devices … There’s no
overarching framework.”
The researchers said governments have a responsibility to
ensure that vulnerable populations are protected within the
context of new technologies.
This, they suggested, should be upheld through a responsive
regulatory approach, in which the sector is supported to self-
and peer-regulate, and escalate issues as they arise so that
governments can set up appropriate regulation.
As one interviewee put it: “The real question is, what are we
going to allow? Are we just going to be a big experiment, where
all the stuff is thrown upon us and we see what happens? Then
just say, ‘Oops, sorry if that was the wrong answer.’ Or are we
going to then end up overreacting and throw the baby out with
the bath water…
“So, the conversation is absolutely essential, there’s no doubt
about that. We need to even move beyond the conversation now
and start talking regulation and find frameworks through which
that can be done.”
Interviewees also highlighted a gap in leadership in the space.
The authors said a role for governments would be supporting
providers to understand the different technologies available and
generate an evidence base.
Smith said the team plans to hold a roundtable in the near
future to consider what should go into a more universal
framework that could be used to address some of the issues
interviewees identified. ■