Masdar Smart City and Robotics - GineersNow Engineering Magazine Masdar: The Future of Sustainable City in Abu Dhab | Page 17
THE MORAL AND ETHICAL
ISSUE WITH ROBOTS
Photo by: EW.com
Take a look at science fiction
and you will find innumerable
examples of robots gone
evil. From the popular
Skynet and Terminators to
HAL 9000 to Ultron, there
are many robots that are
out to destroy humanity. Of
course, such stories may be a
bit of a stretch. Nonetheless,
there is a genuine concern
about artificially intelligent
robots but it’s not because
people worry about these
robots taking over the world.
The concern is about whether
the robots in question are
capable of taking the right
moral choice at the right
time.
Thousands of scientists
and tech experts such as
Stephen Hawking, Steve
Wozniak and Elon Musk have
agreed to get autonomous
weapons such as drones
banned. Such weapons are
capable of identifying and
destroying targets without
human intervention. As scary
as that sounds, the actual
applications may be more
mundane. Nonetheless, it
may spark off another arms
race.
Jerry Kaplan, a scholar
of artificial intelligence,
believes that morality is
essential to robots. After all,
humans can end up making
some extremely silly and bad
choices when it comes to
their robots. After all, they
may ask the robot to fetch
something quickly. Now,
that may turn to be very bad
if the robot ends up hurting
people when fetching the
desired item.
Another interesting test
proposed by Jerry Kaplan is
the driving test. Self-driving
cars need to make a very
crucial decision when it comes
to avoiding a major accident.
For example, should the car
swerve to save pedestrians
and harm the occupants or
the reverse? Now that is a
moral conundrum that even
humans find difficult to
agree on. Kaplan declares
that machines by their very
nature are psychopaths.
Therefore, it becomes crucial
that robots are taught
morality and ethics. A lofty
goal? Well, we need that or
Skynet is a possibility.
JULY 2016
Future Cities & Robotics
17