THE MOST
DANGEROUS MOMENT
PREVIOUS PAGE: MARTIN GEE
in a self-driving car involves no immediate or obvious peril. It is not when, say,
the computer must avoid a vehicle swerving into its lane or navigate some other
recognizable hazard of the road — a patch of ice, or a clueless pedestrian stepping
into traffic. It is when something much more routine takes place: The computer
hands over control of the vehicle to a human being. ¶ In that instant, the human
must quickly rouse herself from whatever else she might have been doing while the
computer handled the car and focus her attention on the road. As scientists now
studying this moment have come to realize, the hand-off is laden with risks.
“People worry about the wrong
thing when it comes to the safety
of autonomous cars,” says Clifford
Nass, a Stanford University professor and director of the Revs Program, an interdisciplinary research
center. “There are going to be
times where the driver has to take
over. And that turns out to be by
far the most dangerous and totally
understudied issue.”
Thrust back into control while
going full-speed on the freeway,
the driver might be unable to
take stock of all the obstacles
on the road, or she might still
be expecting her computer to do
something it can’t. Her reaction
speed might be slower than if
she’d been driving all along, she
might be distracted by the email
she was writing or she might
choose not to take over at all,
leaving a confused car in command. There’s also the worry that
people’s driving skills will rapidly
deteriorate as they come to rely
on their robo-chauffeurs.
In the effort to engineer selfdriving cars, the best and brightest minds have already mastered
many of the technological questions, producing vehicles that
can park themselves, navigate
highways and handle stop-and-go