Do people drive more safely if their cars
speak to them, flash messages or, say, vibrate
the steering wheel? Should cars give an
update on road conditions just before
the human driver takes over at the wheel,
or are such details distracting?
als treat gadgets as if they are other
humans, expecting machines to be
sensitive to our moods and feelings.
As Nass sees it, driverless cars
should eventually be capable of
acting as our “wingmen,” proactive
and aware of our faults so they can
assist us in the best possible way.
We’re witnessing “the transition
of the car from being your slave to
being your teammate,” he explains.
“You can start to think about a
radical new way of designing cars
that starts from the premise that
[the car] and I are a team.”
Nass’ new simulator will give
him the most detailed view yet
into our relationships with our
cars. What’s special about this
setup, he explains excitedly, is
that it allows him to match up
exactly what’s happening in the
driver’s head with what’s happening, at that instant, inside the car.
His test subjects will be equipped
with high-tech gear that tracks
their emotional and mental states
throughout the courses they
drive. They can be outfitted with
EEG sensors that measure brain
activity, skin conductance sensors that track emotional arousal,
and eyetracking glasses that follow their gaze. Nass will use data
from these tools, in conjunction
with questionnaires and logs of
the car’s activity, to see how automation affects drivers’ reaction
speeds, focus and their ability to
avoid obstacles after driving a car
that’s been driving itself.
In one of Nass’ first studies, he
will try to determine how long
it takes drivers to “get their act
together” after the autonomous
car hands back control. Google’s
self-driving Lexus SUV offers one
current template for the handoff: When the car knows it needs
human help — often when approaching a construction zone
or merging onto a freeway — an
icon or message will flash on a