Ukraine on a map. The farther off base they were
about the geography, the more likely they were to
favor military intervention. (Respondents were so
unsure of Ukraine’s location that the median guess
was wrong by eighteen hundred miles, roughly the
distance from Kiev to Madrid.)
Surveys on many other issues have yield-
ed similarly dismaying results. “As a rule, strong
feelings about issues do not emerge from deep
understanding,” Sloman and Fernbach write. And
here our dependence on other minds reinforces
the problem. If your position on, say, the Afforda-
ble Care Act is baseless and I rely on it, then my
opinion is also baseless. When I talk to Tom and
he decides he agrees with me, his opinion is also
baseless, but now that the three of us concur we
feel that much more smug about our views. If we all
now dismiss as unconvincing any information that
contradicts our opinion, you get, well, the Trump
Administration.
Deterrence argument
In The students were asked to respond to two
studies. One provided data in support of the de-
terrence argument, and the other provided data
that called it into question. Both studies—you
guessed it—were made up, and had been de-
signed to present what were, objectively speaking,
equally compelling statistics. The students who
had originally supported capital punishment rated
the pro-deterrence data highly credible and the
anti-deterrence data unconvincing; the students
who’d originally opposed capital punishment did
the reverse. At the end of the experiment, the stu-
dents were asked once again about their views.
Those who’d started out pro-capital punishment
were now even more in favor of it; those who’d op-
posed it were even more hostile.
If reason is designed to generate sound judg-
ments, then it’s hard to conceive of a more seri-
ous design flaw than confirmation bias. Imagine,
Mercier and Sperber suggest, a mouse that thinks
the way we do. Such a mouse, “bent on confirm-
ing its belief that there are no cats around,” would
soon be dinner. To the extent that confirmation
bias leads people to dismiss evidence of new or
underappreciated threats—the human equiva-
lent of the cat around the corner—it’s a trait that
should have been selected against. The fact that
both we and it survive, Mercier and Sperber argue,
proves that it must have some adaptive function,
and that function, they maintain, is related to our
“hypersociability.”
Mercier and Sperber prefer the term “myside
bias.” Humans, they point out, aren’t randomly
credulous. Presented with someone else’s argu-
ment, we’re quite adept at spotting the weakness-
es. Almost invariably, the positions we’re blind
about are our own.
This lopsidedness, according to Mercier and
20
Sperber, reflects the task that reason evolved
to perform, which is to prevent us from getting
screwed by the other members of our group. Liv-
ing in small bands of hunter-gatherers, our ances-
tors were primarily concerned with their social
standing, and with making sure that they weren’t
the ones risking their lives on the hunt while others
loafed around in the cave. There was little advan-
tage in reasoning clearly, while much was to be
gained from winning arguments.
Among the many, many issues our forebears
didn’t worry about were the deterrent effects of
capital punishment and the ideal attributes of a
firefighter. Nor did they have to contend with fab-
ricated studies, or fake news, or Twitter. It’s no
wonder, then, that today reason often seems to
fail us. As Mercier and Sperber write, “This is one
of many cases in which the environment changed
too quickly for natural selection to catch up.”
Steven Sloman, a professor at Brown, and Philip
Fernbach, a professor at the University of Colora-
do, are also cognitive scientists. They, too, believe
sociability is the key to how the human mind func-
tions or, perhaps more pertinently, malfunctions.
They begin their book, “The Knowledge Illusion:
Why We Never Think Alone” (Riverhead), with a
look at toilets.
Virtually everyone in the United States, and
indeed throughout the developed world, is fa-
miliar with toilets. A typical flush toilet has a ce-
ramic bowl filled with water. When the handle is
depressed, or the button pushed, the water—
and everything that’s been deposited in it—gets
sucked into a pipe and from there into the sewage
system. But how does this actually happen?
In a study conducted at Yale, graduate students
were asked to rate their understanding of every-
day devices, including toilets, zipper s, and cylin-
der locks. They were then asked to write detailed,
step-by-step explanations of how the devices
work, and to rate their understanding again. Ap-
parently, the effort revealed to the students their
own ignorance, because their self-assessments
dropped. (Toilets, it turns out, are more compli-
cated than they appear.)
Sloman and Fernbach see this effect, which
they call the “illusion of explanatory depth,” just
about everywhere. People believe that they know
way more than they actually do. What allows us to
persist in this belief is other people. In the case of
my toilet, someone else designed it so that I can
operate it easily. This is something humans are
very good at. We’ve been relying on one another’s
expertise ever since we figured out how to hunt
together, which was probably a key development
in our evolutionary history. So well do we collab-
orate, Sloman and Fernbach argue, that we can
hardly tell where our own understanding ends and
others’ begins.
Sloman and Fernbach see in this result a little
candle for a dark world. If we—or our friends or the
pundits on CNN—spent less time pontificating
and more trying to work through the implications
of policy proposals, we’d realize how clueless we
are and moderate our views. This, they write, “may
be the only form of thinking that will shatter the il-
lusion of explanatory depth and change people’s
attitudes.”
No sharp boundary
between one person’s
ideas and knowledge”
and “those of other
members” of the group.
This borderlessness, or, if you prefer, confusion,
is also crucial to what we consider progress. As
people invented new tools for new ways of living,
they simultaneously created new realms of igno-
rance; if everyone had insisted on, say, mastering
the principles of metalworking before picking up a
knife, the Bronze Age wouldn’t have amounted to
much. When it comes to new technologies, incom-
plete understanding is empowering.
Where it gets us into trouble, according to Slo-
man and Fernbach, is in the political domain. It’s
one thing for me to flush a toilet without knowing
how it operates, and another for me to favor (or
oppose) an immigration ban without knowing
what I’m talking about. Sloman and Fernbach cite
a survey conducted in 2014, not long after Russia
annexed the Ukrainian territory of Crimea. Re-
spondents were asked how they thought the U.S.
should react, and also whether they could identify
WHY FACTS DON’T CHANGE OUR MINDS
researchers rounded up a group of students who
had opposing opinions about capital punishment.
Half the students were in favor of it and thought
that it deterred crime; the other half were against it
and thought that it had no effect on crime.
“This is how a community of knowledge can be-
come dangerous,” Sloman and Fernbach observe.
The two have performed their own version of the
toilet experiment, substituting public policy for
household gadgets. In a study conducted in 2012,
they asked people for their stance on questions
like: Should there be a single-payer health-care
system? Or merit-based pay for teachers? Par-
ticipants were asked to rate their positions de-
pending on how strongly they agreed or disagreed
with the proposals. Next, they were instructed to
explain, in as much detail as they could, the im-
pacts of implementing each one. Most people at
this point ran into trouble. Asked once again to
rate their views, they ratcheted down the intensity,
so that they either agreed or disagreed less vehe-
mently.
Elizabeth Kolbert
Elizabeth Kolbert has been a staff writer at The New
Yorker since 1999. She won the 2015 Pulitzer Prize for
general nonfiction for “The Sixth Extinction: An Unnat-
ural History.
21