Monday, June 30, 2008

New York Times: Your Brain Lies to You

By SAM WANG and SANDRA AAMODT

FALSE beliefs are everywhere. Eighteen percent of
Americans think the sun revolves around the earth, one
poll has found. Thus it seems slightly less egregious
that, according to another poll, 10 percent of us
think that Senator Barack Obama, a Christian, is
instead a Muslim. The Obama campaign has created a Web
site to dispel misinformation. But this effort may be
more difficult than it seems, thanks to the quirky way
in which our brains store memories — and mislead us
along the way.

The brain does not simply gather and stockpile
information as a computer’s hard drive does. Facts are
stored first in the hippocampus, a structure deep in
the brain about the size and shape of a fat man’s
curled pinkie finger. But the information does not
rest there. Every time we recall it, our brain writes
it down again, and during this re-storage, it is also
reprocessed. In time, the fact is gradually
transferred to the cerebral cortex and is separated
from the context in which it was originally learned.
For example, you know that the capital of California
is Sacramento, but you probably don’t remember how you
learned it.

This phenomenon, known as source amnesia, can also
lead people to forget whether a statement is true.
Even when a lie is presented with a disclaimer, people
often later remember it as true.

With time, this misremembering only gets worse. A
false statement from a noncredible source that is at
first not believed can gain credibility during the
months it takes to reprocess memories from short-term
hippocampal storage to longer-term cortical storage.
As the source is forgotten, the message and its
implications gain strength. This could explain why,
during the 2004 presidential campaign, it took some
weeks for the Swift Boat Veterans for Truth campaign
against Senator John Kerry to have an effect on his
standing in the polls.

Even if they do not understand the neuroscience behind
source amnesia, campaign strategists can exploit it to
spread misinformation. They know that if their message
is initially memorable, its impression will persist
long after it is debunked. In repeating a falsehood,
someone may back it up with an opening line like “I
think I read somewhere” or even with a reference to a
specific source.

In one study, a group of Stanford students was exposed
repeatedly to an unsubstantiated claim taken from a
Web site that Coca-Cola is an effective paint thinner.
Students who read the statement five times were nearly
one-third more likely than those who read it only
twice to attribute it to Consumer Reports (rather than
The National Enquirer, their other choice), giving it
a gloss of credibility.

Adding to this innate tendency to mold information we
recall is the way our brains fit facts into
established mental frameworks. We tend to remember
news that accords with our worldview, and discount
statements that contradict it.

In another Stanford study, 48 students, half of whom
said they favored capital punishment and half of whom
said they opposed it, were presented with two pieces
of evidence, one supporting and one contradicting the
claim that capital punishment deters crime. Both
groups were more convinced by the evidence that
supported their initial position.

Psychologists have suggested that legends propagate by
striking an emotional chord. In the same way, ideas
can spread by emotional selection, rather than by
their factual merits, encouraging the persistence of
falsehoods about Coke — or about a presidential
candidate.

Journalists and campaign workers may think they are
acting to counter misinformation by pointing out that
it is not true. But by repeating a false rumor, they
may inadvertently make it stronger. In its concerted
effort to “stop the smears,” the Obama campaign may
want to keep this in mind. Rather than emphasize that
Mr. Obama is not a Muslim, for instance, it may be
more effective to stress that he embraced Christianity
as a young man.

Consumers of news, for their part, are prone to
selectively accept and remember statements that
reinforce beliefs they already hold. In a replication
of the study of students’ impressions of evidence
about the death penalty, researchers found that even
when subjects were given a specific instruction to be
objective, they were still inclined to reject evidence
that disagreed with their beliefs.

In the same study, however, when subjects were asked
to imagine their reaction if the evidence had pointed
to the opposite conclusion, they were more open-minded
to information that contradicted their beliefs.
Apparently, it pays for consumers of controversial
news to take a moment and consider that the opposite
interpretation may be true.

In 1919, Justice Oliver Wendell Holmes of the Supreme
Court wrote that “the best test of truth is the power
of the thought to get itself accepted in the
competition of the market.” Holmes erroneously assumed
that ideas are more likely to spread if they are
honest. Our brains do not naturally obey this
admirable dictum, but by better understanding the
mechanisms of memory perhaps we can move closer to
Holmes’s ideal.

Sam Wang, an associate professor of molecular biology
and neuroscience at Princeton, and Sandra Aamodt, a
former editor in chief of Nature Neuroscience, are the
authors of “Welcome to Your Brain: Why You Lose Your
Car Keys but Never Forget How to Drive and Other
Puzzles of Everyday Life.”

New York Times
June 27, 2008
Op-Ed Contributor
Your Brain Lies to You

1 comment:

Anonymous said...

that's very interesting, thanks for that!

arlene,
Federal Way florist