"Memories, pressed
between the pages of my mind
Memories, sweetened thru the ages
just like wine,
Memories, memories, sweet memories"
– Mac Davis
To me, there is no doubt
that my feeble memory is failing. Whether my present condition is a
reality of old age or a particular effect of the life I have chosen
to live is uncertain; however, I know that remembering things is
becoming increasingly more difficult. I am now finding the most
difficult part of grocery shopping at Krogers is remembering where I
parked my car upon exiting the store.
A new study in Human
Communication Research confirms that our brains tend to fail when
relating information. Researchers found that people given accurate
statistics on a controversial issue tended to misremember those
numbers to fit commonly held beliefs.
For example, when people
are shown that the number of Mexican immigrants in the United States
declined recently – which is true but goes against many people’s
beliefs – they tend to remember the opposite.
Jason Coronel, lead author
of the study and assistant professor of communication at The Ohio
State University says …
“And when people pass
along this misinformation they created, the numbers can get further
and further from the truth. People can self-generate their own
misinformation. It doesn't all come from external sources.”
Coronel confirms people
may not be doing it purposely, but their own biases can lead them
astray. Coronel and his other researchers found that people usually
got the numerical relationship right on the issues for which the
stats were consistent with how many people viewed the world. But when
it came to the issues where the numbers went against many people’s
beliefs, participants were much more likely to remember the numbers
in a way that agreed with their probable biases rather than the
truth.
People often resist
altering their perceptions, instead believing they are right in their
previous constructs. When they have an idea in their mind, they tend
to look for evidence that supports that idea and not pay attention to
evidence that says the idea isn’t accurate.
Why?
Karyn Hall, Ph.D.,
founder of www.DBTSkillsCoaching.com,
says people
do not like uncertainty and often respond emotionally to different
ideas, situations, or people. Negative reactions to ambiguity –
even quick, first impressions – can anchor their later perceptions.
They often base their unnecessary fears in unfamiliarity. And, if
these people imagine something occurring, their view of the
likelihood of that thing actually occurring increases.
Solution aversion, as the
researchers call it, seems to know no partisan bounds. “In any
issue where people’s cherished beliefs and identities are in play,
you’re probably going to see some amount of solution aversion,”
said Troy Campbell, a consumer behavior researcher at Duke
University’s business school. “We alter our view of reality to be
as flattering as possible.” He concludes …
“If you feel really
negatively about the solution, if you don’t want the solution to
happen, then you deny that the problem exists. Then there will be
coherence in your belief systems.”
Failing memories,
misremembering, solution aversion, politicized narratives – is it
any wonder human beings are fountains of biased interpretations with
faulty, leaking connections of verification? We can and we should be
more self-aware of bias … especially of our own partiality.
Columnist David Brooks made the same point when he wrote about what
he called the “mental virtues” of a willingness to challenge
oneself, humility about one’s own understanding, and openness to
the knowledge of others. David Brooks writes …
“Thinking well means
pushing against the grain of our nature – against vanity, against
laziness, against the desire for certainty, against the desire to
avoid painful truths. Good thinking isn’t just adopting the right
technique. It’s a moral enterprise and requires good character, the
ability to go against our lesser impulses for the sake of our higher
ones …
“There is humility,
which is not letting your own desire for status get in the way of
accuracy. The humble person fights against vanity and
self-importance. The humble researcher doesn’t become arrogant
toward his subject, assuming he has mastered it. Such a person is
open to learning from anyone at any stage in life.”
Many people tend to have
flawed perspectives preventing them from moving forward in an optimal
way. Although their vision of accuracy is important to them, like the
wind and the rain, truth is beyond their control.
Of course, I'm talking
about others; I'm certainly not speaking about myself. My ironclad,
69 year-old mind is infallible. The trustworthiness of my opinions is
unquestioned. My beliefs are free of cognitive distortions, polarized
thinking, overgeneralizations, mislabeling, and jumping to
conclusions. You can take that to the bank.
After all, I remember
being the class valedictorian, a National Merit recipient, and a
Rhodes scholar. I also scored that winning touchdown in the State
Championship game of 1969, and then I became the youngest mayor of my
town at the tender age of twenty-one. That was just before I took the
cabinet position and President Ford sought my advice on ending the
war and finding a solution to world poverty. Since then, I've been
happy to work for think tanks at institutes including Cato,
Brookings, and Rand.
I can go and on, but right
now I need your help for an immediate, pressing problem of the
highest personal priority. Do you think you can help me find my car?
I've got a buggy full of groceries, and I'm standing here in Kroger's
parking lot without a clue. I'd appreciate your humble assistance.
No comments:
Post a Comment