Sunday, April 11, 2010

People Believe What They Want To Believe

The truth of the matter is disturbing. Most people will believe what they want. Despite knowledge, circumstance, evidence, and other means that aid positive discernment, most people will face resistance to the contrary and still believe what they want to believe. In this sense, not much has changed human nature: people often believe without good reason. Support for this thesis is well documented.

When people do have a belief for certain reasons, and those reasons are refuted-- taken away or weakened by other evidence--they still often stand on their original point of view instead of making needed adjustments. Even if the alternative is compelling, people will tend to hang onto scanty evidence rather than to change their minds. Personal bias has strong roots that prove difficult to remove.

Gregory Koukl ("People Believe What They Want," standtoreason.org, 1996) states,

"So, people will say, 'Life on Mars! It's already been proven.' Well, it hasn't...The point is, the evidence for ancient life on Mars wasn't conclusive in any way, shape or form. Yet those who want to believe in life on other planets or in evolution-- and even if there was life on Mars, it wouldn't prove evolution, as I pointed out-- they seize on this scanty evidence... I think the Mars rock is much ado about nothing... All of a sudden this rock makes it into the news. They see a couple of forms one-hundredth the width of a human hair through an electron microscope. Scattered around it are some chemicals that are sometimes, but not necessarily, associated with life. As one person pointed out, if this had been found on earth, no one would have ever drawn the conclusion that this was life."

People have propensities for the confirmation bias (which seeks and finds confirmatory evidence for what they already believe) and the hindsight bias (which tailors after-the-fact explanations to what they already know happened). Arthur Goldwag’s book, Cults, Conspiracies, and Secret Societies (Vintage, 2009) finds bias in this example:

"Even the most trivial detail seems to glow with significance," Goldwag explains, noting the JFK assassination as a prime example. "Knowing what we know now ... film footage of Dealey Plaza from November 22, 1963, seems pregnant with enigmas and ironies—from the oddly expectant expressions on the faces of the onlookers on the grassy knoll in the instants before the shots were fired (What were they thinking?) to the play of shadows in the background (Could that flash up there on the overpass have been a gun barrel gleaming in the sun?)

"Each odd excrescence, every random lump in the visual texture seems suspicious. Add to these factors how compellingly a good narrative story can tie it all together—think of Oliver Stone’s JFK or Dan Brown’s Angels and Demons, both equally fictional." (Michael Shermer, "Why People Believe In Conspiracies," Scientific American, September 2009)


Confirmation Bias

Confirmation bias is a phenomenon wherein decision makers have been shown to actively seek out and assign more weight to evidence that confirms their hypothesis, and ignore or underweigh evidence that could disconfirm their hypothesis. Robert T. Carroll (The Skeptic's Dictionary, 2009) says, "For example, if you believe that during a full moon there is an increase in admissions to the emergency room where you work, you will take notice of admissions during a full moon, but be inattentive to the moon when admissions occur during other nights of the month. A tendency to do this over time unjustifiably strengthens your belief in the relationship between the full moon and accidents and other lunar effects." 

Thomas Gilovich confirms that the "most likely reason for the excessive influence of confirmatory information is that it is easier to deal with cognitively" (How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life, 1993) It is much easier to see how a piece of data supports a position than it is to see how it might count against the position. 

A person can apply confirmation bias in extra-perception. When one considers a typical ESP experiment or a seemingly clairvoyant dream, successes are often unambiguous or data are easily massaged to count as successes, while negative instances require intellectual effort to even see them as negative or to consider them as significant. And, the tendency to give more attention and weight to the positive and the confirmatory has been shown to influence people's memories. When digging into their memories for data relevant to a position, people are more likely to recall data that confirms the position.


Hindsight Bias

Hindsight bias occurs when people who know the answer vastly overestimate its predictability or obviousness, compared to the estimates of subjects who must guess without advance knowledge. In other words, hindsight bais is the tendency people have to view events as more predictable than they really are. After an event, they often believe that they knew the outcome of the event before it actually happened.

Even in experiments, people often recall their predictions before the event as much stronger than they actually were.
The hindsight bias is often referred to as the "I-knew-it-all-along phenomenon." College students have also experienced the hindsight bias in their own studies. As they read their course texts, the information may seem easy. "Of course," they might think after reading the results of a study or experiment. "I knew that all along." When it comes to test time, however, the presence of many different answers on a multiple choice test may make people realize that they did not know the material quite as well as they thought they did. By being aware of this problem, people can develop good study habits to overcome the tendency to assume that they "knew-it-all-along." (David G. Myers, Social Psychology, 8th ed., 2005) 


Of course, this famous example of hindsight bias occurred after the 1986 Challenger exploded for reasons traced to an O-ring losing flexibility at low temperature.  There were warning signs of a problem with the O-rings. Still, according to Eliezer Yudkowsky (lesswrong.com, August 16 2007), "...preventing the Challenger disaster would have required, not attending to the problem with the O-rings, but attending to every warning sign which seemed as severe as the O-ring problem, without benefit of hindsight.  It could have been done, but it would have required a general policy much more expensive than just fixing the O-Rings."


Implications

It may prove dangerous and certainly stupid "to believe what you want to believe." In matters of human interest and in affairs of human communication, many people want their instinct to guide them through muddy waters. Not only do the biases of confirmation and hindsight push themselves into argumentation but also the deliberate use of fallacies to confuse information apply their negative influence. Often in arguments, people completely ignore facts.

"I don't like the looks of it."
"Something about it doesn't smell right." 
"I knew it all along."
"He or she is just like that."
"He (or she) did it once, so he (or she) will do it again."
"I could tell this was going to happen."
"He (or she) is certainly not one of us."
"He (or she) just doesn't have any faith."
"People with those dumb opinions just get on my nerves."


Do any of these sentences sound familiar? Have you encountered resistance because you have a different opinion or because someone refuses to think about an alternative? Do friends and even family prejudge your motives or your words? Do they simply wish you would "shut up"?


Without much doubt, people believe what they want to believe; however, we all don't believe the same set of ideals and values for the same set of reasons. Diversity can be a strong motive for change when it is allowed to develop. 
Without a voice, people are denied the opportunity to affect change. Without different voices, society stagnates.

Tonight, I'm feeling as if someone could listen to my point of view, even if it is unpopular, because I value expression and because I am not like everyone else. Judgments and sentences do not always change conditions for the better. Isolation and disregard leave scars that seldom heal. My sense of the truth is far from perfect, but I do believe the so-called truth is not as straight and narrow as many may believe. To most, truth depends upon what they want to believe. And, believe me, hypocrisy is present in the beliefs of a great number.

1 comment:

  1. Good article. People love to 'self deceive'. To think is painful and correcting ones path to the way of truth requires work and effort - which most people expect to be paid for, or they won't do it!

    ReplyDelete