How and why do people change their minds?

The answer to this crucial question doesn't have to be a matter of opinion. Although we are far from a complete answer, some research is being done in cognitive science that can help us understand better what kinds of activism are likely to be effective.

Resisting belief revision

When we are faced with information that contradicts beliefs we hold, we tend to reject the information or interpret it in a way that allows us to keep our beliefs: "in everyday thinking, the mind is very good at brushing aside information that a logician would regard as being of the utmost importance to correct thinking" (Campbell, 1989:238).

This doesn't sound like a good thing. However, this information processing strategy can be seen as a reasonable way for an individual to cope with a vastly complex environment.

Resisting belief revision: benefits

Obviously, we are bombarded with information continuously. Since much of that information is uncertain and inconsistent with other information, being able to tolerate inconsistency can have substantial adaptive value (Anderson, 1991:124).

How do we decide whether Advil or Nuprin is better on the basis of information in commercials? Dignified people in lab coats present pseudo-scientific facts in favor of both products. If we were too willing to revise our beliefs, it would take a long time to reach a decision on the most mundane of questions.

Using an old model to explain new facts is easier than revising the model (Campbell, 1989:245), and also more efficient. We may not always make the most logical decisions when faced with contradictory facts, but we resolve the uncertainty fairly quickly.

It seems likely that being able to judge incoming information has been important always, not just in the information age. Consider this scenario:

"...two hypothetical prehistoric hunters and gatherers are out in barren terrain, searching for berries or some small animal. One convinces the other that he would fare better on a distant hill, when in fact the best chances are where they stand. This lie, although unethical, has great genetic value, since if two people are searching for food in a place where there is barely enough for one of them alone, either one might raise his own chance of survival by persuading the other to search elsewhere" (Goleman 1985:242).

In this situation, the hunter-gatherer who is too willing to revise his beliefs can reduce his chance for survival.

Ease of processing information is not the only reason to resist changing what we believe. Holding on to certain beliefs can be socially effective: goal-directed action often requires over-confidence for success. In Campbell’s words: "If the mind worked like a pocket calculator, computing probabilities, human beings might never embark on any enterprise that involves risk."

For instance, for every business in the United States that makes money, four others will only break even, lose money, or even go bankrupt. Still, new businesses are being started at an amazing rate (1989:243). Optimism is possible because the mind constructs models of successful businesses and happy marriages.

"The biases, obsolete opinions, and inaccuracies with which the self is riddled are normal and healthy. They preserve cognitive organization and make us steadfast in the pursuit of chosen goals" (Campbell, 1989:255).

Some beliefs have an important psychological function which may be more important than whether the belief is true or not. Suppose Colette believes that she is a competent individual, and Harold believes that little gray aliens will take over his mind if he ever takes his aluminum foil hat off. Now suppose they are both wrong: Colette is incompetent and nobody finds Harold's mind worth taking over. A key difference is that Colette’s false belief allows her to function in her daily activities, while Harold’s false belief gets in his way.

Resisting belief revision: disadvantages

Of course, examples of this potentially useful information processing strategy being used in a damaging way abound.

Burger and Gochfield (1991) interviewed over 50 people who fished for recreation or subsistence in areas of Puerto Rico where mercury contamination was suspected. Subjects were aware of the mercury threat, but either denied its importance, believed the contamination was restricted to a more distant area, or assumed that the area would be officially off limits if the threat was real. All subjects ate the fish they caught. Even in this case, where picking a different site would not have been a disaster to the personal security or the sense of self of the fishers, it was more convenient to hold on to the old belief.

Even more tragic examples come from children of alcoholic or abusive parents, as in this example from Goleman (1985:16-17):

"I am very close to my family. They were always very demonstrative and caring. When I disagreed with my mother she threw whatever was nearest at hand at me. Once it happened to be a knife and I needed ten stitches in my leg. A few years later my father tried to choke me when I began dating a boy he didn’t like. They really are very concerned about me."

The individual’s need to believe that she lives in a secure, loving family overrides all evidence that the family is actually putting her in physical danger.

Here are two examples from World War II (Goleman, 1985:18):

"In World War II, Göring was told that an Allied fighter had been shot down over a German city, the first that had ever been seen that far behind the Axis lines. This meant the Allies had developed a long-range fighter that could escort bombers over Germany. Göring, a pilot himself, ‘knew’ such a development was impossible. His reply:’I officially assert that American fighter planes did not reach Aachen... I herewith give you an official order that they weren’t there.’

In the same war, on the day the Germans began their offensive against Russia, a Soviet frontline unit sent the message headquarters, ‘We are being fired on. What shall we do?’ To which headquarters responded, ‘You must be insane.’"

Groups are also susceptible to self-deception. A small group of key decision-makers can ignore crucial information because it did not fit with the collective view. And once a decision is adopted by the group, "individual members are likely to feel it must be right" (Goleman, 1985:186). Each individual member of the group will be unwilling to draw attention to unpopular facts, because in doing so they might destroy the illusion of solidarity (a punishable act).

For instance, there is evidence that the Challenger space shuttle disaster of 1986 was not an accident, but a completely preventable incident caused by groupthink (as discussed in "The Truth About Lies"). At Morton Thiokol, a supplier for NASA, engineers knew that the O-rings that acted as seals between booster segments were eroding in cold weather. Six months before the Challenger disaster, Boisjoly, one of the engineers, wrote a memo warning that loss of life might occur if nothing were done. This memo was circulated at Morton Thiokol, but didn’t reach NASA.

The day before the launch, Morton Thiokol engineers heard that weather reports predicted freezing temperatures overnight. They recommended to NASA that the launch be postponed, but the space agency was under pressure to launch. They believed that the shuttle was a glorified cargo truck which operated with regularity and economy, and that belief had taken on a life of its own. NASA was trying to convince Congress that it could do about 25 missions a year. NASA managers asked Morton Thiokol to reverse their recommendation. They did, and there was a decision to launch.

Here is a great article--David Hadorn's "The Role of Cognitive Errors in the Drug Policy Debate":

http://www.pdxnorml.org/cog_err.html

The sections "Belief Perseveration" and "Discussion" have implications for our efforts: cognitive mechanisms for belief perseveration are so powerful that it may not be a realistic goal for us to convince people who are strongly or even moderately opposed to our views, "but rather to influence those who are truly undecided".

Potential implications for activists

It may increase effectiveness to...
  • reach out to undecided individuals
  • initially avoid reaching out to individuals who are actively opposed to our arguments (or at least be careful about what we say to them) because it may be that any arguments presented to them will intensify their opposition, no matter how valid and well presented
  • reach out to individuals who socially valued, central figures in loosely knit social networks
  • have a trusted, known person present the argument (since there may be valid and ancient reasons to distrust information received from strangers); perhaps this is what we are doing when we get celebrities to spread the message
  • encourage people to incorporate new behaviors consistent with the new world view that we would like them to consider

 

 

Anderson, Norman H. 1991. "A cognitive theory of judgment and decision." In Anderson (ed.) Contributions to Information Integration Theory. Volume I: Cognition. Lawrence Erlbaum Associates, Hillsdale, NJ, 105-142.

Burger, Joanna and Michael Gochfield. 1991. "Fishing a Superfund site: Dissonance and risk perception of environmental hazards by fishermen in Puerto Rico." Risk Analysis, June, 11(2), 269-77.

Campbell, Jeremy. 1989. The improbable machine: What the upheavals in artificial intelligence research reveal about how the mind really works. Simon and Schuster, New York, 1989.

Goleman, Daniel. 1985. Vital lies, simple truths: The psychology of self-deception. Simon and Schuster, New York.

"The truth about lies." Bill Moyers: The Public Mind. PBS. WTIU, Bloomington.