Belief perseverance

Belief perseverance (also known as conceptual conservatism[1]) is maintaining a belief despite new information that firmly contradicts it.[2] Such beliefs may even be strengthened when others attempt to present evidence debunking them, a phenomenon known as the backfire effect (compare boomerang effect).[3] For example, an article in a 2014 article in The Atlantic, journalist Cari Romm describes a study involving vaccination hesitancy. In the study, the subjects were concerned of the side effects of flu shots, and became less willing to receive them after being told that the vaccination was entirely safe.[4]

There are three kinds of backfire effects: Familiarity Backfire Effect (from making myths more familiar), Overkill Backfire Effect (from providing too many arguments), and Worldview Backfire Effect (from providing evidence that threatens someone’s worldview). According to Cook & Lewandowsky (2011), there are a number of techniques to debunk misinformation. They suggest emphasizing the core facts and not the myth. If you must mention the myth, before you do, provide an explicit warning that the upcoming information is false. Finally, provide an alternative explanation to fill the gaps left by debunking the misinformation.[5]

Since rationality involves conceptual flexibility,[6][7] belief perseverance is consistent with the view that human beings act at times in an irrational manner. Philosopher F.C.S. Schiller holds that belief perseverance "deserves to rank among the fundamental 'laws' of nature".[8]

Evidence from experimental psychology

According to Lee Ross and Craig A. Anderson, "beliefs are remarkably resilient in the face of empirical challenges that seem logically devastating".[9] The following experiments can be interpreted or re-interpreted with the aid of the belief perseverance concept.

The first study of belief perseverance was carried out by Festinger, Riecken, and Schachter. These psychologists spent time with a cult whose members were convinced that the world would end on December 21, 1954. After the prediction failed, most believers still clung to their faith.[10]

When asked to reappraise probability estimates in light of new information, subjects displayed a marked tendency to give insufficient weight to the new evidence.[11]

In another study, mathematically competent teenagers and adults were given seven arithmetical problems and first asked for approximate answers by manual estimation. Then they were asked for the exact answers by use of a calculator rigged to produce increasingly erroneous results (e.g., yielding 252 × 1.2 = 452.4, when it is actually 302.4). In reflecting on their estimation skills or techniques, about half the subjects went through all seven problems without once letting go of the conviction that calculators are infallible.[12]

Lee Ross and Craig A. Anderson led some subjects to the false belief that there existed a positive correlation between a firefighter's stated preference for taking risks and their occupational performance. Other subjects were told that the correlation was negative.  Subjects were then extensively debriefed and given to understand that no correlation existed between risk taking and performance. These authors found that post-debriefing interviews pointed to significant levels of belief perseverance.[13]

In another study,[14] subjects spent about four hours following instructions of a hands-on instructional manual.  At a certain point, the manual introduced a formula which led them to believe that spheres are 50% larger than they are. Subjects were then given an actual sphere and asked to determine its volume; first by using the formula, and then by filling the sphere with water, transferring the water to a box, and directly measuring the volume of the water in the box. In the last experiment in this series, all 19 subjects held a Ph.D. degree in a natural science, were employed as researchers or professors at two major universities, and carried out the comparison between the two volume measurements a second time with a larger sphere. All but one of these scientists clung to the spurious formula despite their empirical observations.

Taken together, such experiments lead to a surprising conclusion:

"Even when we deal with ideologically neutral conceptions of reality, when these conceptions have been recently acquired, when they came to us from unfamiliar sources, when they were assimilated for spurious reasons, when their abandonment entails little tangible risks or costs, and when they are sharply contradicted by subsequent events, we are, at least for a time, disinclined to doubt such conceptions on the verbal level and unlikely to let go of them in practice."[1]

In cultural innovations

Physicist Max Planck wrote that "the new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it".[15] For example, the heliocentric theory of the great Greek astronomer, Aristarchus of Samos, had to be rediscovered about 1,800 years later, and even then undergo a major struggle before astronomers took its veracity for granted.[16]

Belief perseverance often involves intrapersonal cognitive processes as well. "When the decisive facts did at length obtrude themselves upon my notice," says the great chemist Joseph Priestley, "it was very slowly, and with great hesitation, that I yielded to the evidence of my senses."[17] Arthur Koestler coined the term snowblindness to refer "to that remarkable form of blindness which often prevents the original thinker from perceiving the meaning and significance of his own discovery. Jealousy apart, the antibody reaction directed against new ideas seems to be much the same whether the idea was let loose by others–or oneself."[18]

In education

Students often "cling to ideas that form part of their world view even when confronted by information that does not coincide with this view."[19] For instance, students may study the solar system for months, do well on tests, yet continue to believe that lunar phases are caused by Earth's shadow.[20] One example that people who are striving to finish their education should follow is the example of a 94 year old college graduate, he tells people that education is the only thing you can take with you in this life. "Regardless of what degrees we hold, lifelong learning is something we all should value. The continual pursuit of knowledge and personal development makes us more engaged citizens, better neighbors and happier people."[21] This happens all the time in history where people have a belief persevation to finish their education.

Causes

The causes of belief perseverance remain unclear. Experiments in the 2010s suggest that neurochemical processes in the brain underlie the strong attentional bias of reward learning. Similar processes could underlie belief perseverance.[22]

Peter Marris suggests that the process of abandoning a conviction is similar to the working out of grief. "The impulse to defend the predictability of life is a fundamental and universal principle of human psychology." Human beings possess "a deep-rooted and insistent need for continuity".[23]

Thomas Kuhn points to the resemblance between conceptual change and Gestalt perceptual shifts (e.g., the difficulty encountered in seeing the hag as a young lady). Hence, the difficulty of switching from one conviction to another could be traced to the difficulty of rearranging one's perceptual or cognitive field.[24]

gollark: I'm trying to make all gollarious, but this is hard.
gollark: <@331320482047721472> maybe-tit-for-tat-or-grudger is doing RANDOMNESS. Is this not impure and thus illegal?
gollark: Maybe gollariosity could just look ONE turn into the future.
gollark: Against tit-for-tat, say, it would realize that it got a better score if it coooöoperated.
gollark: The opponent doesn't ALWAYS have that however.

See also

References

  1. Nissani, Moti (1990). "A cognitive reinterpretation of Stanley Milgram's observations on obedience to authority". American Psychologist. 45 (12): 1384–1385. doi:10.1037/0003-066x.45.12.1384.
  2. Baumeister, R. F.; et al., eds. (2007). Encyclopedia of Social Psychology. Thousand Oaks, CA: Sage. pp. 109–110. ISBN 9781412916707.
  3. Silverman, Craig (June 17, 2011). "The Backfire Effect: More on the press’s inability to debunk bad information". Columbia Journalism Review, Columbia University (New York City).
  4. Romm, Cari (December 12, 2014). "Vaccine Myth-Busting Can Backfire". The Atlantic.
  5. Cook, J., Lewandowsky, S. (2011), The Debunking Handbook. St. Lucia, Australia: University of Queensland. November 5. ISBN 978-0-646-56812-6.
  6. Voss, J. F.; et al., eds. (1991). Informal Reasoning and Education. Hillsdale: Erlbaum. p. 172.
  7. West, L.H.T.; et al., eds. (1985). Cognitive Structure and Conceptual Change. Orlando, FL: Academic Press. p. 211.
  8. Beveridge, W. I. B. (1950). The Art of Scientific Investigation. New York: Norton. p. 106.
  9. Kahneman, D., ed. (1982). Judgment Under Uncertainty: Heuristics and Biases. Cambridge: Cambridge University Press. p. 144.
  10. Festinger, Leon; et al. (1956). When Prophecy Fails. Minneapolis: University of Minnesota Press.
  11. Kleinmuntz, B., ed. (1968). Formal Representation of Human Judgment. New York: Wiley. pp. 17–52.
  12. Timnick, Lois (1982). "Electronic Bullies". Psychology Today. 16: 10–15.
  13. Anderson, C. A. (1983). "Abstract and Concrete Data in the Conservatism of Social Theories: When Weak Data Lead to Unshakeable Beliefs" (PDF). Journal of Experimental Social Psychology. 19 (2): 93–108. doi:10.1016/0022-1031(83)90031-8. Archived from the original (PDF) on 2016-10-05. Retrieved 2016-07-18.
  14. Nissani, M. and Hoefler-Nissani, D. M. (1992). "Experimental Studies of Belief-Dependence of Observations and of Resistance to Conceptual Change". Cognition and Instruction. 9 (2): 97–111. doi:10.1207/s1532690xci0902_1.
  15. Eisenck, Hans J. (1990). Rebel with a Cause. London: W. H. Allen. p. 67.
  16. Koestler, Arthur (1990). The Sleepwalkers: A History of Man's Changing Vision of the Universe. Penguin Books. ISBN 978-0140192469.
  17. Roberts, Royston M (1989). Serendipity. New York: Wiley. p. 28.
  18. Koestler, Arthur (1964). Act of Creation. London: Hutchinson. p. 216.
  19. Burbules, N.C.; et al. (1992). "Response to contradiction: scientific reasoning during adolescence". Journal of Educational Psychology. 80: 67–75. doi:10.1037/0022-0663.80.1.67.
  20. Lightman, A.; et al. (1993). "Teacher predictions versus actual student gains". The Physics Teacher. 31 (3): 162–167. doi:10.1119/1.2343698.
  21. "Are Graded Lesson Observations the "Elephant" in Our Classrooms? An Exploration into the Views of In-Service Teacher Trainees on Lesson Observations". Teaching in Lifelong Learning. 7 (1). 2016-07-08. doi:10.5920/till.2016.712. ISSN 2040-0993.
  22. Anderson, Brian A.; et al. (2016). "The Role of Dopamine in Value-Based Attentional Orienting". Current Biology. 26 (4): 550–555. doi:10.1016/j.cub.2015.12.062. PMC 4767677. PMID 26877079.
  23. Marris, Peter (1986). Loss and Change. London: Routledge. p. 2.
  24. Kuhn, Thomas (1962). The Structure of Scientific Revolutions. Chicago: University of Chicago Press.
Further reading
  • Anderson, Craig A. (2007). "Belief Perseverance". In Baumeister, Roy; Vohs, Kathleen (eds.). Encyclopedia of Social Psychology. pp. 109–110. doi:10.4135/9781412956253.n62. ISBN 9781412916707.
  • Nissani, M. (1994). "Conceptual conservatism: An understated variable in human affairs?". The Social Science Journal. 31 (3): 307–318. doi:10.1016/0362-3319(94)90026-4.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.