Why Facts Don’t Change Our Minds and Beliefs Are so Hard to Change?

Why Facts Don’t Change Our Minds and Beliefs Are so Hard to Change?
Imed Bouchrika, Phd by Imed Bouchrika, Phd
Chief Data Scientist & Head of Content

When some people have already formed beliefs in their minds, these tend to stick and it goes for various topics. Even when faced with facts and logical reasoning, they would persistently push on with their beliefs. In fact, social media was weaponized due to this behavior, with a number of PR firms driving narratives that highlight propaganda rather than facts. Why do a lot of people think this way?

This article will explore the various reasons why facts don’t change our minds. By recognizing our own biases and knowing how our brain responds to stimuli, one can avoid falling into the trap of faulty thinking patterns and assess information objectively. In addition, students can avoid the habit of denying facts to truly learn, which enables them to truly learn in their general education courses.

Why Facts Don’t Change Our Mind: Table of Contents

  1. Science Denial: Some Harmful Examples
  2. Belief Perseverance
  3. Confirmation Bias
  4. Avoidance of Complexity
  5. Causality and the Ignorance Gap
  6. Emotions and Assessing Risk
  7. Convincing Others to Change Their Minds
  8. Clinging to and Changing Beliefs

Science Denial: Some Harmful Examples

In 2015, misconceptions about the Ebola virus led to an Ebola hysteria in the United States. According to the Centers for Disease Control and Prevention, Ebola mainly spreads through direct contact with the blood or body fluids or objects contaminated with the body fluids of an infected person (CDC, n.d.).

Yet, Americans reacted to the virus in ways that were not supported by science. For instance, a teacher in Maine was put on leave by the school board because she stayed at a hotel in Dallas, Texas near the hospital where two nurses contracted Ebola. Docks in Mexico and Belize refused entry to a cruise ship because one of the passengers was a nurse who worked at the Texas hospital, even though the nurse did not have direct contact with the first Ebola patient who died there (Yuhas, 2014).

Another example is the anti-vaccine movement. Parents who refuse to have their children vaccinated believe that it causes autism in addition to other medical risks. However, multiple studies with large sample sizes show the vaccines used to prevent measles, mumps, and rubella do not cause autism. The CDC also confirms this, saying that vaccines are continuously monitored for safety and have side effects that are minor (CDC, n.d.).

In 2014, anti-vaccine sentiment proved to be a health hazard, as the U.S. reported 600 cases of measles, the highest number that the CDC has reported in 20 years (CDC, n.d.). The outbreaks were caused by unvaccinated people who contracted the disease abroad, then spreading it to communities with relatively low vaccination rates (Carroll, 2014). With the COVID-19 vaccines being rolled out, anti-vaccine sentiment is on the rise once again and threatens to hamper vaccine distribution. For instance, a survey from The Vaccine Confidence Project revealed that there was a drop by 6.4% and 2.4% among United Kingdom and U.S. respondents, respectively, after being exposed to misinformation about a potential COVID-19 vaccine.

Moreover, denying science and fact can affect students’ academic achievements. After all, school subjects are centered on knowing and applying facts instead of loose theories.

Source: The Vaccine Confidence Project

Find The Best Degree Match

Get personalized degree recommendations that will help you find a program that will match your goals and dreams.

The website Research.com is funded by advertising. All school search, finder, and match results, as well as featured or trusted partner programs, are for schools who pay us. Our school rankings, resource guides, or any other editorially impartial content on our website are unaffected by the compensation we receive.

Why Facts Don’t Change Minds

There are a number of phenomena that researchers have uncovered that help explain why we cling steadfastly to our beliefs, or more commonly known as false beliefs that cannot be changed by facts. These include belief perseverance, the well-known phenomenon of confirmation bias, the illusion of explanatory depth, avoidance of complexity, filling in the ignorance gap with false causality, and a poor understanding of risk. One of the primary goals of psychology is to control unhealthy behaviors and in the case of fact denial, knowing the causes is important.

Belief Perseverance

One explanation of why facts don’t change our minds is the phenomenon of belief perseverance. This refers to people’s tendencies to hold on to their initial beliefs even after they receive new information that contradicts or disaffirms the basis for those beliefs (Anderson, 2007). A study on belief perseverance was done by Anderson, Lepper, and Ross (1980) where they asked participants to examine the relationship between the risk preference of firefighters to their success on the job. One group was given responses supposedly from firefighters that were manipulated to establish a positive relationship between risk-taking and success as a firefighter. The other group was given made-up responses, which established a negative correlation between risk-taking and success as a firefighter. One group was debriefed that the information given to them was fictitious and that the experimenters did not know whether there was a positive or negative correlation between risk-taking and success as a firefighter (Anderson et al, 1980).

Results showed that for the group that received a debriefing, presentation of discrediting evidence did little for them to abandon their position that there was a positive or negative relationship between risk-taking and firefighting ability. According to the researchers, the subjects’ theories were “virtually intact,” which strongly supported the hypothesis that even after initial evidence was shown, which disproved the basis for their beliefs, people still will not change their minds about those beliefs (Anderson et al, 1980).

Confirmation Bias

Confirmation bias is a person’s tendency to accept information that confirms their views or prejudices while ignoring or rejecting contradicting information. This prevents them from seeing things objectively (Heshmat, 2015).

In their book Enigma of Reason, Hugo Mercier and Daniel Sperber refer to this as “myside bias.” According to them, rationality is less about making decisions based on logic but providing justifications for decisions one has already made. The ability to justify actions also has a social aspect. When one is able to successfully do so, one receives a boost in status and prestige within his social group. Because of this, there is a motivation to prove that one’s belief is the correct one by downplaying evidence and arguments that support opposing beliefs (James, 2018).

Mercier and Sperber further argue that human cognition is made up of modules, one of which is the reasoning module. But again, the reasoning module is more about intuition and less about logic, which according to them, only plays a “marginal role.” Instead of forming major premises, minor premises, and conclusions, we turn to stories to justify our reasons, which are really just after-the-fact rationalizations (James, 2018).

Illusion of Explanatory Depth

Another reason why beliefs are so hard to change is the illusion of explanatory depth. According to this concept, people think they understand an issue well enough to be able to have an opinion about it. The only time they become aware of their ignorance is when they are asked to explain about it and they fumble in doing so.

One study about the illusion of explanatory depth was done by U.K. researcher Rebecca Lawson. Lawson asked a group of psychology students from the University of Liverpool to rate their knowledge of how bicycles work and draw the pedals, chain, and extra frame onto a sketch of a bicycle. There was also a multiple-choice task that required them to identify the usual position of the frame, pedals, and chain (Lawson, 2006).

The study found that over 40% of the participants who were not experts in bicycles made at least one mistake in the drawing type and the multiple-choice task. This is despite the fact that almost all participants learned how to ride a bike, that almost half of them owned a bicycle, and despite bicycles being common, everyday objects. One striking comment from a participant was “I never knew how little I knew about things until I had to draw them.” Thus, the results suggest that people have a vague, incomplete, and often inaccurate understanding of how everyday objects function (Lawson, 2006).

Another type of cognitive bias that leads to faulty thinking patterns is the so-called Dunning-Kruger Effect. In 1999, Cornell University psychologists David Dunning and Justin Kruger. They administered tests on logic, grammar, and sense of humor. The results showed that people who scored in the lowest percentile also tended to overestimate how well they perform. For example, people who had actual test scores in the 12th percentile (meaning they tested better than 12% of the test takers) estimate that their performance would put them in the 62nd percentile (Cherry, 2019). This is because poor performers lack metacognition or the ability to assess one’s self objectively. As a result, they suffer from the double curse of the Dunning-Kruger Effect. Not only do they perform poorly but they also miss out on opportunities for growth because they lack the self-awareness to judge their skills accurately (Psychology Today, n.d.).

Avoidance of Complexity

In their book Denying to the Grave: Why We Ignore the Facts That Will Save Us, Sara and Jack Gorman point out that one of the causes of science denial is that making decisions based on science is complicated and requires a great deal of mental energy. Intimidated by difficult concepts they struggle to understand, people resort to simplistic explanations even though they may not be that accurate.

The psychology of changing your mind has a lot to do with how our brain is structured. For instance, the amygdala, the almond-shaped cells near the base of the brain, governs our emotions, emotional behavior, and motivation (Wright, 2020). On the other hand, the prefrontal cortex (PFC) is responsible for processing beliefs, all of which are relatively advanced: doxastic inhibition, which allows us to deliberate as if something is not true and get an intuitive feeling about being “right” or “wrong” about certain beliefs. Within the PFC is the dorsolateral prefrontal cortex, which governs executive function, reason, and logic (Gorman & Gorman, 2016).

According to behavioral researchers Daniel Kahneman and Amos Tversky, the more primitive parts of the brain like the amygdala cannot process complicated information. While the PFC can make rational decisions that take into account long-term consequences, making these decisions can be exhausting for that part of the brain. This is why people find it easier to make quick decisions rather than think long and hard about issues (Gorman & Gorman, 2016).

Moreover, scientists have found through functional magnetic resonance imaging that holding firm in one’s belief activates the nucleus accumbens, the pleasure center of the brain. On the other hand, changing one’s belief stimulates the insula, the same area that is triggered by anxiety, fear, or disgust (Gorman & Gorman, 2016). If you’re wondering why beliefs are so hard to change, it is because we’re hard-wired to feel great about standing our ground.

Causality and the Ignorance Gap

The Gormans also argue that humans are not comfortable with the “ignorance gap” or not knowing why certain events happen. Thus, the human brain, which is hardwired to find and recognize patterns in our environment, tends to assign causality based on an event that follows another event (Gorman & Gorman, 2016).

For example, let’s say Student A uses a fountain pen to answer a math exam and aces the exam. Student A also uses the same fountain pen to answer his English exam where he got good marks. Even if it’s probably just coincidence, the thought that the fountain pen is a lucky pen will cross the mind of Student A because of the brain’s tendency to establish causality. In reality, the student might just have studied well for both exams but nevertheless, the student believes that what caused his good grades is the pen and not his good study habits. This example shows that the intense desire to find causality can make people abandon rational thinking.

The Gormans argue that assuming causality helped primitive humans survive at a time when they had scant means to distinguish coincidence from causal connections. However, as humans evolved, scientific processes established more structured and rigorous ways of establishing causality such as empirical research (Gorman & Gorman, 2016).

To prove causality, one must observe the counterfactual, the condition of what would have happened that would make an experiment yield different results. But since this is impossible to observe, scientists approximate counterfactual conditions by setting up controlled, randomized trials. This is why establishing causality can take years of research. For example, it took decades before scientists were able to conclude that cigarettes cause cancer. The difference between how scientists establish a cause and the intuitive way in which the human brain connects random events together explain why there is a disconnect between the average person’s and scientist’s approach to causality, and why sometimes the public expresses doubts on scientists’ knowledge (Gorman & Gorman, 2016).

Emotions and Assessing Risk

Yet another reason why people jump into faulty thinking patterns is a lack of understanding of risk and probability. People tend to dismiss the risk of doing everyday activities like taking a bath or driving while underestimating large ones that are unfamiliar and they don’t have control over like vaccines or nuclear radiation. Psychologists, behavioral economists, and neuroscientists call this tendency the nonlinear estimation of probability.

In an ideal setting, plotting out the estimation and the probability of risk will result in a straight line, which is why it’s called the linear estimation of probability. In other words, the greater the chances of a bad event happening, the greater the perception of risk. However, experiments have found that in real life, the way humans assess risk is nonlinear: small probabilities had higher than expected risk perception and high probabilities had unexpectedly lower risk perceptions. Risk theorists have noted that emotions and affective judgments are involved in calculating risk probabilities. The way that individuals assess risk varies greatly from the numerical approach used by professional risk assessors (Gorman & Gorman, 2016).

People’s assessment of risk is also marred by what behavioral economists call the endowment effect. This phenomenon describes how people are more afraid of losing something than gaining something of equal value. In an experiment done by Knutson et al (2008), participants were asked to buy, sell, or express a certain preference for an item. Researchers conducted fMRI on three areas of the brain: the greater nucleus accumbens; the mesial prefrontal cortex (MPFC); and the insula. They observed that when there was a low price for buying versus selling, the MPFC was activated. However, during selling, greater activation and stronger endowment effect were observed in the insula, which is associated with the prediction of monetary loss (Mozes, 2008). When the insula is strongly activated, more emotional negativity is attached at the thought of parting with something that one owns. When applied to beliefs, people’s perspectives can be thought of as “possessions” which is hard for them to give up even when there is a “better offer,” or evidence that contradicts their point of view (Gorman & Gorman, 2016).

Convincing Others to Change Their Minds

Now that we know why people fall into faulty thinking patterns, how do we go about changing other people’s minds?

According to Tali Sharot, a professor of cognitive neuroscience at University College London and author of The Influential Mind: What the Brain Reveals About Our Power to Change Others, we must align ourselves with other people using seven core elements (Pandey and Gupta, 2019).

The first core element of prior beliefs involves seeking common ground. Find out the beliefs of the other person you are in agreement with, instead of bombarding him or her with facts and figures about a debatable topic that supports your argument (Pandey and Gupta, 2019).

The second core element of emotions involves framing our views positively rather than negatively. This is because positive framing is easier to process and broadens one’s thoughts and actions (Pandey and Gupta, 2019).

Thirdly, it is more effective to present an immediate positive reward than subsequently giving a threat. To influence others, they must be given a sense of control or “agency,” otherwise they will feel angry, frustrated, and resist attempts at being persuaded. One can likewise gain others’ trust by having them gain control of the choices presented to them (Pandey and Gupta, 2019).

The fifth core element is “curiosity” or a desire to know. Before giving information to others, we must point out the gap in their knowledge and show them how they can benefit from the information given to them (Pandey and Gupta, 2019). Responsibly disseminating information about health and well-being, community safety, etc., are community activities examples that can help bridge knowledge gaps.

As for the sixth element, one must assess the other person’s mental and emotional state. People who are calm and relaxed are more receptive to being influenced versus people who are in stressful or in a threatening situation (Pandey and Gupta, 2019).

Lastly, Sharot cautions against group conformity or the “knowledge and acts of other people,” which comprises the seventh component. She states that in some situations, we must be wary of being influenced by other people, especially in social media or political campaigns when the truthfulness of the information cannot be verified (Pandey and Gupta, 2019).

Find The Best Degree Match

Get personalized degree recommendations that will help you find a program that will match your goals and dreams.

The website Research.com is funded by advertising. All school search, finder, and match results, as well as featured or trusted partner programs, are for schools who pay us. Our school rankings, resource guides, or any other editorially impartial content on our website are unaffected by the compensation we receive.

Clinging to and Changing Beliefs

There are a number of reasons why facts don’t change our minds, a phenomenon that researchers have called belief perseverance. One of them is the well-documented confirmation bias, in which the person only seeks out information that affirms his beliefs and ignores and downplays contradicting information. There is also the illusion of explanatory depth where people don’t realize how ill-informed they are about an issue unless they are asked to explain it. People also tend to avoid complicated explanations, so they seek out simple ones while sacrificing accuracy. Rigorous scientific processes mean that it takes multiple studies to establish causality and in the meantime, the human brain’s discomfort with not knowing leads it to establish causality when there is only coincidence. Lastly, people discount how emotions play a part in their assessment of risk, which causes them to overestimate small risks and underestimate huge risks.

Why do facts sometimes not change our minds? It is often difficult to convince people with fixed beliefs. Despite these obstacles, it’s still possible to change other people’s minds with some strategic persuasion skills. These include establishing common interests, framing perspectives in a positive light, and paying attention to their mental or emotional state.

 

References:

  • Anderson, C.A. (2007). Belief perseverance. In R. F. Baumeister & K. D. Vohs (Eds.), Encyclopedia of Social Psychology, pp. 109-110. Thousand Oaks, CA: Sage.
  • Anderson, C., Lepper, M., & Ross, L. (1980). Perseverance of social theories: The role of explanation in the persistence of discredited information. Journal of Personality & Social Psychology, 39 (6), 1037-1049.  https://psycnet.apa.org/doi/10.1037/h0077720
  • Carroll, R. (2014, September 7). US has seen nearly 600 measles cases this year, CDC says. The Guardian.
  • CDC (2015, January 29). Transcript for CDC Telebriefing: Measles in the United States, 2015. CDC Newsroom. Atlanta, GA: Centers for Disease Control and Prevention.
  • CDC (2019, November 5). Transmission. Ebola (Ebola Virus Disease). Atlanta, GA: Centers for Disease Control and Prevention.
  • CDC (2020, April 2). Possible side effects from vaccines. Vaccines & Immunization. Atlanta, GA: Centers for Disease Control and Prevention.
  • Cherry, K. (2019, June 14). The Dunning-Kruger Effect. Very Well Mind.
  • Dunning-Kruger Effect. (n.d.). Dunning-Kruger Effect. Psychology Today.
  • Heshmat, S. (2015, April 23). What is confirmation bias? Psychology Today.
  • James, T. (2018, April 3). “The Enigma of Reason” by Dan Sperber and Hugo Mercier. Medium.
  • Lawson, R. (2006). The science of cycology: Failures to understand how everyday objects work. Memory & Cognition, 34 (8), 1667-1675. https://doi.org/10.3758/BF03195929
  • Mozes, A. (2008, June 12). Possession is nine-tenths the perceived value. HealthDay News.
  • Pandey, S., & Gupta, R. (2019). Book Review: The Influential Mind: What the Brain Reveals About Our Power to Change Others. Frontiers in Psychology, 10, 1210. https://doi.org/10.3389/fpsyg.2019.01210
  • Wright, A. (2020, October 10). Chapter 6: Limbic System: Amygdala. Neuroscience Online. Houston, TX: UTHealth.
  • Yuhas, A. (2014, October 20). Panic: the dangerous epidemic sweeping an Ebola-fearing US. The Guardian.

Find The Best Degree Match

Get personalized degree recommendations that will help you find a program that will match your goals and dreams.

The website Research.com is funded by advertising. All school search, finder, and match results, as well as featured or trusted partner programs, are for schools who pay us. Our school rankings, resource guides, or any other editorially impartial content on our website are unaffected by the compensation we receive.

Find The Best Degree Match

Get personalized degree recommendations that will help you find a program that will match your goals and dreams.

The website Research.com is funded by advertising. All school search, finder, and match results, as well as featured or trusted partner programs, are for schools who pay us. Our school rankings, resource guides, or any other editorially impartial content on our website are unaffected by the compensation we receive.