Why Our Brains Won’t Let Us Change Our Minds (And What to Do About It)

The Belief Trap: Why We See What We Expect to See

We like to believe that our beliefs and convictions are built on facts—that we gather evidence, weigh it fairly, and let reason guide us to the truth. But the reality is different. Most of us don’t form beliefs by neutrally assessing information. Instead, we start with a belief—often without realizing it—and then go looking for facts that make it feel right. This isn’t a sign of laziness or dishonesty; it’s how the human mind works.

Psychologists call this tendency confirmation bias: the subconscious habit of favoring information that supports our existing views while dismissing or downplaying what contradicts them (Nickerson, 1998; Plous, 1993). It’s not just selective attention—it shapes how we interpret, remember, and even search for information.

But confirmation bias is only part of the story. A deeper issue is the confirming evidence trap: once we hold a belief, we don’t just passively lean toward confirming data—we actively seek it, structure our attention around it, and avoid exposure to disconfirming information (Klayman, 1995). We become advocates for our beliefs, not impartial judges of the facts.

The Experiments That Expose the Trap

A foundational study by Lord, Ross, and Lepper (1979) at Stanford University provides a striking illustration. Participants who either supported or opposed capital punishment were shown the same two studies—one supporting and one challenging their belief. Instead of moderating their stance after reviewing both, people became more entrenched. They criticized the opposing evidence and praised the supporting one—regardless of its objective quality.

Another famous demonstration comes from Wason (1960), who gave participants a sequence of numbers: “2, 4, 6” and asked them to determine the rule by generating new sequences. Most assumed the rule was “increasing by twos” and tested similar sequences like “8, 10, 12.” Few tried sequences that might disprove their guess, such as “3, 6, 9.” The actual rule was simply “any ascending numbers”—yet people overwhelmingly sought confirmation rather than falsification. This tendency to only seek confirming examples, even in simple tasks, reflects how reluctant we are to challenge our own ideas.

These findings have been replicated and expanded in multiple domains, including decision-making (Tversky & Kahneman, 1974), reasoning (Evans, Barston, & Pollard, 1983), and legal judgment (Ask & Granhag, 2005).

When Politics Becomes a Belief Trap

One area where the belief trap is especially visible is political ideology. Beliefs in politics often take on an identity-defining role. Once that happens, contrary evidence isn’t just a disagreement—it feels like a threat to self-concept (Taber & Lodge, 2006).

Historically, this dynamic has been well documented. For instance, during the 1930s, Western intellectuals who supported the ideals of communism often dismissed or rationalized reports of Stalinist purges and famines (Courtois et al., 1999; Getty & Manning, 1993). Similarly, during the Cold War, strong anti-communist sentiment in the West sometimes led to the tolerance of civil liberties violations in the name of national security (Schrecker, 1998; Chomsky & Herman, 1988).
In both cases, ideological belief shaped what people were willing to see, accept, and believe. The facts didn’t change the minds—the frame around the facts did.

The Social Media Amplifier

While confirmation bias is as old as human cognition, social media has dramatically intensified its effects. Algorithms now personalize what we see based on our past preferences—creating filter bubbles (Pariser, 2011) and echo chambers (Sunstein, 2001). This leads to ideological isolation, polarization, and mutual incomprehension.

Research confirms that people are more likely to engage with—and believe—information that aligns with their existing views (Pennycook & Rand, 2018). Social media doesn’t just reflect preferences; it shapes and reinforces them, making extreme views seem more common and moderate ones disappear (Barberá et al., 2015).

A study by Bakshy et al. (2015) found that Facebook users are significantly more likely to see and share stories that support their ideological orientation. Over time, this skews our perception of what is true or “obvious.”

Why the Trap Matters

The confirming evidence trap isn’t just a psychological curiosity—it has real-world consequences:
• In medicine, doctors who form early diagnostic impressions may overlook contradictory symptoms, leading to misdiagnosis (Croskerry, 2002).
• In business, companies that fall in love with a strategy often ignore market shifts or early warning signs (Christensen, 1997).
• In politics, public opinion becomes polarized as people receive fundamentally different information streams (Iyengar & Hahn, 2009).
• Even in science, researchers may design studies that confirm their hypotheses while ignoring disconfirming results—a factor in the replication crisis (Ioannidis, 2005; Open Science Collaboration, 2015).

Escaping the Trap

We can’t eliminate cognitive bias—but we can learn to recognize it and reduce its influence. Here are some strategies:

1. Flip the question: Instead of asking, “What supports my view?” ask, “What would prove me wrong?” This is the essence of falsification—a cornerstone of scientific thinking (Popper, 1959).
2. Seek disagreement: Surround yourself with people who question your assumptions. Productive disagreement is key to sharpening ideas and avoiding groupthink (Nemeth, 1986; Janis, 1972).
3. Delay certainty: Be cautious of how quickly you settle on a belief. Certainty can feel satisfying, but it shuts down curiosity. Make “I might be wrong” your default position (Aronson, 1995).
4. Form beliefs based on evidence—and be willing to change them: The most effective mindset is to treat all beliefs as provisional—valid only as long as the evidence supports them (Stanovich & West, 2007). When better or conflicting evidence arises, we must be open to updating our views.
5. Examine early-life beliefs: Many of our core assumptions come from childhood and cultural conditioning, absorbed long before we had the ability to reason critically (Haight & Tappan, 1997). These beliefs can feel “natural” or “obvious,” but they deserve special scrutiny.

The Bottom Line

The belief trap teaches us this: beliefs must be anchored in evidence, not identity or tradition. The more central a belief is to our self-image or upbringing, the more urgently it should be examined.

We can’t escape bias entirely. But we can become more self-aware, more evidence-driven, and more intellectually honest. That’s the only way to keep our thinking flexible, our minds open, and our decisions grounded in reality.

 

 

 

Author: A S Prasad – Critical Thinking Trainer and Visiting Faculty. Lead author of the textbook “Critical and Analytical Thinking” (Cengage) 

 

 

References
• Ask, K., & Granhag, P. A. (2005). Motivational bias in criminal investigators’ judgments of witness reliability. Journal of Applied Social Psychology, 35(1), 184-200.
• Aronson, E. (1995). The Social Animal (7th ed.). W.H. Freeman.
• Bakshy, E., Messing, S., & Adamic, L. A. (2015). Exposure to ideologically diverse news and opinion on Facebook. Science, 348(6239), 1130–1132.
• Barberá, P., Jost, J. T., Nagler, J., Tucker, J. A., & Bonneau, R. (2015). Tweeting from left to right: Is online political communication more than an echo chamber? Psychological Science, 26(10), 1531–1542.
• Chomsky, N., & Herman, E. S. (1988). Manufacturing Consent: The Political Economy of the Mass Media. Pantheon.
• Christensen, C. M. (1997). The Innovator’s Dilemma. Harvard Business Review Press.
• Courtois, S., Werth, N., Panné, J. L., Paczkowski, A., Bartosek, K., & Margolin, J. L. (1999). The Black Book of Communism. Harvard University Press.
• Croskerry, P. (2002). Achieving quality in clinical decision making: cognitive strategies and detection of bias. Academic Emergency Medicine, 9(11), 1184–1204.
• Evans, J. St. B. T., Barston, J. L., & Pollard, P. (1983). On the conflict between logic and belief in syllogistic reasoning. Memory & Cognition, 11(3), 295–306.
• Getty, J. A., & Manning, R. T. (1993). Stalinist Terror: New Perspectives. Cambridge University Press.
• Haight, W., & Tappan, M. (1997). Parenting as a moral discourse. In J. Tudge, M. Shanahan, & J. Valsiner (Eds.), Comparisons in Human Development (pp. 152–166). Cambridge University Press.
• Ioannidis, J. P. A. (2005). Why most published research findings are false. PLOS Medicine, 2(8), e124.
• Iyengar, S., & Hahn, K. S. (2009). Red media, blue media: Evidence of ideological selectivity in media use. Journal of Communication, 59(1), 19–39.
• Janis, I. L. (1972). Victims of Groupthink. Houghton Mifflin.
• Klayman, J. (1995). Varieties of confirmation bias. Psychology of Learning and Motivation, 32, 385–418.
• Lord, C. G., Ross, L., & Lepper, M. R. (1979). Biased assimilation and attitude polarization. Journal of Personality and Social Psychology, 37(11), 2098–2109.
• Nemeth, C. J. (1986). Differential contributions of majority and minority influence. Psychological Review, 93(1), 23–32.
• Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175–220.
• Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349(6251), aac4716.
• Pariser, E. (2011). The Filter Bubble: What the Internet Is Hiding from You. Penguin Press.
• Pennycook, G., & Rand, D. G. (2018). The Implied Truth Effect: Attaching warnings to a subset of fake news stories increases perceived accuracy of stories without warnings. Management Science, 66(11), 4944–4957.
• Plous, S. (1993). The Psychology of Judgment and Decision Making. McGraw-Hill.
• Popper, K. (1959). The Logic of Scientific Discovery. Basic Books.
• Schrecker, E. (1998). Many Are the Crimes: McCarthyism in America. Princeton University Press.
• Stanovich, K. E., & West, R. F. (2007). Natural myside bias is independent of cognitive ability. Thinking & Reasoning, 13(3), 225–247.
• Sunstein, C. R. (2001). Republic.com. Princeton University Press.
• Taber, C. S., & Lodge, M. (2006). Motivated skepticism in the evaluation of political beliefs. American Journal of Political Science, 50(3), 755–769.
• Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124–1131.
• Wason, P. C. (1960). On the failure to eliminate hypotheses in a conceptual task. Quarterly Journal of Experimental Psychology, 12(3), 129–140.