linkedin article confirmation bias

Why logic and evidence thoroughly fail to change opinions and beliefs

Today,  supporters and opponents of Mr. Donald Trump in the USA and Mr. Modi in India are highly polarized. Supporters and detractors try very hard to provide to the other supporting their views. But contrary evidence only seems to harden and polarize opinions further. Why is neither side able to convince the other? Why do discussion and evidence result in further polarization?  Why do logic and evidence fail to convince?

 

Psychologists have studied this phenomenon and discovered that once we have formed a belief about something, we tend to seek out information that confirms our belief and filter out or reject any information that contradicts it. Psychologists initially dubbed this cognitive bias a the Confirmation bias, and later researchers (Gary Klein) seem to have termed this as  'Fixation'.  For this article, given its objective, I will be treating fixation and confirmation bias to mean the same.

Death penalty study

Lord, Ross and Lepper (1979) conducted a study to see if people change their opinions in the face of contrary evidence. They recruited a group of 48 students for a study. Half of them believed that the death penalty was a good deterrent to future crime, while the other was against the death penalty and doubted its effectiveness as a deterrent.

Two carefully prepared studies were given to both groups. One study provided data that supported the view that the death penalty was a deterrent, while the other study supported the view that the death penalty was not a deterrent. The participants were also provided with critiques of each study listing out their deficiencies.

So, a supporter of the death penalty got to see one study supporting it, and another opposing it along with the critiques of both. And the opponent also got to see both studies and both critiques.

Given that participants were presented with evidence that opposed their viewpoints, it was a possibility that they would soften their views and move to a somewhat central position. However, what happened was that the studies resulted in greater polarization of views.

Those who initially supported the death penalty ended up with even greater conviction in their position. The group opposing the death penalty too ended with greater conviction in their position that the death penalty was not a deterrent.

This tendency to only look for information that supports their own belief while rejecting or downplaying conflicting information has been termed as "Confirmation bias". This particular cognitive bias results in many errors in decision making and also creates sharp divides amongst people. Today's social media platforms aren't helping either.

Confirmation bias combined with the Artificial Intelligence-based social media engines is fuelling and strengthening differences between sections of society. If I believe that Trump is a great leader, Social media will flood me with a constant stream of news and articles which confirm this view. A Trump detractor on the other hand will be fed similar volumes of news and articles that are not complementary to Trump. The constant stream of confirming information only strengthens our particular beliefs and creates an unbridgeable divide. I believe that this cognitive bias could be a good explanation for why we as a society are more sharply divided even over trivial and not so important issues today as compared to times before the advent of social media.

Confirmation bias in testing hypotheses

Let's take a hypothetical situation where I make up a rule for a sequence of numbers. I write down this rule on a piece of paper and tell you that that the numbers  2-4-6  satisfy the rule. You are then asked to guess the rule I am applying to generate the sequences. You have to figure out what the rule is by asking me to confirm or deny sequences of numbers you generate based on your hypothesis of what the rule is. Lets say that you give me a sequence: x-y-z and  I give you feedback each time as 'Yes - confirms to the rule' or No - does not conform to the rule. How would you go about trying to confirm your hypothesis of what the rule is?

You may want to pause here, and look at your alternate hypotheses and then read on.

This was an actual experiment conducted by Wason (1960)1 to understand how people proceed in testing hypotheses. He was trying to confirm Popper's 2  belief that the general mistake people make in testing hypotheses is that they tend to try and confirm a hypothesis rather than trying to falsify it. Participants in Wason's experiment followed this predicted path in testing their hypothesis of the number sequence.

They framed a hypothesis to start with and tried to propose more sets of three numbers to satisfy this rule. In this case, they tried numbers such as 4-8-10,  6-8-12,  20-22-24. The feedback was positive that the number sequences satisfied the rule. After several rounds of testing, the participants were satisfied that their hypothesis was correct and stopped testing.

However, when they stated the rule they had in mind (even numbers), they were told that the rule was wrong. The rule was simply 'increasing numbers'. The numbers hypothesized by the participants was a subset of all possible numbers that could satisfy the rule, and hence they got the feedback each time that their sequence satisfied the rule. But their conclusion about what the rule itself turned out to be wrong.

So what went wrong? According to Wason, once the participants formed a hypothesis, they failed to work on finding sequences of numbers that would falsify the hypothesis. Instead, they worked only towards confirming the hypothesis.  A bias towards seeking only confirming evidence has also been called the 'Confirming-evidence' trap. And yet another name for it is the 'Positive testing' strategy.

This tendency to only look for confirming evidence is discussed by Prof John S. Hammond, Ralph L. Keeney, and Howard Raiffa   in an article  called 'The hidden traps in decision making' published in Harvard Business Review (Sep-Oct 1998)

In their article, they take the hypothetical example of the President of a large manufacturing operation wondering whether to go ahead or call off a manufacturing plant expansion. In specific, the President is concerned that the firm won't be able to maintain the rapid pace of growth of its exports. He fears that the value of the US dollar will strengthen making its goods more expensive for overseas consumers and dampen demand. But before taking a decision, he decides to call the Chief Executive of a smaller firm who also had dropped plans to set up a new factory to check why she had dropped her plans of expansion. The CEO cites her reason for dropping the expansion as her apprehension that the US dollar will strengthen against other currencies. The CEO's reasoning echoes the President's own fears.

The authors of the article warn us that the President should not treat this conversation as a deciding factor and drop his plans for expansion. If he decides based on this one conversation, he would be a victim of what psychologists call Confirming-evidence trap'.  This bias leads us to seek out only evidence that supports our existing instinct or point of view while avoiding information that contradicts it.

The confirming-evidence bias not only affects us where we go to look for evidence, it also affects the weightage we give to available evidence. We pay far too much weightage to evidence that supports our viewpoints and too little to conflicting viewpoints.

In their article they also provide some suggestions on how we can protect ourselves from this bias.

  • Always check whether we are examining all evidence with equal vigour
  • Do not accept confirming evidence without question
  • Get someone whose opinion you value to play the Devil's advocate
  • Build your counter-arguments too
  • Evaluate what's the strongest reason to do something else
  • Be honest with yourself about your motives- are you just seeking confirming evidence, or are you really wanting to impartially make a good choice?
  • In seeking the advice of others, don't ask leading questions

The confirmation bias is also a good explanation of why two different people have diametrically opposite views about an employee, or for that matter, while hard evidence shows excellent performance by the employee, a superior doesn't view the employee as a good performer. It may be wiser to rely on hard evidence rather than relying on memory to evaluate people - as relying on memory will only pull out confirming instances.