Skip to main content

How to get people to overcome their bias

One of the tricks our mind plays is to highlight evidence which confirms what we already believe. If we hear gossip about a rival we tend to think "I knew he was a nasty piece of work"; if we hear the same about our best friend we're more likely to say "that's just a rumour". If you don't trust the government then a change of policy is evidence of their weakness; if you do trust them the same change of policy can be evidence of their inherent reasonableness.

Once you learn about this mental habit – called confirmation bias – you start seeing it everywhere.

This matters when we want to make better decisions. Confirmation bias is OK as long as we're right, but all too often we’re wrong, and we only pay attention to the deciding evidence when it’s too late.

How we should to protect our decisions from confirmation bias depends on why, psychologically, confirmation bias happens. There are, broadly, two possible accounts and a classic experiment from researchers at Princeton University pits the two against each other, revealing in the process a method for overcoming bias.

One possibility is that we simply have a blindspot in our imagination for the ways the world could be different

The first theory of confirmation bias is the most common. It's the one you can detect in expressions like "You just believe what you want to believe", or "He would say that, wouldn't he?" or when the someone is accused of seeing things a particular way because of who they are, what their job is or which friends they have. Let's call this the motivational theory of confirmation bias. It has a clear prescription for correcting the bias: change people's motivations and they'll stop being biased.

The alternative theory of confirmation bias is more subtle. The bias doesn't exist because we only believe what we want to believe, but instead because we fail to ask the correct questions about new information and our own beliefs. This is a less neat theory, because there could be one hundred reasons why we reason incorrectly – everything from limitations of memory to inherent faults of logic. One possibility is that we simply have a blindspot in our imagination for the ways the world could be different from how we first assume it is. Under this account the way to correct confirmation bias is to give people a strategy to adjust their thinking. We assume people are already motivated to find out the truth, they just need a better method. Let's call this the cognition theory of confirmation bias.

Confirmatory evidence strengthened people's views, as you'd expect, but so did disconfirmatory evidence

Thirty years ago, Charles Lord and colleagues published a classic experiment which pitted these two methods against each other. Their study used a persuasion experiment which previously had shown a kind of confirmation bias they called 'biased assimilation'. Here, participants were recruited who had strong pro- or anti-death penalty views and were presented with evidence that seemed to support the continuation or abolition of the death penalty. Obviously, depending on what you already believe, this evidence is either confirmatory or disconfirmatory. Their original finding showed that the nature of the evidence didn't matter as much as what people started out believing. Confirmatory evidence strengthened people's views, as you'd expect, but so did disconfirmatory evidence. That's right, anti-death penalty people became more anti-death penalty when shown pro-death penalty evidence (and vice versa). A clear example of biased reasoning.

For their follow-up study, Lord and colleagues re-ran the biased assimilation experiment, but testing two types of instructions for assimilating evidence about the effectiveness of the death penalty as a deterrent for murder. The motivational instructions told participants to be "as objective and unbiased as possible", to consider themselves "as a judge or juror asked to weigh all of the evidence in a fair and impartial manner". The alternative, cognition-focused, instructions were silent on the desired outcome of the participants’ consideration, instead focusing only on the strategy to employ: "Ask yourself at each step whether you would have made the same high or low evaluations had exactly the same study produced results on the other side of the issue." So, for example, if presented with a piece of research that suggested the death penalty lowered murder rates, the participants were asked to analyse the study's methodology and imagine the results pointed the opposite way.

They called this the "consider the opposite" strategy, and the results were striking. Instructed to be fair and impartial, participants showed the exact same biases when weighing the evidence as in the original experiment. Pro-death penalty participants thought the evidence supported the death penalty. Anti-death penalty participants thought it supported abolition. Wanting to make unbiased decisions wasn't enough. The "consider the opposite" participants, on the other hand, completely overcame the biased assimilation effect – they weren't driven to rate the studies which agreed with their preconceptions as better than the ones that disagreed, and didn't become more extreme in their views regardless of which evidence they read.

The finding is good news for our faith in human nature. It isn't that we don't want to discover the truth, at least in the microcosm of reasoning tested in the experiment. All people needed was a strategy which helped them overcome the natural human short-sightedness to alternatives.

The moral for making better decisions is clear: wanting to be fair and objective alone isn't enough. What's needed are practical methods for correcting our limited reasoning – and a major limitation is our imagination for how else things might be. If we're lucky, someone else will point out these alternatives, but if we're on our own we can still take advantage of crutches for the mind like the "consider the opposite" strategy.
by Tom Stafford,
Tom Stafford is the author of the ebook For argument's sake: Evidence that reason can change minds is out now. If you have an everyday psychological phenomenon you'd like to see written about in these columns please get in touch with@tomstafford  on Twitter, or


~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~  ~ ~ ~  ~

Humanity, Knowledge, Religion, Culture, Tolerance, Peace 

Peace Forum Network
Visited by Millions
Facebook Page

Popular posts from this blog

A historic moment in the Arab world

لحظة تاريخية في العالم العربي
As a democratic revolution led by tech-empowered young people sweeps the Arab world, Wadah Khanfar, Al Jazeera's director-general, shares a profoundly optimistic view of what's happening in Egypt, Tunisia, Libya and beyond. In the first talk posted online from the TED 2011 conference in California, Khanfar describes the powerful moment when people realised they could step out of their homes and ask for change. "كما ثورة ديمقراطية بقيادة الشباب التكنولوجيا ذات صلاحيات تجتاح العالم العربي ، وضاح خنفر ، الجزيرة المدير العام والأسهم وجهة نظر متفائلة بشكل كبير ما يحدث في مصر وتونس وليبيا وخارجها. وفي اول حديث له نشر على الانترنت من مؤتمر تيد 2011 في ولاية كاليفورنيا ، خنفر يصف لحظة قوية عند الناس أدركت أنها لا يمكن الخروج من منازلهم ونطلب من أجل التغيير." This talk was given on March 1, 2011 in Long Beach, California. TED 2011 is taking place between March 1 and Mar…

Our Captured, Wounded Hearts: Arundhati Roy On Balakot, Kashmir And India

With his reckless “pre-emptive” airstrike on Balakot in Pakistan, Prime Minister Narendra Modi has inadvertently undone what previous Indian governments almost miraculously, succeeded in doing for decades. Since 1947 the Indian Government has bristled at any suggestion that the conflict in Kashmir could be resolved by international arbitration, insisting that it is an “internal matter.” By goading Pakistan into a counter-strike, and so making India and Pakistan the only two nuclear powers in history to have bombed each other, Modi has internationalised the Kashmir dispute. He has demonstrated to the world that Kashmir is potentially the most dangerous place on earth, the flash-point for nuclear war. Every person, country, and organisation that worries about the prospect of nuclear war has the right to intervene and do everything in its power to prevent it.  Keep reading  >>>>

India has built around itself an aura of a global power whose time has come. For at least the last t…

Kashmir Jihad - Analysis & Options


Kashmir is an incomplete agenda of partition of India. Since 1947, India and Pakistan have fought three wars on this issue. According to UN resolutions, Kashmiris have to decide their accession to Pakistan or India through impartial plebiscite, which could not take place due to Indian reluctance. Recently, India revoked Article 370 of the Constitution, which granted special autonomous status to Kashmir, it was done to unilaterally integrate occupied Kashmir. This is a violation of the UN resolutions and the Simla bilateral agreement, which demands to maintain status quo until the final settlement. The US and world powers are emphasizing that Kashmir should be resolved bilaterally, though India has refused to hold talks with Pakistan. In the present scenario, while India has turned Kashmir into the largest prison of 9 million people, denying basic human rights and oppressing the Kashmiris' who want freedom from India, Pakistan cannot watch as a silent spec…