Saturday, September 23, 2017

Thoughts on Confirmation Bias

This post is part of a series on cognitive biases. You might also be interested in my thoughts on groupthink and the survivor bias.
_____

"People have no trouble turning any information into a coherent narrative."
Jason Dana

Zener Cards

J.B. Rhine, the founder of parapsychology, was convinced that ESP (extrasensory perception, ie receiving information with the bare mind, instead of the known senses) exists. In the 1930s and 1940s he conducted hundreds of experiments to prove his claim. Amongst other things Rhine invented a card deck called Zener Cards. A deck comprised of 25 cards with one of five different symbols on each card. The "sender" (often Rhine) looked at a card, while the "receiver" (a person potentially capable of ESP) had to guess the symbol on the respective card. If people would just randomly guess, they would get an average of five correct answers. When a person scored way above five, Rhine saw this as prove for ESP. Over time, Rhine thought he had found people, who were extraordinary good at ESP, because they consistently scored above five.
Millions of people believed Rhine‘s research, he sold lots of books and got funding after funding. However, later it turned out that his methods and his data were completely screwed. Rhine was so desperate to prove his claim, that he went through his data over and over again, until he finally found what he was looking for. For instance, when a promising "receiver" seemed not to perform well enough in a series of tests, Rhine found what he called "forward displacement". This means that the "receiver" did guess the correct answer, but his/her guesses were "displaced", so that he/she did not guess the actual card, but the upcoming one. Another obscure phenomenon Rhine discovered was what he called the "decline effect". When a receiver started promising, but then failed to continue his/her correct guesses, he/she must have been fatigued or bored (and therefor the results must be discarded). Probably the most absurd claim Rhine made was that whenever people scored above average, they clearly were capable of ESP; and when they scored below average, this too was proof for ESP, because they used their ESP skills to embarrass Rhine!

J. B. Rhine with one of his "receivers", Hubert Pearce. Source: Wikipedia
I‘ve learned about Rhine and the Zener Cards from reading the great book Standard Deviations by Gary Smith. The story is a striking, yet extreme example of confirmation bias, "the tendency to search for, interpret, favour, and recall information in a way that confirms one's preexisting beliefs or hypotheses" (Wikipedia). One interesting thing about Rhine is that there‘s no strong evidence that he intentionally cheated. He thought his methods were totally valid. And he was without any doubt a smart man.
You‘ve probably seen lists of dozens of cognitive biases. While it turns out that some of them are so context-dependent that it‘s questionable how useful they really are (and others could not be replicated in subsequent experiments), it appears to me that there‘s no real doubt that confirmation bias exists and that it seriously impacts our behaviour and decision making. We are all affected by confirmation bias, even if we think we would be better able to deal with it than others. For a vivid and astonishing example (there‘s a magician involved!), watch this video. You will get to know choice blindness, which is very close to confirmation bias IMO.



Why are we affected by confirmation bias?

It‘s not entirely clear why confirmation bias (and similar effects) seem to be hardwired into our brains, but there are some explanations that make sense to me:
  • In this article Jonah Lehrer links confirmation bias (and similar phenomena) to our ability to persuade others, which is a critical skill for humans as social beings. In that sense there‘s a lot value in our brain finding more and more arguments for our point, so that the rest of the group follows or supports us: "the function of reasoning is rooted in communication, in the act of trying to persuade other people that what we believe is true." 
  • David McRaney suggests that confirmation bias might has offered an adaptive advantage to humans as a species He gives the example of a caveman being out in the woods. He has a hunch that there are cheetahs out there, which is obviously dangerous to him. He therefore looks for clues that confirm his hunch. In cases like this it‘s generally much better to go with confirming information, instead of looking for disconfirming information. Or as McRaney puts it: "It‘s better to be wrong than to be dead." (You Are Not So Smart Podcast, episode 103
  • Sarah Gimbel and Jonas Kaplan from the University of California point out that relatedness is extremely valuable for humans. Whenever we change an important belief, we put this relatedness to our peer group at risk. So although it might make sense to change a belief from a rational, individual‘s perspective, it might be to costly from a group perspective.
    Gimbel and Kaplan also explain that important beliefs can become part of our identify, and threatening this beliefs triggers very similar reactions in our brains than eg meeting a bear in the woods.
  • Kaplan adds two more interesting views on this: Firstly, mental models need some level of stability to be useful in the real world. Therefore we need to balance the influence of new information with this need for stability. Secondly, new information ist not stored independently in our brains. We don‘t simply add another piece of information, but instead we connect it to other information we already have. So changing one piece of information could cause a "chain reaction" and might not be worth the effort. (Listen to this excellent episode 093 of the You Are Not So Smart Podcast with Gimbel and Kaplan.) 
Whether or not these theories can be backed up by more research in the future, there‘s strong evidence that humans have a hard time dealing with cognitive dissonance: When our beliefs don‘t match our behaviours, we tend to become psychologically uncomfortable. Therefore we strive for internal consistency, either by changing our behaviour or by justifying it (or avoiding situations in which this inconsistency becomes evident). Changing our behaviour seems extra hard, one reason being that when we do change it, we‘ve created another inconsistency, because our current behaviour deviates from our past behaviour. Therefore oftentimes we justify our behaviour by looking for more and more confirming data and neglecting disconfirming data. Confirmation is not a big problem with minor or relatively unimportant beliefs, but it really kicks in with "cherished beliefs", those beliefs that are linked to our core identity (for many people religion or politics).
It gets weirder: in this blog post David McRaney writes about a study from 1979, which suggests that "even in your memories you fall prey to confirmation bias, recalling those things which support your beliefs, forgetting those things which debunk them."
The internet does not really make things better. As Justin Owings points out in this blog post: "Thanks to Google, we can instantly seek out support for the most bizarre idea imaginable. If our initial search fails to turn up the results we want, we don’t give it a second thought, rather we just try out a different query and search again. [...] People have always been prone to confirmation bias, but the Internet amplifies the phenomenon since we need not look far to confirm our particular bias. It’s always a click away.” That‘s also the reason why it‘s almost impossible to win an argument online - the other party will always find websites which support their opinion.

What can we do about it?

The research on confirmation bias is not new, and most people have heard about at least some of it. Yet I am amazed by how little we talk about the impact of confirmation bias (as well as other biases) in our community. One reason for this might be a phenomenon called Bias blindness, which means that we underestimate the effect that biases have on us compared to others.
If only a small part of the research about confirmation bias is valid, we are wasting tons of money by making flawed decisions in any part of our organization: Should we hire this person? What product should we develop next? Should we invest in a new market? etc.
So what to do about it? Obviously I don‘t have the silver bullet and research shows that it‘s incredibly hard to overcome or even dampen cognitive bias. But I really think we should talk more about these things and share ideas. Here are some of my thoughts on how to fight confirmation bias, please share yours in the comments section):
  1. Whenever we hear someone (our ourselves) say things like: "I knew it! Here‘s yet more evidence that XY is correct!", we should be cautious and try to look out for confirmation bias.
  2. I think it‘s important to institutionalise a process in which we come up with criteria for success and failure (or better: confirming or disconfirming data) in advance, i.e. before we start with a project, initiative, etc. "How will we know we are successful?", "What are signs of failure?", "At what point will we re-evaluate what we‘re doing or stop it all together?" are good questions to discuss (I believe this is called Preregistration in the scientific community). What happens instead (and I am guilty of doing this many times) is this: We are very enthusiastic about a thing and really want it to succeed. So we start a project and invest time and money. After a while the data shows that the desired effect does not occur. But instead of questioning what we are doing, we start looking for other "great" effects we can observe: "Yes, sales did not go up after implementing this feature, but look how much this blogger liked it! So let‘s do more of it." In this case we not only deal with confirmation bias, but most likely also with the sunk cost fallacy.
  3. Writing things down often helps. By grabbing a piece of paper and writing down our assumptions, we can create a commitment, which makes it harder to justify contradicting actions afterwards. The story goes that Warren Buffett admired Charles Darwin‘s approach to confirmation bias: "Charles Darwin used to say that whenever he ran into something that contradicted a conclusion he cherished, he was obliged to write the new finding down within 30 minutes. Otherwise his mind would work to reject the discordant information, much as the body rejects transplants. Man's natural inclination is to cling to his beliefs, particularly if they are reinforced by recent experience--a flaw in our makeup that bears on what happens during secular bull markets and extended periods of stagnation." (source
  4. Have someone take the role of devil‘s advocate is always a good idea, especially with big decisions. In addition, it‘s often beneficial to invite known critics to voice their opinion. Warren Buffett, again, is known for inviting his critic Doug Kass to challenge his investment plans (source). 
  5. On a related note David Rock points out in this webinar that it‘s a good idea to fight bias as a team. Rarely all members of a team need to be involved in actually making a decision. So some of them can focus on observing how the decision is being made, share their observations and trigger a discussion about the biases at work.
  6. It‘s good to keep in mind that often people are not as rational as we think (or wish for). If we wonder why someone is insisting on his/her opinion, despite all the evidence against his/her point, chances are that we won‘t convince this person by adding more arguments. In fact, this could result in this person insisting even more on his/her option. This is called the backfire effect.
  7. If someone seems to behave irrational, we should not assume he/she is an idiot. It‘s almost always a good idea to take a break and let heated discussions cool down. From what I understand the prefrontal cortex plays an important role in what we call rational behaviour. Unfortunately, this part of the brain is easily tired. So give it time to "recharge", before making important decisions.
  8. Taber and Lodge have found evidence for what they call the "Attitude strength effect": Subjects voicing stronger attitudes will be more prone to confirmation bias. (for a summary of the study see this PDF document.) One conclusion from this might be that we should not expect someone to change his/her opinion, when this very person has just talked about this very opinion with utter conviction. In this case it might be a better idea to take a break and continue the conversation in a smaller group.
  9. The way most people (me included) search for information seems to be problematic. When we do research, we often use a search engine to find what we are looking for, or so we think. But knowing a little bit about confirmation bias, I tend to believe that we are rather typing in what we already believe and/or what want to be true. I remember searching for phrases like "Kanban success stories" or "Lean cycle time reduction". And of course I will find what I was looking for. But what if I would search for phrases like "Kanban does not work" or "Lean is is fad"? Wouldn‘t I find completely different results? I now try to make it a habit to search for disconfirming results on Google. But what can I say? It‘s hard...
  10. We all know the term filter bubble and some of its implications. We might call social media a big confirmation bias machine. The first problem is that we are only connected with people who are mostly like us, meaning they probably share most of our cherished beliefs. So I don‘t even get to hear disconfirming data. The next problem is that I get rewarded, whenever we post something that confirms our group‘s beliefs. For this we will be instantly rewarded in the form of likes, retweets and positive comments. So our beliefs are being re-affirmed and strengthened all the time. If we want to change this, we might want to consider following/friending people on social media, who are different from us and have very different opinions/beliefs. I‘ve done that, and it‘s interesting but it can also be very exhausting. 
These are some ideas I have about mitigating confirmation bias. I will soon publish another blog post on debiasing strategies in general.

If you want to dig deeper, you might want to learn about biases that are closely connected to confirmation bias. Here‘s a list I‘ve compiled:

backfire effect, belief bias, belief preservation, belief overkill, choice blindness, desirability bias, frequency illusion, hypothesis locking, morton's demon, motivated reasoning and motivated scepticism, myside bias, non-falsifiable hypothesis, polarization effect, positive bias, selective thinking, Tolstoy syndrome

P.S. Of course the belief that confirmation bias exists is also affected by confirmation bias. I‘ve at least done some googling and searched for phrases like "confirmation bias is not real", "confirmation bias is overrated" etc. I did not find any useful information.

No comments:

Post a Comment