Confirmation bias is a phenomenon in which people tend to seek out information that supports their beliefs. It is a simple, but often overlooked, process that seems intuitively obvious: you are more likely to remember the facts you already like than those you don’t.
Such facts may even seem like they should be easy to remember; if they are true, then they must be true. But this tendency can be a powerful force in our lives; it can lead us to ignore evidence that contradicts our beliefs or cause us to believe things that aren’t true (and thus deluded).
Here are some examples of confirmation bias:
• We will remember cases where we did not receive negative feedback as more positive than when we got feedback and were given negative feedback. Studies show that when someone is told they received poor treatment, they are less likely to feel positive about them afterwards. So if someone was given the option of getting good or bad feedback, the person would choose the latter without hesitation because it would reinforce what they already believed.
• We will remember instances where we were encouraged and rewarded for being better than others as more encouraging than instances where we were forced or pressured into doing so (as in “If you don’t do your best, there won’t be any reward for it”). In other words, if we get a chance for something good but are not allowed or pressured into doing the best job possible then the opportunity won’t seem as valuable and worthwhile.
Finally, here is an example of confirmation bias at work:
• When asked by another person whether he believes in God (or any other religion), many people say yes — even though most people say no when asked directly by an atheist (doubly confirming their belief). Why? Because asking them directly would make them appear as irrational and weak-minded; asking someone who has already decided on a particular answer implies that they are making an implicit decision based on their religious upbringing rather than having come to their own conclusions.
Confirmation Bias Examples
Confirmation bias is a general phenomenon in which we are biased to accept information that confirms what we already believe. For example, when you hear someone say they don’t believe in God, you may respond by saying “You’re crazy.” But in reality, you do not believe in God if you are religious.
The sentiment of ‘you’s and ‘they’s can be demonstrated most clearly by a silly game. If you are an atheist and tell your friends that atheists do not believe in God, what exactly do they think you mean? If they know that atheists do not believe in God and only want to hear something that confirms their belief, they will probably say something like “I know I don’t believe in God but I like to go out with people who do.”
In the same way, if you tell your friends that Muslims do not believe in God or have a harsh view of Islam, the same kind of response will be expected: “I don’t think Muslims care about religion and enjoy going out with people who do.”
In some cases (for instance when talking about rival products), it might even be possible to use the word for the product itself: for example when talking about Slack vs other chat products like Hipchat (see this post for an example). The problem is that it leads us into confirmation bias towards such things as source validity and consistency — two important characteristics of any product.
Confirmation bias is not exclusive to the atheist community alone: it happens all the time with people who are fans of any particular brand or industry — but there seems to be something universal at play here: our tendency to choose ideas which confirm what we already think. It is useful as a reminder that it is absolutely essential to listen to people when they talk about their ideas and just take them at their word — even if we don’t agree with them at all!
Confirmation Bias Effects
This quote by Nassim Nicholas Taleb comes to mind whenever we see a “confirmation bias” effect while reading scientific papers or talking to other scientists. The quote is from a paper by J. Burns and M. Tversky (1982).
The paper explains that confirmation bias occurs when people are influenced by their beliefs and emotions on a question, deciding which evidence to look at, ignoring the evidence that contradicts their beliefs and focusing on the evidence that supports them…. The key part of the quote relates to the phenomenon of confirmation bias:
… if you are in doubt about some issue, you will be more likely to recall and remember information from those sources that support your position. If you are uncertain about an issue, you will be more likely to recall and remember information from those sources that contradict your position.
Taleb argues the opposite:
Confirmation bias is a bad thing because it suppresses truth–it makes people believe what they want to believe, not what they know [p. 17].
How to Avoid Confirmation Bias
What is it? This term refers to a cognitive bias that leads people to believe the information they receive supports their existing beliefs. For example, people often feel more justified in making negative statements about controversial topics such as climate change, because they have experienced it personally.
It’s important to note that the term “confirmation bias” itself is misleading; it implies that the person who thinks this way is being purposefully selective in how and when they look at information. But no one, including me, ever says “I want to be a scientist” or “I think I am a scientist” (or even “I want to be a physicist” or “I think I am a physicist”).
This isn’t just an academic issue either: many of us are still doing some form of science-related work. We design products and services and test them out. Whether we are scientists or not, most of us have biases (we tend to pay more attention to negative than positive information) as well as biases relating to our own backgrounds and experiences (our parents were engineers; we grew up in an environment where science was valued over other forms of knowledge; we were taught that science is important).
So what exactly is confirmation bias? It basically means that we take certain information for granted (e.g., climate change does not exist), and then ignore any opposing arguments which do not fit with what we already believe (climate change doesn’t exist), or even just selectively look at evidence which confirms our existing beliefs (that climate change doesn’t exist). In other words: under certain circumstances we see things through our own eyes before looking at anything else.
Here are some examples of confirmation bias:
– We make inferences when seeing something for the first time, even if there is no direct evidence for them (e.g., I see an article on my mobile phone while waiting in line) – We make inferences when reading something in another language because it sounds strange to us (e.g., I don’t understand this saying) – We make inferences when comparing two different things due to similarities between them (e.g., watching videos on YouTube does not always mean better quality videos) – We do not take into account differences between products because one looks similar to another – In general, we tend towards accepting patterns without consideration for their plausibility – We look at details which confirm what we already
You’ve probably seen the term “confirmation bias” and wondered what it means to you. It is a term that refers to a cognitive psychological phenomenon, which is defined by its ability to alter our interpretation of evidence.
Confirmation bias can be found in many forms: it can be found in your own experience, when you accept what you have been told and not questioned, or in the way that your mind interprets information.
The term “confirmation bias” was coined by psychologist Daniel Kahneman in his book Thinking Fast and Slow (which was later adapted into a popular book) and refers to the tendency of our minds to interpret data differently based on the beliefs of those around us. We can use this tendency to either support or attack ideas, especially those that we hold dear or wish to protect.
A good example of confirmation bias would be someone who wants to believe that obesity is harmless as opposed to being aware that it is linked with increased incidence of diabetes and heart disease. Although these risks are certainly worth being aware of, people tend not to look for reasons why they might happen when they hear them. Instead they tend to claim that obesity is just as healthy as other health risks like smoking or drinking alcohol.
The concept of confirmation bias also applies outside of psychology: there are many examples where we find ourselves shutting out contradictory information that conflicts with our own beliefs because we have internalized the belief system around it — often resulting in poor decision making.