Thursday, April 16, 2020

The Main Barriers to Critical Thinking 2 Pro Self Biases

The following group of cognitive biases serves to make us prone to fail to see our biases, to see biases (or flaws in thinking and character) in others and to see our own ideas as better than they are or unbiased and the ideas of others (especially if they disagree with or oppose us) as worse
than they are or wrong. (Note: all definitions are quoted from

50 COGNITIVE BIASES TO BE AWARE OF SO YOU CAN BE THE VERY BEST VERSION OF YOU


  1. Blind Spot Bias: We don’t think we have bias, and we see it in others more than ourselves.
  2. Self-Serving Bias: Our failures are situational, but our successes are our responsibility.
  3. Halo Effect: If you see a person as having a positive trait, that positive impression will spill over into their other traits. (This also works for negative traits.)
  4. Naïve Realism: We believe that we observe objective reality and that other people are irrational, uninformed, or biased.
  5. Naïve Cynicism: We believe that we observe objective reality and that other people have a higher egocentric bias than they actually do in their intentions/actions.
  6. Dunning-Kruger Effect: The less you know, the more confident you are. The more you know, the less confident you are.
  7. Confirmation Bias: We tend to find and remember information that confirms our perceptions.
  8. Backfire Effect: Disproving evidence sometimes has the unwarranted effect of confirming our beliefs.
  9. Third-Person Effect: We believe that others are more affected by mass media consumption than we ourselves are.
  10. Belief Bias: We judge an argument’s strength not by how strongly it supports the conclusion but how plausible the conclusion is in our own minds.
Blind Spot Bias is an obvious one to start with, we naturally are blinded by our biases and usually only learn about biases by observing them in others and realizing that we as human beings must also have them too, without the benefit of direct observations of the biases in ourselves. A lot of research on psychological priming and biases has good evidence to support the idea that we all have biases and are profoundly unaware of them in our direct observations of our daily lives. They are well hidden.

Regarding Self-Serving bias, seeing OUR failures as caused by circumstances and the failures of OTHER PEOPLE as caused by their character makes us biased to see ourselves as both more moral than others and better in our decision making. Seeing ourselves through rose colored glasses and others through a dark tint gives us a false sense of superiority and competence that is undeserved.

The Halo Effect serves as a one two punch with the Self-Serving Bias because it lets us see any alleged flaws in someone as "proof" they are wrong, stupid, irrational, and so on in their claims and beliefs, conveniently whenever they disagree with us ! This is irrational as a person who is usually or generally wrong can have a valid claim or idea and a person who is generally right or intelligent or agrees with us can have an incorrect claim or idea.

The Halo Effect is one of the most frequently abused biases as politicians and media and people online frequently comment "that guy was SO WRONG about this other entirely unrelated topic ! No one should EVER listen to anything he says !" or "That person complains about an issue and consequences but they are not perfect in their own past behavior and conduct ! So they have no right to bring up the issue until they go back in time and live a life that cannot be criticized in any way because a flawed person only creates flawed arguments and claims !" Well, I may have exaggerated the last one a little to show that saying someone has to be perfect or have impeccable character and conduct to be listened to is just silly, since no human being has perfect behavior that is beyond any criticism.

We when it is convenient pivot to thinking of flaws in people, especially people who disagree with us and to think of good traits in people who agree with us as if the unrelated and irrelevant traits support or disprove claims that they have nothing to do with.

You can find thousands of comments online each day that follow this. A politician proposes a policy and many people attack or praise the policy based on their general attitude towards the politician or party.

Naive Realism has been written about a bit by psychologists and is well worth considering, it primes us to be overconfident in our own perception, memory and thinking and to arrogantly think WE have a better grasp on reality than others.

Naive Cynicism together with Naive Realism sets us up to see OTHER people as needing to learn critical thinking, scientific method and other subjects and to not see that we ourselves are part of the problem.

Dunning-Kruger Effect is a monster. Research has shown we are not built to dig into many or most subjects in depth, we lack the time and we work together as groups.

So, consequently, we have usually only a few subjects we learn in depth and we are prone to think we understand others far better than we really do. Even if we understand one or several subjects well we don't understand others and we and others can mistake knowledge or expertise in one area for the same thing in others.

I have seen people who have degrees even master's degrees and PhDs in some subjects give incredibly wrong and uneducated opinions in other subjects and seen that most people think a doctor or professor is an expert in everything when a little digging in the subject can show that they don't know what they're talking about.

This doesn't mean experts are worthless and always wrong. It means that they are just as vulnerable to Dunning-Kruger Effect in areas outside their expertise as the rest of us. Dunning-Kruger Effect is not about stupidity, it is about expertise.

I have seen thousands of comments that incorrectly identify Dunning-Kruger Effect as "stupid people don't know that they are stupid" that is not what it is, so you are claiming something incorrect if you say that.

Confirmation Bias is easy to understand as we can see it in others, but we have to work hard to monitor and acknowledge it in ourselves and to fight it. It goes against the easiest way to deal with information to fight confirmation bias, it takes willpower, self discipline and a lot of work to go against our first nature.

The ideas from A Theory Of Cognitive Dissonance by Leon Festinger and On Liberty by John Stuart Mill go a long way to deal with the basic problem. Thinking, Fast and Slow by Daniel Kahneman is a superb compliment to those ideas.

The Backfire Effect is a neat little cognitive trick in which we convince ourselves we are right even when evidence shows we are not. It debunks the deficit model of information in which it was believed that availability of true information was the barrier to people knowing the truth. You can provide true information and good evidence all day and it doesn't always help.

Third Person Effect is summed up by a little cartoon showing a half dozen or so stick figure people together. One thought bubble is above them all with a thought like "look at those poor fools, robots controlled by propaganda. Poor devils, I am the only one who is thinking for myself." We tend to see ourselves as having unrestrained free will, regardless of evidence, and others as being influenced by all sorts of things we ourselves somehow escaped.

Belief Bias is very subtle and hard to get people to understand because it operates below conscious awareness and only a lot of experiments and research in psychology and social psychology revealed the subconscious has tremendous unnoticed influence on our beliefs and behavior. The book Subliminal by Leonard Mlodinow gives a terrific description of how this works and great evidence to support the claims.

Beliefs that feel good or not threatening to us and our identity and prior beliefs are much easier to perceive, accept, think of and remember than beliefs that disagree, especially with beliefs that have deep emotional impact and our fundamental values about ourselves, our groups and whatever deep underlying assumptions serve to frame and define everything else for us, they could be emotional, psychological, religious or other types depending on what has strong emotional associations for us as individuals.

Belief Bias is described by the logical fallacy personal incredulity. If I don't consider the evidence and arguments for a claim because I cannot stand the idea of thinking of it because of a strong compulsion to reject it or to not endanger an idea or belief or value it could throw into doubt then the belief bias is at play. It can have a strong emotional or intellectual component or both simultaneously but one way or another considering an idea is unacceptable. But sometimes unthinkable ideas are true.


This grouping is sadly not complete but meant to give a taste of why "I think that I am not wrong and others are" exists and how this is unfortunately in each of us. This tendency serves as a primary obstacle to education on critical thinking, everyone knows it is lacking in society when they hear about it but they also know someone else needs to learn and that I am doing it just fine.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.