Thursday, April 16, 2020

The Main Barriers to Critical Thinking

I have written on critical thinking and studied it for several years. I have discovered that some issues are the primary obstacles to overcome when trying to encourage people to use critical thinking.

The simple fact is that if you give people a definition of critical thinking the almost universal response is a variation of "I am smart/ a naturally good critical thinker, other people need to study this. I don't have that problem, they do."

 The fact is that critical thinking is a subject and requires a lot of study, questioning and hard work to even begin to learn. It is more like a martial art that requires lots of discipline, practice, concentration, devotion and gradual development than a trait like strength or speed that people have greatly varying degrees of natural endowment in.

People usually treat it as a trait that they have a high degree of naturally and that others, especially others they disagree with and in groups that oppose their beliefs have a low degree of.

But the obvious question is why ?

There are several factors that contribute to this result and I want to take on several of them, starting with some that are the building blocks that combine to create the overall effect.

I think that the fundamental grouping that we need to take on is folk psychology. We have ideas and assumptions about human behavior and minds that we get from parents, ourselves, peer groups and society overall and many of these ideas are, frankly, wrong.

We all hear stories from people like our parents and teachers that describe personal character and they are not all accurate for understanding human beings. We usually don't closely inspect the fundamental assumptions we have or the metaphors that frame how we think of everything else. As children our critical and independent thinking is so poorly developed that we are extremely vulnerable to indoctrination and so we unthinkingly accept the ideas our parents, teachers and peers give us and we usually don't realize the drawbacks of hanging onto these beliefs and unexamined assumptions as adults.

So, how do we correct this ? In part the answer is individual and shaped by our own life experiences.

I think a good starting point is to examine cognitive biases because they affect all of us and are not well laid out in folk psychology. Several contradict folk psychology, so learning them can undo false ideas and replace them with true relevant information.

I want to use quotes from an article to give a foundation on cognitive biases to examine. If you have a good understanding of the subject at the base level it can serve you well for evaluating the topic in other contexts.








50 COGNITIVE BIASES TO BE AWARE OF SO YOU CAN BE THE VERY BEST VERSION OF YOU

The human brain is pretty tricky: While we think we know things, there’s a whole list of cognitive biases that can be gumming up the works. We’ve found 50 types of cognitive bias that come up nearly every day, in petty Facebook arguments, in horoscopes, and on the global stage. Along with their definitions, these are real-life examples of cognitive bias, from the subtle groupthink sabotaging your management meetings to the pull of anchoring making you spend way too much money at a store during a sale. Knowing about this list of biases can help you make more informed decisions and realize when you’re way off the mark.

WHAT IS COGNITIVE BIAS?
Let’s start off with a basic cognitive bias definition: It is a systematic error in cognitive processes (like thinking, perceiving, and memory) diverging from rationality, which can affect judgments. If we think of the human brain as a computer, cognitive bias basically is an error in the code, making us perceive the input differently or come up with an output that’s illogical.
But there are other types of bias as well that aren’t necessarily cognitive; for example, there’s the theory of social proofing, which is one of the more popular social psychological biases. Also, there can be cognitive theories that aren’t necessarily considered biases, or rather, they’re more like a network of common biases tangled together, like cognitive dissonance, which causes mental discomfort when we hold conflicting ideas or beliefs in our minds. Then, there’s the world-famous placebo effect, which can actually result in physiological changes.
End quote

So, let's focus on just the biases that directly make it hard for us to understand that we need to improve our own critical thinking and that other people are not entirely the problem.

I will quote the definitions from the article below and then comment. I am going to take on a category of biases and break it down into two groups. Here are the biases that I have chosen to start with.

The first group is ten biases that are essential to seeing my side as right, including my group, my peers, celebrities and historical figures I admire and authorities who I agree with. 

It also helps me to see such people who I dislike or don't admire and see as different from me and my groups as wrong. 


  1. Blind Spot Bias: We don’t think we have bias, and we see it in others more than ourselves.
  2. Fundamental Attribution Error: We judge others on their personality or fundamental character, but we judge ourselves on the situation.
  3. Stereotyping: We adopt generalized beliefs that members of a group will have certain characteristics, despite not having information about the individual.
  4. Outgroup Homogeneity Bias: We perceive out-group members as homogeneous and our own in-groups as more diverse.
  5. In-Group Favoritism: We favor people who are in our in-group as opposed to an out-group.
  6. Bandwagon Effect: Ideas, fads, and beliefs grow as more people adopt them.
  7. Groupthink: Due to a desire for conformity and harmony in the group, we make irrational decisions, often to minimize conflict.
  8. False Consensus: We believe more people agree with us than is actually the case.
  9. Availability Cascade: Tied to our need for social acceptance, collective beliefs gain more plausibility through public repetition.
  10. Authority Bias: We trust and are more often influenced by the opinions of authority figures.
Blind Spot Bias is an obvious one to start with, we naturally are blinded by our biases and usually only learn about biases by observing them in others and realizing that we as human beings must also have them too, without the benefit of direct observations of the biases in ourselves. A lot of research on psychological priming and biases has good evidence to support the idea that we all have biases and are profoundly unaware of them in our direct observations of our daily lives. They are well hidden.

We have the vast array of biases that work to make us see ourselves and our peers as both rational and correct and as individuals with differences between us and them as both wrong and more similar to each other than they really are.

The Fundamental Attribution Error helps us to attribute actions of others to general traits and to see our actions as differentiated responses to different situations.

This is reinforced by Stereotyping as it is in agreement with it and further bolstered by Out Group Homogeneity Bias and helps us to see our group members as holding differences and out group members as holding the same beliefs, making seeing ALL of them as having incorrect beliefs easier. It also makes it easier to dismiss evidence that members of our group are wrong because we think of our group as having variations and so some people in it can be wrong without it lowering our opinion of the whole group.

In-Group Favoritism further helps this as we give our individual group members and the group overall the benefit of the doubt whenever possible. In other words we interpret ambiguous or gray area information favorably for our group members and don't give the benefit of the doubt to out-group members. We interpret ambiguous information unfavorably for them.

The Bandwagon Effect compounds this as we see ourselves and our group as better, including better at evaluating the truth, than others outside our group. So, a lot of our people believing an idea is rock solid proof of our accuracy to us and a lot of people outside our group believing an idea we disagree with serves to prove to us how wrong, stupid, irrational, evil, backward and so on the others are.  A lot of us believing functions as proof we must be right but a lot of them disagreeing with our beliefs shows how consistently wrong they are.

Groupthink encourages us to conform to group norms and to go along with the most accepted ideas in the group, regardless of their truth or importance. Being believed and strongly embraced by the group serves as a substitute for being believed and embraced by ourselves.

False Consensus can boost the effect of many other group related biases as we think everyone else who is sensible believes something and everyone who is dead wrong disagrees.

The Availability Cascade, especially with red feeds and blue feeds tailored by algorithms which select the content we are most likely to agree with, serves to give us memes, articles, programs and videos that reinforce our beliefs and don't challenge them.

The Authority Bias helps us to see our beliefs and groups as proven right because we recognize authorities who are in agree with us and our peers as valid but see authorities who disagree with us as invalid. Simply put we use the In-Group Favoritism to select the authorities that fit the needs of the group, based on agreement and the other biases can work together such as Groupthink and the Bandwagon Effect when the authority and group agree without dissent. In free groups that are allowed to be open minded and to develop and express differing views this is to a degree blunted, but in high control groups aka authoritarian groups aka cults or destructive cults the power of ALL these biases can combine as obedience to authority and conformity to group norms without dissent creates a potent combination.

I wanted to zero in on these biases that are crucial to understand why we so easily see our group as right and the other groups as wrong or as more wrong when critical thinking is brought up because seeing the others as more wrong helps us to see them as having the problem and needing to improve their critical thinking first, while we usually never get around to it.

So, we can start with Blind Spot Bias as it hides all other biases then see the group of biases that includes (but is not limited to) The Fundamental Attribution Error, Stereotyping, Out Group Homogeneity Bias, In-Group Favoritism, the Bandwagon Effect, Groupthink, False Consensus, the Availability Cascade, and the Authority Bias as together helping us to not see our biases then see our side and peers and our authorities and selves as being right because we are in our groups and to see others outside our groups who disagree with us as wrong. It may seem complicated but I hope that the descriptions I gave help and you can always think of fictional and real examples. This is usually easiest with people and groups you strongly and passionately disagree with.

Just seeing how biases work and "compound" in anyone is difficult and a good start BUT if you don't carry it through to see how your authorities, your peers and you yourself also do it then you have just reinforced the biases with half understood justifications.

It's easy to see that the "opposition" is wrong. Most of us do, but it is harder to see how WE are wrong too. If you cannot see the fact that we are wrong too and dig into the details of when, how and why, then I frankly don't think you are going to be practicing critical thinking. Critical thinking expert Richard Paul in some lectures described pseudo critical thinkers who adopt a piece of critical thinking knowledge but don't have enough understanding of the subject in general or the information that they do have to correctly apply it. This doesn't mean you need a PhD in critical thinking to apply it. It means some of the ideas and techniques are more complicated than a phrase or sentence or paragraph and require more knowledge to properly apply.

In one lecture Richard Paul described just applying something to support your own beliefs or argument as sophistry and pseudo critical thinking. Sometimes people think winning arguments or getting opponents to withdraw from debates in confusion is critical thinking but it isn't, not even close.

A person can learn about biases and logical fallacies and propaganda techniques and rhetoric then face a less educated opponent relative to these ideas and bury them in terms they have to struggle through and unfamiliar aspects of debate such as the burden of proof falling on the claimant and they can use flawed arguments and claims while tearing apart the same flaws in arguments and claims by their opponent. This is sophistry, insincere debate, and attorneys are notorious for it. Also politicians.

It is a kind of intellectual cheating in which you insist that the opposition follows rules for carefully seeking the truth and you yourself only try to win with no regard for honesty or the truth.

There was a break in schools of philosophy a few thousand years ago and the sophists sought victory in debate and persuasion at any cost. Victory was their only goal. Many other philosophers saw both what they were doing and how they were doing it and condemned the sophists as intellectual frauds. And they were right to do so.

It took a lot of work by a lot of people, probably an unimaginable number, to get us the understanding we have today of our biases and critical thinking we can study today. I think it is well worth the time and effort and worth being honest with you about it too.

Click to view the full-size infographic




50 COGNITIVE BIASES TO BE AWARE OF SO YOU CAN BE THE VERY BEST VERSION OF YOU

The human brain is pretty tricky: While we think we know things, there’s a whole list of cognitive biases that can be gumming up the works. We’ve found 50 types of cognitive bias that come up nearly every day, in petty Facebook arguments, in horoscopes, and on the global stage. Along with their definitions, these are real-life examples of cognitive bias, from the subtle groupthink sabotaging your management meetings to the pull of anchoring making you spend way too much money at a store during a sale. Knowing about this list of biases can help you make more informed decisions and realize when you’re way off the mark.

WHAT IS COGNITIVE BIAS?
Let’s start off with a basic cognitive bias definition: It is a systematic error in cognitive processes (like thinking, perceiving, and memory) diverging from rationality, which can affect judgments. If we think of the human brain as a computer, cognitive bias basically is an error in the code, making us perceive the input differently or come up with an output that’s illogical.
But there are other types of bias as well that aren’t necessarily cognitive; for example, there’s the theory of social proofing, which is one of the more popular social psychological biases. Also, there can be cognitive theories that aren’t necessarily considered biases, or rather, they’re more like a network of common biases tangled together, like cognitive dissonance, which causes mental discomfort when we hold conflicting ideas or beliefs in our minds. Then, there’s the world-famous placebo effect, which can actually result in physiological changes.
Let’s go into some common cognitive bias examples to really see how they work!

50 TYPES OF COMMON COGNITIVE BIASES

  1. Fundamental Attribution Error: We judge others on their personality or fundamental character, but we judge ourselves on the situation.
  2. Self-Serving Bias: Our failures are situational, but our successes are our responsibility.
  3. In-Group Favoritism: We favor people who are in our in-group as opposed to an out-group.
  4. Bandwagon Effect: Ideas, fads, and beliefs grow as more people adopt them.
  5. Groupthink: Due to a desire for conformity and harmony in the group, we make irrational decisions, often to minimize conflict.
  6. Halo Effect: If you see a person as having a positive trait, that positive impression will spill over into their other traits. (This also works for negative traits.)
  7. Moral Luck: Better moral standing happens due to a positive outcome; worse moral standing happens due to a negative outcome.
  8. False Consensus: We believe more people agree with us than is actually the case.
  9. Curse of Knowledge: Once we know something, we assume everyone else knows it, too.
  10. Spotlight Effect: We overestimate how much people are paying attention to our behavior and appearance.
  11. Availability Heuristic: We rely on immediate examples that come to mind while making judgments.
  12. Defensive Attribution: As a witness who secretly fears being vulnerable to a serious mishap, we will blame the victim less if we relate to the victim.
  13. Just-World Hypothesis: We tend to believe the world is just; therefore, we assume acts of injustice are deserved.
  14. Naïve Realism: We believe that we observe objective reality and that other people are irrational, uninformed, or biased.
  15. Naïve Cynicism: We believe that we observe objective reality and that other people have a higher egocentric bias than they actually do in their intentions/actions.
  16. Forer Effect (aka Barnum Effect): We easily attribute our personalities to vague statements, even if they can apply to a wide range of people.
  17. Dunning-Kruger Effect: The less you know, the more confident you are. The more you know, the less confident you are.
  18. Anchoring: We rely heavily on the first piece of information introduced when making decisions.
  19. Automation Bias: We rely on automated systems, sometimes trusting too much in the automated correction of actually correct decisions.
  20. Google Effect (aka Digital Amnesia): We tend to forget information that’s easily looked up in search engines.
  21. Reactance: We do the opposite of what we’re told, especially when we perceive threats to personal freedoms.
  22. Confirmation Bias: We tend to find and remember information that confirms our perceptions.
  23. Backfire Effect: Disproving evidence sometimes has the unwarranted effect of confirming our beliefs.
  24. Third-Person Effect: We believe that others are more affected by mass media consumption than we ourselves are.
  25. Belief Bias: We judge an argument’s strength not by how strongly it supports the conclusion but how plausible the conclusion is in our own minds.
  26. Availability Cascade: Tied to our need for social acceptance, collective beliefs gain more plausibility through public repetition.
  27. Declinism: We tent to romanticize the past and view the future negatively, believing that societies/institutions are by and large in decline.
  28. Status Quo Bias: We tend to prefer things to stay the same; changes from the baseline are considered to be a loss.
  29. Sunk Cost Fallacy (aka Escalation of Commitment): We invest more in things that have cost us something rather than altering our investments, even if we face negative outcomes.
  30. Gambler’s Fallacy: We think future possibilities are affected by past events.
  31. Zero-Risk Bias: We prefer to reduce small risks to zero, even if we can reduce more risk overall with another option.
  32. Framing Effect: We often draw different conclusions from the same information depending on how it’s presented.
  33. Stereotyping: We adopt generalized beliefs that members of a group will have certain characteristics, despite not having information about the individual.
  34. Outgroup Homogeneity Bias: We perceive out-group members as homogeneous and our own in-groups as more diverse.
  35. Authority Bias: We trust and are more often influenced by the opinions of authority figures.
  36. Placebo Effect: If we believe a treatment will work, it often will have a small physiological effect.
  37. Survivorship Bias: We tend to focus on those things that survived a process and overlook ones that failed.
  38. Tachypsychia: Our perceptions of time shift depending on trauma, drug use, and physical exertion.
  39. Law of Triviality (aka “Bike-Shedding”): We give disproportionate weight to trivial issues, often while avoiding more complex issues.
  40. Zeigarnik Effect: We remember incomplete tasks more than completed ones.
  41. IKEA Effect: We place higher value on things we partially created ourselves.
  42. Ben Franklin Effect: We like doing favors; we are more likely to do another favor for someone if we’ve already done a favor for them than if we had received a favor from that person.
  43. Bystander Effect: The more other people are around, the less likely we are to help a victim.
  44. Suggestibility: We, especially children, sometimes mistake ideas suggested by a questioner for memories.
  45. False Memory: We mistake imagination for real memories.
  46. Cryptomnesia: We mistake real memories for imagination.
  47. Clustering Illusion: We find patterns and “clusters” in random data.
  48. Pessimism Bias: We sometimes overestimate the likelihood of bad outcomes.
  49. Optimism Bias: We sometimes are over-optimistic about good outcomes.
  50. Blind Spot Bias: We don’t think we have bias, and we see it in others more than ourselves.
Use our cognitive bias infographic as inspiration for becoming better and knowing more! You can even print it out and use it as a cognitive bias poster to encourage others to do the same.
End Quote

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.