The simple fact is that if you give people a definition of critical thinking the almost universal response is a variation of "I am smart/ a naturally good critical thinker, other people need to study this. I don't have that problem, they do."
The fact is that critical thinking is a subject and requires a lot of study, questioning and hard work to even begin to learn. It is more like a martial art that requires lots of discipline, practice, concentration, devotion and gradual development than a trait like strength or speed that people have greatly varying degrees of natural endowment in.
People usually treat it as a trait that they have a high degree of naturally and that others, especially others they disagree with and in groups that oppose their beliefs have a low degree of.
But the obvious question is why ?
There are several factors that contribute to this result and I want to take on several of them, starting with some that are the building blocks that combine to create the overall effect.
I think that the fundamental grouping that we need to take on is folk psychology. We have ideas and assumptions about human behavior and minds that we get from parents, ourselves, peer groups and society overall and many of these ideas are, frankly, wrong.
We all hear stories from people like our parents and teachers that describe personal character and they are not all accurate for understanding human beings. We usually don't closely inspect the fundamental assumptions we have or the metaphors that frame how we think of everything else. As children our critical and independent thinking is so poorly developed that we are extremely vulnerable to indoctrination and so we unthinkingly accept the ideas our parents, teachers and peers give us and we usually don't realize the drawbacks of hanging onto these beliefs and unexamined assumptions as adults.
So, how do we correct this ? In part the answer is individual and shaped by our own life experiences.
I think a good starting point is to examine cognitive biases because they affect all of us and are not well laid out in folk psychology. Several contradict folk psychology, so learning them can undo false ideas and replace them with true relevant information.
I want to use quotes from an article to give a foundation on cognitive biases to examine. If you have a good understanding of the subject at the base level it can serve you well for evaluating the topic in other contexts.
50 COGNITIVE BIASES TO BE AWARE OF SO YOU CAN BE THE VERY BEST VERSION OF YOU
The human brain is pretty tricky: While we think we know things, there’s a whole list of cognitive biases that can be gumming up the works. We’ve found 50 types of cognitive bias that come up nearly every day, in petty Facebook arguments, in horoscopes, and on the global stage. Along with their definitions, these are real-life examples of cognitive bias, from the subtle groupthink sabotaging your management meetings to the pull of anchoring making you spend way too much money at a store during a sale. Knowing about this list of biases can help you make more informed decisions and realize when you’re way off the mark.
WHAT IS COGNITIVE BIAS?
WHAT IS COGNITIVE BIAS?
Let’s start off with a basic cognitive bias definition: It is a systematic error in cognitive processes (like thinking, perceiving, and memory) diverging from rationality, which can affect judgments. If we think of the human brain as a computer, cognitive bias basically is an error in the code, making us perceive the input differently or come up with an output that’s illogical.
But there are other types of bias as well that aren’t necessarily cognitive; for example, there’s the theory of social proofing, which is one of the more popular social psychological biases. Also, there can be cognitive theories that aren’t necessarily considered biases, or rather, they’re more like a network of common biases tangled together, like cognitive dissonance, which causes mental discomfort when we hold conflicting ideas or beliefs in our minds. Then, there’s the world-famous placebo effect, which can actually result in physiological changes.
End quote
- Blind Spot Bias: We don’t think we have bias, and we see it in others more than ourselves.
- Fundamental Attribution Error: We judge others on their personality or fundamental character, but we judge ourselves on the situation.
- Stereotyping: We adopt generalized beliefs that members of a group will have certain characteristics, despite not having information about the individual.
- Outgroup Homogeneity Bias: We perceive out-group members as homogeneous and our own in-groups as more diverse.
- In-Group Favoritism: We favor people who are in our in-group as opposed to an out-group.
- Bandwagon Effect: Ideas, fads, and beliefs grow as more people adopt them.
- Groupthink: Due to a desire for conformity and harmony in the group, we make irrational decisions, often to minimize conflict.
- False Consensus: We believe more people agree with us than is actually the case.
- Availability Cascade: Tied to our need for social acceptance, collective beliefs gain more plausibility through public repetition.
- Authority Bias: We trust and are more often influenced by the opinions of authority figures.
Blind Spot Bias is an obvious one to start with, we naturally are blinded by our biases and usually only learn about biases by observing them in others and realizing that we as human beings must also have them too, without the benefit of direct observations of the biases in ourselves. A lot of research on psychological priming and biases has good evidence to support the idea that we all have biases and are profoundly unaware of them in our direct observations of our daily lives. They are well hidden.
We have the vast array of biases that work to make us see ourselves and our peers as both rational and correct and as individuals with differences between us and them as both wrong and more similar to each other than they really are.
The Fundamental Attribution Error helps us to attribute actions of others to general traits and to see our actions as differentiated responses to different situations.
This is reinforced by Stereotyping as it is in agreement with it and further bolstered by Out Group Homogeneity Bias and helps us to see our group members as holding differences and out group members as holding the same beliefs, making seeing ALL of them as having incorrect beliefs easier. It also makes it easier to dismiss evidence that members of our group are wrong because we think of our group as having variations and so some people in it can be wrong without it lowering our opinion of the whole group.
In-Group Favoritism further helps this as we give our individual group members and the group overall the benefit of the doubt whenever possible. In other words we interpret ambiguous or gray area information favorably for our group members and don't give the benefit of the doubt to out-group members. We interpret ambiguous information unfavorably for them.
The Bandwagon Effect compounds this as we see ourselves and our group as better, including better at evaluating the truth, than others outside our group. So, a lot of our people believing an idea is rock solid proof of our accuracy to us and a lot of people outside our group believing an idea we disagree with serves to prove to us how wrong, stupid, irrational, evil, backward and so on the others are. A lot of us believing functions as proof we must be right but a lot of them disagreeing with our beliefs shows how consistently wrong they are.
Groupthink encourages us to conform to group norms and to go along with the most accepted ideas in the group, regardless of their truth or importance. Being believed and strongly embraced by the group serves as a substitute for being believed and embraced by ourselves.
False Consensus can boost the effect of many other group related biases as we think everyone else who is sensible believes something and everyone who is dead wrong disagrees.
The Availability Cascade, especially with red feeds and blue feeds tailored by algorithms which select the content we are most likely to agree with, serves to give us memes, articles, programs and videos that reinforce our beliefs and don't challenge them.
The Authority Bias helps us to see our beliefs and groups as proven right because we recognize authorities who are in agree with us and our peers as valid but see authorities who disagree with us as invalid. Simply put we use the In-Group Favoritism to select the authorities that fit the needs of the group, based on agreement and the other biases can work together such as Groupthink and the Bandwagon Effect when the authority and group agree without dissent. In free groups that are allowed to be open minded and to develop and express differing views this is to a degree blunted, but in high control groups aka authoritarian groups aka cults or destructive cults the power of ALL these biases can combine as obedience to authority and conformity to group norms without dissent creates a potent combination.
I wanted to zero in on these biases that are crucial to understand why we so easily see our group as right and the other groups as wrong or as more wrong when critical thinking is brought up because seeing the others as more wrong helps us to see them as having the problem and needing to improve their critical thinking first, while we usually never get around to it.
So, we can start with Blind Spot Bias as it hides all other biases then see the group of biases that includes (but is not limited to) The Fundamental Attribution Error, Stereotyping, Out Group Homogeneity Bias, In-Group Favoritism, the Bandwagon Effect, Groupthink, False Consensus, the Availability Cascade, and the Authority Bias as together helping us to not see our biases then see our side and peers and our authorities and selves as being right because we are in our groups and to see others outside our groups who disagree with us as wrong. It may seem complicated but I hope that the descriptions I gave help and you can always think of fictional and real examples. This is usually easiest with people and groups you strongly and passionately disagree with.
Just seeing how biases work and "compound" in anyone is difficult and a good start BUT if you don't carry it through to see how your authorities, your peers and you yourself also do it then you have just reinforced the biases with half understood justifications.
It's easy to see that the "opposition" is wrong. Most of us do, but it is harder to see how WE are wrong too. If you cannot see the fact that we are wrong too and dig into the details of when, how and why, then I frankly don't think you are going to be practicing critical thinking. Critical thinking expert Richard Paul in some lectures described pseudo critical thinkers who adopt a piece of critical thinking knowledge but don't have enough understanding of the subject in general or the information that they do have to correctly apply it. This doesn't mean you need a PhD in critical thinking to apply it. It means some of the ideas and techniques are more complicated than a phrase or sentence or paragraph and require more knowledge to properly apply.
In one lecture Richard Paul described just applying something to support your own beliefs or argument as sophistry and pseudo critical thinking. Sometimes people think winning arguments or getting opponents to withdraw from debates in confusion is critical thinking but it isn't, not even close.
A person can learn about biases and logical fallacies and propaganda techniques and rhetoric then face a less educated opponent relative to these ideas and bury them in terms they have to struggle through and unfamiliar aspects of debate such as the burden of proof falling on the claimant and they can use flawed arguments and claims while tearing apart the same flaws in arguments and claims by their opponent. This is sophistry, insincere debate, and attorneys are notorious for it. Also politicians.
It is a kind of intellectual cheating in which you insist that the opposition follows rules for carefully seeking the truth and you yourself only try to win with no regard for honesty or the truth.
There was a break in schools of philosophy a few thousand years ago and the sophists sought victory in debate and persuasion at any cost. Victory was their only goal. Many other philosophers saw both what they were doing and how they were doing it and condemned the sophists as intellectual frauds. And they were right to do so.
It took a lot of work by a lot of people, probably an unimaginable number, to get us the understanding we have today of our biases and critical thinking we can study today. I think it is well worth the time and effort and worth being honest with you about it too.
In-Group Favoritism further helps this as we give our individual group members and the group overall the benefit of the doubt whenever possible. In other words we interpret ambiguous or gray area information favorably for our group members and don't give the benefit of the doubt to out-group members. We interpret ambiguous information unfavorably for them.
The Bandwagon Effect compounds this as we see ourselves and our group as better, including better at evaluating the truth, than others outside our group. So, a lot of our people believing an idea is rock solid proof of our accuracy to us and a lot of people outside our group believing an idea we disagree with serves to prove to us how wrong, stupid, irrational, evil, backward and so on the others are. A lot of us believing functions as proof we must be right but a lot of them disagreeing with our beliefs shows how consistently wrong they are.
Groupthink encourages us to conform to group norms and to go along with the most accepted ideas in the group, regardless of their truth or importance. Being believed and strongly embraced by the group serves as a substitute for being believed and embraced by ourselves.
False Consensus can boost the effect of many other group related biases as we think everyone else who is sensible believes something and everyone who is dead wrong disagrees.
The Availability Cascade, especially with red feeds and blue feeds tailored by algorithms which select the content we are most likely to agree with, serves to give us memes, articles, programs and videos that reinforce our beliefs and don't challenge them.
The Authority Bias helps us to see our beliefs and groups as proven right because we recognize authorities who are in agree with us and our peers as valid but see authorities who disagree with us as invalid. Simply put we use the In-Group Favoritism to select the authorities that fit the needs of the group, based on agreement and the other biases can work together such as Groupthink and the Bandwagon Effect when the authority and group agree without dissent. In free groups that are allowed to be open minded and to develop and express differing views this is to a degree blunted, but in high control groups aka authoritarian groups aka cults or destructive cults the power of ALL these biases can combine as obedience to authority and conformity to group norms without dissent creates a potent combination.
I wanted to zero in on these biases that are crucial to understand why we so easily see our group as right and the other groups as wrong or as more wrong when critical thinking is brought up because seeing the others as more wrong helps us to see them as having the problem and needing to improve their critical thinking first, while we usually never get around to it.
So, we can start with Blind Spot Bias as it hides all other biases then see the group of biases that includes (but is not limited to) The Fundamental Attribution Error, Stereotyping, Out Group Homogeneity Bias, In-Group Favoritism, the Bandwagon Effect, Groupthink, False Consensus, the Availability Cascade, and the Authority Bias as together helping us to not see our biases then see our side and peers and our authorities and selves as being right because we are in our groups and to see others outside our groups who disagree with us as wrong. It may seem complicated but I hope that the descriptions I gave help and you can always think of fictional and real examples. This is usually easiest with people and groups you strongly and passionately disagree with.
Just seeing how biases work and "compound" in anyone is difficult and a good start BUT if you don't carry it through to see how your authorities, your peers and you yourself also do it then you have just reinforced the biases with half understood justifications.
It's easy to see that the "opposition" is wrong. Most of us do, but it is harder to see how WE are wrong too. If you cannot see the fact that we are wrong too and dig into the details of when, how and why, then I frankly don't think you are going to be practicing critical thinking. Critical thinking expert Richard Paul in some lectures described pseudo critical thinkers who adopt a piece of critical thinking knowledge but don't have enough understanding of the subject in general or the information that they do have to correctly apply it. This doesn't mean you need a PhD in critical thinking to apply it. It means some of the ideas and techniques are more complicated than a phrase or sentence or paragraph and require more knowledge to properly apply.
In one lecture Richard Paul described just applying something to support your own beliefs or argument as sophistry and pseudo critical thinking. Sometimes people think winning arguments or getting opponents to withdraw from debates in confusion is critical thinking but it isn't, not even close.
A person can learn about biases and logical fallacies and propaganda techniques and rhetoric then face a less educated opponent relative to these ideas and bury them in terms they have to struggle through and unfamiliar aspects of debate such as the burden of proof falling on the claimant and they can use flawed arguments and claims while tearing apart the same flaws in arguments and claims by their opponent. This is sophistry, insincere debate, and attorneys are notorious for it. Also politicians.
It is a kind of intellectual cheating in which you insist that the opposition follows rules for carefully seeking the truth and you yourself only try to win with no regard for honesty or the truth.
There was a break in schools of philosophy a few thousand years ago and the sophists sought victory in debate and persuasion at any cost. Victory was their only goal. Many other philosophers saw both what they were doing and how they were doing it and condemned the sophists as intellectual frauds. And they were right to do so.
It took a lot of work by a lot of people, probably an unimaginable number, to get us the understanding we have today of our biases and critical thinking we can study today. I think it is well worth the time and effort and worth being honest with you about it too.
Click to view the full-size infographic
50 COGNITIVE BIASES TO BE AWARE OF SO YOU CAN BE THE VERY BEST VERSION OF YOU
The human brain is pretty tricky: While we think we know things, there’s a whole list of cognitive biases that can be gumming up the works. We’ve found 50 types of cognitive bias that come up nearly every day, in petty Facebook arguments, in horoscopes, and on the global stage. Along with their definitions, these are real-life examples of cognitive bias, from the subtle groupthink sabotaging your management meetings to the pull of anchoring making you spend way too much money at a store during a sale. Knowing about this list of biases can help you make more informed decisions and realize when you’re way off the mark.
WHAT IS COGNITIVE BIAS?
WHAT IS COGNITIVE BIAS?
Let’s start off with a basic cognitive bias definition: It is a systematic error in cognitive processes (like thinking, perceiving, and memory) diverging from rationality, which can affect judgments. If we think of the human brain as a computer, cognitive bias basically is an error in the code, making us perceive the input differently or come up with an output that’s illogical.
But there are other types of bias as well that aren’t necessarily cognitive; for example, there’s the theory of social proofing, which is one of the more popular social psychological biases. Also, there can be cognitive theories that aren’t necessarily considered biases, or rather, they’re more like a network of common biases tangled together, like cognitive dissonance, which causes mental discomfort when we hold conflicting ideas or beliefs in our minds. Then, there’s the world-famous placebo effect, which can actually result in physiological changes.
Let’s go into some common cognitive bias examples to really see how they work!
50 TYPES OF COMMON COGNITIVE BIASES
- Fundamental Attribution Error: We judge others on their personality or fundamental character, but we judge ourselves on the situation.
- Self-Serving Bias: Our failures are situational, but our successes are our responsibility.
- In-Group Favoritism: We favor people who are in our in-group as opposed to an out-group.
- Bandwagon Effect: Ideas, fads, and beliefs grow as more people adopt them.
- Groupthink: Due to a desire for conformity and harmony in the group, we make irrational decisions, often to minimize conflict.
- Halo Effect: If you see a person as having a positive trait, that positive impression will spill over into their other traits. (This also works for negative traits.)
- Moral Luck: Better moral standing happens due to a positive outcome; worse moral standing happens due to a negative outcome.
- False Consensus: We believe more people agree with us than is actually the case.
- Curse of Knowledge: Once we know something, we assume everyone else knows it, too.
- Spotlight Effect: We overestimate how much people are paying attention to our behavior and appearance.
- Availability Heuristic: We rely on immediate examples that come to mind while making judgments.
- Defensive Attribution: As a witness who secretly fears being vulnerable to a serious mishap, we will blame the victim less if we relate to the victim.
- Just-World Hypothesis: We tend to believe the world is just; therefore, we assume acts of injustice are deserved.
- Naïve Realism: We believe that we observe objective reality and that other people are irrational, uninformed, or biased.
- Naïve Cynicism: We believe that we observe objective reality and that other people have a higher egocentric bias than they actually do in their intentions/actions.
- Forer Effect (aka Barnum Effect): We easily attribute our personalities to vague statements, even if they can apply to a wide range of people.
- Dunning-Kruger Effect: The less you know, the more confident you are. The more you know, the less confident you are.
- Anchoring: We rely heavily on the first piece of information introduced when making decisions.
- Automation Bias: We rely on automated systems, sometimes trusting too much in the automated correction of actually correct decisions.
- Google Effect (aka Digital Amnesia): We tend to forget information that’s easily looked up in search engines.
- Reactance: We do the opposite of what we’re told, especially when we perceive threats to personal freedoms.
- Confirmation Bias: We tend to find and remember information that confirms our perceptions.
- Backfire Effect: Disproving evidence sometimes has the unwarranted effect of confirming our beliefs.
- Third-Person Effect: We believe that others are more affected by mass media consumption than we ourselves are.
- Belief Bias: We judge an argument’s strength not by how strongly it supports the conclusion but how plausible the conclusion is in our own minds.
- Availability Cascade: Tied to our need for social acceptance, collective beliefs gain more plausibility through public repetition.
- Declinism: We tent to romanticize the past and view the future negatively, believing that societies/institutions are by and large in decline.
- Status Quo Bias: We tend to prefer things to stay the same; changes from the baseline are considered to be a loss.
- Sunk Cost Fallacy (aka Escalation of Commitment): We invest more in things that have cost us something rather than altering our investments, even if we face negative outcomes.
- Gambler’s Fallacy: We think future possibilities are affected by past events.
- Zero-Risk Bias: We prefer to reduce small risks to zero, even if we can reduce more risk overall with another option.
- Framing Effect: We often draw different conclusions from the same information depending on how it’s presented.
- Stereotyping: We adopt generalized beliefs that members of a group will have certain characteristics, despite not having information about the individual.
- Outgroup Homogeneity Bias: We perceive out-group members as homogeneous and our own in-groups as more diverse.
- Authority Bias: We trust and are more often influenced by the opinions of authority figures.
- Placebo Effect: If we believe a treatment will work, it often will have a small physiological effect.
- Survivorship Bias: We tend to focus on those things that survived a process and overlook ones that failed.
- Tachypsychia: Our perceptions of time shift depending on trauma, drug use, and physical exertion.
- Law of Triviality (aka “Bike-Shedding”): We give disproportionate weight to trivial issues, often while avoiding more complex issues.
- Zeigarnik Effect: We remember incomplete tasks more than completed ones.
- IKEA Effect: We place higher value on things we partially created ourselves.
- Ben Franklin Effect: We like doing favors; we are more likely to do another favor for someone if we’ve already done a favor for them than if we had received a favor from that person.
- Bystander Effect: The more other people are around, the less likely we are to help a victim.
- Suggestibility: We, especially children, sometimes mistake ideas suggested by a questioner for memories.
- False Memory: We mistake imagination for real memories.
- Cryptomnesia: We mistake real memories for imagination.
- Clustering Illusion: We find patterns and “clusters” in random data.
- Pessimism Bias: We sometimes overestimate the likelihood of bad outcomes.
- Optimism Bias: We sometimes are over-optimistic about good outcomes.
- Blind Spot Bias: We don’t think we have bias, and we see it in others more than ourselves.
Use our cognitive bias infographic as inspiration for becoming better and knowing more! You can even print it out and use it as a cognitive bias poster to encourage others to do the same.
End Quote
End Quote
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.