Tuesday, November 5, 2019

The Knowledge Illusion part 11

This post is on the book The Knowledge Illusion by Steven Sloman (cognitive scientist and professor at Brown University) and Phillip Fernbach (cognitive scientist and professor of marketing at Colorado's Leeds School of business).

This post is the eleventh in a series of sixteen that address The Knowledge Illusion and unless otherwise noted all quotes are from The Knowledge Illusion. I recommend reading all sixteen posts in order.

I have written on numerous other books on psychology, social psychology, critical thinking, cognitive dissonance theory and related topics already but discovered this one and feel it plays a complimentary and very needed role. It helps to explain a huge number of "hows" and "whys" regarding the other subjects I mentioned, all of the subjects.


In chapter eight, Thinking About Science, the authors take on our knowledge and understanding of science (gulp). The authors start by pointing out that people have long had a strain of apprehension and suspicion regarding science and technology.
There is a story of a man named Ned Ludd who allegedly smashed a knitting machine with a hammer. Now the term Luddite or neo Luddite has expanded to embody the attitude of opposing technology.

The authors point out that a reasonable skepticism towards science and technology is probably healthy for society but antiscientific thinking can be dangerous when it goes too far. I tend to concur.

The authors give several topics as examples including climate change, genetic engineering,  and vaccination.

They then introduced a bit on the history of scientific education of the public. For a long time the assumption was that people who didn't know the accepted science on a topic simply were not exposed to the information. This is called the deficit model.

The model is based on the hypothesis that antiscientific thinking is due to a knowledge deficit and will disappear as soon as deficit is filled.

The Bodmer Report was an effort to see how well scientific education has been doing. A group called the National Science Board came up with a quiz of twelve basic questions about science such as does the earth go around the sun or does the sun go around the earth and several true or false questions on topics such as evolution, if electrons are smaller than atoms and similar topics.

People have also been asked their feelings about topics like genetic engineering, and other scientific issues. People who have better attitudes toward those topics do a little better on average.

People in America tend to get the answers right about on about  51% of the questions or less for 6 of the questions and get 61 - 73% right for 3 of the questions and get 80 - 84% right for three questions.  Not great for very simple and basic science and most are true false questions, so just guessing gets 50% correct on average. In case you think it is particularly damning for Americans, people from Russia, China, the EU, India and Japan do no better.

 "But here's the real problem with the deficit model. Decades of attempts to educate people about science have been ineffective at achieving the aspirations of the Bodmer Report: to promote positive views about science throughout society by fostering scientific literacy. Despite all the effort and energy that has gone into promoting public understanding of science, the millions and millions of dollars spent on research, curriculum design, outreach, and communication, we just do not seem to be making headway on that goal. Antiscientific beliefs are still pervasive and strong, and education does not seem to be helping."
(Page 159)

The issue of vaccines was used as an example. Parents were given information including specific information on the terrible effects of failing to vaccinate children, including emotional descriptions. They were also given information debunking the link between vaccines and autism.

 "None of the information made people more likely to say they would vaccinate. In fact, some of the information backfired. After seeing images of sick children, parents expressed an increased belief in the vaccine-autism connection, and after reading the emotional story, parents were more likely to believe in serious vaccine side effects." (Page 159)

People spent a lot of time and a lot of effort, including research in many fields to find the answer to this. I think it is one of the most important answers that has been found regarding human knowledge and behavior.

 "The answer that has dominated thinking recently is that nothing has gone wrong. Scientific attitudes are not based on rational evaluation of evidence, and therefore providing information does not change them. Attitudes are determined instead by a host of contextual and cultural factors that make them largely immune to change."

 "One of the leading voices in promoting this new perspective is Dan Kahan, a law professor from Yale. Our attitudes are not based on a rational, detached evaluation of the evidence, Kahan argues. This is because our beliefs are not isolated pieces of data that we can take and discard at will. Instead, beliefs are deeply intertwined with other beliefs, shared cultural values, and our identities. To discard a belief often means discarding a whole host of other beliefs, forsaking our communities, going against those we trust and love, and in short, challenging our identities."  (Page 160)

So the authors discovered WHY giving people a little information regarding GMOs, vaccines, evolution, or global warming does little to change their views.

Now, I can say I completely understand the difficulty in rejecting a pseudoscience belief or in my case belief system as I rejected Scientology after twenty five years and found it one of the most difficult things that I ever had to do. It involved rejecting my justification for years of neglecting and rejecting the people in my life who I should have been there for the most.

When I rejected the false justification Scientology provided it was like living your life thinking that an autobiography about yourself would portray a good or even heroic person and getting the big twist of realizing you had been a villain the whole time and only fooled yourself in your deluded mind, only to wake up to the dark truth in the third act. It has been called mind shattering and heart crushing for good reason. You get a big reversal and find out that you have done almost incomprehensible evil while thinking you were righteous all along. It is a revelation that is understandable why people avoid it and often never accept it, no matter how much evidence there is or how strong that evidence is.

The authors gave the example of a guy they call Science Mike. Mike McHargue is a podcaster and blogger who goes by that name. He grew up in Tallahassee Florida as a fundamentalist Christian. Like many fundamentalists he believed in Young Earth Creationism, denied evolution and believed prayer could heal people.

Mike in his thirties read about evolution, paleontology, biology, physics regarding the universe and studies on the effectiveness of prayer. Mike went through a long journey and today is a Christian but he has rejected the fundamentalist antiscientific beliefs of his old church.

Science Mike will discuss both science and his faith in his podcast. In one podcast he had to answer a caller who realized his beliefs don't match the fundamentalist Christian faith he was brought up in and Mike told him he would need to find people who share his current beliefs to worship with. Mike was honest about the hardship this creates and the lost relationships it can bring. For Mike his roots were in one community and faith. When he began questioning that,  his whole world was turned upside down. The relationships that were most important to him were dramatically changed.

"That is the power of culture. Our beliefs are not our own. They are shared with the community. And this makes them really hard to change.

Science Mike's experience gives us a feel for where the knowledge illusion comes from. We typically don't know enough individually to form knowledgeable, nuanced views about new technologies and scientific developments. We simply have no choice but to adopt the positions of those we trust. Our attitudes and those of the people around us thus becoming mutually reinforcing. And the fact that we have a strong opinion makes us think that there must be a firm basis for our opinion, so we think we know a lot, more than in fact we do. " (Page 162)

This is chilling. But supported by a lot of good research and evidence in my opinion. I have written numerous blog posts on many books and studies that support this.

We tend to take beliefs on with only the slimmest of superficial understandings regarding them based almost entirely on how they fit with our prior beliefs and the beliefs people in our social groups embrace. And we think we have knowledge far beyond what we do and mistake our confidence in our beliefs, which is often high, with both our understanding of those beliefs, which is routinely low and the soundness of the basis for our beliefs which is also routinely poor. Gulp.

Regarding the science quiz given earlier people had high confidence they did well, regardless of whether they did well or not.

"There was no relationship at all between science literacy and people's evaluations of their own knowledge; those who got many answers wrong reported knowing just as much about the technologies as those who did well." (Page 162)


"All of this should sound familiar by now. People tend to have limited understanding of complex issues and they have trouble absorbing details (like answers to factual questions). They also tend not to have a good sense of how much they know, and they lean heavily on their community of knowledge as a basis for their beliefs. The outcome is passionate, polarized attitudes that are hard to change." (Page 163)

To compliment the information presented here a long list of experiments and research from many books could be added such as Influence by Robert Cialdini, A Theory Of Cognitive Dissonance by Leon Festinger, Subliminal by Leonard Mlodinow and many others. I have written numerous posts at my blog and often described the very research I am referring to.




The authors pointed out that as humans we are poor at remembering details and think in causal models. They often involve substitution of things that seem similar with us thinking of electricity like water and drugs like fuel in a car.

These inaccurate folk understandings lead to incorrect beliefs about how the effects of drugs get affected by vigorous activity or how electricity moves or is stored.

A false idea of what genetic modifications do to food can lead to scary beliefs about what they can do to us, even if those beliefs are false. Many concerns about genetic engineering of food don't hold up if you understand how it actually works.

The case is similar with many arguments against vaccines.

"Beliefs are hard to change because they are wrapped up with our values and identities, and they are shared with our community. Moreover, what is actually in our heads - our causal models - are sparse and often wrong. This explains why false beliefs are so hard to weed out. Sometimes communities get the science wrong, usually in ways that are supported by our causal models. And the knowledge illusion means that we don't check our understanding often or deeply enough. This is a recipe for antiscientific thinking. " (Page 169)


For several years Michael Ranney, a psychologist at the University of California, Berkeley has been researching how to educate people so they will understand global warming.

He found the unsurprising fact that people have very little understanding of how it works. He asked a couple hundred people in Paris in San Diego a series of questions and found only 12 percent of people have even a partial understanding and no one he asked could give a full and accurate description.

Next he tried to inform people and showed a four hundred word primer on global warming. He is making a website that shows videos of varying length from five minutes to fifty two second videos.

Preliminary results are promising, but only time will tell if this will work on the long term.

"The first step to correcting false beliefs is opening people's minds to the idea that they and their community might have the science wrong. No one wants to be wrong." (Page 170)

That is a huge barrier to overcome. To get someone to realize that they and their community, including religious leaders, parents, friends, politicians they support and on and on could all be wrong ? That is a lot to ask. It's downright critical thinking !





No comments:

Post a Comment

Note: Only a member of this blog may post a comment.