Thursday, February 25, 2021

Critical Thinking - a Simple and Concise Look 6

 "The fact is, though our experiences (and our judgements about those experiences) are reliable enough for most practical purposes, they often mislead us in the strangest, most unexpected ways - especially when the experiences are exceptional or mysterious. This is because our perceptual capacities, our memories, our states of consciousness, our information-processing abilities have perfectly natural but amazing powers and limits. Apparently, most people are unaware of these powers and limits. But these odd characteristics of our minds are very influential. Because of them, as several psychologists have pointed out, we should expect to have many natural experiences that seem for all the world like supernatural or paranormal events. So even if the supernatural or paranormal didn't exist, weird things would still happen to us.

The point is not that every strange experience must indicate a natural phenomenon - nor is it that every weird happening must be supernatural. The point is that some ways of thinking about personal experience help increase our chances of getting to the truth of the matter. If our minds have peculiar characteristics that influence our experience and how we judge that experience, we need to know about those characteristics and understand how to think our way through them - all the way through, to conclusions that make sense. This feat involves critical thinking. But it also requires creative thinking - A grand leap powered by an open mind past the obvious answer, beyond the will to believe or disbelieve, toward new perspectives, to the best solution among several possibilities." (Page 103)


"Just because something seems (feels, appears) real doesn't mean that it is." (Page 103)


"We can't infer what is from what seems. To draw such a conclusion is to commit an elementary fallacy of reasoning. It's clearly fallacious to say, " This event or phenomenon seems real; therefore, it is real. " What's more, the peculiar nature of our minds guarantees that what seems will frequently not correspond to what is.


Now, in our daily routines, we usually do assume that what we see is reality - that seeming is being. And we're generally not disappointed. But we're at much greater risk for being dead wrong with such assumptions when (1) our experience is uncorroborated (no one else has shared our experience), (2) our conclusions are at odds with all known previous experience, or (3) any of the peculiarities of our minds could be at work. " (Page 104)


"PERCEIVING: WHY YOU CAN'T ALWAYS BELIEVE WHAT YOU SEE

The idea that our normal perceptions have a direct, one-to-one correspondence to external reality - that they are like photographs of the outer world - is wrong. Much research now suggests that perception is constructive, that it's in part something that our minds manufacture. Thus what we perceive is determined, not only by what our eyes and ears and other senses detect, but also by what we know, what we expect, what we believe, and what our physiological state is. This constructive tendency had survival value - it helps us make sense of the world. But it also means that seeing is often not believing - rather, the reverse is true." (Page 105)


"We sometimes perceive exactly what we expect to perceive, regardless of what's real.

Research has shown that when people expect to perceive a certain stimulus (for example, see a light or hear a tone), they often do perceive it - even when no stimulus is present. In one experiment, subjects were told to walk along a corridor until they saw a light flash. Sure enough, some of them stopped, saying they had seen a flash - but the light hadn't flashed at all. In other studies, subjects expected to experience an electric shock, or feel warmth, or smell certain odors, and many did experience what they expected even though none of the appropriate stimuli had been given. All that was really given was the suggestion that a stimulus might occur. The subjects had hallucinated (or perceived, or apparently perceived, objects or events that have no objective existence). So if we're normal, expectancy or suggestion can cause us to perceive what simply isn't there. Studies show that this perception is especially true the stimulus is vague or ambiguous or when clear observation is difficult." (Page 107)


"Claims that conflict with expert opinion cannot be known, unless it can be shown beyond a reasonable doubt that the experts are mistaken." (Page 117)


Now this is important and worth considering. If you find experts that disagree so that a consensus or near consensus is unattainable then the experts are unsettled, at least regarding a consensus. The experts can always simplify be wrong but unfortunately if they have a consensus then demonstrating that they are wrong becomes a difficult task because the standard of proof becomes extremely high.


We have many topics that experts in the past have held a consensus or near consensus on that ultimately surrendered to a different idea because strong evidence was able to be presented. It does not mean we should say, "Experts have been proven wrong time and again, so I can dismiss their best arguments and evidence without serious and careful examination!" It does mean we should not accept and never doubt or question ideas merely because experts support them, even if they hold a consensus. 

"all individuals are suggestible" (Page 120)


I am including this quote because I have read a bit about hypnosis and psychology and suggestible subjects and this may be the first time I have seen any author claim ALL individuals are suggestible. It may or may not be true. I honestly don't know.

I do know that every school of hypnosis I have ever examined, even slightly, has some version of the idea that hypnosis "works" on some people, works extremely well on some and is not effective at all in persuading some, no matter what method is used. Apparently lots of people in the thousands of years that people have tried to hypnotize each other have discovered subjects that their techniques simply didn't work on, try as they might. But the authors may not be only referring to hypnosis as such and other methods of suggestion may be capable of affecting these "hypnosis resistant" subjects. I honestly don't know.


"Hypnosis and sodium amytal administration ("truth serum") are unacceptable procedures for memory recovery. Courts reject hypnosis as a memory aid. Subjects receiving hypnosis or amytal as general memory aids (even in instances where there is no question of sexual abuse) will often generate false memories. Upon returning to their normal state of consciousness, subjects assume all their refreshed" memories " are equally true. " (Page 121)


"Psychologist Elizabeth Loftus, A prominent critic of the misguided therapy techniques that often result in False Memory Syndrome, says that the phenomenon has taken an enormous toll:" (Page 121)

I have written on the research of false memory expert Elizabeth Loftus before in several blog posts that address our malleable memories including my series on the book Subliminal by Leonard Mlodinow which digs deep into the topic with extensive descriptions of experiments and research to show that we have imperfect and changeable memories.

I highly recommend that anyone who is skeptical about this or interested check out the numerous articles and interviews with Elizabeth Loftus or even her books.


"REMEMBERING: WHY YOU CAN'T ALWAYS TRUST WHAT YOU RECALL" (Page 121)


"A lot of research now indicates that our memories aren't literal records or copies. Like our perceptual powers, our memories are constructive, or rather, creative. When we remember an experience, our brains reach for a representation of it; then, piece by piece, they reconstruct a memory based on this fragment. This reconstructive process is inherently inexact. It's also vulnerable to all kinds of influences that guarantee that our memories will frequently be inaccurate.

For an example of your memory's reconstructive powers, try this. Remember an instance when you were sitting today. Recall your surroundings, how you were dressed, how you positioned your legs and arms. Chances are, you see the scene from the perspective of someone looking at it, as though you were watching yourself from this perspective. You now remember certain pieces of the experience, and your brain constructed everything else, television perspective and all.


For well over half a century, research has been showing that the memory of witnesses can be unreliable, and the constructive nature of memory helps explain why. Studies demonstrate that the recall of eyewitnesses is sometimes wrong because they reconstruct events from memory fragments and then draw conclusions from the reconstruction. Those fragments can be a far cry from what actually transpired. Further, if eyewitnesses are under stress at the time of their observations, they may not be able to remember crucial details, or their recall may be distorted. Stress can even distort the memory of expert witnesses, which is one of several reasons why reports of UFOs, seances, and ghosts must be examined carefully: The experiences are stressful. Because memory is constructive and liable to warping, people can sincerely believe that their recall is perfectly accurate - and be perfectly wrong. They may report their memory as honestly as they can, but alas, it's been worked over.


Like perception, memory can be dramatically affected by expectancy and belief. Several studies show this effect, but a classic experiment illustrates the point best. Researchers asked students to describe what they had seen in a picture. It portrayed a white man and a black man talking to each other on the subway. In the white man's hand was an open straight razor. When the students recalled the picture, one-half of them reported that the razor was in the hand of the black man. Memory reconstruction was tampered with by expectancy or belief.


The same thing can happen in our successful "predictions." After some event has occurred, we may say, "I knew that would happen; I predicted it." And we may truly believe that we foretold the future. But research suggests that our desire to believe that we accurately predicted the future can sometimes alter our memories of the prediction." (Page 121-122)


"Past Life Remembered or Cryptomnesia


If, under hypnosis, you recall living 200 years ago and can vividly remember doing and seeing things that you've never experienced in your present life, isn't that proof you lived a "past life"? Isn't this evidence of reincarnation? Some people would think so. There is, however, another possibility, explained by Ted Schultz:


Beatle George Harrison got sued for rewriting the Chiffons'" He's So Fine " into "My Sweet Lord." He was the innocent victim of the psychological phenomenon of cryptomnesia. So was Helen Keller, the famous blind and deaf woman, when she wrote a story called "The Frost King." After it was published in 1892, she was accused of plagiarizing Margaret Canby's "The Frost Fairies," though Helen had no conscious memory of ever reading it. But, sure enough, inquiries revealed that Canby's story had been read to her (by touch) in 1888. She was devastated...

Cryptomnesia, or "hidden memory," refers to thoughts and ideas that seem new and original, but which are actually memories of things that you've forgotten you knew. The cryptomnesic ideas may be variations on the original memories, with details switched around and changed, but still recognizable.

Cryptomnesia is a professional problem for artists; it also plays an important role in past-life regression. In the midst of the hoopla surrounding the Britney Murphy [reincarnation] case the Denver Post decided to send newsman William J. Barker to Ireland to try to find evidence of Bridey's existence. [Bridey was the alleged past-life personality of Virginia Tighe.] Unfortunately for reincarnation enthusiasts, careful checking failed to turn up anything conclusive. Barker couldn't locate the street Bridey said she lived on, he couldn't find any essays by Bridey's husband in the Belfast News-Letter between 1843 and 1864 (during which time Bridey said he was a contributor), and he couldn't find anyone who had heard of the "Morning Jig" that Bridey danced.

Research by reporters from the Chicago American and later by writer Melvin Harris finally uncovered the surprising source of housewife Virginia Tighe's past-life memories. As a teenager in Chicago, Virginia had lived across the street from an Irish woman named Mrs. Anthony Corkell, who had regaled her with tales about the old country. Mrs. Corkell's maiden name was Bridie Murphy! Furthermore, Virginia had been active in high school dramatics, at one point memorizing several Irish monologues which she learned to deliver with a heavy Irish brogue. Finally, the 1893 World's Columbian Exposition, staged in Chicago, had featured A life-size Irish Village, with fifteen cottages, A castle tower, and a population of genuine Irish women who danced jigs, spun cloth, and made butter. No doubt Virginia had heard stories of this exhibition from many of her neighbors while growing up in Chicago in the '20s.

Almost every other case of "past-life memory" that has been objectively investigated has followed the same pattern: the memories, often seemingly quite alien to the life experiences of the regressed subject, simply cannot be verified by historical research; on the other hand, they frequently prove to be the result of cryptomnesia." (Page 123)


"Research also shows that our memory of an event can be drastically changed if we later encounter new information about the event - even if the information is brief, subtle, and dead wrong. Here's a classic example: In one experiment, people were asked to watch a film depicting a car accident. Afterward, they were asked to recall what they had seen. Some of the subjects were asked, "About how fast were the cars going when they smashed into each other?" The others were asked the same question with a subtle difference. The word smashed was replaced by hit. Strangely enough, those who were asked the "smashed" question estimated higher speeds than those asked the "hit" question. Then, A week later, all the subjects were asked to recall whether they had seen broken glass in the film. Compared to the subjects who got the "hit" question, more than twice as many of those who got the "smashed" question said they had seen broken glass. But the film showed no broken glass at all. In a similar study, subjects recalled that they had seen a stop sign in another film of a car accident even though no stop sign had appeared in the film. The subjects had simply been asked a question that presupposed as stop sign and thus created the memory of one in their minds.

These studies put in doubt any long-term memory that's subjected to leading questions or is evoked after exposure to a lot of new, seemingly pertinent information." (Page 124)


"CONCEIVING: WHY YOU SOMETIMES SEE WHAT YOU BELIEVE


Our success as a species is due in large part to our ability to organize things into categories and to recognize patterns in the behavior of things. By formulating and testing hypotheses, we learn to predict and control our environment. Once we have hit upon a hypothesis that works, however, it can be very difficult to give it up. Francis Bacon was well aware of this bias in our thinking:


The human understanding when it has adopted an opinion... draws all things else to support and agree with it. And though there be a great number and weight of instances to be found on the other side, yet these it either neglects and despises, or else by some distinction sets aside, and rejects, in order that by this great and pernicious predetermination, the authority of its former conclusion may remain inviolate.


While this intellectual inertia can keep us from jumping to conclusions, it can also keep us from seeing the truth." 

(Page 126)


"Max Planck was well aware of how tenaciously we can cling to a hypothesis when we have invested a lot of time and effort in it. He once remarked, " A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it." (Page 127)


"Our ability to make sense of things is one of our most important abilities. But we are so good at it that we sometimes fool ourselves into thinking something's there when it's not. Only by subjecting our views to critical scrutiny can we avoid such self-delusion." (Page 131)


"Confirmation Bias


Not only do we have a tendency to ignore and misinterpret evidence that conflicts with our own views; we also have a tendency to look for and recognize only evidence that confirms them. A number of psychological studies have established this confirmation bias." (Page 133)


"A wise man knows his own ignorance, A fool knows everything - Charles Simmons" (Page 133)


"Facts do not cease to exist because they are ignored."

- Aldous Huxley


"This experiment demonstrates that we tend to look for confirming rather than disconfirming evidence, even though the later can often be far more revealing. Disconfirming evidence can be decisive when confirming evidence is not.

Consider the hypothesis: All swans are white. Each swan we see tends to confirm that hypothesis. But even if we've seen a million white swans, we can't be absolutely sure that all swans are white because there could be black swans in places we haven't looked. In fact, it was widely believed all swans were white until black swans were discovered in Australia. Thus:


When evaluating a claim, look for disconfirming as well as confirming evidence.


Our tendency to confirm rather than disconfirm our beliefs is reflected in many areas of our lives. Members of political parties tend to read literature supporting their positions. Owners of automobiles tend to pay attention to advertisements touting their makes of car. And all of us tend to hang out with people who share our views about ourselves.

One way to cut down on confirmation bias is to keep a number of different hypotheses in mind when evaluating a claim. In one experiment, subjects were shown a sequence of numbers - 2, 4, 6 - and were informed that it follows A certain rule. Their task was to identify this rule by proposing other triplets of numbers. If a proposed triple fit the rule - or if it did not - the subjects were informed. They were not supposed to state the rule until they were sure of it.

Most subjects picked sets of even numbers like 8, 10, 12 or 102, 104, 106. When told these too followed the rule, subjects often announced that they knew the rule: Any three consecutive even numbers. But that rule was incorrect. This fact led some people to try out other triplets such as 7, 9, 11 or 93, 95, 97. When told that these triplets fit the rule, some claimed that the rule was any three numbers ascending by two. But that rule, too, was incorrect. What was the correct rule? Any three numbers in ascending order.

Why was this rule so difficult to spot? Because of confirmation bias: Subjects tried only to confirm their hypotheses; they did not try to disconfirm them.

If subjects were asked to keep two hypotheses in mind - such as, any three numbers in ascending order and any three numbers not in ascending order - they did much better. They picked a wider range of triplets, each of which confirmed or disconfirmed one of the rules. Thus, keeping a number of different hypotheses in mind can help you avoid confirmation bias. " (Page 135 - 136)


"The Availability Error


Confirmation bias can be exacerbated by the availability error. The availability error occurs when people base their judgements on evidence that's vivid or memorable instead of reliable or trustworthy." (Page 136)

"Mankind, in the gross, is a gaping monster that loves to be deceived and has seldom been disappointed."

 - Harry Mackenzie


"Those who base their judgements on psychologically available information often commit the fallacy of hasty generalization. To make a hasty generalization is to make a judgement about a group of things on the basis of evidence concerning only a few members of that group. It is fallacious, for example, to argue like this: " I know one of those insurance salespeople. You can't trust any of them. " Statisticians refer to this error as the failure to consider sample size. Accurate judgements about a group can be made on the basis of a sample only if the sample is sufficiently large and every member of the group has an equal chance to be part of the sample.

The availability error also leads us to misjudge the probability of various things. For example, you may think that amusement parks are dangerous places. After all, they are full of rides that hurl people around at high speeds, and sometimes those rides break. But statistics show that riding the rides at an amusement park is less dangerous than riding a bicycle on main roads. We tend to think that amusement parks are dangerous places because amusement park disasters are psychologically available - they are dramatic, emotionally charged, and easy to visualize. Because they stick in our minds, we misjudge their frequency.

When confirming evidence is more psychologically compelling than disconfirming evidence, we are likely to exhibit confirmation bias." (Page 138)

"When evaluating a claim, look at all the relevant evidence, not just the psychologically available evidence." (Page 138)


"The availability error not only leads us to ignore relevant evidence; it also leads us to ignore relevant hypotheses. For any set of data, it is, in principle, possible to construct any number of different hypotheses to account for the data. In practice, however, it is often difficult to come up with many different hypotheses. As a result, we often end up choosing among only those hypotheses that come to mind - that are available.

In the case of unusual phenomena, the only explanations that come to mind are often supernatural or paranormal ones. Many people take the inability to come up with a natural or normal explanation for something as proof that it is supernatural or paranormal. "How else can you explain it?" they often ask.

This sort of reasoning is fallacious. It's an example of the appeal to ignorance. Just because you can't show that the supernatural or paranormal explanation is false doesn't mean that it is true. Unfortunately, although this reasoning is logically fallacious, it is psychologically compelling.

The extent to which the availability of alternate hypotheses can affect our judgements of probability was demonstrated in the following experiment. Subjects were presented with a list of possible causes of a car's failure to start. Their task was to estimate the probability of each of the possible causes listed. Included on every list was a catchall hypothesis labeled "all other problems [explanations]." Researchers discovered that the probability the subjects assigned to a hypothesis was determined by whether it was on the list - that is, by whether it was available. If more possibilities were added, subjects lowered the probability of the existing possibilities instead of changing the probability of the catchall hypothesis (which they should have done if they were acting rationally).

Although the unavailability of natural or normal explanations does not increase the probability of supernatural or paranormal ones, many people think that it does. To avoid this error, it's important to remember that just because you can't find a natural explanation for a phenomenon doesn't mean that the phenomenon is supernatural. Our inability to explain something may simply be due to our ignorance of the relevant laws or conditions. " (Page 140)

"Although supernatural or paranormal claims can be undercut by providing a natural or normal explanation of the phenomenon in question, there are other ways to cast doubt on such claims. A hypothesis is only acceptable if it fits the data. If the data are not what you would expect if the hypothesis were true, there is reason to believe that the hypothesis is false.

Take the case of the infamous Israeli psychic Uri Geller. Geller claims to have psychokinetic ability: the ability to directly manipulate objects with his mind. But the data, psychologist Nicholas Humphrey says, do not fit this hypothesis:

If Geller has been able to bend a spoon merely by mind-power, without his exerting any other sort of normal mechanical force, then it would immediately be proper to ask: Why has this power of Geller's worked only when applied to metal objects of a certain shape and size? Why indeed only to objects anyone with a strong hand could have bent if they had the opportunity (spoons or keys, say, but not pencils or pokers or round coins)? Why has he not been able to do it unless he has been permitted, however briefly, to pick the object up and have sole control of it? Why has he needed to touch the object with his fingers, rather than his feet or with his nose? Etcetera, etcetera. If Geller really does have the power of mind over matter, rather than muscle over metal, none of this would fit.


Humphrey calls this sort of skeptical argument, the argument from "unwarranted design" or "unnecessary restrictions," because the phenomena observed are more limited or restricted than one would expect if the hypothesis were true. To be acceptable, a hypothesis must fit the data: This means not only that the hypothesis must explain the data, but also  that the data explained must be consistent with what the hypothesis predicts. If the hypothesis makes predictions that are not borne out by the data, there is reason to doubt the hypothesis." (Page 140-141)


"The Representativeness Heuristic

Our attempt to comprehend the world is guided by the world is guided by certain rules of thumb known as heuristics. These heuristics speed up the decision-making process and allow us to deal with a massive amount of information in a short amount of time. But what we gain in speed we sometimes lose in accuracy. When the information we have to work with is inaccurate, incomplete, or irrelevant, the conclusions we draw from it van be mistaken.

One of the heuristics that governs both categorization and pattern recognition is this one: Like goes with like. Known as the representativeness heuristic, this rule encapsulates the principles that members of a category should resemble a prototype and that effects should resemble their causes. While these principles often lead to correct judgements, they can also lead us astray. A baseball game and a chess game are both games, but their dissimilarities may be greater than their similarities." (Page 141)


"He that is not aware of his ignorance will only be mislead by his knowledge." - Richard Whately (Page 141)


"Man's mind is so formed that it is far more susceptible to falsehood than truth." - Desiderius Erasmus (Page 142)

"Superstition, which is widespread among the nations, has taken advantage of human weakness to cast its spell over the mind of almost every man." - Cicero (Page 143)

"Man prefers to believe what he prefers to be true." -  Francis Bacon (Page 143)


"One problem is that most of us don't realize that because of ordinary statistical laws, incredible coincidences are common and must occur. An event that seems highly improbable can actually be highly probable - even virtually certain - given enough opportunities for it to occur. Drawing a royal flush in poker, getting heads five times in a row, winning the lottery - all these events seem incredibly unlikely in any instance. But they're virtually certain to happen sometime to someone. With enough chances for something to happen, it will happen." (Page 145)


"Rationalizing Homo Sapiens

People not only jump to conclusions, they frequently rationalize or defend whatever conclusions they jump to. Psychologist Barry Singer summarizes research findings that show just how good our rationalizing skills are: 

Numerous psychological experiments on problem solving and concept formation have shown that when people are given the task of selecting the right answer by being told whether particular guesses are right or wrong, they will tend to do the following:

1. They will immediately form a hypothesis and look only for examples to confirm it. They will not seek evidence to disprove their hypothesis, although this strategy would be just as eff, but will in fact try to ignore any evidence against it.

2. If the answer is secretly changed in the middle of the guessing process, they will be very slow to change the hypothesis that was once correct but has suddenly become wrong.

3. If one hypothesis fits the data fairly well, they will stick with it and not look for other hypotheses that might fit the data better.

4. If the information provided is too complex, people will cope by adopting overly simple hypotheses or strategies for solution, and by ignoring any evidence against them.

5. If there is no solution, if the problem is a trick and people are told "right" and "wrong" about their choices at random, people will nevertheless form all sorts of hypotheses about causal relationships they believe are inherent in the data, will believe their hypotheses through thick and thin, and will eventually convince themselves that their theories are absolutely correct. Causality will invariably be perceived even when it is not present.

It is astonishing that rats, pigeons, and small children are often better  solving these sorts of problems than are human adults. Pigeons and small children don't care so much whether they are always right, and they do not have such a developed capacity for convincing themselves they are right, no matter what the evidence is." (Page 147)

"It's reasonable to accept personal experience as reliable evidence only if there's no reason to doubt its reliability." (Page 147)

"Our beliefs may predispose us to misinterpret the facts, when ideally the facts should serve as the evidence upon which we base beliefs." - Alan M. MacRobert and Ted Schultz (Page148)


"When there's reason to think that any of these limitations or conditions may be present, our personal experience can't prove that something is true. In fact, when we're in situations where our subjective limitations could be operating, the experiences that are affected by those limitations not only can't give us proof that something is real or true; they can't even provide us with low-grade evidence. The reason is that at those moments, we can't tell where our experience begins and our limitations end. Is that an alien spacecraft in the night sky or Venus, embellished for us by our own high level of expectancy? Is that strange conjunction of events a case of cosmic synchronicity or just our inability to appreciate the true probabilities? If subjective limitations might be distorting our experience, our personal evidence is tainted and can't tell us much at all. That is why anecdotal evidence - evidence based on personal testimony - carries so little weight in scientific investigations. When we can't establish beyond a reasonable doubt that a person was not influenced by these limitations, we aren't justified in believing that what they report is real." (Page 148)


"Personal experience alone generally cannot establish the effectiveness of a treatment beyond a reasonable doubt.

There are three reasons why this principle is true: Many illnesses simply improve on their own; people sometimes improve even when given A treatment known to be ineffective; and other factors may cause the improvement in a person's condition." (Page 149)


"The power of suggestion to alter body function is well established by research with hypnosis. Blisters have been induced and warts made to disappear through suggestion." - William T. Jarvis (Page 151)

"Case reports are accounts of a doctor's observations of individual patients." (Page 154)

"Case reports are also vulnerable to several serious biases that controlled research is better able to deal with. One is called social desirability bias. It refers to patients' tendency to strongly wish to respond to treatment in what they perceive as a correct way. People will sometimes report improvement in their condition after treatment simply because they think that's the proper response or because they want to please the doctor.

Another bias can come from doctors themselves. Called investigator bias, it refers to the well-documented fact that investigators or clinicians sometimes see an effect in a patient because they want or expect to see it." (Page 155)


 "Case studies alone generally cannot establish the effectiveness of a treatment beyond a reasonable doubt." (Page 155)

"SUMMARY

An important principle to use when evaluating weird phenomena is that just because something seems real doesn't mean that it is. Part of the reason for this caution is the constructive nature of our perceptions. We often perceive exactly what we expect to perceive, regardless of what's real, and we sometimes experience the misperception of seeing distinct forms in vague and formless stimuli. These constructive processes are notoriously active in UFO sightings, where under poor viewing conditions average people mentally transform lights in a dark sky into alien spacecraft.

Our memories are also constructive and easily influenced by all sorts of factors: stress, expectation, belief, and the introduction of new information. Added to this is the selectivity of memory - we selectively remember certain things and ignore others, setting up a recall bias. No wonder the recall of eyewitnesses is often so unreliable.

How we conceive the data we encounter is also problematic. is also problematic. We often refuse to accept contrary evidence,  a reluctance that can be found in just about everyone, including scientists and trained investigators. We have a tendency to believe that a very general personality description applies uniquely to ourselves, a phenomenon known as the Forer effect. The Forer effect is at work in the readings of astrology, biorhythms, fortune-telling, tarot cards, palmistry (palm-reading), and psychic performances. We are often prey to confirmation bias, the tendency to look for and recognize only evidence that confirms our own views. We fall for the availability error and base our judgements on evidence that's vivid or memorable instead of reliable or trustworthy. We are sometimes led astray by the representative heuristic, the rule of thumb that like goes with like. And we are generally poor judges of probability and randomness, which leads us to erroneously believe that that an event could not possibly be a mere coincidence.

All this points to the fact that anecdotal evidence is not a reliable guide to the truth. Our principle should be that it's reasonable to accept personal experience as reliable evidence only if there's no reason to doubt its reliability. The problems with this kind of evidence are illustrated well in people's personal attempts to judge the effectiveness of treatments and health regimens. The reality is that personal experience alone generally cannot establish the effectiveness of a treatment beyond a reasonable doubt - but controlled scientific studies can." (Page 156)



No comments:

Post a Comment

Note: Only a member of this blog may post a comment.