You can get several PhDs in psychology and still not fully explore or understand this issue. A lot of research has been done on the subject.
Stanley Milgram is famous for his obedience to authority experiment. This is one of the most repeated experiments in social psychology and has been done by numerous researchers and recreated in different circumstances by different people in different countries over decades.
The shocking experiment found that if we ask people to progressively administer a series of gradually increasing electric shocks to a stranger, even to a level that could hurt or kill them, we will do so if an authority in a lab coat demands it at about sixty percent compliance. This means sixty percent of us will do this, and not stop or refuse. Before the experiment psychologists assumed that 1% of people would comply and that this 1% is psychotic.
They were certain the experiment was a total waste of time and even psychologists with decades of experience believed this.
Stanley Milgram explored obedience to authority and wrote several books, including Obedience to Authority; An Experimental View.
His ideas are well worth exploring.
The reason I included his work is to pont out that we are not independent critical thinking creatures generally, we tend to believe what our contemporaries believe, our peers and the authorities we follow.
Asch conformity experiments were carried out by Solomon Asch. In these experiments several students were asked to indicate what lines were the same length but the people went in an order. The real experiment involved the actual participant being the person who answered last and the other people being told beforehand to all give the same WRONG answer and to how many people would give the same answer, despite the truth being easily seen.
In the book Sway The Irresistible Pull Of Irrational Behavior authors (and brothers) Ori Brafman ( MBA Stanford Business School) and Rom Brafman (PhD Psychology) described experiments on dissent.
Solomon Asch did one of the most famous experiments in social psychology. In one experiment a subject was told they were being tested for visual acuity. They were placed in a group with several other people. The group was shown three straight lines of greatly varying lengths and a fourth line and asked which of the three lines the new one matched. The lines were intentionally different enough that the answer was meant to be obvious.
But there was a hidden element, as there usually is in a social psychology experiment, every person except one was an actor. The actors were all instructed to give the same answer before the actual subject responded. They all gave the same wrong answer.
Now there were several rounds of being presented lines and answering. And when everyone else gave the same obviously wrong answer 75% of subjects ALSO gave that answer in at least one of the rounds.
Asch found unanimity gave the experiment its full persuasive power. It's hard to be a lone dissenting voice.
He did something I have found people often do with good experiments. He repeated it with a slight variation to test an idea. He had the same set up with one crucial alteration: he had one actor give the right answer while the others gave the same wrong answer.
He found that having even one person give the true and easily observable answer made it so the test subjects felt free and confident enough to also give the correct answer, almost every single time.
The authors wrote, "The really interesting thing, though, is that the dissenting actor didn't even need to give the correct response; all it took to break the sway was for someone to give an answer that was different from the majority." (Page 155 Sway)
To really drive home this point with evidence another clever experiment is described. Psychologist Vernon Allen conducted it.
In this one a subject was asked to do a self-assessment survey alone. After five minutes a researcher knocked on the door and asked the subject to share the room due to a lack of space.
The new subject was of course an actor. The new subject had special extra, extra thick glasses intentionally designed to give the impression of him being nearly blind without them. Super coke bottle glasses.
To step it up a notch the researcher and actor even had a script. The actor said, "Excuse me, but does this test require long-distance vision ?" The researcher confirmed it and the subject responded, "I have very limited eyesight" and "I can only see up-close objects."
They even acted out a scene of the researcher asking the coke bottles wearing actor to read an easily legible sign on the wall. The actor of course acted out straining and finding the sign impossible to make out to drive home the point that he was practically blind over long distances.
The researcher explained that he needed five people for the testing apparatus to work, so it was okay for the nearly blind seeming subject to, "Just sit in anyway, since you won't be able to see the questions, answer any way you want; randomly, maybe. I won't record your answers."
But even with the coke bottle glasses and blind as a bat routine the actor was able to affect conformity significantly. 97% of participants conformed when agreement was unanimous but it dropped to 64% with the coke bottles wearing actor even if he gave an incorrect answer as long as it was different from the majority.
That is astounding. Having three people give an incorrect answer can be countered for 33% of people even with an obviously incorrect answer from an obviously unreliable source !
It's truly worth considering. Imagine yourself being like 97% of us and conforming with the crowd in denying what you see before your eyes, but that one out of three of us actually will see and acknowledge the truth if anyone, no matter how unlikely or wrong or obviously unqualified simply disagrees and breaks the unanimous opinion.
I think dissenting views shouldn't just be accepted or even suggested for important decisions that time permits careful consideration of but frankly should be required !
And looking at Mill's ideas makes me think the dissenting views should be the absolute best prepared and presented versions of those views possible. Put every effort into giving them the opportunity to be actually well thought out and persuasive so they take real effort to refute.
More evidence that dissent is actually useful is described in Sway regarding a fascinating unforseen result of a study. The study was conducted by David Kantor, a Boston-based family therapist who was trying to see how schizophrenia manifested itself in families. He had cameras set up in people's homes then poured through hours of footage of regular family life. He didn't learn much about schizophrenia but found a useful pattern that occurred over and over in family after family.
He discovered four roles people take turns assuming in families. We may assume these roles in other groups as well. It's an interesting hypothesis.
The roles are 1) the initiator, someone that comes up with an idea or starts an activity. 2) the blocker, someone that brings up reasons to not do the idea or fears if they do it or negative potential consequences.
In Sway the authors wrote, "Of course, it's easy to think of blockers as pure curmudgeons. But as we'll soon see, they play a vital role in maintaining balance within a group." (Page 158)
Initiators and blockers inevitably disagree and then 3) supporters come in. They take a side and go with the intention of the initiator or blocker. If the initiator wants to go to the movies and the blocker thinks they should not the supporter will either encourage going or not going, very clear. Last is 4) the observer, they watch and try to not take sides.
The initiators and blockers naturally bump heads. In polite discourse they disagree, in less polite situations they argue or even fight.
Initiators in this hypothesis have lots of ideas and are willing to do things, or at least come up with ideas or decisions for others. Blockers are cautious and less optimistic.
Many groups seek people with ideas, confidence and the qualities initiators hold. Groups also avoid blockers. I have even heard of people that see the key to success is to steer clear of people who are negative or unsuccessful, in some ways describing blockers as too down.
But a blocker has the tendency to give the dissenting view, even if they are wrong everything we have seen up to now should tell us we want and need to hear dissenting views. Too many yes men can inspire false or unjustified confidence.
Some managers to prevent conformity influencing the evaluations of executives and advisors instruct a group of staff to take a written proposal for a project, go home, read it and write a one page response with your impression and opinion. Don't discuss this with each other in forming your opinion.
Come back tomorrow and each of you in turn can read your response. This way we won't influence the statements of each other. Additionally if they are considering going forth another assignment is used. The staff are instructed to separately come up with a hypothetical situation. They are supposed to imagine six months in the future and that the project has failed. They are supposed to write a post mortem, an analysis of what went wrong, how and why.
By each separately brainstorming the potential failure of the project they each play the role of blocker and bring their intelligence, imagination and knowledge to the topic of what could fail. In this way potential weaknesses or obstacles can be considered.
A person with knowledge of aspects of the project that could be difficult might alert others in the meeting who could realize many things. A person could realize the project is illegal, or could realize the problems in one area will make the project unfeasible in how it impacts another area or could foresee an obstacle and practical solution to it from their expertise which if it hadn't been implemented from the beginning would result in certain failure.
The possibilities are many and give the group a better chance of using the knowledge of each other as they interact. The whole group, whether they are aware of it or not has more knowledge of potential factors that could create success and factors that could create failure.
So, the factors of obedience to authority and conformity to group norms both influence our behavior and beliefs.
The A Theory of Cognitive Dissonance by Leon Festinger also describes how we believe and specifically what influences us to be true believers and unwilling to consider disconfirming evidence regarding our beliefs.
To briefly reference A Theory Of Cognitive Dissonance by Leon Festinger:
Festinger went on to say:
The presence or absence of dissonance in some particular content area will have important effects on the degree of information seeking and on the selectivity of such information seeking. (Page 126)
Relative absence of dissonance. If little or no dissonance exists, there would be no motivation ( considering this source of motivation alone ) to seek out new and additional information. (Page 127)
The presence of moderate amounts of dissonance. The existence of appreciable dissonance and the consequent pressure to reduce it will lead to the seeking out of information which will introduce consonances and to the avoidance of information which will increase the already existing dissonance. (Page 128)
The presence of extremely large amounts of dissonance. Under such circumstances a person may actively seek out, and expose himself to, dissonance-increasing information. If he can increase the dissonance to the point where it is greater than the resistance to change of one or another cluster of cognitions, he will then change the cognitive elements involved, thus markedly reducing or perhaps even wholly eliminating the dissonance which now is so great. (Page 129) All quotes from A Theory Of Cognitive Dissonance.
We have a wealth of references on what makes one a true believer including the books The True Believer by Eric Hoffer and Origins of Totalitarianism by Hannah Arendt.
A great modern take on this that integrates a remarkable amount of new ideas including attachment theory is the ground breaking Terror, Love and Brainwashing by Alexandra Stein.
I should mention that The Discipling Dilemma: A Study of the Discipling Movement Among Churches of Christ by Flavil R. Yeakley Jr. has scientific evidence that no other sources have regarding the high control of groups like Scientology. The research in this book shows that the members of high control groups tend to all take on one personality type - the personality of the founder or leader. Most groups have a variety of people who have a variety of personality types and the people are not molded into the same type. You can be a member of many groups and be more or less yourself, but in high control groups you can only be an imitation of the leader! That profoundly affects your decision making!
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.