Wednesday, October 21, 2015

Scientology Building The Prison Of The Mind Part 4 An Iron Fist In A Velvet Glove

 Image result for a theory of cognitive dissonanceImage result for a theory of cognitive dissonance


 Image result for a theory of cognitive dissonanceImage result for a theory of cognitive dissonance



Like all parts in the Building the Prison of the Mind series this post is written comparing the book A Theory of Cognitive Dissonance by Leon Festinger against my twenty five years in the Scientology cult. I will quote excerpts and attempt to interpret the Scientology experience in this framework.

Several experiments in social psychology are described in great detail. I won't repeat that here. I do recommend the book to everyone. I will primarily deal with the hypotheses forwarded by Festinger and points relevant to Scientology.

After one experiment the idea that if one goes against personal beliefs for significant reward or to avoid significant punishment little dissonance or none results. While small incentives to act against beliefs generate strong dissonance. Why ? If you pretend to like someone you really don't like but will get a million dollars for being pleasant to someone for a day, the benefit is accepted as motivation and so no dissonance will follow. If you are nice to someone you really don't like for no benefit, you might be confounded by it - that's high dissonance.

This principle is important in Scientology as usually a new recruit for Scientology tries, bit by bit, slight odd changes in behavior. An odd study method, a strange drill, an eccentric auditing procedure all are off putting at first but cleverly combine two effective tools for persuasion.

First, the slightly unusual new methods being requested are meant to be done immediately, upon introduction with no critical examination. The social environment acts like the slight request is entirely reasonable, and even delaying this at all is absurd. An auditor, registrar or course supervisor encourages instant compliance and rewards blind obedience. So, it is easy to give in for seemingly no incentive.

Secondly, with no reward for complying apparent the new cult member generates dissonance by doing things they really don't understand without a clear motive.

To reduce this dissonance a simple adjustment occurs, in belief. The unaware cult member shifts beliefs to fit their behavior.

Festinger wrote:
If one wanted to obtain private change in addition to mere public compliance, the best way to do this would be to offer just enough reward to elicit the overt compliance. (Page 95)

He went on:
These studies lend support to the idea that attitude or opinion change is facilitated if a person finds himself in a situation where, by showing compliant behavior, he is engaged in actions which are dissonant with his private opinions. The changes in private opinion which ensue are the end result of a process of attempting to reduce or eliminate the dissonance. (Page 112)

Hubbard may have read this book, or articles based on it or similar concepts. He certainly exploits the mental weaknesses outlined here quite well.

He had students redefine new words in their own words, so they participate in validating the very terms he is handing them. And in star rate checkouts students explain Scientology concepts and demo his ideas with examples they must enthusiastically create.

It seems like so little, but by having students over and over create examples and explanations in total agreement with Hubbard's doctrine the students unknowingly influence their very own minds to hold beliefs in line with both their behavior and statements so the new beliefs become identical to deeply seated personal convictions in the individual cult member's mind. They function like ideas arrived at after observation, testing and contemplation but without any of those crucial elements of judgement.

Plainly, these ideas from Hubbard feel like conclusions from deep introspection and independent analysis to the cult member. So since they feel like personal deep convictions, they are protected by the ego defenses vigorously.

Regarding these kinds of beliefs Festinger comments:
There are several areas of opinion where it is notoriously difficult to change people. (Page 120)

Festinger cites a study:
There is evidence in our data that once a change in behavior has occurred, a change in beliefs is likely to follow. (Page 121)

So Hubbard cleverly installed new ideas and tricked victims into holding them close, and defending them zealously.

Imagine developing a method that could get people to take ideas you create and take them as their own strongest faith without really testing or critically examining them. Granted the method doesn't work on everybody, but dissenting voices are quickly silenced in Scientology.

So only obedient followers are allowed in the cult. If you ever wondered what a person would do if they could achieve such extreme covert influence, undue influence in truth, than just look at the cult and you will surely see the terrible and tragic results.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.