[ot] cult influence and power, 1988-2018

Undiscussed Groomed for Male Slavery, One Victim of Many gmkarl+brainwashingandfuckingupthehackerslaves at gmail.com
Sat Aug 27 15:16:43 PDT 2022


Chapter 4–Understanding Mind Control

				When I do trainings or lectures at colleges, I usually challenge
my audience with this question: “How would you _know_ if you were under
mind control?”

				After some reflection, most people realize that if they were under
mind control, it would be impossible to determine it without some help
from the outside. In addition, they would need to understand very
clearly what mind control is. When I was under mind control, I didn’t
understand what it was all about. I assumed that mind control would
involve being tortured in a dank basement somewhere, with a light bulb
shining in my face. Of course, that never happened to me while I was
in the Moonies. Whenever people yelled at me and called me a
“brainwashed robot,” I just took it as an expected persecution. It
made me feel more committed to the group.

				At that time, I didn’t have a frame of reference for the
phenomenon of mind control. It wasn’t until my deprogramming that I
was given a credible model of what it is and how it works. Since I was
a member of the Moonies and we regarded Communism as the enemy, I was
very interested in the techniques that the Chinese Communist Party
used to convert people into Communism during the 1950s. I didn’t
resist, then, when my counselors asked to read me parts of Dr. Robert
Jay Lifton’s book _Thought Reform and the Psychology of Totalism_.[74]
Since the book had been published in 1961, I could not accuse Lifton
of being anti-Moon.

				That book had a major impact on my understanding of what had
happened to me in the Moonies. Lifton identified eight basic elements
of the process of mind control as practiced by the Chinese Communists.
My counselors pointed out that no matter how wonderful the cause, or
how attractive the members, if any group employed all eight of Robert
Jay Lifton’s elements, then it was practicing mind control. I was
eventually able to see that the Unification Church used all eight of
those elements: milieu control, mystical manipulation or planned
spontaneity, the demand for purity, the cult of confession, sacred
science, loading of the language, doctrine over person, and dispensing
of existence. (In the Appendix of this book[included before endnotes
at end of this email], Lifton describes these eight elements in more
detail. Two video interviews with Lifton can be found on my website,
freedomofmind.com.)

				Before I could leave the Moonies, though, I had to wrestle with
several moral questions. Does the God I believe in need to use
deception and mind control? Do the ends truly justify the means? Do
the means determine the ends? How could the world become a paradise if
people’s free wills are subverted? What would the world truly look
like if Moon assumed total power? Through asking myself these
questions, I decided I could no longer participate in an organization
that used mind control practices. I left behind the fantasy world I
had lived in for years.

				Over the years, I have come to realize that millions of people
have actually been subjected to a mind control regimen but don’t even
know it. Hardly a week goes by that I don’t talk with several people
who are still experiencing negative side effects from their experience
of mind control. Often, it is a great relief for them to hear that
they are not alone and that their problems stem from their past
involvement with such a group.

				Perhaps the biggest problem faced by people who have left
destructive cults is the disruption of their own authentic identity.
There is a very good reason: they have lived for years inside an
“artificial” identity given to them by the cult. While cult mind
control can be talked about and defined in many different ways, I
believe it is best understood as _a system that disrupts an
individual’s healthy identity development_. An identity is made up of
elements such as beliefs, behavior, thought processes and emotions
that constitute a definite pattern. Under the influence of mind
control, a person’s authentic identity given at birth, and as later
formed by family, education, friendships, and most importantly that
person’s own free choices, becomes replaced with another identity,
often one that they would not have chosen for themself without
tremendous social pressure.[75]

				Even if the person gets along through deliberate play-acting at
first, the act eventually becomes real. They take on a totalistic
ideology that, when internalized, supersedes their prior belief
system. Ultimately, the person usually experiences—and shows—a radical
personality change and a drastic interruption of their life course.

				The process can be initiated quickly, but usually requires days or
weeks to solidify. Those unfortunate enough to be born to members of a
destructive cult are deprived of a healthy psychological environment
in which to mature optimally. That said, children are remarkably
resilient and I have met many who described never completely “buying
in” to the crazy beliefs and practices. Most ran away or found a way
to escape before they became adult. Yet, for others, it took decades
to find the strength and the courage to be “true to themselves.”
Family ties can enforce silence on disbelieving second-generation
members. It is easier to go along with the cult than to express their
true opinions.

				It’s worth noting that a group can use mind control in positive
ways. For example, many drug rehabilitation and juvenile
rehabilitation programs use some of these same methods to re-integrate
a person’s old identity. But such programs, successful as they may be,
are fraught with danger. After the person is broken and given a new
identity, they must also have their autonomy and individuality
restored. Whether that happens depends entirely on the altruism and
responsible behavior of the group’s directors. As mentioned earlier,
one drug rehabilitation program, Synanon, drew repeated allegations
that it abused the most basic rights of its members and was actually a
full-fledged cult.[76]

				Of course, we are all subjected to various social pressures every
day, most noticeably in our families and our work. The pressure to
conform to certain standards of behavior exists in nearly every
institution. Many kinds of influence are at work on us all the time,
some of them obvious and benign (such as “Fasten Your Seat Belt”
billboards) and others subtle and destructive. I cannot emphasize too
strongly, then, that when I use the term “mind control,” I am
specifically referring to the destructive end of the spectrum.
Therefore, as I have stressed before, the term “mind control” in this
book will not apply to certain technologies (biofeedback, for example)
that are used to enhance personal control and promote choice. It will
refer to only those systems that _seek to undermine an individual’s
integrity in making independent decisions_. The essence of mind
control is that it encourages dependence and conformity, and
discourages autonomy and individuality.

				Mind Control Versus Brainwashing

				While it is important to have a basic understanding of mind
control, it is just as important to understand what mind control is
not. Unfortunately, in popular discussions of the subject, the term
_brainwashing_ is often used as a synonym for _mind control_ or _undue
influence_. On the influence continuum, however, brainwashing belongs
closer to the most negative, injurious and extreme end.

				The term _brainwashing_ was coined in 1951 by journalist and CIA
agent Edward Hunter. He used it to describe how American servicemen
captured in the Korean War suddenly reversed their values and
allegiances, and believed they had committed fictional war crimes.
Hunter translated the term from the Chinese _hsi nao_, which means
“wash brain.”

				I think of brainwashing as overtly coercive. The person being
brainwashed knows at the outset that they are in the hands of an
enemy. The process begins with a clear demarcation of the respective
roles—who is prisoner and who is jailer—and the prisoner experiences
an absolute minimum of choice. Abusive mistreatment, even torture, and
sometimes rape are involved.

				Perhaps one of the most famous cases of brainwashing and cult mind
control in the United States involved newspaper heiress Patty Hearst.
She was kidnapped by the Symbionese Liberation Army (SLA), a small,
political, terrorist cult, in 1974, the same month I was recruited
into the Moonies. She was locked in a dark closet for weeks and was
raped and starved. Later she became an active member of the group as
“Tania.” She passed up chances to escape and even participated in a
bank robbery, for which she was convicted and served a jail term.

				Unfortunately, Hearst was the victim of an ignorant judge and
jury. She was eventually granted a full pardon by President Clinton,
in 2001.

				The SLA may have succeeded in brainwashing Patty Hearst, but, on
the whole, the coercive approach hasn’t had an outstanding success
rate. Once people are away from their controllers and back in familiar
surroundings, the effects tend to dissipate. The SLA succeeded with
Patty Hearst because they gave her a whole new identity. They
convinced her that the FBI was out to shoot her on sight. She was
convinced her safety lay in remaining with the group rather than
seeking rescue.

				Brainwashing is especially effective in producing compliance to
demands, such as signing a false confession or denouncing one’s
government. People are coerced into specific acts for
self-preservation; then, once they have acted, their beliefs change to
rationalize what they have done. But these beliefs are usually not
well internalized. If and when the prisoner escapes their field of
influence (and fear), they are usually able to throw off those
beliefs.

				Mind control is much more subtle and sophisticated. The victim
typically regards the controllers as friends or peers, so is much less
on guard. They usually unwittingly participate by cooperating with
their controllers, and by giving them private information that they do
not realize will be used against them.

				Mind control involves little or no overt physical abuse. Instead,
_hypnotic processes_ are combined with _group dynamics_ to create a
potent indoctrination effect. The individual is deceived and
manipulated—but not directly threatened—into making the prescribed
choices. On the whole, the victim responds positively to what is done
to them.

				It is unfortunate that the word _brainwashing_ is used loosely.
People inside most cults aren’t physically tortured, so when critics
accuse them of having been brainwashed, the accusation doesn’t ring
true. When I was in the Moonies, I knew I hadn’t been brainwashed. I
do remember, however, Moon giving us a speech in which he said that a
popular magazine had accused him of brainwashing us. He declared,
“Americans’ minds are very dirty—full of selfish materialism and
drugs—and they _need_ a heavenly brainwashing!”[77] We all laughed.

				A Note On Hypnotism

				The term hypnotism is also misused. We use the term hypnotism in
our normal speech—we sometimes say such things as, “She hypnotized him
with her smile.” Actually, hypnosis is little understood by most
people. When the term is mentioned, the first image that may come to
mind is of a bearded doctor dangling an old pocket watch by its chain
in front of a droopy-eyed subject. While that image is certainly a
stereotype, it does point to the central feature of hypnotism: the
trance. People who are hypnotized enter a trance-like state that is
fundamentally different from normal consciousness. The difference is
this: whereas in normal consciousness the attention is focused
outwards through the five senses, in a trance one’s attention is
usually focused _inwards_. One is hearing, seeing and feeling
internally. Of course, there are various degrees of trance, ranging
from the mild and normal trance of daydreaming to deeper states in
which one is much less aware of the outside world and extremely
susceptible to suggestions that may be put into one’s mind.

				Hypnotism relates to the unethical mind control practices of
destructive cults in a variety of ways. In many cults which claim to
be religious, what is often called “meditation” is no more than a
process by which the cult members enter a trance, during which time
they may receive suggestions which make them more receptive to
following the cult’s doctrine. Non-religious cults use other forms of
group or individual induction. In addition, being in a trance is
usually a pleasant, relaxing experience, so that people wish to
re-enter the trance as often as possible. Most importantly, it has
been clinically established by psychological researchers that people’s
critical faculties are diminished in the trance state. One is less
able to evaluate information received in a trance than when in a
normal state of consciousness.

				The power of hypnosis to affect people can be considerable. People
who are “high hypnotizables”—can be put into a trance very quickly and
perform remarkable feats. During stage hypnosis shows subjects have
been directed to dance like Elvis Presley (to the audience’s
laughter); lie down between two chairs and assume a wooden boardlike
rigidity; believe they are naked (when they are fully clothed) or
behave as though their hands were “glued” to their sides. If people
can be made to perform these acts in just a few minutes of influence,
getting hypnotic subjects to believe that they are part of a “chosen
few” with many hours, days or weeks of programming is very achievable.

				Destructive cults commonly induce trances in their members through
lengthy indoctrination sessions. Repetition, boredom and forced
attention are very conducive to the induction of a trance. Looking at
a group in such a setting, it is easy to see when the trance has set
in. The audience will exhibit slowed blink and swallow reflexes, and
their facial expressions will relax into a blank, neutral state. With
people in such a state, it is possible for unscrupulous leaders to
implant irrational beliefs. I have seen many strong-willed people
hypnotized and made to do things they would never normally do.

				Basic Principles Of Social Psychology And Group Dynamics

				The political experience of World War II, in which thousands of
apparently normal people operated concentration camps in which
millions of Jews, Romanies, Slavs, blacks, gays and communists were
killed, provoked considerable interest among psychologists.[78] How
was it that people who had led ordinary lives prior to Adolf Hitler’s
rise to power became involved in a deliberate attempt to exterminate
whole groups of people?

				Thousands of social psychological experiments have been conducted
since World War II, yielding great insights into the various ways
people are influenced, both as groups and as individuals. The result
of these studies has been the consistent demonstration of the
remarkable power of _behavior modification techniques_, _group
conformity_ and _obedience to authority_. These three factors are
known in psychological terms as “influence processes” and demonstrate
that situations often determine human behaviors, often more than the
values and beliefs of the individual. One of the most remarkable
discoveries of social psychology is that people are hardwired to
unconsciously respond to social cues.

				For example, a class of psychology students once conspired to use
behavior modification techniques on their teacher. As the professor
lectured, the students would smile and seem attentive when he moved
toward the left of the room. When he moved to the right, the students
acted bored and listless. Before long, the professor began to drift to
the left, and after a few classes he spent each lecture leaning
against the left wall.

				But when the students let the professor in on the experiment, he
insisted that nothing of the sort had happened. He saw nothing odd
about leaning against the wall, and angrily insisted that it was
merely his personal lecturing style—something he had chosen to do of
his own free will. This psychology professor was completely
unconscious of how he had been influenced.

				Of course, under ordinary circumstances, the people around us are
not all secretly conspiring to make us do anything. They simply act
more or less as they have been culturally conditioned to act, which in
turn conditions us. This is the way in which a culture perpetuates
itself.

				In a destructive cult, however, the behavior modification process
is completely stage-managed around new recruits, who of course have no
idea of what is going on.

				If behavior modification techniques are powerful, so too are the
influences of conformity and obedience to authority. A famous
experiment in conformity by Dr. Solomon Asch demonstrated that most
people will conform—and even doubt their own perceptions—if they are
put in a social situation where the most confident people in the group
all give the same wrong answers.[79] Another social psychologist,
Stanley Milgram, tested people for obedience to authority and found
that over 90 percent of his subjects would obey orders, even if they
believed that doing so caused physical suffering to another person.
Milgram wrote, “The essence of obedience consists in the fact that a
person comes to view himself as the instrument for carrying out
another person’s wishes, and therefore no longer regards himself as
responsible for his own actions.”[80]

				Dr. Philip Zimbardo conducted a world-famous prison experiment in
the basement of the Psychology building at Stanford University in
1971. He demonstrated the “power of the situation,” which he described
in detail in his book _The Lucifer Effect_. Healthy, normal young men
were randomly divided into two groups: one of prisoners and one of
guards, who were to manage the prisoners. This was to be a two-week
experiment, but it had to be called off after only six days, because
some of the guards had become sadistic, and some of the prisoners had
broken down mentally.

				Good people started behaving badly when put in a bad situation and
were unaware of the mind control forces at work. We are unconsciously
wired to adapt and conform to promote our survival. When we are
confused or not sure what to do, we look to others in our environment
and especially to people we deem to be legitimate authority figures.
Most people conform to fit in. The groundbreaking work of Philip
Zimbardo and others has enormous implications. Zimbardo, who is
Professor Emeritus at Stanford University and former President of the
American Psychological Association, taught a course for 15 years
called The Psychology of Mind Control.[81]

				The BITE Model: The Four Components Of Mind Control[82]

				Clearly, one cannot begin to understand mind control without
realizing the power of behavior modification techniques, as well as
the influences of conformity and obedience to authority. If we take
these insights from social psychology as a foundation, we may be able
to identify the basic components of mind control.

				As I have come to see it, mind control can be largely understood
by analysis of the three components described by psychologist Leon
Festinger, in what has become known as the “cognitive dissonance
theory.”[83] These components are _control of behavior_, _control of
thoughts_ and _control of emotions_.

				Each component has a powerful effect on the other two: change one,
and the others will tend to follow. Succeed in changing all three, and
the individual will be swept away. However, from my experience in
researching destructive cults, I have added one more component that is
vital: _control of information_. If you control the information
someone receives, you restrict his ability to think for himself.

				These four components of mind control serve as the basic reference
points for understanding how mind control works.

				Cognitive dissonance theory is not as forbidding as its name might
sound. In 1950, Festinger summarized its basic principle this way: “If
you change a person’s behavior, his thoughts and feelings will change
to minimize the dissonance.”[84] What did Festinger mean by
“dissonance?” In basic terms, he was referring to the conflict that
occurs when a thought, a feeling or a behavior is altered in
contradiction to the other two. A person can tolerate only a certain
amount of discrepancy between his thoughts, feelings and actions,
which after all make up the different components of his identity.
Festinger’s theory states—and a great deal of later research has
confirmed—that if any one of the three components changes, the other
two will shift to reduce the dissonance.

				How does this kind of shift apply to the behavior of people in
cults? Festinger looked for a place to examine his ideas in the real
world. In 1956 he published a book, _When Prophecy Fails_, about a
Wisconsin flying saucer cult, whose leader had predicted the end of
the world. The cult leader claimed to be in mental contact with aliens
from another planet. Followers sold their homes, gave away their
money, and stood at the appointed date on a mountainside, waiting all
night to be picked up by flying saucers before a flood destroyed the
world the next morning.

				When morning came with no saucers and no flood—just a spate of
satirical news stories about the group—the followers might have been
expected to become disillusioned and angry. And a few did—but they
were fringe members who had not invested much time or energy. Most
members, however, became more convinced than ever. Their leader
proclaimed that the aliens had witnessed their faithful vigil and
decided to spare the Earth. Members wound up feeling _more_ committed
to the leader, even after they took a dramatic public stance that
resulted in public humiliation. Most Jehovah’s Witnesses responded to
the failure of the group’s many prophecies of the end of the world
with renewed faith.

				Cognitive dissonance theory helps explain why this heightened
commitment occurred. According to Festinger, people need to maintain
order and meaning in their life. They need to think they are acting
according to their self-image and their own values. If their behavior
changes for any reason, their self-image and values change to match.
The important thing to recognize about cult groups is that they
deliberately create dissonance in people this way and exploit it to
control them.

				To make it easier to remember, I call it the *B*I*T*E* model of
mind control: *B*ehavior, *I*nformation, *T*hought and *E*motional
Control. Let’s take a closer look at each one of these components of
mind control.

				Behavior Control

				Behavior control is the regulation of an individual’s physical
reality. It includes the control of their environment—where they live,
what clothes they wear, what food they eat, how much sleep they get,
and what jobs, rituals and other actions they perform.

				This need for behavior control is the reason most cults prescribe
a very rigid schedule for their members. Each day a significant amount
of time is devoted to cult rituals and indoctrination activities.
Members are also typically assigned to accomplish specific goals and
tasks, thus restricting their free time—and their behavior. In
destructive cults there is always something to do.

				In some of the more restrictive groups, members have to ask
permission from leaders to do almost anything. In other groups, a
person is made so financially dependent that their choices of behavior
are narrowed automatically. A member must ask for bus fare, clothing
money or permission to seek health care—things most of us take for
granted. Often the person must ask permission to call a friend or
relative not in the group. Every hour of the cult member’s day has to
be accounted for. In these ways the group can keep a tight rein on the
member’s behavior—and on their thoughts and feelings as well.

				Behavior is often controlled by the requirement that everyone act
as a group. In many cults, people eat together, work together, have
group meetings and sometimes sleep together in the same dormitory.
Individualism is fiercely discouraged. People may be assigned a
constant “buddy” or be placed in a small unit of a half dozen members.

				The chain of command in cults is usually authoritarian, flowing
from the leader, through their lieutenants, to their sub-leaders, down
to the rank and file. In such a well-regulated environment, all
behaviors can be either rewarded or punished. If a person performs
well, they will be given public praise from higher-ups, and sometimes
gifts or a promotion. If the person performs poorly, they may be
publicly singled out and criticized, or forced to do manual labor such
as cleaning toilets or polishing other members’ shoes. Other forms of
punishment may include prescribed fasting, cold showers, staying up
for an all-night vigil or doing remedial work. Those who actively
participate in their own punishment will eventually come to believe
they deserve it.

				Each particular group has its own distinctive set of ritual
behaviors that help bind it together. These typically include
mannerisms of speech, specific posture and facial expressions, as well
as the more traditional ways of representing group belief. In the
Moonies, for instance, we followed many Asian customs, such as taking
off our shoes when entering a Moonie center, kneeling and bowing when
greeting older members. Doing these little things helped make us feel
we were special and superior. Psychologists call this “social proof.”

				If a member is not behaving sufficiently enthusiastically, they
may be confronted by a leader and accused of being selfish or impure,
or of not trying hard enough. They will be urged to become like an
older group member, even to the extent of mimicking that person’s tone
of voice.

				Obedience to a leader’s command is the most important lesson to
learn. A cult’s leaders cannot command someone’s inner thoughts, but
they know that if they command _behavior_, hearts and minds will
follow.

				Information Control

				Information control is the second component of mind control.
Information provides the tools with which we think and understand
reality. Without accurate, up-to-date information, we can easily be
manipulated and controlled. Deny a person the information they require
to make sound judgments and they will become incapable of doing so.

				Deception is the biggest tool of information control, because it
robs people of the ability to make informed decisions. Outright lying,
withholding information and distorting information all become
essential strategies, especially when recruiting new members. By using
deception, cults rob their victims of “informed consent” and in the
case of religious cults, this lack of honest disclosure most certainly
violates people’s individual religious rights.

				In many totalistic cults, people have minimal access to non-cult
newspapers, magazines, TV, radio and online information. Certain
information may be forbidden and labeled as unhealthy: apostate
literature, entheta (negative information), satanic, bourgeoisie
propaganda, and so on. Members are also kept so busy that they don’t
have free time to think and seek outside answers to questions. When
they do read, it is primarily cult-generated propaganda or material
that has been censored to keep members focused.

				Information control also extends across all relationships. People
are not allowed to talk to each other about anything critical of the
leader, doctrine, or organization. Members must spy on each other and
report improper activities or comments to leaders, often in the form
of written reports (a technique pioneered by the Nazis, with the
Hitler Youth). New converts are discouraged from sharing doubts with
anyone other than a superior. Newbies are typically chaperoned, until
they prove their devotion and loyalty. Most importantly, people are
told to avoid contact with ex-members and critics. Those people who
could provide the most outside—that is, _real_—information are to be
completely shunned. Some groups even go so far as to screen members’
letters and phone calls.

				Information is usually compartmentalized, to keep members from
knowing the big picture. In larger groups, people are told only as
much as they “need to know” in order to perform their jobs. A member
in one city therefore does not necessarily know about an important
legal decision, media story, or internal dispute that is creating
turmoil in the group somewhere else. Cult members naturally feel they
know more about what’s going on in their group than outsiders, but in
counseling ex-members, I have found that they often know far less than
almost anyone else. Moonies are often ignorant of their cult’s
involvement in arms manufacture, and Scientologists of the
imprisonment of eleven leaders for the largest infiltration of
government agencies ever undertaken.

				Destructive organizations also control information by having many
levels of “truth.” Cult ideologies often have “outsider” doctrines and
“insider” doctrines. The outsider material is relatively bland stuff
for the general public or new converts. The inner doctrines are
gradually unveiled, as the person is more deeply involved and only
when the person is deemed “ready” by superiors.

				For example, Moonies always said publicly that they were
pro-American, pro-democracy and pro-family. The Moonies _were_
pro-American, in that they wanted what they thought was best for
America, which was to become a theocracy under Moon’s rule. They
believed democracy was instituted by God to allow the Unification
Church the space to organize a theocratic dictatorship. They _were_
pro-family in believing that every human being’s true family was Moon,
his wife and his spiritual children. Yet the inner doctrine was—and
still is—that America is inferior to Korea and must become subservient
to it; that democracy is a foolish system that “God is phasing
out”;[85] and that people must be cut off from their “physical” (as
opposed to “spiritual”) families if they are at all critical of the
cult.

				A member can sincerely believe that the outer doctrines are not
lies, but just a different level of truth. By creating an environment
where truth is multileveled, cult directors make it nearly impossible
for a member to make definitive, objective assessments. If they have
problems, they are told that they are not mature or advanced enough to
know the whole truth yet. But they are assured that all will become
clear shortly. If they work hard, they’ll earn the right to understand
the higher levels of truth.

				But often there are many inner levels or layers of belief. Often
an advanced member who thinks they know a cult’s complete doctrine is
still several layers away from what the higher ups know. Questioners
who insist on knowing too much too fast, of course, are redirected
toward an external goal until they forget their objections or they
object too loudly and are kicked out and vilified.

				Thought Control

				Thought control, the third major component of mind control,
includes indoctrinating members so thoroughly that they internalize
the group doctrine, incorporate a new language system, and use
thought-stopping techniques to keep their mind “centered.” In order to
be a good member, a person must learn to manipulate their own thought
processes.

				In totalistic cults, the ideology is internalized as “the truth,”
the only map of reality. The doctrine not only serves to filter
incoming information, but also regulates how the information can be
thought about. Usually, the doctrine is absolutist, dividing
everything into black versus white, or us versus them. All that is
good is embodied in the leader and the group. All that is bad is on
the outside. The doctrine claims to answer all questions to all
problems and situations. Members need not think for themselves because
the doctrine does the thinking for them. The more totalistic groups
claim that their doctrine is scientific, but that is never truly the
case.

				A destructive cult inevitably has its own “loaded language” of
unique words and expressions. Since language provides the symbols we
use for thinking, using only certain words serves to control thoughts.
Cult language is totalistic and therefore condenses complex
situations, labels them, and reduces them to cult clichés. This
simplistic label then governs how members think in any situation.[86]
In the Moonies, for example, whenever a member had difficulty relating
to someone who was either above or below them in status, it was called
a _Cain-Abel problem_. It didn’t matter who was involved or what the
problem was—it was simply a Cain-Abel problem. The term itself
dictated how the problem had to be resolved. Cain needed to obey Abel
and follow him, rather than kill him (as Cain killed Abel in the Old
Testament). Case closed. To think otherwise would be to obey Satan’s
wish that evil Cain should prevail over righteous Abel. Clearly, a
critical thought about a leader’s misconduct cannot get past this
roadblock in a devout member’s mind.

				The cult’s clichés and loaded language also put up an invisible
wall between believers and outsiders. The language helps to make
members feel special, and separates them from the general public. It
also serves to confuse newcomers, who want to understand what members
are talking about. The newbies think they merely have to study harder
in order to understand the truth, which they believe is precisely
expressed in this new language. In reality, though, loaded language
helps them learn how _not_ to think or understand. They learn that
“understanding” means accepting and believing.

				Another key aspect of thought control involves training members to
block out any information that is critical of the group. A member’s
normal defense mechanisms often become so twisted that they defend
their own new cult identity against their old, former self. The first
line of defense includes denial—“What you say isn’t happening at all”;
rationalization—“This is happening for a good reason”;
justification—“This is happening because it ought to”; and wishful
thinking—“I’d like it to be true so maybe it really is.”

				If information transmitted to a cult member is perceived as an
attack on either the leader, the doctrine or the group, a defensive
wall goes up. Members are trained to disbelieve any criticism.
Critical words have been explained away in advance—for instance, as
“the lies about us that Satan puts in people’s minds” or “the lies
that the World Conspiracy prints in the news media to discredit us,
because they know we’re onto them.” Paradoxically, criticism of the
group is used to confirm that the cult’s view of the world is correct.
Because of thought control, factual information that challenges the
cult worldview does not register properly.

				Perhaps the most widely used, and most effective, technique for
controlling cult members’ thoughts is _thought-stopping_.[87] Members
are taught to use thought-stopping on themselves. They are told it
will help them grow, stay “pure and true” or be more effective.
Whenever cult members experience a “bad” thought, they use
thought-stopping to halt the “negativity” and center themselves, thus
shutting out anything that threatens or challenges the cult’s version
of reality.

				Different groups use different thought-stopping techniques, which
can include concentrated praying, chanting aloud or silently,
meditating, speaking in tongues, singing or humming. These actions, at
times useful and valuable, thus become perverted in destructive cults.
They also become quite mechanical, because the person is programmed to
activate them at the first sign of doubt, anxiety or uncertainty. In a
matter of weeks, the technique becomes ingrained. It becomes so
automatic, in fact, that the person is usually not even aware that
they just had a “bad” thought. They are only aware that they are
suddenly chanting or ritualizing.

				Through the use of thought-stopping, members think they are
growing, when in reality they are just turning themselves into
thought-stopping addicts. After leaving a cult that employs extensive
thought-stopping techniques, a person normally goes through a
difficult withdrawal process before they can overcome this addiction.

				Thought-stopping is the most direct way to short-circuit a
person’s ability to test reality. Indeed, if people are able to think
only positive thoughts about their involvement with the group, they
are most certainly stuck. Since the doctrine is perfect and the leader
is perfect, any problem that crops up is assumed to be the fault of
the individual member. They learn to always to blame themselves and
simply work harder.

				Thought control can effectively block out any feelings that do not
correspond with the group doctrine. It can also serve to keep a cult
member working as an obedient slave. In any event, when thought is
controlled, feelings and behaviors are usually controlled as well.

				Emotional Control

				Emotional control, the fourth component of the BITE model,
attempts to manipulate and narrow the range of a person’s feelings.
All or nothing. Either you feel wonderful as a “chosen” member of the
elite, someone really special and loved and part of a wonderful
movement; or you are broken, unspiritual, have bad karma, are guilty
of _overts_, are sinful and need to repent, try harder and become a
better, more devoted member. Guilt and fear figure mightily. However,
most cult members can’t see that guilt and fear are being used to
control them. They are both essential tools to keep people under
control.

				Guilt comes in many forms. Historical guilt (for instance, the
fact that the United States dropped the atomic bomb on Hiroshima),
identity guilt (a thought such as “I’m not living up to my
potential”), guilt over past actions (“I cheated on a test”) and
social guilt (“People are dying of starvation”) can all be exploited
by destructive cult leaders. Members are conditioned to always take
the blame, so that they respond gratefully whenever a leader points
out one of their “shortcomings.”

				Fear is used to bind the group members together in several ways.
The first is the creation of an outside enemy, who is persecuting the
group and its members. For example, the FBI will jail or kill you;
Satan will carry you off to Hell; psychiatrists will give you
electroshock therapy; armed members of rival sects will shoot or
torture you; and, of course, ex-members and critics will try to
persecute you. Second is the terror of discovery and punishment by
cult members and leaders. Fear of what can happen to you if you don’t
do your job well can be very potent. Some groups claim that nuclear
holocaust or other disasters will result if members are lax in their
commitment.

				In order to control someone through their emotions, feelings
themselves often have to be redefined. For example, everyone wants
happiness. However, if happiness is redefined as being closer to God,
and God is unhappy (as He apparently is in many religious cults), then
the way to be happy is to be unhappy. Happiness, therefore, consists
of suffering so you can grow closer to God. This idea also appears in
some non-cult theologies, but in a cult it is a tool for exploitation
and control.

				In some groups, happiness simply means following the leader’s
directions, recruiting a lot of new members, or bringing in a lot of
money. Or, happiness is defined as the sense of community provided by
the cult to those who enjoy high status within it.

				Loyalty and devotion are the most highly respected emotions of
all. Members are not allowed to feel or express negative emotions,
except toward outsiders. They are taught never to feel for themselves
or their own needs, but always to think of the group and never to
complain. They are never to criticize a leader, but to criticize
themselves instead.

				Many groups exercise complete control over interpersonal
relationships. Leaders can and do tell people to avoid certain members
or spend time with others. Some even tell members whom they can marry,
and control the entire relationship, including their sex lives. Some
groups require members to deny or suppress sexual feelings, which
become a source of bottled-up frustration that can be channeled into
other outlets such as harder work. Other groups _require_ sexuality,
and a member who hangs back is made to feel selfish. Either way, the
group is exercising emotional control.

				People are often kept off balance, praised one minute and
tongue-lashed the next. In some groups, one day you’ll be doing public
relations before TV cameras in a suit and tie; the next, you’ll be in
another state doing manual labor as punishment for some imagined sin.
This misuse of reward and punishment fosters dependency and
helplessness. Such double-bind behavior is a commonplace in cults.

				Confession of past sins or wrong attitudes is also a powerful
device for emotional control. Of course, once someone has publicly
confessed, rarely is their old sin truly forgiven or forgotten. The
minute they get out of line, it will be hauled out and used to
manipulate them into obeying. Anyone who finds themselves in a cult
confession session needs to remember this warning: Anything you say
can _and will_ be used against you. This device can even extend to
blackmail, if you leave the cult. Even when it does not, former
members are often scared to speak out, just in case their embarrassing
secrets are made public.

				The most powerful technique for emotional control is phobia
indoctrination, which was described in Chapter 3. Members will have a
panic reaction at the thought of leaving the group. They are told that
if they leave they will be lost and defenseless in the face of dark
horrors. They’ll go insane, be killed, become drug addicts or commit
suicide. Such tales are repeated often, both in lectures and in hushed
tones through informal gossip. It becomes nearly impossible for
indoctrinated cult members to feel they can have any happiness,
security or fulfillment outside the group.

				When cult leaders tell the public, “Members are free to leave any
time they want; the door is open,” they give the impression that
members have free will and are simply choosing to stay. Actually,
members may not have a real choice, because they have been
indoctrinated to fear the outside world. If a person’s emotions are
successfully brought under the group’s control, their thoughts and
behavior will follow.

				Each component of the BITE model: behavior control, information
control, thought control, emotional control—has great influence on the
human mind. Together, they form a totalistic web, one that can be used
to manipulate even the most intelligent, creative, ambitious and
strong-willed person. In fact, it is often the strongest-minded
individuals who make the most involved and enthusiastic cult members.

				I have attempted to cover only the broadest and most common
practices within each component of mind control. No one group does
everything described in this section. Other practices are used by
certain cults but are not included here.

				Some practices could fall into more than one of these categories.
For example, some groups change people’s names in order to hasten the
formation of the new “cult” identity. This technique could fall under
all four categories. There are many variations between groups. For
example, some groups are overt in their phobia indoctrination; others
are extremely subtle. What matters most is the overall impact on the
individual. Are they truly in control of their life choices? The only
way to tell is to give them the opportunity to reflect, to gain free
access to all information and to know that they are free to leave the
group if they choose.

				
				Three Steps To Gaining Control Of The Mind

				It is one thing to identify the four components of mind control
but quite another to know how they are actually used to change the
behavior of unsuspecting people. On the surface, the process of
gaining control of someone else’s mind seems quite simple. There are
three steps: _unfreezing_, _changing_ and _refreezing_.

				This three-step model was derived in the late 1940s from the work
of Kurt Lewin,[88] and was described in Edgar Schein’s book _Coercive
Persuasion_.[89] Schein, like Lifton, studied the brainwashing
programs in Mao Tse Tung’s China “China, in the late 1950s. His book,
based on interviews with former American prisoners, is a valuable
study of the process. Schein’s three steps apply just as well to other
forms of mind control as they do to brainwashing. As he described
them, unfreezing consists of breaking a person down; changing
constitutes the indoctrination process; and refreezing is the process
of building up and reinforcing the new identity.
				Destructive cults today have the added advantage of many decades
of psychological research and techniques, making their mind control
programs much more effective and dangerous than”

I’m stopping transcribing this because it is too shitty. I want real
therapy around this.

Chapter 4 Appendix

Lifton’s Eight Criteria of Mind Control

				The following excerpt from Robert Jay Lifton’s _The Future of
Immortality and Other Essays for a Nuclear Age_ (New York, Basic
Books, 1987) is a concise explanation of Lifton’s eight criteria for
defining mind control. These are:

				1.  Milieu control

				2.  Mystical manipulation (or planned spontaneity)

				3.  The demand for purity

				4.  The cult of confession

				5.  Sacred science

				6.  Loading of the language

				7.  Doctrine over person

				8.  Dispensing of existence
				
				The essay from which this selection is taken is entitled “Cults:
Religious Totalism and Civil Liberties.” In it, Lifton frames his
comments in relation to what he calls _ideological totalism_. This was
the environment in which Chinese thought reform was practiced, as
Lifton came to know of it from the Korean War and afterward.
				
				Ideological Totalism

				The phenomenology I used when writing about ideological totalism
in the past still seems useful to me, even though I wrote that book in
1960. The first characteristic is “milieu control,” which is
essentially the control of communication within an environment. If the
control is extremely intense, it becomes an internalized control—an
attempt to manage an individual’s inner communication. This can never
be fully achieved, but it can go rather far. It is what sometimes has
been called a “God’s-eye view”—a conviction that reality is the
group’s exclusive possession. Clearly this kind of process creates
conflicts in respect to individual autonomy: if sought or realized in
such an environment, autonomy becomes a threat to milieu control.
Milieu control within cults tends to be maintained and expressed in
several ways: group process, isolation from other people,
psychological pressure, geographical distance or unavailability of
transportation, and sometimes physical pressure. There is often a
sequence of events, such as seminars, lectures, and group encounters,
which becomes increasingly intense and increasingly isolated, making
it extremely difficult—both physically and psychologically—for one to
leave.

				These cults differ from patterns of totalism in other societies.
For instance, the centers that were used for reform in China were more
or less in keeping with the ethos of the society as it was evolving at
the time; and therefore when one was leaving them or moving in and out
of them, one would still find reinforcement from without. Cults, in
contrast, tend to become islands of totalism within a larger society
that is on the whole antagonistic to these islands. This situation can
create a dynamic of its own; and insofar as milieu control is to be
maintained, the requirements are magnified by that structural
situation. Cult leaders must often deepen their control and manage the
environment more systematically, and sometimes with greater intensity,
in order to maintain that island of totalism within the antagonistic
outer world.

				The imposition of intense milieu control is closely connected to
the process of change. (This partly explains why there can be a sudden
lifting of the cult identity when a young person who has been in a
cult for some time is abruptly exposed to outside, alternative
influences.) One can almost observe the process in some young people
who undergo a dramatic change in their prior identity, whatever it
was, to an intense embrace of a cult’s belief system and group
structure. I consider this a form of doubling: a second self is formed
that lives side by side with the prior self, somewhat autonomously
from it. Obviously there must be some connecting element to integrate
oneself with the other—otherwise, the overall person could not
function; but the autonomy of each is impressive. When the milieu
control is lifted by removing, by whatever means, the recruit from the
totalistic environment, something of the earlier self reasserts
itself. This leave-taking may occur voluntarily or through force (or
simply, as in one court case, by the cult member moving across to the
other side of the table, away from other members). The two selves can
exist simultaneously and confusedly for a considerable time, and it
may be that the transition periods are the most intense and
psychologically painful, as well as the most potentially harmful.

				A second general characteristic of totalistic environments is what
I call “mystical manipulation” or “planned spontaneity.” It is a
systematic process that is planned and managed from above (by the
leadership) but appears to have arisen spontaneously within the
environment. The process need not feel like manipulation, which raises
important philosophical questions. Some aspects—such as fasting,
chanting, and limited sleep—have a certain tradition and have been
practiced by religious groups over the centuries. There is a cult
pattern now in which a particular “chosen” human being is seen as a
savior or a source of salvation. Mystical manipulation can take on a
special quality in these cults because the leaders become mediators
for God. The God-centered principles can be put forcibly and claimed
exclusively, so that the cult and its beliefs become the only true
path to salvation. This can give intensity to the mystical
manipulation and justify those involved with promulgating it and, in
many cases, those who are its recipients from below.

				Insofar as there is a specific individual, a leader, who becomes
the center of the mystical manipulation (or the person in whose name
it is done), there is a twofold process at work. The leader can
sometimes be more real than an abstract god and therefore attractive
to cult members. On the other hand, that person can also be a source
of disillusionment. If one believes, as has been charged, that Sun
Myung Moon (founder of the Unification Church, whose members are
consequently referred to frequently as “Moonies”) has associations
with the Korean Central Intelligence Agency and this information is
made available to people in the Unification Church, their relationship
to the church can be threatened by disillusionment toward a leader. It
is never quite that simple a pattern of cause and effect—but I am
suggesting that this style of leadership has both advantages and
disadvantages in terms of cult loyalty.

				While mystical manipulation leads (in cult members) to what I have
called the psychology of the pawn, it can also include a legitimation
of deception (of outsiders)—the “heavenly deception” of the
Unification Church, although there are analogous patterns in other
cult environments. If one has not seen the light, and it is not in the
realm of the cult, one is in the realm of evil and therefore can be
justifiably deceived for the higher purpose. For instance, when
members of certain cults have collected funds, it has sometimes been
considered right for them to deny their affiliation when asked. Young
people have been at centers of a particular cult for some time without
being told that these were indeed run by it. The totalistic ideology
can and often does justify such deception.

				The next two characteristics of totalism, the “demand for purity”
and the “cult of confession,” are familiar. The demand for purity can
create a Manichean quality in cults, as in some other religious and
political groups. Such a demand calls for radical separation of pure
and impure, of good and evil, within an environment and within
oneself. Absolute purification is a continuing process. It is often
institutionalized; and, as a source of stimulation of guilt and shame,
it ties in with the confession process. Ideological movements, at
whatever level of intensity, take hold of an individual’s guilt and
shame mechanisms to achieve intense influence over the changes he or
she undergoes. This is done within a confession process that has its
own structure. Sessions in which one confesses to one’s sins are
accompanied by patterns of criticism and self-criticism, generally
transpiring within small groups and with an active and dynamic thrust
toward personal change.

				One could say more about the ambiguity and complexity of this
process, and Camus has observed that “authors of confessions write
especially to avoid confession, to tell nothing of what they know.”
Camus may have exaggerated, but he is correct in suggesting that
confessions contain varying mixtures of revelation and concealment. A
young person confessing to various sins of precultic or
pre-institutional existence can both believe in those sins and be
covering over other ideas and feelings that he or she is either
unaware of or reluctant to discuss. In some cases, these sins include
a continuing identification with one’s prior existence, if such
identification has not been successfully dishonored by the confession
process. Repetitious confession, then, is often an expression of
extreme arrogance in the name of apparent humility. Again Camus: “I
practice the profession of penitence, to be able to end up as a
judge,” and “the more I accuse myself, the more l have a right to
judge you.” That is a central theme in any continual confessional
process, particularly where it is required in an enclosed group
process.

				The next three patterns I describe in regard to ideological
totalism are “the sacred science,” the “loading of the language,” and
the principle of “doctrine over person.” The phrases are almost
self-explanatory. I would emphasize especially sacred science, for in
our age something must be scientific as well as spiritual to have a
substantial effect on people. Sacred science can offer considerable
security to young people because it greatly simplifies the world. The
Unification Church is a good example, but not the only one, of a
contemporary need to combine a sacred set of dogmatic principles with
a claim to a science embodying the truth about human behavior and
human psychology. In the case of the Unification Church, this claim to
a comprehensive human science is furthered by inviting prominent
scholars (who are paid unusually high honoraria) to large symposia
that stress unification of thought; participants express their views
freely, but nonetheless contribute to the desired aura of intellectual
legitimacy.

				The term “loading the language” refers to a literalization of
language—and to words or images becoming God. A greatly simplified
language may seem cliché-ridden, but can have enormous appeal and
psychological power in its very simplification. Because every issue in
one’s life—and these are often very complicated young lives—can be
reduced to a single set of principles that have an inner coherence,
one can claim the experience of truth and feel it. Answers are
available. Lionel Trilling has called this the “language of
nonthought” because there is a cliché and a simple slogan to which the
most complex and otherwise difficult questions can be reduced.

				The pattern of doctrine over person occurs when there is a
conflict between what one feels oneself experiencing and what the
doctrine or dogma says one should experience. The internalized message
in totalistic environments is that one must find the truth of the
dogma and subject one’s experiences to that truth. Often the
experience of contradiction, or the admission of that experience, can
be immediately associated with guilt; or else (in order to hold one to
that doctrine) condemned by others in a way that leads quickly to that
guilty association. One is made to feel that doubts are reflections of
one’s own evil. Yet doubts can arise; and when conflicts become
intense, people can leave. This is the most frequent difficulty of
many of the cults: membership may represent more of a problem than
money.

				Finally, the eighth, and perhaps the most general and significant
of these characteristics, is what I call the “dispensing of
existence.” This principle is usually metaphorical. But if one has an
absolute or totalistic vision of truth, then those who have not seen
the light—have not embraced that truth, are in some way in the
shadows—are bound up with evil, tainted, and do not have the right to
exist. There is a “being versus nothingness” dichotomy at work here.
Impediments to legitimate being must be pushed away or destroyed. One
placed in the second category of not having the right to exist can
experience psychologically a tremendous fear of inner extinction or
collapse. However, when one is accepted, there can be great
satisfaction of feeling oneself a part of the elite. Under more
malignant conditions, the dispensing of existence, the absence of the
right to exist, can be literalized; people can be put to death because
of their alleged doctrinal shortcomings, as has happened in all too
many places, including the Soviet Union and Nazi Germany. In the
Peoples Temple mass suicide/murder in Guyana, a single cult leader
could preside over the literal dispensing of existence—or more
precisely, nonexistence—by means of a suicidal mystique he himself had
made a part of the group’s ideology. (Subsequent reports based on the
results of autopsies reveal that there were probably as many murders
as suicides.) The totalistic impulse to draw a sharp line between
those who have a right to live and those who do not—though occurring
in varying degrees—can become a deadly approach to resolving
fundamental human problems. And all such approaches involving totalism
or fundamentalism are doubly dangerous in a nuclear age.

				I should say that, despite these problems, none of these processes
is airtight. One of my purposes in writing about them is to counter
the tendency in the culture to deny that such things exist; another
purpose is to demystify them, to see them as comprehensible in terms
of our understanding of human behavior.

				Dr. Lifton wrote _Witness to an Extreme Century: A Memoir_ (Free
Press, 2011). I was fortunate to sit him down for two videotaped
interviews which are on the freedomofmind.com website.


Chapter 4 Endnotes
				74.  Robert Jay Lifton, Thought Reform and the Psychology of
Totalism (New York: W.W. Norton & Company, 1961).
				75.  I, Louis Jolyon West, Jon Atack and others believe that the
personality is formed of a continuum of many identities, so the
authentic personality is overtaken by the cult identity. These “parts”
are also referred to by therapists who do “ego-state” therapy.
				76.  “Jury Indicts 9 Linked to Synanon,” The Cult Observer (Oct
1985), from The New York Times (Oct 2, 1985).“Point Reyes Light Wins
$100,000 settlement from Synanon,” The Cult Observer (March/April
1987).Steve Allen, Beloved Son (Indianapolis, New York: The
Bobbs-Merrill Company, Inc., 1982), 187-194.Myrna Oliver, “Two Synanon
Members Get Year in Jail,” Los Angeles Times  (November 22, 1980).
				77.  Moon made this speech to an audience of several hundred
people during the summer of 1975 in upstate New York.
				78.  See Adorno, Frenkel-Brunswik, Levinson, Sanford, The
Authoritarian Personality (New York: Harper & Brothers, 1950).
				79.  Solomon Asch, “Effects of Group Pressure Upon the
Modification and Distortion of Judgment,” in Groups, Leadership, and
Men, ed. M.H. Guetzkow, (Pittsburgh: Carnegie, 1951).Solomon Asch,
“Studies of Independence and Conformity: A Minority of One Against a
Unanimous Majority,” Psychological Monographs, 70 (1956).
				80.  Stanley Milgram, Obedience to Authority (New York:  Harper  &
 Row, 1974), xii.
				81.  He used this chapter and the preceding one of this book in
the course materials. I conducted a videotaped interview with Dr.
Zimbardo on mind control which is on the freedomofmind.com site.
				82.  The original edition of Combatting Cult Mind Control used the
four components, but it was Rev. Buddy Martin who suggested that I
change the order and use the acronym  BITE instead. Many thanks!
				83.  Leon Festinger, Henry W. Riecken, and Stanley Schachter, When
Prophecy Falls (Harper & Row, 1964).
				84.  Ibid.
				85.  Fred Clarkson, “Moon’s Law: ‘God is Phasing Out Democracy,’”
Covert Action Information Bulletin No. 27 (Spring 1987), 38.
				86.  The appendix to George Orwell’s Nineteen Eighty-Four gives an
excellent description of the use of language to restrict thought.
				87.  Michael Mahoney and Carl Thoreson, Self-Control: Power to the
Person. (Monterey, California: Brooks/Cole, 1974).
				88.  Kurt Lewin, “Frontiers in Group Dynamics: Concept, Method,
and Reality in Social Science,” Human Relations, 1947.
				89.  Edgar H. Schein, Coercive Persuasion, 1961 (The Massachusetts
Institute of Technology, W.W. Norton, 1971).
				90.  One of the best books I’ve read on linguistic double binds is
Milton Erickson’s Hypnotic Realities (New York: Irvington Publishers,
1976).


More information about the cypherpunks mailing list