Overcoming The Bondage Of Victimization

Assumptions Of Mind Control

The theory of cult mind control is part of a contemporary adversarial approach to many cults, new religious movements, and non-traditional churches. In this approach sociological and psychological terminology has been substituted for Christian terminology. Cult involvement is no longer described as religious conversion, but as mind control induction. Cult membership is not characterized as misplaced religious zeal but as programming. And the cultist who leaves his group is no longer described as redeemed, but as returned to a neutral religious position. And rather than evangelism of cult members, we now have “intervention counseling.”

And biblical apologetics has been replaced by cognitive dissonance techniques. A parent’s plea has changed from “How can my adult child be saved?” to “How can my adult child revert to his/her pre-cult personality?” Biblical analysis and evangelism of the cults has become overshadowed by allegedly “value neutral” social science descriptions and therapy-oriented counseling.

The principal assumptions of the cult mind control model can be summarized under eight categories:

1. Cults’ ability to control the mind supersedes that of the best military “brainwashers.”
2. Cult recruits become unable to think or make decisions for themselves.
3. Cult recruits assume “cult” personalities and subsume their core personalities.
4. Cultists cannot decide to leave their cults.
5. A successful intervention must break the mind control, find the core personality, and return the individual to his/her pre-cult status.
6. Psychology and sociology are used to explain cult recruitment, membership, and disaffection.
7. Religious conversion and commitment may be termed “mind control” if it meets certain psychological and sociological criteria, regardless of its doctrinal or theological standards.
8. The psychological and sociological standards which define mind control are not absolute, but fall in a relative, subjective continuum from “acceptable” social and/or religious affiliation to “unacceptable.”

According to most cult mind control model advocates, no one is immune to the right mind control tactics used at the right time. Anyone is susceptible. For example, Steven Hassan, recognized as a premier source for the cult mind control model, writes in his book, Combatting Cult Mind Control, “Anyone, regardless of family background, can be recruited into a cult. The major variable is not the person’s family but the cult recruiter’s level of skill.” Dr. Paul Martin, evangelical director of a rehabilitation center for former cultists, writes,

“But the truth of the matter is, virtually anyone can get involved in a cult under the right circumstances. . . . Regardless of one’s spiritual or psychological health, whether one is weak or strong, cultic involvement can happen to anyone.”
Evangelical exit counselor Craig Branch told us in an interview that, even though he was extremely knowledgable and experienced regarding cult mind control, he still could be caught by cult mind control administered at the right time by the right person.
The cult mind control model is based on a fundamental conviction that the cultist becomes unable to make responsible and rational choices or decisions (particularly the choice to leave the group), and that psychological techniques are the most effective ways to free them to make decisions once more. This foundation is non-negotiable to the mind control model, and is at the root of what we consider so flawed about the mind control concept.

We find this foundational conviction assumed in a 1977 article describing recovery from cult mind control by evangelical sociologist Dr. Ronald Enroth, who quotes Dr. Margaret Singer, an outspoken advocate of the cult mind control model:

In a situation removed from the reinforcing pressures of the cult, the ex-members are encouraged to think for themselves so that they are “once again in charge of their own volition and their own decision-making.”
Hassan asserts that, both from his personal testimony and his field experience, cult recruits cannot think for themselves or initiate decisions:

Members [of the Unification Church] . . . become totally dependent upon the group for financial and emotional support, and lose the ability to act independently of it.
Paul Martin asserts that cult mind control renders its victims virtually unresponsible for their actions or beliefs:

. . . the process whereby he or she was drawn into the cult was a subtle but powerful force over which he or she had little or no control and therefore they need not feel either guilt or shame because of their experience.
Cult mind control must be distinguished from “mere” deception, influence, or persuasion. At the core of the distinctive of mind control is the idea that the individual becomes unable to make autonomous personal choices, not simply that his or her choices have been predicated on something false. British sociologist Eileen Barker, a critic of the mind control concept, points out this difference:

Recruitment that employs deception should, however, be distinguished from “brainwashing” or “mind control.” If people are the victims of mind control, they are rendered incapable of themselves making the decision as to whether or not to join a movement — the decision is made for them. If, on the other hand, it is just deception that is being practised, converts will be perfectly capable of making a decision — although they might make a different decision were they basing their choice on more accurate information.
Fundamentally, the mind control model assumes inability to choose, while deception interferes with the accuracy of the knowledge one uses to make a choice.
Objection: The Brainwashing Connection

Representatives of the mind control model contradictorily both distance mind control from classic brainwashing and yet also see continuity between cult mind control and the classic brainwashing attempts in the 1950s by North Koreans and Chinese among American prisoners of war and by American CIA researchers. When critics of the mind control model point out the abysmal failures of classic brainwashing (discussed later in this article), advocates like Michael Langone say they have “misrepresented the critics’ [of the cults] [supporters of the mind control model] position by portraying them as advocates of a robotization theory of cult conversion based on The Manchurian Candidate.”

However, there is also concensus among mind control model advocates that classic brainwashing is the precursor to contemporary cult mind control. Psychologist Dr. Margaret Singer underscores this connection in her preface to this same Langone book, Recovery from Cults:

[M]y interest [in cult psychology and mind control] began during the Korean War era when I worked at the Walter Reed Army Institute of Research and studied thought-reform, influence, and intense indoctrination programs. Since then, I have continued the study of group influence.
In the 1960s I began to heed the appearance of cults and heard the descriptions of hundreds of parents who noticed certain changes in the personality, demeanor, and attitudes of their young-adult offspring who had become involved in cults. . . . The cults created programs of social and psychological influence that were effective for their goals. And I noticed especially that what had been added to the basic thought-reform programs seen in the world in the 1950s was the new cultic groups’ use of pop psychology techniques for further manipulating guilt, fear, and defenses.
This contradictory embracing and rejecting of the brainwashing connection is partially reconciled only by the nonsubstantive differences pointed out by mind control model supporters: (1) “Brainwashing” is considered primitive and often ineffective; (2) “Mind control” is claimed to be extremely powerful and compelling.
Hassan says, “Today, many techniques of mind control exist that are far more sophisticated than the brainwashing techniques used in World War II and the Korean War”, and explains further:

Leave a Comment


The reCAPTCHA verification period has expired. Please reload the page.