The History of Credibility Attacks Against Former Cult Members (Scientology Spies)

credibility2011 | The History of Credibility Attacks Against Former Cult Members | Stephen A Kent Phd

Download as .pdf

This paper discusses the controversy within social science (Kent is a sociologist) regarding the value of the testimony of ex-members of “high-control groups” – including the odd reference to Scientology.

In it, a prominent academic who has worked extensively with ex-members of these groups (and is an expert on Scientology) discusses,

[the problems] that have arisen with cult critics attempting to work with some former members, or at least people claiming to have left various groups.

and notes that,

A brief history of those problems, therefore, provides a cautionary tale worth telling in anti-cult
or counter-cult circles.

It is by keeping half an eye on the possibility that some few ex-members are not completely reliable that Kent has developed his formidable reputation as an accurate and objective scholar. This makes his criticisms of “high control groups” all the more effective – for example, his condemnation of Scientology’s labour camps (the RPF) for human rights abuses.

This is a good lesson for activists, who may find their own long-term credibility damaged by an encounter with any of the six types of unreliable potentially unreliable informants that Kent Describes.

These are:

  1. Forced deconverts
  2. Returnees
  3. Delusional alleged former members
  4. Con artists Spies Ex-members with‘histories’
  5. Professional former member anti-cultists
  6. Former members who become professionals

Brainwashing and Deprogramming – a Historical Note

The first two types are (thankfully) more of a historical note than a present-day problem. We have discussed (1) “forced deconverts” and (2) “returnees” previously, in a post about ‘brainwashing’ and ‘deprogramming’.

‘Brainwashing’ is a Deprogramming1flawed concept that led logically to the practice of ‘deprogramming’ – if you can change peoples’ beliefs by force, then it is can be argued it is morally defensible to use the same tactics to make them ‘normal’ again. This was the rationale behind ‘deprogramming’ – people were effectively kidnapped and subjected to coercive techniques.

Part of this process was to manipulate the subject into writing and signing a ‘statement’ describing the all of the negative aspects of their involvement, supposedly to enable them to put their experience behind them.

This provided cults with  two effective ways to attack the credibility of critical groups:

  1. The statements were often ‘atrocity stories’ designed to enhance the reputation of the (paid) ‘deprogrammer’ rather than to reveal the truth. Cults could defend themselves by pointing out the error and exaggerations in these texts, and destroy the credibility of the groups opposing them
  2. Some people returned to their cult after ‘deprogramming’. This is hardly surprising, as the process did not address the reasons that they had been attracted to membership in the first place. This completely invalidated ‘statements’ upon which critics may have relied upon.

Trust and Verify

The remaining points have contemporary relevance, and activists would be well-advised to bear Kent’s advice in mind. Scientology’s practice of ‘training’ spies to infiltrates and report on the activities of critics is specifically discussed in point 4.

In conclusion, I should like to make it clear that Kent is not hostile to the use of testimony from ex-members. Quite the contrary – he acknowledges his debt to such testimony and argues that it should be taken far more seriously than it is among social scientists, some of whom will automatically reject it as ‘tainted’.

Kent strongly opposes such a attitude, noting that,

The total rejection of the ‘testimonies’ of former members is not social science, and future generations of scholars will look back on this rejection with incredulity. What should matter in the social sciences is that researchers obtain accurate information under ethical circumstances.

Regardless of who provides it, social scientists simply should attempt to verify its content by comparing it to information that others provide or that the researchers obtained in other ways—a process called triangulation. The more that independent sources point to the same facts, the higher the likelihood of the facts being accurate.

Rejecting former members’ accounts, therefore, without checking them is more than simply bad social science, it actually is ideology. It is a refusal to question one’s basic assumptions that privileges the controversial groups—the cults— themselves. It privileges these groups by categorically excluding from research the wealth of information that people have who have seen these groups from the inside.

Advertisements

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s