Guest post: Subordination of human experience to the claims of doctrine
Originally a comment by Bjarte Foshaug at Miscellany Room.
We have all heard trans activism, or wokeism in general, described as a cult. On the other hand there are people who seem to suggest that because these movements don’t exactly meet the formal definition of cult, there is nothing to learn from the study of cults that’s at all relevant to the issue. Regardless of whether or not you think TRAs are a cult, I hope we can all agree that this is not a very compelling argument. Even if we accept the premise, the conclusion clearly doesn’t follow.
Robert Jay Lifton provided what still seems to be the most widely accepted definition of a cult in his article Cult Formation in 1981:
Cults can be identified by three characteristics:
1. a charismatic leader who increasingly becomes an object of worship as the general principles that may have originally sustained the group lose their power;
2. a process I call coercive persuasion or thought reform;
3. economic, sexual, and other exploitation of group members by the leader and the ruling coterie.
The most obvious way in which trans activism differs from this definition is that there is no clearly identifiable single leader with the authority of someone like Jim Jones or L. Ron Hubbard*. Also, while there is plenty of exploitation going on, I think there’s a fair case to be made that it’s less coordinated or organized than what we have seen in the case of, say, Scientology. Rather than one large confidence scheme, there are many smaller ones.
My focus in this analysis is on Lifton’s second characteristic. Perhaps the real issue isn’t whether or not TRAs are a “cult”, but whether or not they engage in thought reform (or “brain washing”). The latter does not presuppose the former. After all, Lifton’s original work on the topic was about the re-education of political prisoners in Communist China rather than cult indoctrination. In Thought Reform and the Psychology of Totalism (1961) Lifton breaks down the thought reform process into 8 “themes”. So let’s see how well these apply to gender ideology.
1. Milieu Control
The most basic feature of the thought reform environment, the psychological current upon which all else depends, is the control of human communication. Through this milieu control the totalist environment seeks to establish domain over not only the individual’s communication with the outside (all that he sees and hears, reads and writes, experiences, and expresses), but also—in its penetration of his inner life—over what we may speak of as his communication with himself.
The applicability to gender ideology is obvious. What are the endless cancellations, dis-invitations, de-platformings, and book-bannings, not to mention mandatory DEI programs (all presupposing a gender ideology framework), demands for “safe spaces”, the endless days, weeks, or entire months dedicated to “trans something“, etc., other than milieu control on an industrial scale? If you can just coerce people into talking and acting as if they held the required beliefs and opinions, they have a stake in defending them which creates an incentive to make themselves actually believe them (“I’m not the kind of sucker who would cave to external pressure, so if I did say those things, they have to be true!”). The justification spiral does the rest.
2. Mystical Manipulation
The inevitable next step after milieu control is extensive personal manipulation. […]. Initiated from above, it seeks to provoke specific patterns of behavior and emotion in such a way that these will appear to have arisen spontaneously from within the environment. This element of planned spontaneity, directed as it is by an ostensibly omniscient group, must assume, for the manipulated, a near-mystical quality. […] Included in this mystique is a sense of “higher purpose,” of having “directly perceived some imminent law of social development,” and of being themselves the vanguard of this development.
Among Lifton’s themes this is probably the hardest one to pin down since the manipulation, in order to make the planned behavior seem spontaneous, by necessity has to rely heavily on non-verbal social cues. Rainbow flags, gender-neutral restrooms, casual use of preferred pronouns or Genderspeak in general (#6) etc. may be among the less subtle ways of signaling what the officially “approved” beliefs and attitudes are. The part about “having ‘directly perceived some imminent law of social development,’ and of being themselves the vanguard of this development” seems to fit nicely with TRA rhetoric about being on the “right side of history”.
3. The Demand for Purity
In the thought reform milieu, as in all situations of ideological totalism, the experiential world is sharply divided into the pure and the impure, into the absolutely good and the absolutely evil. The good and the pure are of course those ideas, feelings, and actions which are consistent with the totalist ideology and policy; anything else is apt to be relegated to the bad and the impure.
Again the applicability to gender ideology is obvious. It doesn’t matter how far you go out of your way to stress that trans-identified people are entitled to the same rights, protections, dignity, and respect as everybody else. Unless you are prepared to uncritically accept the whole package of sex denialism, gender realism, the self-ID criterion of womanhood, the affirmation-only approach to gender dysphoria, the forced teaming of “LGB” and “TQ+” etc., not to mention tons of highly dubious postmodernist philosophy, Queer Theory, identity politics, social constructivism, standpoint epistemology, linguistic determinism etc. (#5), as part of the deal, you might as well be advocating for re-opening Auschwitz and resuming the mass-production of Zyklon B!
4. The Cult of Confession
Closely related to the demand for absolute purity is an obsession with personal confession. Confession is carried beyond its ordinary religious, legal, and therapeutic expressions to the point of becoming a cult in itself. There is the demand that one confess to crimes one has not committed, to sinfulness that is artificially induced, in the name of a cure that is arbitrarily imposed. Such demands are made possible not only by the ubiquitous human tendencies toward guilt and shame but also by the need to give expression to these tendencies. In totalist hands, confession becomes a means of exploiting, rather than offering solace for, these vulnerabilities.
This fits nicely with the obligatory public displays of owning up to your “cis privilege” and how you, as a member of the oppressor class, can never understand what trans people are going through, how nothing you can possibly do to compensate for this alleged shortcoming will ever be enough etc. The same goes for the obligatory groveling apologies and self-denunciations (very similar to the “self criticisms” demanded in Mao’s China) required by anyone accused (rightly or not) of insufficient ideological purity (#3). Having conceded that the TRAs were right and you were wrong, you once again have a stake in proving your commitment to “doing better”.
5. The “Sacred Science”
The totalist milieu maintains an aura of sacredness around its basic dogma, holding it out as an ultimate moral vision for the ordering of human existence. […]. While thus transcending ordinary concerns of logic, however, the milieu at the same time makes an exaggerated claim of airtight logic, of absolute “scientific” precision. Thus the ultimate moral vision becomes an ultimate science; and the man who dares to criticize it, or to harbor even unspoken alternative ideas, becomes not only immoral and irreverent, but also “unscientific.”
Once again gender ideology gets full marks. Obvious examples of sacred science include the “sex spectrum”, the idea that a person’s “brain sex” (a.k.a. “gender”) can be different from the sex of their body, the alleged infallibility of a person’s ability to know his/her own brain-sex, the idea that the only way to solve the supposed mismatch between gender identity and physical sex is medically “correcting” the latter, the idea that “changing sex” is even possible etc. Other examples might include the claim that the “sex binary” is a recent Western invention and remains alien to indigenous peoples uncontaminated by Western cultural imperialism. And, once again, all of it ultimately rests on tons of highly dubious postmodernist philosophy, Queer, Theory, identity politics, standpoint epistemology, social constructivism, linguistic determinism etc.
6. Loading the Language
The language of the totalist environment is characterized by the thought-terminating cliché. The most far-reaching and complex of human problems are compressed into brief, highly reductive, definitive-sounding phrases, easily memorized and easily expressed. These become the start and finish of any ideological analysis. […] Totalist language, then, is repetitiously centered on all-encompassing jargon, prematurely abstract, highly categorical, relentlessly judging, and to anyone but its most devoted advocate, deadly dull: in Lionel Trilling’s phrase, “the language of nonthought.”
A complete analysis of the TRA use of loaded language would fill, not just a whole book, but an entire library. Not only are thought-terminating clichés (“Trans Women are Women!”, “Trans Men are Men!”, “Non-Binary Identities are Valid!”, “Trans Rights are Human Rights!” etc) ubiquitous, but hardly anything they have to say makes sense except in the light of a million unstated (and very shaky) premises and impossibly sloppy inferences. Even the most central premises of their arguments, including such obviously relevant “details” as the definition of “woman”, what is meant by “trans rights”, and how said “rights “ are supposedly violated by, say, women’s right to female only spaces, are best left unspecified. Apparently simple words and phrases like “trans”, “cis”, “gender”, “gender dysphoria”, “man”, “woman”, “(non-)binary”, “trans rights”, “transphobia”, “trans medicine/healthcare”, “LGBTQ+” etc. are all short-hands and Trojan horses for tons of extremely dubious truth claims, value judgements, tortured inferences, circular definitions, “bad puns” etc. that have to be accepted unconditionally and without asking for specifics. E.g. it is always framed (without the analysis to back it up) as a matter of being for or against “trans rights”, the right of “trans children” to “healthcare” etc. as if the specific content of said “rights” had already been more firmly established than the laws of thermodynamics, when, in fact, this is very much a point of contention. Woke standpoint epistemology and the obligatory admonishments to “educate yourself” (which is “not the responsibility of marginalized people”, remember!) provide a blanket excuse for not bothering with evidence and placing the burden of proof squarely on your opponents.
7. Doctrine Over Person
This sterile language reflects another characteristic feature of ideological totalism: the subordination of human experience to the claims of doctrine. This primacy of doctrine over person is evident in the continual shift between experience itself and the highly abstract interpretation of such experience—between genuine feelings and spurious cataloguing of feelings. It has much to do with the peculiar aura of half-reality which a totalist environment seems, at least to the outsider, to possess.
Once again, the applicability to gender ideology is only too obvious. If all your senses, as well as instincts evolved over millions of years to identify potential threats or mates, are telling you that that 7 feet tall, broad-shouldered, square-jawed, bearded person in the pink dress and blond wig is an obvious male, but the doctrine requires you to see “her” (with her “lady-cock”) as a “woman”, you have to reject the evidence of your senses and not only pretend to see a woman, but work to make yourself honestly mean it.
8. The Dispensing of Existence
The totalist environment draws a sharp line between those whose right to existence can be recognized, and those who possess no such right. […] Surely this is a flagrant expression of what the Greeks called hubris, of arrogant man making himself God. Yet one underlying assumption makes this arrogance mandatory: the conviction that there is just one path to true existence, just one valid mode of being, and that all others are perforce invalid and false. Totalists thus feel themselves compelled to destroy all possibilities of false existence as a means of furthering the great plan of true existence to which they are committed.
Labeling opponents as “TERFs” and “transphobes” (#6) does not just offer a convenient excuse for dismissing anything they might have to say in advance, without addressing the actual substance of their arguments, but serves to mark them as beyond the pale, even undeserving of life (“Kill TERFs!”, “Die Cis Scum!”, “Die in a Grease Fire!” etc.). Any concern about, say, allowing biological males to self-identify into women’s changing rooms or medical experimentation on vulnerable children and teenagers is not seen as sincere, but only as an arbitrary excuse for blind, genocidal hatred and evil for the sake of evil. Anyone who fails to take on board all the baggage supposedly implied (for reasons best left unspecified #6) by “trans rights” is not just seen as misguided or wrong, but as a naked existential threat with whom no negotiations, compromise, or even peaceful co-existence is possible, someone who simply has to be crushed and defeated by any means necessary.
Lifton goes on to write:
The more clearly an environment expresses these eight psychological themes, the greater its resemblance to ideological totalism; and the more it utilizes such totalist devices to change people, the greater its resemblance to thought reform (or “brainwashing”).
I seem to remember cult-expert Rick Ross arguing that at least 6 of Lifton’s 8 themes need to be present for a coercive persuasion effort to be considered thought reform or “brainwashing”. If so, I think there’s a strong case to be made that gender ideology qualifies. I strongly suspect that theme 2 (Mystical Manipulation) is applicable to some degree, although it is hard to conclusively demonstrate for reasons I have already mentioned. As far as the other themes are concerned, I say TRAs have earned a perfect score.
* Not all cult experts see this as a necessary criterion being a cult, however. E.g. Steven Hassan seems to argue that any group that engages in coercive persuasion qualifies as a cult.
Great summary and analysis. It’s like they’ve used Lifton as a playbook.
My thoughts exactly! Another way in which studying cults, or high control groups and movements in general, could be useful might be to look into, not just how people are sucked into such groups, but how they leave. I strongly suspect that parents of trans-identified children could benefit greatly from studying the work of experienced cult “de-programmers” like Rick Allan Ross and Steven Hassan.
While I’m at it, here’s another thing I wrote, that I might as well share here. This one is more specifically about cult indoctrination, but most of it can be applied to trans activism, or wokeism in general, without too much effort.
Coercive Persuasion and Undue Influence 101
As any cult expert will tell you, nobody who’s joining a cult thinks they’re joining a cult, and nobody who’s being brainwashed thinks they’re being brainwashed. I suspect a major part of the reason is that Hollywood has been feeding us the wrong idea of what “brainwashing” looks like, and so we don’t recognize the real thing when we see it. Talk of “brainwashing” and “mind control” tends to conjure up images of all sorts of science fiction-like techniques for “reprogramming” people’s brains and turning them into mindless robots or zombies. In reality, of course, when a cult is working on you, it’s hardly ever like that. Most likely, no one is going to…
…attach any electrodes to your head.
…dangle a pocket watch before your eyes.
…make you stare into a rotating spiral till your brain turns to mud.
…tell you to “look deeply into my eyes”.
…tell you to “shut out everything else and focus on the sound of my voice”.
…etc. etc.
In fact several cult experts now reject terms like “brainwashing” and “mind control” for this exact reason. Instead focus on the essentials: “coercive persuasion” and “undue influence”. I have previously described the real thought reform process as “almost disappointingly mundane”. Indeed the mechanisms involved are all familiar from everyday life. Think “sleazy sales-techniques”, not “the Winter Soldier”.
Deference to Authority
This one hardly needs any explanation. It’s the old “just following orders” excuse. The most famous study on deference to authority was performed by Stanley Milgram in 1961. In the Milgram study volunteers were supposed to assist a scientist in a white lab coat (the “Experimenter” – the study’s authority figure) conduct an experiment. They were told that the purpose of the experiment was to see if people learn better if they’re punished for mistakes. The volunteer (the “Teacher”) was going to operate an electroshock device connected to an unseen person (the “Learner”) in an adjacent room. The Learner was going to memorize certain words and repeat them back to the Teacher. Whenever the Learner gave the wrong answer, the Teacher (the actual test subject) was going to give him or her a shock, which was going to increase in strength for each mistake, from moderately painful to a dangerous, potentially lethal, voltage.
In reality, of course, the person in the adjacent room was an actor, and no one was actually getting shocked (that would have been a grossly unethical experiment!). The volunteers didn’t know that, however. As far as they could tell, they were giving increasingly painful and even dangerous shocks to total strangers who had done them no wrong. Soon the Learner was crying out in pain and complaining about heart conditions, breathing difficulties etc. By now most of the volunteers were getting very uncomfortable and desperately looking at the Experimenter for clues to end the experiment. The Experimenter, on the other hand, would prompt them to continue by saying things like “The experiment requires that you carry on”. Although the volunteers were free to pull out at any moment without economic losses or other negative consequences for themselves, the majority couldn’t bring themselves to disobey the guy in the white lab coat, and in the end 65% went all the way to 450 V, a lethal voltage! By then the Learner had seemingly gone quiet for good.
Group Conformity and Peer Pressure
Group conformity resembles deference to authority except that there is no formal difference in rank or power (hence the expression “peer pressure”), and everything feels voluntary in a way that obedience to authority does not. The most famous study of group conformity was performed by Solomon Asch in the 1950s: A group of volunteers were told that the purpose of the experiment was to look into something related to visual perception. The volunteers were then shown 12 pairs of cards. The first card in each pair displayed a single straight line. The second card displayed three straight lines, one of which was identical to the one on the first card, while the other two were either obviously longer or obviously shorter. The volunteers were then asked, one at a time, to state which line on the second card was identical to the one on the first card.
In reality all the “volunteers” except one were, once again, actors who had been instructed in advance to give the identical wrong answer 75 % of the time. The actual purpose of the study was to see how the real test subject would react to a direct contradiction between the consensus of the group and what he or she could clearly see with his or her own eyes. In the end, 74% of the test subjects went with the group consensus at least once out of the 12 trials, and about 12% conformed in practically every case. But even the 26 % who consistently gave the right answer tended to become very uncomfortable and unsure of themselves when their answer contradicted the group consensus. In follow-up interviews some of the test subjects have confessed to knowingly giving the wrong answer out of embarrassment, fear of standing out etc. Others, however, were genuinely unsure what to think and wondered if something was wrong with them (“Wait, what?! Are my eyes deceiving me, or am I just losing my mind? What the Hell is going on here?!”). Yet others were absolutely convinced that the group consensus was correct (“It has to be me! Everybody else cannot be wrong!”).
Most People Think They’re Not Like Most People
Typical of both the Milgram study and the Asch study (as well as all other findings that reveal uncomfortable truths about human psychology) is that almost everyone thinks they’re exceptions to the rule (no one is influenced by advertising, no one would ever join a cult, no one would ever cooperate with the Nazis etc.). And yet, when you look at how people actually behave in a controlled study, the percentage who do give in to authority and peer pressure vastly exceeds the percentage who think they would (or say they would). After all, the latter percentage is practically zero.
Bottom line, our own self-assessment of how we would react in hypothetical situations has very little predictive power. Our self-image is like a heavily redacted and sanitized “authorized biography”, specifically framed to put us in the most flattering light possible. Most of us most of the time are not as rational, independent, or autonomous as we might like to think. Indeed, as we will see, this whole idea that “I could never fall for anything like that” gives cult recruiters perhaps their most powerful weapon against us. It’s a mindset that only works to the cult’s advantage and our detriment.
Inducing Shame and Guilt
Although different cult experts dissect the thought reform (or “brainwashing”) process in different ways, practically all of them include some version of what Robert Jay Lifton has called the “Cult of Confession”. Whether the pretext is getting you to confess your sins, confront your inner demons, or overcome the “excuses” that are “holding you back” and preventing you from “taking control” of your life, practically every cult requires you to share all your darkest secrets, with a special emphasis on everything you fear, everything you regret, everything you are ashamed of, all the stuff you probably wouldn’t share on Facebook. This serves several purposes [1], but by far the most important is identifying your weaknesses and hence which buttons to push to be better able to manipulate and control you. In the end, the specifics are pretty much irrelevant, since anything other than total surrender and blind obedience to the leader will be turned back against you and re-framed as sinful, pathological, bigotry or bias, excuses, negativity, wrong attitude, a loser mindset, weakness, lack of commitment, inconsistency, hypocrisy, arrogance etc. etc.
“Us vs. Them” Thinking
Most people most of the time don’t necessarily have a very clear idea of what they’re in favor of. On the other hand, everyone always has a perfectly clear idea of what – not to mention who – they are against. This can be used to manipulate you by re-framing anything other than total submission to the cult as agreeing with “them”. We all remember George W. Bush’s infamous “If you’re not with us, you’re with the terrorists”. Every cult spouts some version of this same false dichotomy:
Somewhat ironically, one common perception is that people join cults because they’re too “trusty”, “credulous”, “naive”, “gullible” etc. Yet when you look at a typical cult there is no shortage of distrust, suspicion, cynicism, or even outright paranoia going on. Indeed, most cults are into all sorts of crazy conspiracy theories, and often see themselves as the only people on the planet who have not been “brainwashed”, “swallowed the blue pill”, “drunk the Kool-Aid” etc. It’s the “sheeple” and the “systemites” outside the cult who are living in the Matrix, while the enlightened few on the inside are the ones who have taken the red pill, had their eyes opened, and see the world “as it really is”. Apparently extreme distrust (especially of the selective kind!) can be manipulated as easily as naivety and gullibility [2].
Physical and Mental Exhaustion/Sleep-Deprivation
As we have seen, most of us most of the time are not as rational, independent and autonomous as we might like to think. But let’s say, for the sake of the argument, that you really are among the tiny minority who cannot normally be swayed by authority, peer-pressure, emotional blackmail, or even the desire to distance themselves from some hated outgroup: Not normally. How about at four o’ clock in the morning after being engaged in non-stop exhausting activities all day and all night?
As someone who works regular night shifts, I can definitely say that I don’t do my best thinking in such a situation, and could probably be talked into making some concessions that I wouldn’t normally make otherwise. I’m pretty sure it’s not just me either. There is a reason why one of the most popular methods for forcing a confession out of a suspected criminal (especially in regimes we don’t like being compared to) is to keep them awake for days on end. And one of the reasons it’s such a terrible idea, is that if you keep it up for long enough people will eventually confess to things they haven’t done just to make the torment stop.
The Boiling Frog Effect
It is often said (rightly or not) that if you put a frog into a bowl of scorching hot water, the frog will instantly jump out and save its own life. If you put that same frog into a bowl of comfortably lukewarm water and gradually heat the water to that very same scorching hot temperature, the frog will look in vain for the “line” where the temperature goes from “definitely acceptable” to “definitely unacceptable”, and, not finding a sharp line, it will remain passive and indecisive while boiling to death. We observe a similar behavior in humans. The human brain evolved to react strongly to a sudden crisis or emergency. It did not evolve to react strongly to a gradual worsening of conditions over time.
Joining a cult is kind of like entering into a pool of water that’s slowly getting hotter. Your first encounter with the cult is usually a positive experience (the “comfortably lukewarm” stage), and you don’t want to give it up lightly. Many of the people who recruited you may also become your friends, and you don’t want to disappoint or alienate them. So when the first red flags start popping up (usually sooner rather than later), your first inclination might be to give the group the benefit of the doubt:
Cults are counting on you to think that way since it buys them time to work on you. As hard as it may be to walk out today, you can be damn sure it’s going to be harder tomorrow, and even harder the day after that. By the time it really does get “so bad” (the “boiling” stage), you are already caught and can’t get out.
“All Questions Will Be Answered in Due Time”
Another way in which the Cult buys itself time to work on you is by playing on your curiosity. When you join a cult, you are always given the impression that some cosmic revelation is waiting for you right around the next corner (and then the next, and then the next etc. etc.), and if you stick around for just a little while longer, all questions will finally be answered. When former cult members look back, all questions were never answered [3]. They finally just learned to stop asking questions and parroting back the same slogans and thought-terminating clichés as everybody else.
Cognitive Dissonance and Rationalization
As we have seen, vulnerability to cult influence has very little to do with general brain power. Things like intelligence, logical and analytical skills etc. are tools for achieving goals. Bringing our beliefs into alignment with reality is one such goal, but far from the only one. Other examples might include psychological comfort, winning arguments, getting along with our peers, siding with our own tribe, opposing the people on the other side etc. Arguably the most potent of all human motives is the need to protect our own self-image as intelligent, rational, autonomous, moral agents. The discomfort we feel when confronted with information that threatens this self-image has been called “cognitive dissonance”. In such situations our intelligence turns into an apologist, excusing and explaining away our follies, and making up reasons why we were, in fact, right all along [4].
Leon Festinger presented his theory of cognitive dissonance in When Prophecy Fails. In 1950 Festinger and some of his colleagues infiltrated a small UFO cult in Chicago. The cult leader had predicted the end of the world on December 21st that same year, an ideal opportunity to test Cognitive Dissonance Theory’s predictions about what to expect when the prophecy (presumably) failed to come true. Festinger predicted that – far from losing faith – the most committed [5] cult members would reinterpret the failure as success [6] and proselytize for the cult with renewed enthusiasm. In this Festinger was basically proven right. A few hours after midnight of December 21st had come and gone without incident, the cult leader claimed to have received a new revelation: The cult members’ strong faith had prevented the catastrophe [7]. Encouraged by the “miracle” (i.e. that nothing happened) the cult members went out into the world to spread the good news!
I think the same need to protect our egos plays a major role in how people are recruited in the first place. And this is where the idea that only weak, stupid, or crazy people can be taken in by cults really works against us. I seem to remember former cult-member turned anti-cult activist and “deprogrammer” Steven Hassan once saying that when he was recruiting for the Moonies back in the 1970s, the ideal recruit was someone who thought they were too clever to be recruited by a cult. So at the risk of sounding like a tinfoil-hat-wearing conspiracy nut, that’s what “they” (i.e. cult recruiters) want you to think. It’s a mindset that only works to the cult’s advantage and your detriment.
It’s not just that being that overconfident means your guard is down and you’re less vigilant (although that is definitely also a problem!). But, perhaps more importantly, if you’re that invested in your self-image as too clever to be recruited by a cult, then, if the cult can just get you to make some small concessions (perhaps in your sleep-deprived state at four o’clock in the morning after hours of relentless guilt-tripping and peer-pressure), you’re that much more prone to rationalize it. The idea that perhaps you weren’t too clever after all simply doesn’t compute (think “Syntax error”), so if you did make those concessions it had to be for a good reason:
So you rationalize, i.e. you come up with some spurious, after the fact justification for why making those concessions was indeed the clever thing to do. And, of course, once made, those rationalizations don’t just exist in a vacuum: They now become part of the lens through which you look at every other question. The same rationalizations used to justify concessions a,b,c make it very hard to resist concessions d,e,f without looking inconsistent or hypocritical even to yourself (practically the definition of cognitive dissonance!). And if the inconsistency or hypocrisy is not immediately obvious to you, you can be damn sure the cult leader (as well as every other person in the room) is going to point it out to you:
So you get caught in the logic of your own rationalizations (in what some have called a “justification spiral”): Going back and admitting that you should never have made concessions a,b,c leads to cognitive dissonance (you’re way too clever to make a mistake like that, remember!), but staying where you are and going no further leads to charges of inconsistency and hypocrisy and hence even more cognitive dissonance (after all, you’re not an inconsistent hypocrite!). The only way to go is forward. So you make concessions d,e,f which, by the same logic, make it very hard to resist concessions g,h,i etc… etc… In the end, by the time you get to x,y,z, even forcing Kool-Aid mixed with cyanide down your children’s throats with syringes before drinking it yourself may appear less unacceptable than admitting to yourself and the world:
How do normal people end up drinking the Kool-Aid, crashing planes into buildings, sending people to the gas chamber? One concession at a time. There is obviously more to it, but ultimately I think this is how you get caught. This is when the cult owns you! I think highly educated and intelligent people are often the easiest people to recruit, in part precisely because they think they’re too clever to be recruited, but also because their intelligence makes them even better at rationalization [8]. Self-identified “intellectual” types also tend to be more heavily invested in their self image as too clever to be conned and therefore more vulnerable to the kind of justification spirals outlined above.
________________________________________________________________________________________
[1] One such purpose might be to blackmail you into silence if you should ever decide to leave the cult. Or you might be required to share your secrets with a (cult affiliated) third party so the cult leader can appear psychic or omniscient by feeding them back to you later (i.e. an act of “hot reading”).
[2] In my experience conspiracy theorist are among the most gullible people on the planet. Any accusation against the people they already hate, and all standards of plausibility and reasonableness are out the window!
[3] Or, if they were, the answer turned out to be something utterly silly and anticlimactic like the Xenu story.
[4] Mistakes were made (but not by me) by Carol Tavris and Elliot Aronson is essential reading in this regard.
[5] The importance of “saving face” can hardly be overstated. The more we have invested in a cause, a belief system, a group affiliation etc., the more we have a stake in defending it. The more public the commitment, the harder it gets to take it all back without public humiliation and loss of face.
[6] There is a plausible argument to be made that the central doctrine of Christianity originated in a similar manner: The long awaited Messiah was expected to be a warrior king who was going to destroy Judea’s enemies and restore David’s mighty empire. Instead the early Christians saw their chosen Messiah executed and the dream of sovereignty crushed. Reinterpreting the crucifixion as an act of atonement was a convenient way to explain away the failure and claim victory after all.
[7] Probably the most common rationalization when end-time prophecies fail is to claim that the apocalypse really did happen, but on a spiritual level. Luke 17:20-21 can plausibly be understood in this way.
[8] According to Joachim C. Fest the one truly brilliant mind among the ruling elite of the 3rd Reich was Joseph Goebbels, the great spin-doctor.
“All Questions Will Be Answered in Due Time”
This is why I seldom click on links to some video ad claiming to be about something that sounds interesting.
Too often the video goes on about something (at best) tangential to the alleged subject and may or may not get to the point.
If the link is to a text file I can skim through to look for some information related to the original thing I was curious about, and decide if it is worth reading the whole thing.
Another reason I generally prefer text is that text goes at exactly the speed I am comprehending it.
Oh yes, the Click-Bait and Switch… I have even seen it from scientific sources, e.g. several years ago there was a lot of hype around a study allegedly showing that “Black Holes Don’t Exist!”, when all it really said was that the event horizon probably didn’t work quite the way physicists used to think.
To bring it back to the topic of coercive persuasion and undue influence, one major weakness of the literature is that most of it (well, the parts I have read anyway…) mainly deals with the way things used to work back in the pre-internet era. There are now cults and other high control groups and movements that operate entirely online. In a world where people are increasingly spending most of their lives online, the algorithms that decide what kind of material we are exposed to have taken the milieu control to levels the Chinese Communists of the 1950s could never have imagined, as phrases like “echo chambers”, “information bubbles” and “information silos” testify to. We all have a tendency to selectively seek out sources that say the things we already think. Social media algorithms have put this confirmation bias on steroids. Suddenly we no longer even have to seek out material that confirms our own views: The more we click on a certain kind of material, the more the algorithm keeps feeding us even more of the same.
There are studies that appear to show that when groups of like-minded individuals get together, whether online or in “meat space”, they tend to make each other more certain of their views and more extreme in their positions than any of them are as individuals. In part this can probably be explained by the effect of constantly having your own beliefs and opinions confirmed from the outside. But I also suspect that a situation like that tends to encourage a certain competition to be more “on the right side” than everybody else, or at the very least not less so. In Mao’s China the main purpose of the endless purges and show-trials wasn’t to smoke out any real dissenters, but rather to convey the following message:
Of course, the way to make sure it wasn’t you was by making sure it was somebody else. It wasn’t enough to be innocent of any heretical tendencies. Insufficient eagerness to accuse and inform on others could be enough to get into trouble yourself. It didn’t matter how actively complicit you were in pursuing heretics and thought criminals if most of your neighbors were even more complicit.That way everyone was forced to compete to stay out of trouble, and someone was inevitably going to lose. There was no way to be safe.
I think much of the same logic applies to online purity spirals: There is no such thing as “where the line is”, and as long as you’re on the right side of it you’re ok. If you are, say, on the woke left, you have to be further to the woke left than at least some of your peers, no matter how far to the woke left that happens be. What matters is not your absolute level of wokeness, but your level of wokeness relative to the rest of the group. In any group, the least woke member (even if more woke than everyone outside the group) might as well be Adolf Hitler! To avoid ending up last in line, you have to keep running as fast as you can in the same direction, and so the Overton-window keeps moving further and further to the woke left.
Well, this is interesting. Although I have been trying not to read too much into it, it hasn’t escaped my notice that Ross and Hassan (the two most famous living cult “deprogrammers” on my radar) never seemed to cite each other’s work. I’m not in a position to take any sides here (Hassan has definitely been saying some things that made me uncomfortable, but then again, so has Ross), but it goes to show, once again, that being against some of the same things (religion, the MAGA crowd, gender ideology, cults…) doesn’t mean were all on the same side, or “in it together”…