In just a month, some 15,000 users have joined a Reddit community to share their stories of how the QAnon conspiracy is destroying their personal relationships.
“No longer speaking with my mother,” one user wrote. “Thanks a lot, Q.”
“My wife was arrested as a result of Q,” another posted.
“I lost my friends because of Wayfair,” wrote another user in July, referring to an iteration of QAnon that holds that the eponymous furniture store is actually a cover for child trafficking. “It’s just a Facebook group of friends but being a military wife they’re all I have.”
The circumstances of the stories posted to r/QAnonCasualties differ, but they share some core similarities: that the sprawling, complex, and entirely invented QAnon conspiracy theory has effectively brainwashed people close to them. These users just want them back.
The toxic influence of the conspiracy theory is no small matter. A QAnon supporter, Marjorie Taylor Greene, has just won her Georgia Republican primary and is almost certain to be elected to the U.S. Congress this November. NBC reported this week that QAnon’s Facebook followers can be counted in the millions, to say nothing of its adherents on 4chan, Gab, YouTube, and other platforms.
Its reach is global. Posters to the subreddit hail from the United Kingdom, Poland, and farther afield. As Foreign Policy has reported previously, the conspiracy theory has become married to Donald Trump’s political movement, it has infiltrated an Iranian dissident group, and it may have even inspired a would-be assassin in Canada.
But on r/QAnonCasualties, the threat is not general or abstract but real and personal. Posters speak of families split apart, relationships ended, friendships canceled. The subreddit offers a painfully instructive window into how conspiracy theories manifest in everyday lives and how social media has become an incredibly powerful diffuser of even the most outlandish and foolish conspiracies.
As with all online communities, the stories are nearly impossible to verify, and it is possible these testimonials are invented or embellished, although they track closely with reported stories from the Q orbit.
For those unfamiliar with the Q mythology, a primer: Dating back to 2016, an allegation emerged that Hillary Clinton and the Democratic Party were embroiled in a long-running plot to abduct and traffic children, in part to harvest a natural secretion called adrenochrome, which was used to keep aging politicians young. According to a pseudonymous poster to 4chan and 8chan who claimed to hold “Q” clearance—in reality, a not particularly unusual or high-ranking security clearance within the U.S. Energy Department—Trump was part of a crusade to rid the world of this child trafficking deep state.
Since then, every manner of conspiracy has been bolted and welded on, from David Icke’s belief that lizard people run our governments to the hypothesis that 9/11 was an inside job, and whole new conspiracies have been invented, like the allegation that hollow Wayfair furniture contain trafficked children. Amid the COVID-19 pandemic, the theory has gained steam by subsuming all manner of fringe theorizing, from Bill Gates orchestrating the pandemic to mask usage being a government plot to pacify the masses.
These stories are fictional and have been consistently debunked and can often be easily disproved. The body does not produce adrenochrome, Wayfair furniture does not contain children, and lizard people aren’t real. Yet Q’s followers are taught to ignore all sources of information that do not comport with the reality they’ve accepted. Theirs is a war against the deep state—the Democratic Party, RINOs (Republicans in Name Only), the media, Hollywood actors. The enemies of the people will do whatever it takes to stop their quest for justice and therefore can’t be trusted.
Big names have lent the movement their credibility—or whatever is left of it. Michael Flynn, Trump’s short-lived national security advisor, and trickster-cum-advisor Roger Stone have both signaled support for the conspiracy.
But Q doesn’t depend on high-level adherents. It’s a memetic invasion created, fed, and propelled by the social media ecosystem. Where once there may have been a single Q, there are now thousands. The followers of Q are, themselves, Q, propelling the conspiracy forward with their own misguided research, elaborately crafted hypotheses, and (sometimes) doctored evidence. It is a movement of collective leadership.
“Your voice and your vote matters,” one Q post promises. “Are you ready to serve once again?” another post asks its followers. The informal motto of the movement is ubiquitous in Q circles: WWG1WGA—Where We Go One We Go All. There is even a quasi-Christian element to the movement, pitting the religious crusaders of Q against the satanists of the deep state.
So, how do you beat it?
While r/QAnonCasualties has become a support group, of sorts, it has also become a hub to swap stories on how QAnon adherents have been successfully deprogrammed and how their families are fighting back against the grip of the compelling conspiracy.
“Every time my dad leaves his ipad sitting alone, i go to his youtube app and unsubscribe from the [QAnon] channels he’s been watching,” one Reddit user wrote. “I know this is a breach of privacy, but fuck it. I don’t want him to become too far gone.”
Another user reported how they showed their QAnon-following mother just how easy it can be to change a webpage, by using Google Chrome’s inspector tool. “She’s mentioned it in a couple of our calls since, saying things like, ‘I know this could be fake because of that thing you showed me…’” the user wrote. “It’s small, but it’s one of the few bits of progress I’ve seen in getting her to realize she might be in over her head and doesn’t know the extent to which she could be manipulated with seemingly hard-cut evidence.”
The subreddit also offers an encyclopedic guide for dealing with those in the grips of the worldview-defining conspiracy theory and debunking the outlandish claims of Q. The guide includes testimony from ex-followers, debunking resources, and actual advice on how to fight sex trafficking.
Getting through to those strung along by Q can be difficult, however. One user recounted how her husband had fallen deep into the QAnon rabbit hole. “I can’t talk to him because he just shouts over me,” she wrote. “I miss him so much and am so lonely now.”
One poster’s father no longer believes in QAnon but, they wrote, that “Q was created by the deep state as controlled opposition in order to make conspiracy theorists look stupid.” They confessed: “I don’t know if this is good or bad news.”
There are markers that seem to make people particularly susceptible to the conspiracy. Many users report that their loved ones have some sort of mental illness, depression, or were placed somewhere on the autism spectrum. Some have been longtime conspiracy theorists. While a number of followers are younger, many are older—a level of technology illiteracy may be a significant factor.
Social media and the internet have magnified this effect. By flattening the field, allowing laypeople to do an enormous amount of research themselves, the whole movement has been drafted to help construct the church of Q. Formats like Twitter and 4chan prioritize pumping out information, not providing context or credibility. For many users, there is no substantive difference between receiving information from a Twitter account with a blue checkmark and receiving information from an account with a handful of followers. The same goes for news sources: It can be difficult to tell from a landing page whether a website employs a thousand professional journalists or a lone writer.
Apps and stand-alone social media platforms allow Q followers to decamp to a space where their worldview is only ever affirmed, never challenged. A raft of QAnon apps were removed from the Play Store by Google in May, while one Q-linked video platform is available from Apple’s App Store.
A study published last year by three Australian researchers found that general psychological traits, like impulsive decision-making and “intensified perceptions of life stress,” tended to correlate with a belief in conspiracies. That largely echoed similar research that has been published over the years.
In a soon-to-be-published follow-up study, the same three researchers interviewed more than 600 people on their beliefs in conspiracies, especially around COVID-19. Those who bought into these theories, the study found, are not necessarily stupid or gullible—but they “are likely to have a pre-existing tendency to interpret information in a way that finds patterns, connections and causal relationships in events.” This, coupled with a strong confirmation bias, further means “that all new information will tend to be explained by existing belief systems when facing unexplainable events.” (Stress, however, appeared to be less of a factor than they anticipated.)
“In particular, [conspiracy theory] believers may find it hard to believe that a virus could originate randomly from the natural world because it does not fit with their preconceived view that events have a reason and usually a human or government influence behind it,” they write.
To that end, conspiracy theories may be a sort of modern twist on fears of witchcraft. Across eras, and across civilizations, human beings have long sought out supernatural or outlandishly complicated explanations for unpleasant truths. The witch craze of Salem, Massachusetts, offered a demonic explanation for local strife that came as a flood of refugees arrived from the wars of eastern Canada. The Azande of North Central Africa believed, according to the English anthropologist E.E. Evans-Pritchard, that mishaps could generally be blamed on witchcraft, lest there be strong evidence that human error was to blame. “Witchcraft participates in all misfortunes,” he wrote in 1937. Children often figure into these collective delusions, such as the mass hysteria that racked North America throughout the 1980s and 1990s, leading to accusations, often baseless, against various teachers and child care workers of satanic influence and sexual abuse.
But, as r/QAnonCasualties makes abundantly clear, QAnon marries those moral panics with the trappings of a cult. By the seriousness of what it alleges, QAnon consumes its followers, demanding they live their life in service of protecting children and vanquishing the deep state. Family members on Reddit describe their loved ones spending hours a day watching Q-affiliated videos and spending hours more lecturing others on their research. Some may even be driven to violence. A noted mob boss was gunned down, allegedly, by a man who believed he was doing Q’s work, while another well-armed Q follower was arrested near Canadian Prime Minister Justin Trudeau’s home and charged with threatening the leader’s life.
If we can liken Q to a cult—many users of the subreddit would certainly do so—we do have some experience to draw on.
In 1977, Montreal-born Benji Carroll was brought to a hotel near the San Francisco Airport by his family. He had become an enthusiastic adherent to the Unification Church, a New Age Christian movement headquartered in South Korea known as the “Moonies.” (The movement is still around today and owns the conspiratorial Washington Times.) A network of his friends, who had flown down from Quebec, met him at the hotel and basically held him captive for days as they attempted to “deprogram” him. Essentially, it worked.
“What I went through,” Carroll would later say of the Moonies, “is nothing short of mind control.”
The ones who brought Carroll back to reality would later go on to found an organization mandated to exfiltrate and deprogram cult members.
In his 1989 dissertation, the psychologist Steve K.D. Eichel seemed to endorse the effectiveness of deprogramming but underscored that a personal rapport was crucial. And while it required rigor and planning, deprogramming “contained many of the elements typically associated with casual conversation,” he wrote.
Eichel posited that there were three parts to deprogramming: persuasive conversation, where the deprogrammers make a clear case that remaining with the cult is a mistake; teaching, where they lay out facts around cults and “mind control”; and a moral discourse, wherein the cult leadership is exposed as hypocrites.
Deprogramming has fallen out of fashion—and for good reason. Beyond the legal and human rights implications of abducting cult members, deprogramming itself became a tool of abuse or state oppression for some.
And while there have been successes, things are rarely as simple as sitting someone in a room and holding a conspiracy intervention. That’s especially true today, when access to the cult does not exist exclusively through a commune or a temple but through every smartphone and laptop.
The West has had to grapple with the power of the internet on its most disaffected as it has worked to detangle its citizens from the slick propaganda of the Islamic State.
A core tenet of efforts to combat online radicalization is meeting online users where they are, on platforms like Twitter and Telegram. A team of researchers from the United States and Canada argued in a 2016 paper that empathy and “civil, sustained conversations” can go a long way in breaking the hold on radicalized people. Humor—like photoshopping rubber ducks into Islamic State propaganda—may neutralize dangerous speech and help expand the reach of the countermessaging, they argued.
Yet QAnon, perhaps even more so than many cults or online radical movements, has built-in antibodies. It teaches its followers that everyone else needs to be deprogrammed—“red-pilled,” as in The Matrix. Fact-checkers are just part of the media conspiracy. Opponents to the conspiracy are just taking the mainstream media at face value, unwilling to do their own research. Through this lens, the verified Twitter user is actually less trustworthy than the independent researcher with 12 followers; the professional media website is less credible than the fringe blog.
Most perniciously, and something the users of r/QAnonCasualties know all too well, the movement tells adherents to ditch their loved ones if they can’t be recruited. “We are your family now,” QAnon followers tell each other.
To that end, Q does exactly what other cults have done: It offers family and faith to those most looking for it. That’s why families and religious leaders will ultimately be instrumental in breaking its hold.
QAnon, perversely, has even offered followers strategies to, in their minds, deprogram their friends and family. One QAnon guide offers tactics like “ask open-ended questions,” “let the other person be the expert,” and “when you sense that they’re starting to push back, quickly change the subject back to small talk and wait for a better time.”
How do you deprogram a deprogrammer?
There have been attempts that are unlikely to do much good. Last Week Tonight host John Oliver tried his own hand at producing videos of prominent celebrities to help combat conspiracy theories in the era of COVID-19—from Alex Trebek to Billy Porter—but it seems ill-designed, given that so much of QAnon revolves around claims that its opponents, especially celebrities, are in on it.
Indeed, a paper published in the journal Frontiers in Psychology in June found that, when some counterradicalization efforts were applied to a random sample of individuals, especially from governments and NGOs, “attempts to shape their perspective have the unintended effect of strengthening their ideological positions.” Clear attempts to change individuals’ minds from a source perceived as untrustworthy are doomed to fail.
The research found that a compelling driver for radicalization was a “need for closure”—a desire for certainty and an abhorrence of unpredictability. “[I]t reflects the degree to which people want to preserve their belief systems to avoid uncertainty,” the researchers found. “The greater such need, the greater the reactance to persuasive appeals.”
r/QAnonCasualties offers a tale of hope. And it may offer some interesting insights into how online radicalization and conspiracy movements can be countered.
After initially posting about her inability to get through to her husband, the one who would shout down any attempts to counter Q, the Redditor provided an update: “It might not work but I don’t think my husband is completely gone yet,” she wrote.
She had spent the better part of a day reading through her husband’s Twitter feed and meticulously researching QAnon. (“Unless you are ABSOLUTELY CERTAIN that your own mental health is strong, I would not recommend doing this,” she cautioned.)
When her husband came home from work, she wrote, she confronted him. “I was determined to remain calm, not slip into either belittling or patronising him and listen to what he had to say. … I just kept reminding myself that I was not angry at him. I was angry at Q.”
She posted, in some detail, the back-and-forth that followed. She clearly and patiently explained to him the logical fallacies and factual inaccuracies at the core of Q. “[T]hey have told you to restrict your information to only what they sanction as being true and to turn your back on and dismiss anything and everything that disproves or denies their ‘truth,’” she explained, per her retelling. “In what way is this NOT a cult?”
Days later, she posted another update. The confrontation about Q had spiraled into a larger conversation about their marriage and his mental health, but she had to finally lay down an ultimatum: It’s either me or Q. It seemed as if he was ready to pick the latter.
Just as she packed her bags, she wrote, he came up and apologized. He had deleted his Twitter account and gotten rid of the Q app. “I still think that we’ve got a long way to go,” she wrote, but signed off on a note of optimism.
It may ultimately fall to the family and friends of these Q boosters to convince them that, as the deprogrammers of decades prior did, their belief system has become skewed; to lay out how conspiracy theories take root, as Q has; and to showcase why figures like Trump and online Q proselytizers are, in fact, not the brave truth-tellers the conspiracy has led them to believe.
For now, however, Q keeps growing.