Unwelcome On Facebook And Twitter, QAnon Followers Flock To Fringe Sites

0
138
NEW YORK, NY - OCTOBER 03: A person wears a QAnon sweatshirt during a pro-Trump rally on October 3, 2020 in the borough of Staten Island in New York City. The event, which was organized weeks ago, encouraged people to vote Republican and to pray for the health of President Trump who fell ill with Covid-19. (Photo by Stephanie Keith/Getty Images)

January brought a one-two punch that should have knocked out the fantastical, false QAnon conspiracy theory.

After the Jan. 6 attack on the U.S. Capitol, the social media platforms that had long allowed the falsehoods to spread like wildfire — namely Twitter, Facebook and YouTube — got more aggressive in cracking down on accounts promoting QAnon.

Just two weeks later, Joe Biden was inaugurated president. That stunned those adherents who believed, among other things, that Donald Trump would stay in office for another term and that he would arrest and execute his political enemies.

“There’s no one cohesive narrative that’s really emerged yet. And I pin that on [QAnon] not really having a leader right now,” said Mike Rothschild, a conspiracy-theory researcher who is writing a book about QAnon.

The QAnon universe has two stars. There’s Q, the mysterious figure whose cryptic, evidence-free posts on anonymous online message boards spawned the baseless claim that a Satanic cabal of pedophiles runs rampant in government and Hollywood. The other star is Trump, who was supposed to expose and defeat that cabal.

But both figures have gone silent online. Whoever Q is, the account has pretty much stopped posting on those message boards since the election. Trump was kicked off Facebook, Twitter and Google’s YouTube after urging his supporters to go to the Capitol.

And yet, even as the big social media platforms try to squash harmful misinformation and hate speech, the conspiracy has survived in the darker corners of the Internet. As QAnon believers splinter onto different fringe platforms, experts warn they could absorb even more dangerous conspiracy theories and ideologies. What’s more, QAnon has gained defenders in conservative media and among Republicans in Congress.

The events of Jan. 6 underscore how QAnon has leapt from the fringes of the Internet to the real world. Some rioters in the pro-Trump mob that stormed the Capitol openly expressed support for QAnon. That prompted Twitter, which along with Facebook and YouTube had started limiting QAnon content last year, to clamp down even more.

In the days after the Capitol insurrection, Twitter banned 70,000 QAnon-linked accounts for spreading the conspiracy. Some were influencers with large followings — including high-profile Trump supporters Sidney Powell and Michael Flynn, who had also spread false claims of election fraud and tried to get the results overturned.

The result? “There isn’t one central place that people are finding information in terms of influential accounts, and it’s kind of become more disparate,” said Melanie Smith of the research firm Graphika.

Graphika found that among a dense network of 14,000 QAnon-promoting Twitter accounts it has been tracking, 60% are now inactive. That splintering makes it harder for harmful, even violent, ideas to gain traction — and less likely that unsuspecting Twitter users will stumble across them.

“For me [this] is about not exposing new communities to that type of content,” Smith said. “So in my mind, that’s a pretty big success.”

It’s hard to quantify just how many people follow QAnon. When NPR and Ipsos polled people about whether they believe QAnon’s core false claim — that “a group of Satan-worshipping elites who run a child sex ring are trying to control our politics and media” — 17% said it was true, and another 37% said they didn’t know.

While some people who may be susceptible to believing the falsehoods may never see them, Smith and other researchers warn there is a cost to that success: As QAnon influencers and their followers are pushed off mainstream platforms, some are migrating to apps with fewer rules, like the alternative social network Gab and the messaging app Telegram. There, they may be exposed to more extremist content, like white supremacist and neo-Nazi groups

“What they’re essentially doing is walking straight into an incubator for radicalization,” said Jared Holt, a visiting research fellow at the Atlantic Council’s Digital Forensic Research Lab, where he studies disinformation and extremism.

QAnon is already incorporating new, unfounded theories from other extremists. Some accounts are latching onto obscure legal fictions promoted by sovereign-citizen groups, which deny the legitimacy of the U.S. government, such as the bogus idea that Trump will be inaugurated on March 4.

Holt said such adoption is “just another example of something that QAnon has done repeatedly, if not constantly, which is crowdsourcing their idea of reality.”

QAnon’s constant evolution presents a challenge for platforms like Twitter and Facebook to enforce their bans by stamping out new theories and hashtags appropriated by QAnon believers.

But even if effective, the platforms’ actions can only go so far.

There are QAnon podcasts available through Apple and Google. Fox News host Tucker Carlson recently defended QAnon adherents. QAnon has even gained a foothold in the halls of Congress, where two Republican members have openly supported some of the movement’s baseless ideas.

“When you’ve got people like Tucker Carlson or sitting members of the House of Representatives talking about something, it’s hard to ban it,” said Rothschild, the researcher.

“I think this movement is now so mainstream and has pulled in so many people that it seems inconceivable that it will go away completely,” he said.

LEAVE A REPLY

Please enter your comment!
Please enter your name here