Far-right using COVID-19 theories to grow reach, study shows
New research indicates that far-right extremists and white supremacists are gaining new followers and influence by co-opting conspiracy theories about COVID-19
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.The mugshot-style photos are posted on online message boards in black and white and look a little like old-fashioned “wanted” posters.
“The Jews own COVID just like all of Hollywood,” the accompanying text says. “Wake up people.”
The post is one of many that white supremacists and far-right extremists are using to expand their reach and recruit followers on the social media platform Telegram according to the findings of researchers who sifted through nearly half a million comments on pages — called channels on Telegram — that they categorized as far-right from January 2020 to June 2021.
The tactic has been successful: Nine of the 10 most viewed posts in the sample examined by the researchers contained misleading claims about the safety of vaccines or the pharmaceutical companies manufacturing them. One Telegram channel saw its total subscribers jump tenfold after it leaned into COVID-19 conspiracy theories.
“COVID-19 has served as a catalyst for radicalization,” said the study's author, Ciaran O’Connor, an analyst at the London-based Institute for Strategic Dialogue. “It allows conspiracy theorists or extremists to create simple narratives, framing it as us versus them, good versus evil.”
Other posts downplayed the severity of the coronavirus or pushed conspiracy theories about its origins. Many of the posts contain hate speech directed at Jews, Asians, women or other groups or violent rhetoric that would be automatically removed from Facebook or Twitter for violating the standards of those sites.
Telegram, based in the United Arab Emirates, has many different kinds of users around the world, but it has become a favorite tool of some on the far-right in part because the platform lacks the content moderation of Facebook, Twitter and other platforms.
The company did not respond to repeated messages seeking comment.
O’Connor said he believes the people behind these posts are trying to exploit fear and anxiety over COVID-19 to attract new recruits, whose loyalty may outlast the pandemic.
Indeed, mixed in with the COVID-19 conspiracy posts are some direct recruitment pitches. For example, a Long Island, New York, chapter of the far-right Proud Boys group posted a link to a news story about a local synagogue and added their message urging followers to join them. “Embrace who you were called to be,” read the post, which was accompanied by a swastika.
The researchers found suggestions that far-right groups on Telegram are working together. ISD researchers linked two usernames involved in running one Telegram channel to two prominent members of the American far-right. One was a scheduled speaker at the 2017 Unite the Right rally in Charlottesville Virginia, where a white supremacist deliberately drove into a crowd of counterdemonstrators, killing one and injuring 35.
That channel has grown steadily since the pandemic began and now has a reach of around 400,000 views each day, according to Telegram Analytics, a service that keeps statistical data on about 150,000 Telegram channels on the site TGStat. In May 2020 the channel had 5,000 subscribers; it now has 50,000.
The data is especially concerning given a rash of incidents around the world that indicate some extremists are moving from online rhetoric to offline action.
Gavin Yamey, a physician and public health professor at Duke University, has written about the rise of threats against health care workers during the pandemic. He said the harassment is even worse for those who are women, people of color, in a religious minority or LGBTQ.
Yamey, who is Jewish, has received threats and anti-Semitic messages, including one on Twitter calling for his family’ to be “executed.” He fears racist conspiracy theories and scapegoating may persist even after the pandemic eases.
“I worry that in some ways the genie is out of the bottle,” Yamey said.
The pandemic and the unrest it has caused have been linked to a wave of harassment and attacks on Asian-Americans. In Italy, a far-right opponents of vaccine mandates rampaged through a union headquarters and a hospital. In August in Hawaii, some of those who harassed that state's Jewish lieutenant governor at his home during a vaccine protest brandished fliers with his photo and the word “Jew.”
Elsewhere, people have died after taking sham cures, pharmacists have destroyed vaccine vials, and others have damaged 5G telecommunication towers since the pandemic began nearly two years ago.
Events such as the pandemic leave many people feeling anxious and looking for explanations, according to Cynthia Miller-Idriss, director of the Polarization and Extremism Research and Innovation Lab at American University, which studies far-right extremism. Conspiracy theories can provide an artificial sense of control, she said.
“COVID-19 has created fertile ground for recruitment because so many people around the world feel unsettled," Miller-Idriss said. "These racist conspiracy theories give people a sense of control, a sense of power over events that make people feel powerless.”
Policing extremism online has challenged tech companies that say they must balance protecting free speech with removing hate speech. They also must contend with increasingly sophisticated tactics by groups that have learned to evade platform rules.
Facebook this month announced that it had removed a network of accounts based in Italy and France that had spread conspiracy theories about vaccines and carried out coordinated harassment campaigns against journalists, doctors and public health officials.
The network, called V_V, used both real and fake accounts and was overseen by a group of users who coordinated their activities on Telegram in an effort to hide their tracks from Facebook, company investigators found.
“They sought to mass-harass individuals with pro-vaccination views into making their posts private or deleting them, essentially suppressing their voices,” said Mike Dvilyanski, head of cyber espionage investigations at Meta, Facebook’s parent company.
O'Connor, the ISD researcher, said sites like Telegram will continue to serve as a refuge for extremists as long as they lack the moderation policies of the larger platforms.
“The guardrails that you see on other platforms, they don't exist on Telegram,” O'Connor said. “That makes it a very attractive place for extremists.”
___
Klepper reported from Providence, R.I.