Facebook asks if you know someone ‘becoming an extremist’ in new prompt test
‘Violent groups try to manipulate your anger and disappointment. You can take action now to protect yourself and others’, one version of the message reads
Your support helps us to tell the story
As your White House correspondent, I ask the tough questions and seek the answers that matter.
Your support enables me to be in the room, pressing for transparency and accountability. Without your contributions, we wouldn't have the resources to challenge those in power.
Your donation makes it possible for us to keep doing this important work, keeping you informed every step of the way to the November election
Andrew Feinberg
White House Correspondent
Facebook is testing a prompt asks users whether they are “concerned that someone you know is becoming an extremist”.
The new message says: “We care about preventing extremism on Facebook. Others in your situation have received confidential support.
“Hear stories and get advice from people who escaped violent extremist groups”. Underneath that message is a blue “Get Support” button.
Another version of the message reads: "Violent groups try to manipulate your anger and disappointment. You can take action now to protect yourself and others."
Speaking to CNN, Facebook said that this is part of a test the social media company is running as part of its Redirect Initiative, aimed at fighting extremism.
"This test is part of our larger work to assess ways to provide resources and support to people on Facebook who may have engaged with or were exposed to extremist content, or may know someone who is at risk,"Facebook spokesperson Andy Stone said.
"We are partnering with NGOs and academic experts in this space and hope to have more to share in the future." Facebook shared the same statement with The Independent but attributed it to an unnamed “Facebook company spokesperson”.
Facebook has often been criticised over claims of facilitating extremism on its platforms. A report by Avaaz, a nonprofit advocacy group that says it seeks to protect democracies from misinformation, claimed that Facebook allowed groups to glorify violence during the 2020 election and in the weeks leading up to the Capitol Hill insurrection attempt on 6 January.
Facebook’s algorithm also exacerbated divisiveness, according to leaked research from inside the social media company, as reported by the Wall Street Journal. Facebook reportedly ended research into stopping the platform being so polarising for fears that it would unfairly target right-wing users. “Our recommendation systems grow the problem,” one presentation said.
In response to that report, Facebook published a blog post saying that the newspaper "wilfully ignored critical facts that undermined its narrative" which, the company says, includes changes to the News Feed, limiting the reach of Pages and Groups that breach Facebook’s standards or share fake news, combating hate speech and misinformation, and "building a robust Integrity Team."
Subscribe to Independent Premium to bookmark this article
Want to bookmark your favourite articles and stories to read or reference later? Start your Independent Premium subscription today.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments