Facebook prepares for Chauvin verdict by enforcing its rules
Facebook is stepping up the enforcement of its rules ahead of the verdict in former Minneapolis Police Officer Derek Chauvin’s murder trial in George Floyd’s death
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Facebook is stepping up the enforcement of its rules ahead of the verdict in former Minneapolis Police Officer Derek Chauvin's murder trial in George Floyd’s death.
The social media giant is tightening its content-moderation efforts, saying it wants to “protect peaceful protests and limit content that could lead to civil unrest or violence."
The steps that Facebook is taking include identifying and removing calls to bring arms to areas in Minneapolis, which it has temporarily deemed to be a high-risk location. It says it is also removing material that “praises, celebrates or mocks George Floyd’s death."
The company enacted similar measures to prevent the flow of misinformation and calls to violence in the aftermath of the 2020 presidential election as the world awaited results. While they worked to reduce misinformation, the measures were not permanent.
Facebook said Monday it will continue to remove posts that violate its community standards. These include hate speech, bullying and harassment and inciting violence. And it said it “may also limit" the spread material that its systems predict may “likely" violate its rules.
The company did not say why it doesn't make such emergency measures permanent, as many critics have called for. Facebook representatives did not immediately respond to a message for comment.