Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Oversight Board reverses Meta's initial decisions to remove 2 videos of Israel-Hamas war

A quasi-independent review board is recommending that Facebook parent company Meta overturn two decisions it made this fall to remove posts “informing the world about human suffering on both sides” of the Israel-Hamas war

Via AP news wire
Tuesday 19 December 2023 06:00 EST
Meta Oversight Board Israel Hamas
Meta Oversight Board Israel Hamas (Copyright 2022 The Associated Press. All rights reserved)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

A quasi-independent review board is recommending that Facebook parent company Meta overturn two decisions it made this fall to remove posts “informing the world about human suffering on both sides” of the Israel-Hamas war.

In both cases, Meta ended up reinstating the posts — one showing Palestinian casualties and the other, an Israeli hostage — on its own, although it added warning screens to both due to violent content. This means the company isn't obligated to do anything about the board's decision.

That said, the board also said it disagrees with Meta's decision to bar the posts in question from being recommended by Facebook and Instagram, “even in cases where it had determined posts intended to raise awareness.” And it said Meta's use of automated tools to remove “potentially harmful” content increased the likelihood of taking down “valuable posts” that not only raise awareness about the conflict but may contain evidence of human rights violations. It urged the company to preserve such content.

The Oversight Board, established three years ago by Meta, issued its decisions on Tuesday in what it said was its first “expedited” ruling — taking 12 days instead of the usual 90.

In one case, the board said, Instagram removed a video showing what appears to be the aftermath of a strike on or near Al-Shifa Hospital in Gaza City. The post shows Palestinians, including children, injured or killed. Meta's automated systems removed the post saying it violated its rules against “violent and graphic content.” While Meta eventually reversed its decision, the board said, it placed a warning screen on the post and demoted it, which means it was not recommended to users and fewer people saw it. The board said it disagrees with the decision to demote the video.

The other case concerns video posted to Facebook of an Israeli woman begging her kidnappers not to kill her as she is taken hostage during the Hamas raids on Israel on Oct. 7.

Users appealed Meta's decision to remove the posts and the cases went to the Oversight Board. The board said it saw an almost three-fold increase in the daily average of appeals marked by users as related to the Middle East and North Africa region in the weeks following Oct. 7.

Meta said it welcomes the board's decision.

“Both expression and safety are important to us and the people who use our services. The board overturned Meta’s original decision to take this content down but approved of the subsequent decision to restore the content with a warning screen. Meta previously reinstated this content so no further action will be taken on it,” the company said in a statement. “There will be no further updates to this case, as the board did not make any recommendations as part of their decision.”

In a briefing on the cases, the board said Meta confirmed it had temporarily lowered thresholds for automated tools to detect and remove potentially violating content.

“While reducing the risk of harmful content, it also increased the likelihood of mistakenly removing valuable, non-violating content from its platforms,” the Oversight Board said in a statement, adding that as of Dec. 11, Meta had not restored the thresholds to pre-Oct. 7 levels.

Meta, then called Facebook, launched the Oversight Board in 2020 in response to criticism that it wasn’t moving fast enough to remove misinformation, hate speech and influence campaigns from its platforms. It has 22 members, a multinational group that includes legal scholars, human rights experts and journalists.

The board’s rulings, such as in these two cases, are binding but its broader policy findings are advisory and Meta is not obligated to follow them.

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in