Meta asks Oversight Board for guidance on Covid-19 misinformation policy

The social media giant has asked its Oversight Board to advise on whether its current Covid misinformation policy is ‘still appropriate’.

Martyn Landi
Wednesday 27 July 2022 08:44 EDT
The Oversight Board was set up in 2020 and is able to make binding decisions about Facebook’s content removal actions and policies (PA)
The Oversight Board was set up in 2020 and is able to make binding decisions about Facebook’s content removal actions and policies (PA) (PA Wire)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Facebook parent company Meta has asked its Oversight Board to advise on whether its current Covid-19 misinformation policy is still appropriate now that it says the pandemic’s status has “evolved”.

Sir Nick Clegg, Meta’s president of global affairs, said the company wanted guidance on whether its broad measures to remove misinformation linked to the virus that were introduced in the early days of the pandemic were still relevant and proportionate as many places “seek to return to more normal life”.

The Oversight Board was set up in 2020 and is able to make binding decisions about Facebook’s content removal actions and policies, even overruling the platform and executives.

The former Liberal Democrat leader and deputy prime minister said the tighter measures to stop the spread of misinformation were vital earlier in the Covid-19 outbreak, but the social networking firm felt the time was now right to ask whether it “remains the right approach for the months and years ahead”.

Resolving the inherent tensions between free expression and safety isn’t easy, especially when confronted with unprecedented and fast-moving challenges, as we have been in the pandemic

Sir Nick Clegg, Meta

“The world has changed considerably since 2020,” Sir Nick wrote in a blog post, adding that a number of countries had high vaccination rates, while online tools and resources to identify and remove misinformation, as well as educate people on its dangers, were now widespread.

But he acknowledged that this was not the case everywhere, and the company now wanted guidance on how to best approach keeping people safe from harmful content while still protecting freedom of expression.

“It is important that any policy Meta implements be appropriate for the full range of circumstances countries find themselves in,” he said.

“Meta is fundamentally committed to free expression and we believe our apps are an important way for people to make their voices heard.

“But some misinformation can lead to an imminent risk of physical harm, and we have a responsibility not to let this content proliferate. The policies in our Community Standards seek to protect free expression while preventing this dangerous content.

“But resolving the inherent tensions between free expression and safety isn’t easy, especially when confronted with unprecedented and fast-moving challenges, as we have been in the pandemic.

“That’s why we are seeking the advice of the Oversight Board in this case. Its guidance will also help us respond to future public health emergencies.”

It’s time people in the UK and elsewhere are given democratic oversight of life-changing decisions made thousands of miles away in Silicon Valley

Callum Hood, Centre for Countering Digital Hate

However, online safety campaigners have accused Meta of trying to deflect from what they said was a failure to prevent large amounts of misinformation to be spread during the pandemic.

Callum Hood, head of research at the Centre for Countering Digital Hate (CCDH), said: “This move is designed to distract from Meta’s failure to act on a flood of anti-vaccine conspiracy theories spread by opportunistic liars during the pandemic – many of whom made millions of dollars by exploiting social media’s massive audience and algorithmic amplification.

“CCDH’s research, as well as Meta’s own internal analysis, shows that the majority of anti-vaccine misinformation originates from a tiny number of highly prolific bad actors.

“But Meta has failed to act on key figures who are still reaching millions of followers on Facebook and Instagram.

“Platforms like Meta should not have absolute power over life-and-death issues like this that affect billions of people. It’s time people in the UK and elsewhere are given democratic oversight of life-changing decisions made thousands of miles away in Silicon Valley.”

Sir Nick said Meta’s policies had helped remove Covid-19 misinformation on an “unprecedented scale”, saying more than 25 million pieces of content had been removed globally since the start of the pandemic.

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in