‘Massive ranking failure’ meant Facebook showed users nudity, violence, and Russian misinformation

Misinformation and other harmful posts rose in visibility up to 30 per cent globally because of the failure

Adam Smith
Friday 01 April 2022 06:26 EDT
Comments
(Getty Images)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

A “massive ranking failure” of Facebook’s news feed meant that half of all views were exposed to “integrity risks” over the past six months.

A new report from The Verge, based in internal documents from Facebook, suggests that since October 2021 a huge amount of misinformation began surfacing on the news feed. Instead of suppressing accounts that repeatedly shared misinformation, it was instead giving them a boost – up to 30 per cent globally, it is claimed.

The engineers were unable to find the cause of the issue, and misinformation surged on the site until 11 March despite it being a high-priority case, according to the report.

During this time, Facebook also allegedly failed to properly moderate nudity, violence, and Russian media. This is despite the company “prohibiting Russian state media from running ads or monetising on our platform anywhere in the world”, firm’s head of security policy, Nathaniel Gleicher, said late February. In response, Russia marked Meta, the parent company of Facebook and Instagram, as an ‘extremist’ organisation.

A spokesperson for the company said that it “detected inconsistencies in downranking on five separate occasions, which correlated with small, temporary increases to internal metrics.” The issue was apparently first introduced two years before it made a noticeable impact on the site.

“We traced the root cause to a software bug and applied needed fixes,” the spokesperson said, but the big “has not had any meaningful, long-term impact on our metrics”.

Facebook posts which do not violate its “Community Guidelines”, and so avoid being banned entirely, but are still deemed to be low quality are hidden within the feed.

The categories of posts that will trigger those demotions are detailed in a document the company released in September last year, known as Content Distribution Guidelines.

They include posts that have spam, that have false information, and which put people at personal risk.

“We’ll continue to update the Content Distribution Guidelines to provide people with information about how we define and treat problematic or low-quality content that doesn’t otherwise violate our Community Standards,” Facebook said.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in