These are the questions Facebook refused to answer about Donald Trump’s ban

Facebook refused to provide information about how its news feed promoted the visibility of Mr Trump’s content

Adam Smith
Wednesday 05 May 2021 12:08 EDT
Comments
(Donald Trump/Facebook)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Donald Trump will remain banned from Facebook and Instagram, per the ruling of Facebook’s Oversight Board, but important questions that remain unanswered about how the social media company’s features promoted the former president’s rhetoric.

The Oversight Board, a committee funded by Facebook to moderate the social media giant’s policy decisions, said that Facebook had acted inappropriately in enforcing the indefinite suspension against the former president.

Within six months, Facebook must make a new decision: either restore Mr Trump’s account, make the suspension permanent, or suspend the account for a determined period of time, Michael McConnell, co-chair of the Board, said in a press call, arguing that there was a lack of clarity around the decision.

The Board asked Facebook 46 questions before coming to its decision, but Facebook declined to answer seven. These included “how Facebook’s news feed and other features impacted the visibility of Mr. Trump’s content; whether Facebook has researched, or plans to research, those design decisions in relation to the events of January 6, 2021”.

In the Oversight Board’s post, the company said that some information “was not reasonably required for decision-making ... was not technically feasible to provide”, or “could not or should not be provided because of legal, privacy, safety, or data protection concerns.”

Neither Facebook nor the Oversight Board responded to requests for comment from The Independent before time of publication to explain which reason specifically meant Facebook did not provide answers to these questions, but the company has a clear history of harmful content being spread virally on its platforms.

Donald Trump’s posts about the election fraud on 5 November were two of Facebook’s most popular posts over a 24 hour period, performing 8.7 times better than Mr Trump’s average message, and which have continually proved popular on the platform versus similar posts from Joe Biden.

Internal data from the company claims that Facebook’s fact-checking labels, which the company applied on posts that contained misinformation but were left on the website, have been inadequate.

“We have evidence that applying these informs to posts decreases their reshares by [approximately] eight per cent … However given that Trump has SO many shares on any given post, the decrease is not going to change shares by orders of magnitude”, one data scientist at the company said.

In 2018, CEO Mark Zuckerberg shared a graph which showed posts that came closer to violating the company’s community standards also rose in engagement. Facebook had to consciously change its algorithm in order to stop that happening.

Facebook executives also reportedly took the decision to end research that would make the social media site less polarising for fears that it would unfairly target right-wing users.

“Our algorithms exploit the human brain’s attraction to divisiveness,” a 2018 presentation warned, adding that if action was not taken Facebook would provide users “more and more divisive content in an effort to gain user attention and increase time on the platform.” Facebook published a lengthy blog post in 2020 in response, claiming reports “ignored the significant efforts [it] did make”.

Facebook is not the only social media site to be criticised for these issues. Twitter’s algorithm and YouTube’s algorithm have been met with the same allegations.

The Oversight Board recommends that Facebook “undertake a comprehensive review of Facebook’s potential contribution to the narrative of electoral fraud and the exacerbated tensions that culminated in … violence”.

Facebook suffers from a “lack of transparency” regarding its decisions, which “appears to contribute to perceptions that the company may be unduly influenced by political or commercial considerations”, the board says. These allegations include the claim that Facebook executives intervene to the benefit of high-profile right-wing users.

“It is important that Facebook address this lack of transparency and the confusion it has caused”, it added.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in