Facebook's 'fake news' policy poses more questions than it answers
It’s been two years since a US presidential campaign in which the company was a primary vector for state-sponsored political interference
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Facebook was once the most nimble company of its generation. The speed at which it adapted to every challenge was legendary. It needed only about a decade to go from a dorm-room startup to the largest and most influential communications platform in the world.
But it’s been two years since a US presidential campaign in which the company was a primary vector for misinformation and state-sponsored political interference – and Facebook still seems paralysed over how to respond.
In exchanges with reporters and lawmakers over the past week, its leaders – including Mark Zuckerberg, Facebook’s chief executive – have been comically tripped up by some of the most basic questions the site faces. Mr Zuckerberg, in an interview with journalist Kara Swisher that was published Wednesday, argued that Facebook would not ban Holocaust denialism on the site because “there are things that different people get wrong”. He later explained there were many other ways that Holocaust deniers could be penalised by Facebook – yet lucidity remained elusive.
Mr Zuckerberg’s comments fit a larger pattern. Presented with straightforward queries about real world harm caused by misinformation on their service, Facebook’s executives express their pain, ask for patience, proclaim their unwavering commitment to political neutrality and insist they are as surprised as anyone that they are even in the position of having to come up with speech rules for billions of people.
Testifying before Congress in the spring, Mr Zuckerberg vowed to address lawmakers’ concerns about the site. Even though his business has continued to prosper, he repeatedly warned investors that he would take actions to address Facebook’s social impact that could negatively impact its bottom line.
Yet Facebook executives’ tortured musings in recent weeks suggest that the task ahead remains difficult. The company that was once pilloried for its heedlessness – its motto was “move fast, break things” and it put up posters around its offices asking employees “What would you do if you weren’t afraid?” – is now moving slow, fixing little and taking no stand.
“I think Facebook is trying to thread this needle of trying to claim they’re not a publisher with responsibilities here, when they clearly are,” said Sarah Szalavitz, chief executive of the design agency 7 Robot, who has closely followed social media companies’ efforts to address their shortcomings. “But they need to have a perspective.”
The present dust-up began last week, when Oliver Darcy, a reporter for CNN, asked an obvious question at a press event that Facebook had convened to explain its new plan for fighting misinformation. Why allow Infowars, a site that traffics in conspiracy theories – including that the Sandy Hook school shooting was fake – to maintain a page on Facebook, Mr Darcy said.
In Darcy’s telling, Facebook officials seemed taken aback by the question. John Hegeman, Facebook’s head of News Feed, stumbled through an answer about how “just being false doesn’t violate the community standards,” and how Infowars was a publisher with a “different point of view”.
Later, the social network argued that banning organisations that repeatedly peddle misinformation would be “contrary to the basic principles of free speech”. The company insisted that even if Infowars and other sites that push misinformation are not banned, they might still be penalised. Facebook has contracted with dozens of fact checking organisations around the world; if its fact checkers determined that a specific Infowars story was false, people would be allowed to share it with their friends, but Facebook would push it so far down in everyone’s feeds that most of them would not see it.
Part of the reason Facebook defended Infowars seemed to become evident this week on Capitol Hill. That was when Monika Bickert, Facebook’s vice president of global policy management, showed up at a congressional hearing along with other social media executives to answer questions about whether they may be biased against conservatives. In the hearing, Ms Bickert apologised to Diamond and Silk, two pro-Trump social media stars who had claimed they were treated unfairly by Facebook.
Then came Zuckerberg’s comments to Ms Swisher on Wednesday about Holocaust denialism – and the question about what Facebook would or would not allow on its site became even more confusing.
Even setting aside Zuckerberg’s bizarre idea that there are good-faith Holocaust deniers who are merely misinformed about the past, his argument raised several other issues, including hate speech. Facebook’s code of conduct prohibits hate speech, which it defines as attacks on people based on “protected characteristics” like race, ethnicity or religion. Wouldn’t Holocaust denialism fall into that category?
A Facebook spokeswoman explained that it would be possible, theoretically, to deny the Holocaust without triggering Facebook’s hate speech clause.
That wasn’t all. On Wednesday, Facebook also rolled out a new policy on misinformation that complicated matters some more. The company said it had decided that, actually, it would remove – and not just downrank – certain false posts if it determined that they might lead to imminent violence.
The policy is global, but so far it is operating only in Myanmar and Sri Lanka, where Facebook posts have been linked to ethnic cleansing and genocide. And what exactly constitutes imminent violence is a shifting line, the company said – it is still “iterating on” its policy, and the rules may change.
So to recap: Facebook is deeply committed to free expression and will allow people to post just about anything, including even denying the Holocaust. Unless, that is, if a Holocaust denial constitutes hate speech, in which case the company may take it down. But if a post contains a factual inaccuracy, it would not be removed, but it may be shown to very few people, reducing its impact.
On the other hand, if the misinformation has been determined to be inciting imminent violence, Facebook will remove it – even if it’s not hate speech. On the other other hand, if a site lies repeatedly, spouts conspiracy theories or even incites violence, it can maintain a presence on the site, because ultimately, there’s no falsehood that will get you kicked off Facebook.
All of this fails a basic test: It’s not coherent. It is a hodgepodge of declarations and exceptions and exceptions to the exceptions.
“Personally, I think you can have straightforward boundaries around hateful content that don’t have to be this complicated,” Ms Szalavitz said. “Holocaust denial is hate. That’s not hard.”
The New York Post
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments