Facebook’s security chief warns fake news is more dangerous and complex than people think
Alex Stamos, who’s handling the company’s investigation into Russia’s use of the social media platform ahead of the 2016 US presidential election
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Facebook’s chief security officer warned that the fake news problem is more complicated and dangerous to solve than the public thinks.
Alex Stamos, who’s handling the company’s investigation into Russia’s use of the social media platform ahead of the 2016 US presidential election, cautioned about hoping for technical solutions that he says could have unintended consequences of ideological bias.
It’s very difficult to spot fake news and propaganda using just computer programs, Mr Stamos said in a series of Twitter posts on Saturday.
“Nobody of substance at the big companies thinks of algorithms as neutral,” Mr Stamos wrote, adding that the media is simplifying the matter. “Nobody is not aware of the risks.”
The easy technical solutions would boil down to silencing topics that Facebook is aware are being spread by bots — which should only be done “if you don’t worry about becoming the Ministry of Truth” with machine learning systems “trained on your personal biases,” he said.
Mr Stamos’s comments shed light on why Facebook added 1,000 more people review its advertising, rather than attempting an automated solution.
The company sent a note to advertisers telling them it would start to manually review ads targeted to people based on politics, religion, ethnicity or social issues. The company is trying to figure out how to monitor use of its system without censoring ideas, after the Russian government used fake accounts to spread political discord in the US ahead of the election.
“A lot of people aren’t thinking hard about the world they are asking [Silicon Valley] to build,” Mr Stamos wrote. “When the gods wish to punish us they answer our prayers.”
Facebook has turned over more than 3,000 ads purchased by Russian entities to congressional investigators looking into Russian influence on the election. Twitter has said it gave the panels a roundup of advertisements by RT, formerly known as Russia Today, a TV network funded by the Russian government.
Officials from Facebook, Twitter and Alphabet’s Google are set to testify to Congress on the matter on 1 November.
Bloomberg
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments