Facebook is secretly ranking how trustworthy people are

Users are being judged on their behaviour – without knowing it, or how they are being ranked

Andrew Griffin
Tuesday 21 August 2018 10:33 EDT
Comments
Protesters from the pressure group Avaaz demonstrate against Facebook outside Portcullis House in Westminster
Protesters from the pressure group Avaaz demonstrate against Facebook outside Portcullis House in Westminster (Reuters)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Facebook is secretly ranking people according to how untrustworthy they are, it has revealed.

The site is using an array of information to decide whether its users should be believed when they say something wrong is happening on the site.

Users will not know that their behaviour is being ranked, the company appeared to suggest. But the ranking themselves are important: they decide whether an account is reliable enough to be believed when it reports a page or a story, helping decide whether it should be taken down straight away.

Facebook has been looking to improve how it reacts to problem accounts, and pages that publish fake stories or otherwise break its rules. But it still largely requires its users to notify it of those problems, and then decide whether to take the accounts down.

It has run into problems because users will often report pages they disagree with or otherwise want taken down that might not actually have broken its rules. That is why it is ranking them to decide whether they should be listened to, according to a report in the Washington Post.

The company said that it could not reveal details of the tool for fear that those with understanding of its workings would try and game the system. But it is running in the background and ranking users from zero to one, reports claimed.

Facebook did not reveal what markers it looks for that suggest people are untrustworthy. But it revealed that accurately reporting content might give people more clout, for instance.

“One of the signals we use is how people interact with articles,” Lyons told the Washington Post. “For example, if someone previously gave us feedback that an article was false and the article was confirmed false by a fact-checker, then we might weight that person’s future false news feedback more than someone who indiscriminately provides false news feedback on lots of articles, including ones that end up being rated as true.”

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in