Facebook to allow staff to look through people's nude photos in attempt to stop 'revenge porn'

The site has made clear the images will only be seen by 'specially trained representatives' and that they will be deleted as quickly as they can be

Andrew Griffin
Friday 10 November 2017 07:39 EST
Comments
Facebook's vice president for Latin America, Diego Dzodan, poses for a photograph at Estacao Hack on Paulista Avenue in Sao Paulo's financial centre, Brazil August 25, 2017
Facebook's vice president for Latin America, Diego Dzodan, poses for a photograph at Estacao Hack on Paulista Avenue in Sao Paulo's financial centre, Brazil August 25, 2017 (REUTERS/Nacho Doce)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Facebook is allowing its staff to look at nude photos of its users in an attempt to combat revenge porn.

The site has told its users to send in any photos that they are afraid might be circulated on the site. Those images will then be viewed by Facebook's own staff to verify them, and if they are decided to be legitimate they will be banned from being shared on the site.

In a post aimed at clarifying the facts around the new initiative, Facebook makes clear that those images will only be seen by "a specially trained representative from our Community Operations team". But it does confirm that all of the images will be looked at by its own staff.

"We don’t want Facebook to be a place where people fear their intimate images will be shared without their consent," the post, written by Facebook's global head of safety Antigone Davis, reads. "We’re constantly working to prevent this kind of abuse and keep this content out of our community."

It then goes on to lay out the details of the plan to stop the sharing of non-consensual intimate images, many of which were reported in the press. The article, titled 'The Facts', appears to be an attempt to address some of the misconceptions around the story – but does in fact confirm many of its most controversial details.

Most of the process is automated. Once a photo is verified, it will be converted into a numerical representation called a hash that can only be read by computers, not humans – and any picture that is uploaded will be checked against these hashes to ensure that it isn't on the list of Facebook's banned pictures.

But the site will only allow that to happen if someone sends a photo to Facebook first, at which point it will be checked over by one of the site's own staff.

The feature is currently being rolled out in Australia, in partnership with the country's eSafety Commissioner’s Office and a number of experts and surivors. Those experts have praised the work to identify and ban non-consensual naked images – but the decision to only do so after a person has looked at the photos has drawn intense criticism.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in