Facebook wants you to upload nude pictures of yourself for artificial intelligence to analyse

'We’re using image matching technology to prevent non-consensual intimate images from being shared'

Aatif Sulleyman
Wednesday 08 November 2017 09:29 EST
Comments
A 3D-printed Facebook like button is seen in front of the Facebook logo, in this illustration taken October 25, 2017
A 3D-printed Facebook like button is seen in front of the Facebook logo, in this illustration taken October 25, 2017 (REUTERS/Dado Ruvic)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Facebook wants users to upload nude pictures of themselves to Messenger.

The company believes the best way to combat revenge porn could be to post intimate pictures of yourself online before anyone else manages to.

It’s a highly unusual measure, which is likely to split opinion.

The social network has developed an anti-revenge porn system that uses artificial intelligence to recognise and block specific images, and is testing it in the UK, US, Canada and Australia.

“The safety and well-being of the Facebook community is our top priority,” said Antigone Davis, Facebook’s head of global safety.

“As part of our continued efforts to better detect and remove content that violates our community standards, we’re using image matching technology to prevent non-consensual intimate images from being shared on Facebook, Instagram, Facebook Groups and Messenger.”

Facebook will create a digital fingerprint of a nude picture you flag up to it through Messenger, and automatically block anyone from uploading the same image to the site at a later date.

The company says it won’t store the pictures and only Facebook’s AI is supposed to access them, but the system still requires an enormous amount of trust from users.

Also, if you’re worried about more than one explicit picture of you being posted to the site, you’d have to upload all of them to Messenger.

Furthermore, the system will only protect you from revenge porn on Facebook. People would still be able to post the images elsewhere.

“It would be like sending yourself your image in email, but obviously this is a much safer, secure end-to-end way of sending the image without sending it through the ether,” Australian e-Safety Commissioner Julie Inman Grant told ABC.

“They're not storing the image, they're storing the link and using artificial intelligence and other photo-matching technologies. So if somebody tried to upload that same image, which would have the same digital footprint or hash value, it will be prevented from being uploaded.”

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in