App which used algorithm to ‘undress’ women and create fake nudes shut down

‘Now anyone could find themselves a victim of revenge porn, without ever having taken a nude photo. This tech should not be available to the public,’ says campaigner

Maya Oppenheim
Women's Correspondent
Saturday 29 June 2019 13:55 EDT
Comments
The $50 (£40) Deepnude app has faced heavy criticism and has been accused of objectifying women
The $50 (£40) Deepnude app has faced heavy criticism and has been accused of objectifying women (Getty Images)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

An app which used a machine learning algorithm to digitally “undress” images of women wearing clothes to create fake nudes has been taken offline.

The $50 (£40) Deepnude app faced heavy criticism and has been accused of objectifying women.

DeepNude used artificial intelligence to create the “deepfake” images – showing realistic estimates of how a woman might look if she was naked. The app was not designed to work on men.

Deepfake images and clips often appear credible to the average viewer – with many raising alarm bells about their possibility to mislead members of the public.

The controversial app’s developers have now removed the software from the web – saying “the world is not yet ready”.

“The probability that people will misuse it is too high. We don’t want to make money this way,” DeepNude said in a message on their Twitter feed.

The developers said those who bought the app, which was available for Windows and Linux, will receive a refund.

They also asked people who had a copy not to share it. However the app will still work for anyone who possesses it.

One campaigner against “revenge porn” – defined as the sharing of private, sexual photos or videos of another person, without their consent and with the purpose of causing embarrassment or distress – branded the app “terrifying”.

“Now anyone could find themselves a victim of revenge porn, without ever having taken a nude photo. This tech should not be available to the public.” Katelyn Bowden, founder of anti-revenge porn campaign group Badass, told tech news site Motherboard.

After the outlet published a story on the app, the server for DeepNude crashed, provoking it to announce that it was offline because “we didn’t expect this traffic”.

The app later tweeted: “Here is the brief history, and the end of DeepNude. We created this project for users’ entertainment a few months ago. We thought we were selling a few sales every month in a controlled manner.

“Honestly, the app is not that great, it only works with particular photos. We never thought it would become viral and we would not be able to control the traffic.”

California is contemplating a bill which would make pornographic deepfake images illegal – meaning it would be the only state to ever take legislative action against them.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in