Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

‘I discovered deepfake porn of myself online’

‘We have hit blurry lines between perception and reality because perception is reality now,’ victim says

Maya Oppenheim
Women’s Correspondent
Friday 25 November 2022 12:08 EST
Comments
Campaigner Kate Isaacs says part of their attack was to turn her into a deepfake, describing the ordeal as ‘an act of sexual violence’
Campaigner Kate Isaacs says part of their attack was to turn her into a deepfake, describing the ordeal as ‘an act of sexual violence’ (Kate Isaacs)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

A woman has revealed what it was like to discover deepfake porn of herself on the internet.

Kate Isaacs said it was “violating” to fall victim to “deepfakes” – explicit images or videos which have been manipulated to look like someone without their consent – as she accused her attacker of trying to “silence” and “scare” her.

Kate Isaacs’s comments come as the government announced distributing pornographic deepfakes as well as sharing “downblouse” images – non-consensual photos taken down a woman’s top – will be made illegal.

Police and prosecutors will be given more power to hold perpetrators to account under an amendment to the Online Safety Bill, with the distributors of deepfakes now potentially facing prison time under the suggested measures.

Ms Isaacs, who is 30, told The Independent she thinks she fell victim to a deepfake because she set up a campaign called #NotYourPorn to get non-consensual content removed from the popular adult entertainment platform Pornhub after her friend ended up on the site without her consent.

In 2020, the campaign helped pressure Pornhub into deleting an estimated 10 million unverified videos on the site in order to strip the platform of non-consensual and child porn videos.

Ms Isaacs said: “That was a great day. It spread a message. Unfortunately, after that incident, I became a target on Twitter of a very small but loud group of men. They felt they were entitled to non-consensual porn.

“They were against the campaign and they were upset I had ridded Pornhub of their porn. After that, they started an attack on me online, they posted my work and home address on Twitter.

“They commented underneath they were going to find me, follow me home, rape me, film it and upload it to Pornhub. That was completely terrifying, I’d never experienced fear like that in my life.”

They were against the campaign and they were upset I had ridded Pornhub of their porn. After that, they started an attack on me online, they posted my work and home address on Twitter

Kate Isaacs

The campaigner added part of their attack was to turn her into a deepfake, describing the ordeal as “an act of sexual violence”.

She said: “They took a BBC TV interview and doctored my face onto a porn film of someone having sex. It was so violating. For a few short seconds, I didn’t know it was a deepfake and I was terrified looking at that as I couldn’t remember that moment or the man.”

Upon closer inspection, she realised it was a deepfake, Ms Isaacs, who campaigns on image-based sexual abuse, added.

She said: “Someone was using my identity, my profile without my consent in a sexual manner. I appreciate some people don’t feel like it is that big of a deal.

“But we have hit blurry lines between perception and reality because perception is reality now. I find it abhorrent that they used my image to silence me, to scare me, or for sexual gratification without my consent.”

Many have sounded alarm bells about how deepfakes can mislead members of the public, and previous research conducted by cybersecurity firm Deeptrace indicated around 96 per cent of all deepfake videos are non-consensual porn, while women are targets in 96 per cent of cases.

The newly unveiled measures will see ministers introduce extra laws to address other abusive acts, such as concealing secret cameras to capture photos or footage of an individual without getting their consent.

Nicole Jacobs, domestic abuse commissioner, said: “I welcome these moves by the government which aim to make victims and survivors safer online, on the streets and in their own homes.”

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in