More than 250 UK celebrities become victims of deepfake porn
Cathy Newman watched the deepfake footage of herself and said: ‘It feels like a violation’
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.More than 250 British celebrities have been victims of deepfake porn, according to a new investigation.
Among them is news presenter, Cathy Newman, who said she felt violated on watching digitally altered footage in which her face was superimposed on to pornography using artificial intelligence (AI).
Channel 4 aired its investigation on Thursday evening and said it did an analysis of the five most visited deepfake websites and found 255 of the almost 4,000 famous individuals listed were British, with all but two being women.
In her report, Newman watched the deepfake footage of herself and said: “It feels like a violation.
“It just feels really sinister that someone out there who’s put this together, I can’t see them, and they can see this kind of imaginary version of me, this fake version of me.
“You can’t unsee that. That’s something that I’ll keep returning to.
“And just the idea that thousands of women have been manipulated in this way, it feels like an absolutely gross intrusion and violation.
“It’s really disturbing that you can, at a click of a button, find this stuff, and people can make this grotesque parody of reality with absolute ease.”
Channel 4 News said it contacted more than 40 celebrities for the investigation, all of whom were unwilling to comment publicly.
The broadcaster also said it found that more than 70% of visitors arrived at deepfake websites using search engines like Google.
Advances in AI have made it easier to create digitally altered and fake images.
Industry experts have warned of the danger posed by AI-generated deepfakes and their potential to spread misinformation, particularly in a year that will see major elections in many countries, including the UK and the US.
Earlier in the year, deepfake images of pop star Taylor Swift were posted to X, formerly Twitter, and the platform blocked searches linked to the singer after fans lobbied the Elon Musk-owned platform to take action.
The Online Safety Act makes it a criminal offence to share, or threaten to share, a manufactured or deepfake intimate image or video of another person without his or her consent but it is not intended to criminalise the creation of such deepfake content.
In its investigation, Channel 4 News claimed the most targeted individuals of deepfake pornography are women who are not in the public eye.
Newman spoke to Sophie Parrish, who started a petition before the law was changed, after the person who created digitally altered pornography of her was detained by police but did not face any further legal action.
She told the PA news agency in January that she was sent Facebook messages from an unknown user, which included a video of a man masturbating over her and using a shoe to pleasure himself.
“I felt very, I still do, dirty – that’s one of the only ways I can describe it – and I’m very ashamed of the fact that the images are out there,” she said.
Tory MP Caroline Nokes, who is chairwoman of the Women And Equalities Committee, told Channel 4 News: “It’s horrific… this is women being targeted.
“We need to be protecting people from this sort of deepfake imagery that can destroy lives.”
In a statement to the news channel, a Google spokesperson said: “We understand how distressing this content can be, and we’re committed to building on our existing protections to help people who are affected.
“Under our policies, people can have pages that feature this content and include their likeness removed from Search.
“And while this is a technical challenge for search engines, we’re actively developing additional safeguards on Google Search – including tools to help people protect themselves at scale, along with ranking improvements to address this content broadly.”
Ryan Daniels, from Meta, said in a statement to the broadcaster: “Meta strictly prohibits child nudity, content that sexualises children, and services offering AI-generated non-consensual nude images.”