Fake porn videos featuring celebrities deleted from the internet in attempt to stop 'deepfake' footage
Very convincing videos can be made with just a simple piece of software
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Fake porn videos claiming to show celebrities are being deleted from the internet.
The footage – which has become infamous in recent weeks – can be made relatively simply, using just a simple application and some artificial intelligence. But they produce entirely convincing videos that are almost indistinguishable from real ones, allowing celebrities to be easily photoshopped into other footage.
In large part, that technology is being used to put celebrities' faces into adult videos, allowing people to claim that the footage shows famous people making pornographic films.
Now Gfycat, the San Francisco tech company that has hosted many of the videos, has said the posts are "objectionable" and that it will be removing them from the internet. "Our terms of service allow us to remove content that we find objectionable. We are actively removing this content," it said.
Much of the footage is made, uploaded to Gfycat and then shared on Reddit. Much of it is still online on that site, though it is expected to be removed soon.
Reddit hasn't yet commented on the phenomenon. But its rules ban "involuntary pornography", a stipulation originally used to keep so-called revenge porn off the site, though it requires the person involved to complain.
The legality of such fake videos is still a matter for debate. Some have been taken down on copyright grounds but it's not clear that simply using a picture of a famous person allows them to have a video removed for that reason.
Videos posted online show just how convincing many of those swaps can be.
They are made using one simple tool known as FakeApp, whose developer claims it has been downloaded more than 100,000 times. That app gives easy access to artificial intelligence tools that can spot a person's face and swap it out for another – allowing one person to be convincingly swapped into an entirely different scene.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments