Fear and Loathing in Las Vegas run through Google's Deep Dream neural network is pure nightmare fuel
Because you've always wanted to watch a chicken huff ether
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.The world has been delighting in turning benign scenes and images into trippy, swirling acid trips this week using Google's artificial neural network Deep Dream, but what happens when you run an acid trip through an acid trip generator?
Terry Gilliam's already warped vision of Hunter S. Thompson's novel was given an extra frisson of terror this week, with Johnny Depp's Raoul Duke describing the 'Great San Francisco Acid Wave' whilst metamorphosing into a dog and chicken.
The colours swirl as he lurches toward the camera, in a video almost as hard to watch as that one with all the episodes of Friends laid on top of each other.
Google opened up its image recognising robots to everybody last week, also releasing the half-horrifying, half-amazing pictures that it had created itself, with pictures including a knight made of dogs.
The company made the “Deep Dream” software available on code-sharing website Github, where anyone can download it and run their own images through it.
nullThe software works by turning the image recognising computers on themselves. By telling the systems to over-interpret images, they would pick out otherwise meaningless things and exaggerate them — turning clouds into bizarre llamas, for instance.
As with Google’s own images, the pictures tend to transform thing into animals — with dogs being a particular favourite — and eyes.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments