In recent days, we’ve seen Google’s dream robot reimagine everything from internet memes to Hunter S. Thompson’s acid trips. But exactly how the artificial neural network creates these trippy images is hard to grasp… until you watch it all happening in real time.
Johan Nordberg, an artist-developer with a penchant for visual fun, created a video that takes you on “a journey through all the layers of an artificial neural network.” Starting with random noise, each frame of the video is run through Google’s Deep Dream which recognises and enhances details like edges or shapes. Nordberg further explains:
Each frame is recursively fed back to the network starting with a frame of random noise. Every 100 frames (4 seconds) the next layer is targeted until the lowest layer is reached.
For five minutes, you can watch a bunch of static turn into the trippy dog-fish creatures that have now become emblematic of Deep Dream’s electric fantasies. You’ll also hopefully gain a better understanding of how this futuristic technology works. And if you want to do a cool trick, pause the video and watch your own brain flip out. Even when it’s not moving, the image is always moving.