Jordan Peele Uses Machine Learning Tools To Make A Fake Obama Warn Us About ‘F***ed-Up Dystopia’

Jordan Peele Uses Machine Learning Tools To Make A Fake Obama Warn Us About ‘F***ed-Up Dystopia’

As America’s descent into national madness continues, some have been sounding warning bells about a possible “fake news apocalypse” – the idea that technology is making it easier than ever to generate disinformation and propaganda and quickly disseminate it to legions of people online, with little checks on the process. Something that’s particularly worrying are machine learning algorithms that are quickly making it possible to generate fake videos of public figures saying things they didn’t with uncanny audiovisual accuracy.

GIF: BuzzFeed Video (YouTube)

Last year, University of Washington researchers used the technology to take videos of things that former US President Barack Obama had already said, then generate faked videos of him spitting out those lines verbatim in a machine-generated format.

The research team stopped short of putting new words in Obama’s mouth, but Get Out director Jordan Peele and BuzzFeed have done just that in a PSA warning malicious actors could soon generate videos of anyone saying just about anything.

Using technology similar to the University of Washington study and Peele’s (fairly good!) imitation of Obama’s voice, here’s a clip of the former POTUS saying, “So, for instance, they could have me say things like, I don’t know, Killmonger was right. Or uh, Ben Carson is in the sunken place. Or how about this, simply, President Trump is a total and complete dipshit.”

“Now see, I would never say these things, at least not in a public address, but someone else would,” the fake Obama added.

“We’ve covered counterfeit news websites that say the pope endorsed Trump that look kinda like real news, but because it’s text people have started to become more wary,” BuzzFeed CEO Jonah Peretti wrote. “And now we’re starting to see tech that allows people to put words into the mouths of public figures that look like they must be real because it’s video and video doesn’t lie.”

Peele’s fake clip isn’t perfect; it hitches in places, and sometimes it’s pretty obvious that something is off with the fake Obama’s mouth. Some of the hand motions are stiff, and that isn’t even counting when he keeps doing the same ones over and over. But the video component is already at the level where someone paying half attention – perhaps scanning past the embed in an article, or turning on the kitchen TV while they’re making breakfast – might not notice the uncanny valley aspects of the simulation.

This PSA was made using off-the-shelf consumer software, namely Adobe After Effects and FakeApp, which applies Google TensorFlow machine learning tools. That latter tool is the same one that spread rapidly on Reddit as the website’s horniest posters realised they could use it to swap celebrities’ faces with those of porn actors.

As colleague Adam Clark Smith noted before, there are countless potential uses of this technology that would qualify as mundane, such as improving the image quality of video chat apps, or recreating mind-blowing facsimiles of historic speeches in high-definition video or holograms.

But machine-learning algorithms are improving rapidly, and as security researcher Greg Allen wrote at the time in Wired, it is likely only a matter of years before the audio component catches up and makes Peele’s Obama imitation unnecessary. Within a decade, some kinds of forensic analysis may even be unable to detect forged audio.

Generating high-quality fakes does require a ton of raw input to fine-tune the resulting fake, but that isn’t exactly in short supply on celebrities or politicians.

“When tools for producing fake video perform at higher quality than today’s CGI and are simultaneously available to untrained amateurs, these forgeries might comprise a large part of the information ecosystem,” Allen wrote. “The growth in this technology will transform the meaning of evidence and truth in domains across journalism, government communications, testimony in criminal justice, and, of course, national security.”

At the end of Peele’s video, he and the fake Obama repeat one of the real president’s talking points: If we want to prevent a future where deepfaked videos of politicians saying they want to seize everyone’s guns and melt them into a statue to a heathen God go viral on Facebook without resorting to censorship, everyone needs to get a lot better about treating random info they see online sceptically.

“This is a dangerous time,” Peele concludes. “Moving forward, we need to be more vigilant with what we trust from the internet. It’s a time when we need to rely on trusted news sources. It may sound basic, but how we move forward in an age of information is going to be the difference between whether we survive or we become some kind of fucked-up dystopia.”

It’s a fair point, but I’m not sure it’s all that reassuring.

[YouTube via the Verge]


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.