One of the darkest aspects of our technologically advanced world is how new advancements are inevitably used to sexually exploit people online. Perhaps few know this better than Scarlett Johansson. Speaking to the Washington Post for a story about the rise of deepfakes and how they are being used to harass women online, the actor cut straight to the bone.
“Nothing can stop someone from cutting and pasting my image or anyone else’s onto a different body and making it look as eerily realistic as desired,” Johansson told the Post. “The fact is that trying to protect yourself from the Internet and its depravity is basically a lost cause… The Internet is a vast wormhole of darkness that eats itself.”
Deepfakes—ultra-realistic fake videos that place someone’s face onto another person’s body using machine learning tools—were first reported on by Motherboard last December. A Reddit user going by the name “deepfakes” shared fake porn videos of a number of female actors, including Johansson, on the site. In the months since, the tech behind these videos has only become more powerful and easier to use, with researchers developing ways to both make deepfakes more convincing and easier to detect. (The hope for many is that the latter developments will outpace the former.)
Johansson is hardly a stranger to the insidious ways technology is weaponised against women. She was among the dozens of female celebrities whose nude photos and videos were stolen in 2011. Her photos were then posted around Los Angeles—without her consent—as part of an art project. And in 2016, a product designer in Hong Kong made a humanoid robot in Johansson’s likeness. In response to various programmed commands, the robot would wink and giggle.
And now, Johansson is the subject of numerous deepfake porn videos easily found online. As the Washington Post reported, one of these videos has been viewed more than 1.5 million times. Johansson told the paper that “it’s just a matter of time before any one person is targeted” by deepfakes, characterising the internet as a “virtually lawless (online) abyss.”
Johansson is accurate to caution that everyone—not just celebrities—are vulnerable to becoming the subject of deepfake videos without their consent. Earlier this year, users on Discord and Reddit were spotted soliciting advice on how to make deepfakes of their crushes and exes. “i made a pretty good vid of a girl i went to high school with using only ~380 pics scraped from insta & FB,” one user reportedly wrote on Discord.
The Washington Post detailed another egregious example of a woman in her 40s who found a deepfake porn video of herself circulating online, one that was created without her consent. The person who requested the video of the woman included 491 photos of her face which were mostly ripped from her Facebook. It reportedly only took two days for the request to be fulfilled.
“I feel violated—this icky kind of violation,” the woman told the Washington Post. “It’s this weird feeling, like you want to tear everything off the Internet. But you know you can’t.”