Pornhub Says Digitally Generated ‘Deepfakes’ Are Non-Consensual And It Will Remove Them

Pornhub Says Digitally Generated ‘Deepfakes’ Are Non-Consensual And It Will Remove Them

After Reddit and image-hosting platform Gfycat began removing porn videos digitally altered via machine-learning techniques to swap performers’ faces with those of someone else’s – a genre known as deepfakes – porn site Pornhub is following their lead.

Photo: AP

Per Motherboard, Pornhub said that it considers generating deepfakes of other people to be clearly non-consensual. It compared deepfakes to revenge porn and added that it will remove such content from the site:

“We do not tolerate any nonconsensual content on the site and we remove all said content as soon as we are made aware of it,” a spokesperson told me in an email. “Nonconsensual content directly violates our TOS [terms of service] and consists of content such as revenge porn, deepfakes or anything published without a person’s consent or permission.” Pornhub previously told Mashable that it has removed deepfakes that are flagged by users.

As Motherboard noted and a quick search by Gizmodo confirmed, numerous such videos remained on the site and were easily accessible using the obvious search terms “deepfake” or “deepfakes”. Apparently, Pornhub might be waiting for users to flag the content for them – perhaps a dubious proposition given the likelihood that people searching for the content aren’t particularly interested in having it removed.

Despite the fact that Reddit’s terms of service clearly state media involving “any person in a state of nudity or engaged in any act of sexual conduct, taken or posted without their permission” is prohibited, and some of the deepfakes content appears to have been removed, the r/deepfakes subreddit (link very NSFW) has remained online and active, generating more of the videos. Gfycat told Gizmodo they would be “actively removing it from our platform”. Per Motherboard, chat service Discord has also purged deepfakes-related chat rooms.

In addition to being creepy, non-consensual, and a form of harassment for the people being put into them, almost all of the deepfakes are likely infringing on copyright since they have to be generated using hundreds or thousands of images pulled from search engines. While it might be difficult to demonstrate where the machine-altered images originated, Pornhub does take down user-posted content that infringes on copyrights when asked.

However, as Wired noted, there’s limited legal recourse in the US for deepfake victims as the videos don’t technically qualify as invasions of privacy “because, unlike a nude photo filched from the cloud, this kind of material is bogus”. US courts could also rule deepfakes are protected by the First Amendment, though it’s possible that they could violate anti-defamation statues.

Deepfakes creators are just as likely to move on to other sites, meaning that like any other kind of content it’s pretty much impossible to scrub it from the web entirely. Users on the deepfakes subreddit appeared to have turned to alternative hosts such as Erome or Sendvid, among others, following their removal from Gfycat.

Not all deepfake content is pornographic in nature. The techniques used to make them can easily be put to less malicious purposes, such as replacing Harrison Ford with Nicholas Cage in scenes from Raiders of the Lost Ark.