Facebook Says It Will Use AI To Police Revenge Porn, But It Won’t Fully Explain How

Facebook Says It Will Use AI To Police Revenge Porn, But It Won’t Fully Explain How

In the absence of proactive tools to prevent dirtbags from nonconsensually uploading intimate photos to the internet, victims of revenge porn have little surefire recourse to ensure the photos stop circulating online. It’s an exhausting and devastating game of whack-a-mole, and even when images are effectively scrubbed, the damage has been done.

That’s why Facebook’s announcement that it’s ramping up its program to police nonconsensual image-sharing feels like a glimmer of hope in the fight against revenge porn.

But Facebook’s latest effort, as promising as it is, remains frustratingly vague, leaving vulnerable users with key questions unanswered.

Facebook announced in a blog post on Friday that its “new detection technology” that will help flag and remove intimate photos shared on its platform without the subject’s consent.

“By using machine learning and artificial intelligence, we can now proactively detect near nude images or videos that are shared without permission on Facebook and Instagram,” Antigone Davis, Facebook’s global head of safety, wrote in the post.

“This means we can find this content before anyone reports it, which is important for two reasons: Often victims are afraid of retribution so they are reluctant to report the content themselves or are unaware the content has been shared.”

The technology Facebook announced on Friday will detect intimate photos that have already been uploaded to Facebook and Instagram. A “specially-trained member” of Facebook’s Community Operations team will then review the images and remove them from the platform if they are found in violation of the social network’s Community Standards.

The team will also disable accounts that have shared said content “in most cases”, according to Davis.

Facebook’s new detection tech will work in tandem with the pilot program it announced last year, which lets users preemptively submit intimate photos to the company that they don’t want to be shared on Facebook, Instagram or Messenger.

The images are reviewed by a team of five Facebook employees, hashed, and then any images matching the hashes will be prevented from being uploaded to the services.

In addition to bolstering its efforts to scrub the site of revenge porn, Facebook is also launching a “support hub” for victims called Not Without My Consent, where “victims can find organisations and resources to support them, including steps they can take to remove the content from our platform and prevent it from being shared further — and they can access our pilot program,” Davis writes.

What remains unclear is how Facebook’s new AI tool will be able to identify whether the intimate photos it flags have been uploaded without someone’s consent.

While it’s sometimes legitimate for tech companies to keep the intricate inner workings of anti-harassment tools under wraps so that bad actors can’t abuse or exploit them, in this case, we need to understand more about this new technology.

Facebook doesn’t have to release a detailed blueprint of its detection tech, but it should be able to tell us how it’s able to determine the intent behind a user sharing an intimate photo. This is a tool meant to help some of Facebook’s most vulnerable users, and leaving them in the dark about crucial functionality challenges faith in the system.

Facebook’s initial efforts to combat revenge porn made it inherently obvious that a photo someone is trying to upload is being shared nonconsensually, given that the would-be victim has preemptively shared the image to ensure it isn’t posted online.

That is not the case with Facebook’s new detection technology — these photos are flagged after they’ve already been uploaded, and it’s unclear whether they’re the ones that have been reported to Facebook.

How will this detection technology be able to discern a lingerie photo that someone uploads confidently of themselves versus a photo of a woman in lingerie that was taken in private by a former partner who is now vindictively posting it across his social media pages? Or, how will this new detection tech be able to differentiate revenge porn from a nude work of art or a historically significant photo?

Moderating large platforms such as Facebook is difficult business, and the company’s efforts to better police revenge porn and protect victims is good and necessary, but it needs to ensure that transparency remains at the forefront.

Facebook told Gizmodo in an email that its detection technology was trained on revenge porn in order to better understand what these types of posts would look like, and so it is able to identify whether an intimate or nude image or video is shared without someone’s consent.

However, they didn’t provide any further details on how it would work. AI, as stands, has yet to prove that it can understand even basic human nuances, but Facebook seems confident that it will understand whether a post is vindictive.


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.