Clearview AI Says It Can Do the ‘Computer Enhance’ Thing

Clearview AI Says It Can Do the ‘Computer Enhance’ Thing

Sketchy face recognition company Clearview AI has inflated its stockpile of scraped images to over 10 billion, according to its co-founder and CEO Hoan Ton-That. What’s more, he says the company has new tricks up its sleeve, like using AI to draw in the details of blurry or partial images of faces.

Clearview AI has reportedly landed contracts with over 3,000 police and government customers including 11 federal agencies, which it says use the technology to identify suspects when it might otherwise be impossible. In April, a BuzzFeed report citing a confidential source identified over 1,800 public agencies that had tested or currently uses its products, including everything from U.S. police and district attorney’s offices to Immigration and Customs Enforcement and the U.S. Air Force. It also reportedly has worked with dozens of private companies including Walmart, Best Buy, Albertsons, Rite Aid, Macy’s, Kohl’s, AT&T, Verizon, T-Mobile, and the NBA.

Clearview has landed such deals despite facing considerable legal trouble over its unauthorised acquisition of those billions of photos, including state and federal lawsuits claiming violations of biometrics privacy laws, a consumer protection suit brought by the state of Vermont, the company’s forced exit from Canada, and complaints to privacy regulators in at least five other countries. There have also been reports detailing Ton-That’s historic ties to far-right extremists (which he denies) and pushback against the use of face recognition by police in general, which has led to bans on such use in over a dozen U.S. cities.

In an interview with Wired on Monday, Ton-That claimed that Clearview has now scraped over 10 billion images from the open web for use in its face recognition database. According to the CEO, the company is also rolling out a number of machine learning features, including one that uses AI to reconstruct faces that are obscured by masks.

Specifically, Ton-That told Wired that Clearview is working on “deblur” and “mask removal” tools. The first feature should be familiar to anyone who’s ever used an AI-powered image upscaling tool, taking a lower-quality image and using machine learning to add extra details. The mask removal feature uses statistical patterns found in other images to guess what a person might look like under a mask. In both cases, Clearview would essentially be offering informed guesswork. I mean, what could go wrong?

As Wired noted, quite a lot. There’s a very real difference between using AI to upscale Mario’s face in Super Mario 64 and using it to just sort of suggest what a suspect’s face might look like to cops. For example, existing face recognition tools have been repeatedly assessed as riddled with racial, gender, and other biases, and police have reported extremely high failure rates in its use in criminal investigations. That’s before adding in the element of the software not even knowing what a face really looks like — it’s hard not to imagine such a feature being used as a pretext by cops to fast-track investigative leads.

“I would expect accuracy to be quite bad, and even beyond accuracy, without careful control over the data set and training process, I would expect a plethora of unintended bias to creep in,” MIT professor Aleksander Madry told Wired. Even if it did work, Madry added, “Think of people who masked themselves to take part in a peaceful protest or were blurred to protect their privacy.”

Clearview’s argument goes a little something like this: We’re just out here building tools, and it’s up for the cops to decide how to use them. For example, Ton-That assured Wired that all of this is fine because the software can’t actually go out there and arrest anyone by itself.

“Any enhanced images should be noted as such, and extra care taken when evaluating results that may result from an enhanced image,” Ton-That told the magazine. “… My intention with this technology is always to have it under human control. When AI gets it wrong it is checked by a person.” After all, it’s not like police have a long and storied history of using junk science to justify misconduct or prop up arrests based on flimsy evidence and casework, which often goes unquestioned by courts.

Ton-That is, of course, not that naive to think that police won’t use these kinds of capabilities for purposes like profiling or padding out evidence. Again, Clearview’s backstory is full of unsettling ties to right-wing extremists — like the reactionary troll and accused Holocaust denier Chuck C. Johnson — and Ton-That’s track record is full of incidents where it looks an awful lot like he’s exaggerating capabilities or deliberately stoking controversy as a marketing tool. Clearview itself is fully aware of the possibilities for questionable use by police, which is why the company’s marketing once advertised that cops could “run wild” with their tools and the company later claimed to be building accountability and anti-abuse features after getting its hooks into our justice system.

The co-founder added in his interview with Wired that he is “not a political person at all,” and Clearview is “not political” either. Ton-That added, “There’s no left-wing way or right-wing way to catch a criminal. And we engage with people from all sides of the political aisle.”


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.