Clearview AI Thinks The Solution to Dating Is Facial Recognition

Clearview AI Thinks The Solution to Dating Is Facial Recognition

Clearview AI, now the byword for unchecked surveillance, has filed a wide-ranging patent application that envisions handing over the most reviled uses of facial recognition technology to just about anyone. The application proposes allowing users to submit a photo and instantly retrieve information including, but not limited to, mental health status, housing status, drug use, home address, email, and employer’s website for such suggested uses as “dating.”

“In many instances, it may be desirable for an individual to know more about a person that they meet, such as through business, dating, or other relationship,” the application reads. It goes on to describe the dilemma when a “newly met person” may be providing a fake business card or is falsifying “oral or written” information. “Alternatively, one may research the newly met person on a web site or to perform background checks,” it says. “However, there are many instances when a person can assume a new name or identity to present a false name and history to the individual.” Clearview AI claims it could solve this problem by providing biometrically-guaranteed-authentic information gathered from “the Internet” (presumably, social media platforms, though several have sent Clearview AI cease-and-desist letters) as well as “professional websites, law enforcement websites, or departments of motor vehicles.”

The application, first reported by Buzzfeed News, was filed in August and became public on Thursday. It is still pending approval. So while it’s important to be clear that this is an application for a patent that may or may nor develop in any thousands of permutations, even the least harmful version sounds pretty grim given Clearview AI’s long disinterest in consent.

Early last year, the New York Times revealed that Clearview AI had compiled a database of over three billion images from social media profiles without users’ knowledge. Soon after the story ran, Clearview AI co-founder Hoan Ton-That demoed the app for CNN, capturing photos of the host’s and producer’s faces and retrieving libraries of images from their past, including poorly-lit Facebook photo dating back years and Instagram photos which had since been made private. While the company was secretive about its client list, documents leaked to Buzzfeed News showed that the company had offered its tools to over 2,200 law enforcement agencies including U.S. Immigration and Customs Enforcement, as well as and private entities like Macy’s, Bank of America, and a United Arab Emirates sovereign wealth fund.

The revelations brought on a flotilla of lawsuits alleging violation of the Illinois Biometric Information Privacy Act (BIPA) and all sorts of justified fears from civil rights organisations over weaponizing personal data. The ACLU has pointed out that effectively stripping crowds of anonymity could deter people from protesting, attending AA meetings, or joining in religious gatherings. It could pose an even greater threat to sexual exploitation victims, undocumented immigrants, and people of colour.

One of the many unnerving elements of this patent application is the timing. In May 2020, just months before the patent application was filed, Clearview AI’s attorney sought to reassure an Illinois court that “Clearview’s customers are currently limited to non-Illinois law enforcement and government entities.” Buzzfeed News reported on other filings declaring that the company would “avoid transacting with non-governmental customers anywhere.”

In a statement sent to Gizmodo, Clearview AI said that its technology “is currently only used by law enforcement for after-the-crime investigations.”

“We do not intend to launch a consumer-grade version of Clearview AI,” the company added. The patent application reads like a business pitch for precisely that.

Screenshot: US Patent and Trademark Office
Screenshot: US Patent and Trademark Office

The patent mentions tamer potential business permutations, like authenticating bank account holders; more ominously, it floats the possibility of creating closed networks for “retail” and “real estate” clients “to share headshots of high-risk individuals.” Obviously, like Clearview’s pledges to cut down on private contracts, these for-instances aren’t legal bounds; there’s nothing in the patent application that could stop a real estate agent from refusing to show a home because they discover a person’s religious affiliation or an old smudge on their criminal record. Last November, the Los Angeles Police Department banned the use of outside facial recognition technology after finding that detectives had been using Clearview AI without authorization.

Clearview AI’s patent application even validates privacy advocates’ worst fears, that such technology could be used to profile people based on housing vulnerability and substance abuse issues. In addition to classifications like an “unknown” or “newly met” person, Clearview AI says this technology could be used to assess “a person with deficient memory, a criminal, an intoxicated person, a drug user,” or “a homeless person.”

Clearview AI imagines a scenario in which one could aim a camera at a person and discover that they’ve experienced homelessness. “In one example, the information can be used by social workers to identify homeless people or people in need,” the patent reads. In another example, a law enforcement officer could instantly obtain your health information without your consent. “A person with a history of DUI arrests, revealed by the facial scans, may be treated differently than a person with a history of diabetic low blood sugar symptoms,” it reads.

And (again, hypothetically!) the information retrieved might not only affect that person, but also anyone affiliated with them; one permutation of the tool could pull up information about the subject’s co-workers, friends, family, and partner. This is called “correlative face search,” which helps identify secondary characters who’ve appeared in photos alongside them.

The U.S. Patent and Trademark Office declined to comment on whether it factors potential ethical and legal breaches in the patent review process.

Clearview AI is still fighting to defend what it considers its First Amendment right to capitalise on billions of images, regardless of the potential harm to the people who uploaded them. (Federal law has yet to be written, and states are slowly joining Illinois’s lead in biometric privacy regulations.) The ACLU, which is suing Clearview AI on behalf of sex workers, domestic violence survivors, undocumented immigrants, and others, counters that Clearview is allowed to collect images, but it is not (at least in Illinois) permitted to capture faceprints without consent.

“Accepting Clearview’s argument to the contrary would mean agreeing that collecting fingerprints from public places, generating DNA profiles from skin cells shed in public, or deciphering a private password from asterisks shown on a public login screen are all fully protected speech,” the ACLU wrote in a recent post. It compares Clearview’s acts of surveillance to stealing documents and wiretapping. “In other words, the fact that a burglar intends to publish documents they steal doesn’t mean the burglary is protected by the First Amendment.”