The ACLU Is Suing Shady Facial Recognition Startup Clearview AI for Being a Shady Facial Recognition Startup

The ACLU Is Suing Shady Facial Recognition Startup Clearview AI for Being a Shady Facial Recognition Startup

The facial recognition org Clearview AI — which you might know as the firm that secretly harvested the photos of millions of people from their social media profiles before blog post issued from the organisation earlier today. Per the post, the charges are being raised on organisations representing folks with the most to lose from this kind of data harvesting — namely, “survivors of sexual assault and domestic violence, undocumented immigrants, and other vulnerable communities,” who could face stalking, deportation or even jail time based on Clearview’s facial dossier.

“Face recognition technology offers a surveillance capability unlike any other technology in the past. It makes it dangerously easy to identify and track us at protests, AA meetings, counseling sessions, political rallies, religious gatherings, and more,” Nathan Freed Wessler, an ACLU staff attorney who’s one of the names spearheading the Clearview charges, wrote in the post.

“For our clients — organisations that serve survivors of domestic violence and sexual assault, undocumented immigrants, and people of colour — this surveillance system is dangerous and even life-threatening,” he added. “It empowers abusive ex-partners and serial harassers, exploitative companies, and ICE agents to track and target domestic violence and sexual assault survivors, undocumented immigrants, and other vulnerable communities.”

There’s an important reason this suit needed to happen in the state of Illinois rather than, say, the New York turf where Clearview AI is currently based. In 2008, Illinois became the first state to try and wrangle legislation surrounding the unregulated harvesting of biometric data. The Illinois Biometric Information Privacy Act is still a strong piece of legalese more than a decade later, and still serves up hefty fines for the orgs that deliberately sidestep citizens in the state.

On BIPA-based grounds, Clearview certainly appears culpable. Wessler points out that core to the law is that companies “collecting, capturing, or obtaining” any sort of biometric data — like a photograph of a face, the way Clearview did — need to first notify the photo’s subject and get their written consent before harvesting that photo for their own use. Not only did Clearview not get the consent of Illinois’s residents, but its facial database is loaded with, per the company, billions of faces from every state, all collected without consent of the photographed or the platform they scraped the photos from to begin with.

“In press statements, Clearview has tried to claim its actions are somehow protected by the First Amendment,” Wessler said, which yes, would be true if company executives were just staring at someone’s digital photo album.

What it can’t do, Wessler went on, “is capture our faceprints — uniquely identifying biometrics — from those photos without consent. That’s not speech; it’s conduct that the state of Illinois has a strong interest in regulating in order to protect its residents against abuse.”

Clearview AI has seemed shady from the beginning, and the more we learn about it, the creepier it gets. If the ACLU can prove it’s run afoul of Illinois’ biometrics law, this likely won’t be the last legal challenge the firm faces.


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.