The United Kingdom has had it with creepy facial recognition firm Clearview AI. Under a new enforcement rule from the U.K.’s Information Commissioner’s office, Clearview must cease the collection and use of publicly available U.K. data and delete all data of U.K. residents from their database. The order, which will also require the company to pay a £7,552,800 (A$13,391,378) fine, effectively calls on Clearview to purge U.K. residents from its massive face database reportedly consisting of over 20 billion images scrapped from publicly available social media sites.
The ICO ruling which determined Clearview violated U.K. privacy laws, comes on the heels of a multi-year joint investigation with the Australian Information Commissioner. According to the ICO ruling, Clearview failed to use U.K. resident data in a way that was fair and transparent and failed to provide a lawful reason for collecting the data in the first place. Clearview also failed, the ICO notes, to put in place measures to stop U.K resident data from having their data collected indefinitely and supposedly didn’t meet higher data protection standards outlined in the EU’s General Data Protection Regulation.
NEW: We’ve fined Clearview AI Inc more than £7.5m for using images of people in the UK, and elsewhere, that were collected from the web and social media to create a global online database that could be used for facial recognition.— ICO – Information Commissioner's Office (@ICOnews) May 23, 2022
Read our press release: https://t.co/VCnmjjcM8D pic.twitter.com/88mO1mUjmq
“The company not only enables identification of those people, but effectively monitors their behaviour and offers it as a commercial service,” U.K. Information Commissioner, John Edwards said in a statement. “That is unacceptable.”
Though the ICO didn’t speculate on the precise amount of U.K. residents caught up in Clearview’s database, it said the pervasiveness of U.K. social media users meant it’s “likely to include a substantial amount of data from UK residents, which has been gathered without their knowledge.”
In a statement sent to Gizmodo, Jenner and Block partner and Clearview attorney Lee Wolosky, said he viewed the ICO’s ruling as, “incorrect as a matter of law.” “Clearview AI is not subject to the ICO’s jurisdiction, and Clearview AI does no business in the U.K. at this time.”
While Clearview previously offered its service to U.K. law enforcement groups like the Ministry of Defence and National Crime Agency in the past, the company doesn’t actually currently offer services in the U.K. Still, the nature of its social media data scraping apparatus means images of individuals from non-client companies can still make their way in the database. The ICO said rulings like this could make that access more difficult for Clearview, both in the U.K. and elsewhere.
“People expect that their personal information will be respected, regardless of where in the world their data is being used,” Edwards added. “That is why global companies need international enforcement. Working with colleagues around the world helped us take this action and protect people from such intrusive activity.”
“I am deeply disappointed that the UK Information Commissioner has misinterpreted my technology and intentions,” Clearview AI CEO Hoan Ton-That, said in statement emailed to Gizmodo. Ton-That said he grew in Australia and viewed the U.K. as a “majestic” place. “We collect only public data from the open internet and comply with all standards of privacy and law. My company and I have acted in the best interests of the UK and their people by assisting law enforcement in solving heinous crimes against children, seniors, and other victims of unscrupulous acts.”
Clearview Faces an International Privacy Reckoning
It’s been a rough few years for Clearview on the international stage. Late last year, the country was forced to cease all data scraping operations in Australia. Like the U.K. case, Australian regulators required Clearview to also destroy any existing images collected from the country. In Canada, Clearview opted to abandon its business operations back in 2020 amid pressure from multiple investigations. Overall, Clearview has had to delete massive data sets in Australia, France, and Italy.
Just this month Clearview reached an historic settlement with the American Civil Liberties Union effectively banning the use of its faceprint database worldwide for private companies. Clearview’s partners currently mostly consist of public law enforcement agencies but the company had high hopes of expanding its business to a wider commercial audience. The ACLU settlement will make those dreams much more difficult to materialise.
That settlement’s significant, but it will do little to affect Clearview’s core U.S business model. On that end, lawmakers and advocates are amping up pressure on major federal agencies to ditch the company, which some warn could bring about the end of anonymity.