A controversial tech startup, Clearview AI, shocked the world when it was revealed it had scraped images on the internet for faces, entered them into its facial recognition database, and provided them to law enforcement officials worldwide to search. Now, Australia’s privacy watchdog is teaming up with its British counterpart to investigate the company.
Australia’s Office of the Australian Information Commissioner (OAIC), the country’s privacy and information regulator, has announced it’s launching a joint investigation with its British equivalent into Clearview AI — a US startup, which offers law enforcement around the world access to a huge facial recognition database of reportedly more than three billion images.
“The Office of the Australian Information Commissioner (OAIC) and the UK’s Information Commissioner’s Office (ICO) have opened a joint investigation into the personal information handling practices of Clearview AI Inc., focusing on the company’s use of ‘scraped’ data and biometrics of individuals,” a joint statement read.
“The investigation highlights the importance of enforcement cooperation in protecting the personal information of Australian and UK citizens in a globalised data environment.”
It comes days after the company said it had left Canada amid two federal investigations into its use of taking publicly available images of people, linking them to identities and providing access to law enforcement agencies.
Australian police admit to using Clearview AI
Clearview AI was first subject to criticism after a New York Times report in January revealed the extent of the face database created by Australian developer, Hoan Ton-That. Clearview AI was a notable departure from Ton-That’s previous app creations, including one that let you place Donald Trump’s infamous hairdo on pictures. The database he had worked to create had already been used by the FBI and the Department of Homeland Security, the United States’ biggest federal law enforcement agencies.
The following month, a Buzzfeed investigation revealed Clearview’s client list had more than 2,200 agencies, companies, and individuals signed up, including agencies closer to home such as the Australian Federal Police (AFP) and state police in Queensland, Victoria and South Australia.
Those agencies denied knowledge of their officers using the system to Gizmodo Australia at the time but the AFP later revealed that some of its officers had signed up for a trial in January.
The AFP said in April it was working in cooperation with the OAIC on its use of the system.
“The AFP seeks to balance the privacy, ethical and legal challenges of new technology with its potential to solve crime and even save victims,” an AFP spokesperson told Gizmodo Australia.
“We are actively looking to improve our processes and governance without necessarily constraining innovative investigative approaches.”
The OAIC has not said when it expects to deliver the findings of its investigation.
Gizmodo Australia has contacted Clearview AI for comment.