Back in January the tech world was up in arms over a new facial recognition app called Clearview AI. Created by Australian Hoan Ton-That, it has raised alarming privacy concerns due to its ability to compare a photo to a database of over 3 billion images that have been scraped from social media sites.
It had already been used by U.S. law enforcement to solve crimes, which led to questions over whether it had been utilised in Australia. At the time, the Australian Federal Police (AFP) denied using Clearview AI. It has now been revealed that some of its officers had.
The original story, written by the New York Times, revealed the app was popular with law enforcement agencies because imperfect photographs could be used to accurately identify a person. This brought about concerns around accuracy and the possibility of false convictions.
And while Clearview AI stated that only publicly available images were used – for example, a public Facebook profile – deleting the images or changing your privacy settings later would not remove those photos from the app’s database.
Another issue that arose was the amount of control that the company seemed to have over the search results. Journalist Kashmir Hill found some of her own photos in the system, but those images reportedly disappeared after the company discovered she was asking police offices to run her photo through the system. This incident was later dismissed by founder Ton-That as a ‘software bug’.
The AFP originally denied it used Clearview AI
Despite reports from U.S. police officers regarding successful arrests due to using the app, the abundance of red flags led to the question of whether our own law enforcement – particularly the AFP and ASIO were trialling it.
Gizmodo Australia asked the AFP in January whether it had used the Clearview AI app. “The AFP does not use Clearview AI,” said a AFP spokesperson in an email. When also asked whether the AFP has been in talks with Clearview AI a spokesperson said, “We don’t have that advice.”
A Freedom of Information (FOI) request was also submitted to the AFP by Gizmodo Australia regarding Clearview AI and it was rejected.
It has since been revealed that AFP officers were in fact using the app.
A face recognition app used by thousands of law enforcement agencies, which has drawn considerable scrutiny in past weeks due to its creator's dubious data collection efforts, contains code hinting at a range of unreported potential features, based on a version of the app discovered by Gizmodo.Read more
Documents seen by Buzzfeed News in late February revealed email addresses connected to officers at state police departments in Queensland, South Australia and Victoria had signed up for accounts on Clearview AI. It is reported that over 900 searches had been run by these accounts, with a further 100 conducted by the AFP. Some of these searches were conducted in January.
At the time, it seemed like the AFP was potentially unaware of some of its officers were using the app.
“The AFP requested the names associated with the accounts registered using AFP email addresses, but these have not been provided. Without this information, the AFP is not in a position to provide further information or comment,” an AFP spokesperson said to Gizmodo Australia in an email.
The AFP finally admits its staff used Clearview AI, despite earlier denials
According to the ABC, despite these earlier denials, the AFP finally admitted to using Clearview AI on Tuesday in response to a question on notice about its use of the app from Shadow Attorney-General Mark Dreyfus in February.
It was revealed that AFP officers working in the Australian Centre to Counter Child Exploitation (ACCCE) signed up for a free trial of Clearview AI and used it for searches between November 2, 2019 and January 22, 2020. On January 21, a day before the searches stopped, the AFP told Gizmodo Australia it doesn’t use Clearview AI.
While we now know the app was used, the AFP maintains it hasn’t been formally adopted by the department.
“The AFP has not adopted the facial recognition platform Clearview.AI as an enterprise product and has not entered into any formal procurement arrangements with Clearview.AI.”
A Parliamentary Joint Committee on Intelligence and Security review also questioned the AFP about FOI rejections regarding Clearview AI. The AFP has essentially said that use of the app within the ACCCE was discovered after these requests were completed.
“The AFP received three requests under the Freedom of Information Act 1982 (FOI Act) on 22 January 2020, 24 January 2020, and 4 February 2020, seeking documents held by the AFP relating to Clearview AI. The requests were processed in accordance with the FOI Act, in that reasonable searches were undertaken by the AFP portfolio with responsibility for facial identification capabilities. No information relating to Clearview AI was identified.
On 14 February 2020, the AFP wrote to each FOI applicant stating reasonable steps had been taken to find the documents, however no documents had been located. As such, the FOI requests were refused in accordance with section 24A(b)(ii) of the FOI Act, which provides an agency may refuse a request for access to a document if all reasonable steps have been taken to find the document and the agency is satisfied the document does not exist.
It was subsequently discovered, in our response to Question 1 (above), the AFP-hosted Australian Centre to Counter Child Exploitation (ACCCE) held information relevant to Clearview AI, which was not identified in response to the earlier freedom of information requests.”
Concerns surround the AFP using Clearview AI
While Clearview AI has been lauded as a tool for convictions in the U.S., the question of whether use of an unofficial, private service being used by the AFP is problematic has been raised by the Labor party.
“The Home Affairs Minister must explain whether the use of Clearview without legal authorisation has jeopardised AFP investigations into child exploitation,” said Dreyfuys and other unnamed Labor leaders in a statement to the ABC.
“The use by AFP officers of private services to conduct official AFP investigations in the absence of any formal agreement or assessment as to the system’s integrity or security is concerning.”
Adding to the concern is some of the potential features. Gizmodo reporters in the U.S. found a public version of Clearview AI on an Amazon server. Using an APK, they were able to download it to an Android device and inspect some of the code and data.
This information revealed development of a private search mode, voice search and in-app functionality to allow officers to take photos of people to run through the database.
It also revealed the name of an AR Glasses company that Clearview AI once planned to partner with.
The AFP is currently cooperating with the Office of the Australian Information Commissioner (OAIC) in relation to its use of Clearview AI but doesn’t seem to be ruling out its potential use in the future.
In response to a request for comment from Gizmodo Australia, an AFP spokesperson said:
“The AFP seeks to balance the privacy, ethical and legal challenges of new technology with its potential to solve crime and even save victims. We are actively looking to improve our processes and governance without necessarily constraining innovative investigative approaches.”
The controversial facial recognition firm Clearview AI has already landed in hot water for purportedly letting both law enforcement and rich investors play around in its database of billions of photos scraped from the public internet. However, a new report suggests the company didn’t just stop there. Apparently, at one point Clearview AI aimed to compile a nationwide repository of mug shots from the last 15 years.Read more