The country’s privacy and information commissioner has asked the Australian Federal Police to be better at protecting our privacy, following the AFP’s use of that facial recognition tool from Clearview AI.
Australian Information Commissioner and Privacy Commissioner Angelene Falk has determined that the AFP failed to comply with its privacy obligations in using Clearview AI.
The Commissioner found the AFP failed to complete a privacy impact assessment before using the tool (this is actually a requirement listed in the Australian Government Agencies Privacy Code). The AFP also breached one of the Australian Privacy Principles by “failing to take reasonable steps to implement practices, procedures and systems in relation to its use of Clearview AI to ensure it complied with clause 12 of the Code”.
In April last year, the AFP admitted to using Clearview AI, despite not having an appropriate legislative framework in place, to help counter child exploitation. Between 2 November 2019 and 22 January 2020, Clearview AI provided free trials of the facial recognition tool to members of the AFP-led Australian Centre to Counter Child Exploitation (ACCCE).
ACCCE members uploaded facial images of Australians to test the functionality of the tool, and in some cases, to try to identify persons of interest, and victims in active investigations. The AFP did not assess the risks to providing personal information to a third party located overseas, assess its security practices, accuracy or safeguards.
The Commissioner also considered that the AFP did not have in place appropriate systems to identify, track and accurately record its trial of this new investigative technology involving personal information handling.
Commissioner Falk has directed the AFP to engage an independent assessor to review and report to the OAIC on residual deficiencies in its practices, procedures, systems and training in relation to privacy assessments, and make any necessary changes recommended in the report. She also asked the AFP to ensure that relevant AFP personnel have completed an updated privacy training program.
In a determination made last month, the Commissioner found that Clearview AI interfered with Australians’ privacy by scraping biometric information from the web and disclosing it through its facial recognition tool.
“This determination should provide additional assurance to Australians that deficiencies in the AFP’s privacy governance framework will be addressed, under the OAIC’s oversight,” Falk added on Thursday.