Face Recognition CEO Says Use Of This Tech By Police Is 'Irresponsible And Dangerous'

Face recognition will be used to harm citizens if given to governments or police, writes Brian Brackeen, CEO of the face recognition and AI startup Kairos, in an op-ed published by TechCrunch late last night. Last week, news broke that bodycam maker Axon requested a partnership with Kairos to explore face recognition. Brackeen declined, and writes today that "using commercial facial recognition in law enforcement is irresponsible and dangerous".

Photo: AP

"As the Black chief executive of a software company developing facial recognition services, I have a personal connection to the technology both culturally, and socially," Brackeen writes.

Face recognition is one of the most contentious areas in privacy and surveillance studies, because of issues of both privacy and race. A study by MIT computer scientist Joy Buolamwini published earlier this year found face recognition is routinely less accurate on darker-skinned faces than it is on lighter-skinned faces.

A serious problem, Brackeen reasons, is that as law enforcement relies more and more on face recognition, the racial disparity in accuracy will lead to consequences for people of colour.

"The more images of people of colour it sees, the more likely it is to properly identify them," he writes. "The problem is, existing software has not been exposed to enough images of people of colour to be confidently relied upon to identify them. And misidentification could lead to wrongful conviction, or far worse."

Law enforcement agencies have increasingly relied on face recognition in the US, celebrating the tech as a public safety service. Just last week, Amazon employees rallied against the use of Rekognition, the company's face recognition technology, by police. Once optional for US citizens, the Orlando Airport now mandates face scans for all international travellers. And CBP has moved to institute face recognition at the Mexican border.

Similarly in Australia, facial recognition is used at airports and for law enforcement purposes.

In areas where identifying yourself is tied to physical safety, any inaccuracies or anomalies could lead to secondary searches and more interactions with law enforcement. If non-white faces are already more heavily scrutinised in high security spaces, face recognition could only add to that.

"Any company in this space that willingly hands this software over to a government, be it America or another nation's, is wilfully endangering people's lives," concludes Brackeen. "We need movement from the top of every single company in this space to put a stop to these kinds of sales."

[TechCrunch]

Trending Stories Right Now