UK Watchdog Calls For Face Recognition Ban Over 90 Per Cent False-Positive Rate

A privacy watchdog group in the UK is calling for a ban on automated face recognition technologies after a report revealed staggeringly high false-positive rates. Big Brother Watch has released a report, titled "Face Off: The Lawless Growth of Facial Recognition in UK Policing" detailing abuses of the tech, which the watchdog group calls a "threat to the fundamental rights" of citizens.

"We are deeply concerned that the securitisation of public spaces using biometrically identifying facial recognition unacceptably subjects law abiding citizens to hidden identity checks, eroding our fundamental rights to privacy and free expression," the report reads.

As face recognition in public places becomes more commonplace, Big Brother Watch is especially concerned with false identification. In May, South Wales Police revealed that its face-recognition software had erroneously flagged thousands of attendees of a soccer game as a match for criminals; 92 per cent of the matches were wrong. In a statement to the BBC, Matt Jukes, the chief constable in South Wales, said "we need to use technology when we've got tens of thousands of people in those crowds to protect everybody, and we are getting some great results from that."

Facial Recognition Used By Wales Police Has 90 Per Cent False Positive Rate

Thousands of attendees of the 2017 Champions League final in Cardiff, Wales were mistakenly identified as potential criminals by facial recognition technology used by local law enforcement.

Read more

If someone is misidentified as a criminal or flagged, police may engage and ask for further identification. Big Brother Watch argues that this amounts to "hidden identity checks" that require people to "prove their identity and thus their innocence." 110 people were stopped at the event after being flagged, leading to 15 arrests.

Simply walking through a crowd could lead to an identity check, but it doesn't end there. South Wales reported more than 2,400 "matches" between May 2017 and March 2018, but ultimately made only 15 connecting arrests. The thousands of photos taken, however, are still stored in the system, with the overwhelming majority of people having no idea they even had their photo taken.

The group has raised the issue of image retention for innocent people before, including back in March, when UK officials deemed it "too expensive" to remove the mugshots of innocent people from their databases. The mugshots still aren't being removed, yet police are still taking photos, thus still accruing biometric data on scores of people who haven't committed a crime. Big Brother Watch is calling for an immediate removal.

"We are deeply concerned about the impact of automated facial recognition on individuals' rights to a private life and freedom of expression, and the risk of discriminatory impact," the report concludes. "We call on UK public authorities to immediately stop using automated facial recognition software with surveillance cameras."

[BBC, Big Brother Watch]

Trending Stories Right Now