Amazon’s Face Recognition Tech Once Again Pegs Politicians As Criminals

Amazon’s Face Recognition Tech Once Again Pegs Politicians As Criminals

Facial recognition systems are still far from accurate, and yet U.S. government agencies continue to push for its deployment. To illustrate the existing flaws of the tech, the American Civil Liberties Union conducted another test of Amazon’s Rekognition software, which inaccurately identified one in five California lawmakers as matches to a mugshot database.

The test was detailed during a press conference on Wednesday in which San Francisco Assemblymember Phil Ting called for support of a U.S. bill that would ban the use of facial recognition in police body cameras. The ACLU ran 120 images of California lawmakers against a 25,000 image mugshot database using Amazon’s Rekognition software and found 26 matches, which included Ting.

“We wanted to run this as a demonstration about how this software is absolutely not ready for prime time,” Ting said during the press conference. “While we can laugh about it as legislators, it’s no laughing matter if you are an individual who is trying to get a job, if you are an individual trying to get a home, if you get falsely accused of an arrest, what happens, it could impact your ability to get employment, it absolutely impacts your ability to get housing. So there are real people who could have real impacts.”

ACLU attorney Matt Cagle said during the press conference that a computer scientist who most recently worked at UC Berkeley independently verified the results. The civil liberties organisation employed a similar methodology to the test they conducted last year in which they tried Amazon’s Rekognition on members of U.S. Congress. That demonstration churned out inaccurate and racially biased results. Though Cagle added that the two tests shouldn’t be viewed as one and the same since they involve different images and potentially different algorithms.

When asked for comment, an Amazon spokesperson told Gizmodo:

The ACLU is once again knowingly misusing and misrepresenting Amazon Rekognition to make headlines. As we’ve said many times in the past, when used with the recommended 99% confidence threshold and as one part of a human-driven decision, facial recognition technology can be used for a long list of beneficial purposes, from assisting in the identification of criminals to helping find missing children to inhibiting human trafficking. We continue to advocate for federal legislation of facial recognition technology to ensure responsible use, and we’ve shared our specific suggestions for this both privately with policy makers and on our blog.

Cagle, however, argued against the criticism that the ACLU didn’t use a 99 per cent confidence threshold, stating during the press conference that they used the default settings in the Amazon product, which is an 80 per cent confidence score, and the default for a user when they turn on the system.

The spokesperson responded to this by pointing to a blog post that an Amazon employee wrote following the ACLU’s last test which noted that for public safety use cases, Rekognmition shouldn’t be used with less than a 99 per cent confidence level, a point that the spokesperson reasserted to Gizmodo. “But, machine learning is a very valuable tool to help law enforcement agencies, and while being concerned it’s applied correctly, we should not throw away the oven because the temperature could be set wrong and burn the pizza,” Amazon wrote in the blog post from July.

Odd pizza metaphors aside, the question remains: Why not make a 99 per cent threshold the default setting for Rekognition? Amazon did not immediately have an answer for us. 

While the confidence score used by the ACLU was less than that suggested by Amazon for law enforcement use cases, it was still a pretty high level, and one that is defaulted to for other use cases, and if the purpose was to simply point out that this technology has the capacity under normal circumstances to mess up, it accomplished that goal.

Cagle said that false matches would put innocent people in these communities at risk of not just being arrested, but of being injured and killed by cops. And lawmakers, as well as liberty and equity advocates during the press conference, pointed out that even if Amazon and other similar tech companies developed a flawless facial recognition system, installing this tech in body cams transforms a device for cop accountability into a device that would unjustly affect vulnerable minority communities in the U.S..

“Police body cameras are in communities where police are, those are primarily communities of colour communities, where immigrants live,” Cagle said. “And so even if these algorithms were perfectly accurate, these communities would be the ones who would be disproportionately harmed by the surveillance practices.”


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.