You Won’t Believe All The Stupid Ways Cops Are Using Face Recognition Tech

You Won’t Believe All The Stupid Ways Cops Are Using Face Recognition Tech

There’s plenty of evidence indicating that facial recognition tech is far from serving as a fair and accurate identification tool but because the sector is mostly lawless, this technology is easily weaponised. And according to a new report, in the hands of many cops, it is being used in profoundly dumb and irresponsible ways.

A report from The Center on Privacy & Technology at Georgetown Law Center published on Thursday details a number of ways in which law enforcement agencies are taking their own creative liberties when it comes to photos being fed into facial recognition databases, furthering the likelihood of an inaccurate and unjust arrest.

An NYPD presentation detailing suggested methods for using facial recognition was unearthed in the report. It described some unnerving ways of getting creative with the technology that have actually worked but did not detail any cases or approaches that have failed. One example detailed in the report is that detectives from an organisation that runs facial recognition searches for the NYPD used an image of actor Woody Harrelson from a Google image search to run through the system because the suspect looked like him.

When the algorithm detected a match in the system to the photo of Harrelson, investigating officers used that to find a suspect (not Harrelson). This same organisation that works with the NYPD, Facial Identification Section (FIS), also used a photo of a New York Knicks player that looked like the doppelganger for an assault suspect.

The report also states that at least six police departments in the country are running forensic sketches through facial recognition systems. So, rather than cross-check someone in the system with a photograph taken of a suspect’s face, they are trying to find matches based on semi-realistic drawings or computer mock-ups. And these aren’t generated based on actual photographs, they are created based on what an eyewitness remembers, which is not a particularly reliable account.

Aside from using celebrity doppelganger’s photos and sketches, cops are also reportedly editing photos before feeding them to the algorithm. For example, according to the report, the NYPD has replaced entire facial features with ones they found on Google image search — like switching out an open mouth with an image of lips found on the internet or closed eyes with open ones found online. Detectives have also combined two different people’s faces that look alike into one (think “what would our nonexistent child look like?”) in order to find one of the included people. They’ve also used both the Blur effect and the Clone Stamp Tool to augment photos before searching it through the system.

Sergeant Jessica McRorie, a Deputy Commissioner Public Information spokesperson, didn’t deny in an email to Gizmodo the claims that the NYPD used doppelgänger photos for its facial recognition system to identify a suspect, as well as replaced facial features in some suspect photos with features found on Google Image search. She did characterise facial recognition as “merely a lead” and stated that “it is not a positive identification and it is not probable cause to arrest,” but did not state whether the department had an explicit regulation prohibiting officers from using it as a positive ID, rather than just an assist.

“No one has ever been arrested on the basis of a facial recognition match alone,” McRorie said. “As with any lead, further investigation is always needed to develop probable cause to arrest.” She continued:

The NYPD has been deliberate and responsible in its use of facial recognition technology. We compare images from crime scenes to arrest photos in law enforcement records. We do not engage in mass or random collection of facial records from NYPD camera systems, the internet, or social media. In each case, whether it is to identify a lost or missing person or the perpetrator of a violent crime, facial recognition analysis starts with a specific image that is compared to other specific images to develop a possible lead. That lead will need to be investigated by detectives to develop evidence that will verify or discount it.

The NYPD’s use of facial recognition has generated leads that have ultimately led to the recent arrest of one man for throwing urine at MTA conductors, and another for pushing a subway passenger onto the tracks. The leads generated have also led to arrests for homicides, rapes and robberies. The NYPD has also used facial recognition for non-criminal investigations, for example a woman hospitalized with Alzheimer’s was identified through an old arrest photo for driving without a licence.

The NYPD constantly reassesses our existing procedures and in line with that are in the process of reviewing our existent facial recognition protocols.

There are several unsettling consequences to this experimental approach to facial recognition systems. As Georgetown Law Center points out in the report, these tweaks and unorthodox photo choices can lead to inaccurate identification. For investigative purposes, this means that the wrong person might be arrested. And that’s why, within the list of recommendations at the end of the report, the center urges these agencies to clearly delineate for officers what “sufficient corroboration of a possible match” looks like as well as completely banning facial recognition as a measure of a positive identification “under any circumstance.”

In other words, cops can’t blindly take the word of an algorithmic match as the definitive suspect. The recommendations also suggest banning the use of doppelgangers and forensic art as legitimate data to be run through these facial recognition systems.

“As the technology behind these face recognition systems continues to improve, it is natural to assume that the investigative leads become more accurate,” the report states. “Yet without rules governing what can—and can not—be submitted as a probe photo, this is far from a guarantee. Garbage in will still lead to garbage out.”

We’re very much in the early stages of deploying these surveillance systems on a massive scale, and we’re already seeing how they can be weaponised against ethnic minorities and biased against women and people of colour. These algorithms are also sometimes just comically bad at their job.

And when the facial recognition space is mainly lawless and unregulated, it’s crucial to make sure there’s not only transparency and accountability on how powerful agencies are using the tech but that they have clearly outlined the ways in which they can use it. Otherwise, we’re going to see more of these idiotic use cases that only exacerbate systemic issues for the most vulnerable.


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.