For more than a year, people opposed to giving police access to face recognition technology have pointed to criminal cases such as that of Willie Lynch. A Florida appellate court ruled last year that Lynch, a Jacksonville man convicted of selling drugs worth $US50 ($72) to an undercover agent, had no right to view the photos of other suspects face recognition software had identified next to his. The software, it turned out, had expressed low confidence in Lynch’s photo being a match to that of the suspect.
At trial, it was revealed that Lynch’s was the only photo was ever shown to detectives — though police claimed to have other corroborating evidence, which the defence disputed. Lynch, who had been in and out of prison much of his life, represented himself through most of the trial. He was sentenced to eight years in prison, though prosecutors had asked the judge for four times that. Last year, civil rights lawyers called on the Florida Supreme Court to take up the case.
Now, new cases of erroneous facial recognition matches are cropping up in Michigan where the Detroit Police Department is known to have wrongly arrested at least two men.
In an article Friday, the Detroit Free Press wrote about the case of a 25-year-old Black man, Michael Oliver, who was arrested based on a face recognition match and then fingered in a line-up by a witness. In May 2019, Oliver was charged with larceny. Only after did police realised there was no way he could be the suspect: Oliver had tattoos up and down both arms. The suspect who was photographed had none.
Police told reporters that Oliver’s case was handled prior to new rules around the use of face recognition, which are supposedly stricter and prohibit its use except in violent felonies. In Oliver’s case, charges were filed before a police supervisor had reviewed the evidence, which police told the Free Press is no longer the practice where face recognition is involved. During a public meeting on Monday, Detroit Police Chief James Craig said it was against “current policy” to rely solely on the software, which he acknowledged was almost always inaccurate.
“If we were just to use the technology itself to identify someone,” Craig said, “ I would say 96 per cent of the time it would misidentify.”
Craig’s remarks followed a complaint by the ACLU over another case, that of Robert Williams, who was arrested in January. Williams, who, like Oliver, is Black, was falsely identified by face recognition software as a shoplifter, was then arrested in his own front yard in front of his children, and detained overnight in a city jail. The charges were not immediately dismissed, either, according to the ACLU, even though police acknowledged a computer error was likely at fault. “[Williams] had to explain to his employer and family what had happened. And he had to live with the stigma of being arrested on his front lawn, in front of his family, and where any number of neighbours could have been watching as well,” the complaint says.
“I keep thinking about how lucky I was to have spent only one night in jail — as traumatising as it was,” Williams wrote in an op-ed for the Washington Post last month. “Many black people won’t be so lucky,” he said. “My family and I don’t want to live with that fear. I don’t want anyone to live with that fear.”
“Detroit police’s new policy is a fig leaf that provides little to no protection against a dangerous technology subjecting an untold number of people to the disasters that Robert Williams and Michael Oliver have already experienced,” Dan Korbkin, ACLU of Michigan’s legal director, said Friday. “Lawmakers must take urgent action to stop law enforcement use of this technology until it can be determined what policy, if any, can effectively prevent this technology’s harms.”
Korbkin added that police and prosecutors nationwide should begin to review cases involving face recognition for errors. “This technology is dangerous when wrong and dangerous when right,” he said.