A federal lawsuit filed against the city of Chicago is calling into question law enforcement’s use of controversial gunshot detection technology for gathering key pieces of evidence. The suit, filed this week by the MacArthur Justice Centre at Northwestern University’s law school, alleged Chicago police misused inaccurate and unreliable tech from the firm ShotSpotter and didn’t pursue additional leads. The suit accuses police of putting “blind faith,” in ShotSpotter’s supposed evidence.
The lawsuit, first reported by The Associated Press, revolves around a 2020 shooting in Chicago that left one man dead. Police linked that alleged killing to a 65-year-old man named Michael Williams using two pieces of evidence. Police reportedly obtained a noiseless security video of William’s vehicle passing through an intersection which was then linked to a gunshot supposedly detected by ShotSpotter’s system. Williams was arrested in 2021 and spent nearly a year in jail before a judge finally dismissed his case after prosecutors admitted they lacked sufficient evidence.
Now Williams and the MacArthur Justice Centre are seeking damages. The lawsuit is attempting to force the city of Chicago to pay up for William’s loss of income, legal bills, and mental anguish. Moving beyond Williams’s case specifically, the suit reportedly seeks class-action status for any Chicagoans stopped on the basis of ShotSpotter’s tech. The suit reportedly includes details of another plaintiff named Daniel Ortiz who claims he was wrongfully arrested by police using the tech while he was doing his children’s laundry.
“I come from a good family, a good household. I have children and responsibilities,” Ortiz said according to a statement released by the MacArthur Justice Centre. “But in that moment, all the police saw were stereotypes. That day, I felt that the police could do anything.”
Gizmodo could not independently review the lawsuit. We reached out to the MacArthur Justice Centre for comment but have not heard back.
ShotSpotter’s technology works by setting up surveillance networks of microphones and sensors blanketing specific target areas believed to have higher instances of gunfire. In theory, those sensors are able to pick up the loud crack of a firearm and interact with one another using AI to narrow down the precise location of where the sounds originate from. Multiple major police departments around the country, including the NYPD in New York City, have tuned to ShotSpotter in an effort to better track rising gun crime. Activist organisations and police reform groups have spoken out against ShotSpotter, which they say is ineffective and potentially leads to increased and unnecessary interactions between vulnerable communities and police. A 2021 study conducted by the MacArthur Justice Centre found that 89% of alerts registered by ShotSpotter’s system never ended up revealing any gun-related crime.
ShotSpotter did not immediately respond to Gizmodo’s request for comment.