We all know there's a racial disparity in US criminal prosecutions — but imagine if a computer algorithm was being used to insert even more bias into the criminal justice system. According to a damning new report from ProPublica, that's exactly what's happening in many states around the country. Image: Flickr / joegratz
In a common practice known as "risk assessment", computer software is used to predict the likelihood of a future crime by a specific individual. The only problem is that the computer algorithm that law enforcement agencies are using appears to have a severe racial bias.
The implications of the accusations are huge. Risk assessment software is used to make many decisions in the criminal justice system, including things like bond amounts. In states like Arizona, Colorado, Delaware, Kentucky, Louisiana, Oklahoma, Virginia, Washington and Wisconsin, risk assessment programs can be used by judges during a criminal sentencing.
As part of its research, ProPublica obtained the risk scores of more than 7000 people arrested in Broward County, Florida, during a two year period (2013-2014), and tracked these individuals in the two years following. What ProPublica found was that the risk scores were "remarkably unreliable" in determining violent crimes. The results were also weak when a full range of crimes were taken into account. Only 61 per cent of those people were arrested for any subsequent crimes within two years.
The research found that the risk assessment algorithm used by Broward County was more likely to flag black defendants as future criminals, labelling them at almost twice the rate as white defendants. In addition, white defendants were mislabelled as low risk more often than black defendants.
In one example from the article, an 18-year-old black female was arrested for stealing a kid's Huffy bicycle and Razor scooter. She had a light criminal record, and the total value of the stolen products was $US80 ($111). In a similar case, a 41-year-old white male was caught stealing $US86 ($119) worth of goods from a Home Depot store. The white male had been convicted of armed robbery and served five years in prison. The risk assessment software determined that the black female (who had only previously committed misdemeanours as a juvenile) was more likely to commit a future crime than the white male (who was a seasoned criminal).
In the end, we know the computer algorithm got it exactly wrong. The black female was not charged with any new crimes, and the white male is now serving an eight-year prison term for breaking into a warehouse to steal thousands of dollars in electronics.