Forensic science is only science-magic on shows like CSI, where blood drops quickly reveal the patterns of a killer and a fingerprint places someone at a crime scene, even if it’s only half of one, and smudged off the side of a door. The science behind forensics is actually pretty shaky — and in 2009, a comprehensive report from the US National Research Council showcased just how bad it can be, noting that a wide range of common forensic techniques haven’t been subject to strong scientific evaluation.
According to the Innocence Project, incorrectly used forensic evidence contributes to nearly half of wrongful convictions. In 2015, the FBI admitted that it gave flawed testimony about hair analysis for over two decades.
Even testing for DNA might not be as airtight as it seems. The forensic science community has spent the last decade grappling with the problems inherent to the field, and last month, the Department of Justice (DOJ) announced new policies to improve standards around forensic science in federal labs.
The announcement comes almost one year after US Attorney General Jeff Sessions announced that he would not renew the National Commission on Forensic Science, an independent panel that advised the DOJ on standards in the field. Instead, forensic science improvement and reform would move in-house, with less input from external researchers.
Under the new policies, the DOJ will issue uniform language requirements for forensic scientists in their labs to use during testimony and reporting, to standardise the way they talk about different levels and gradations of forensic evidence.
For example, when reporting the results of fingerprint analysis, the guidelines say that federal scientists should not use the term “individualise”, because it implies that their conclusion is based on comparison with every other fingerprint in the world and should not say that they are 100 per cent certain about their conclusion. Testimony will also be monitored for consistency, and department labs will post records of their quality management documents and validation studies online.
While the new standards are a step in the right direction, forensic scientists outside of the DOJ say they probably won’t make a huge impact on the day-to-day practice or understanding of forensic analysis.
Jessica Cino, who studies forensic evidence at the Georgia State University College of Law, said it’s promising that the DOJ is taking strides forward on forensic standards. But, she said, the new requirements only apply to the federal labs. It’s difficult to say if they’re going to trickle down to the state crime labs, which is where most of the day-to-day forensic analysis in the United States actually occurs.
“There’s no real teeth to bring along the state labs,” Cino said. “The bulk of all of this work is coming on the state side, but the state labs are left out of this endeavour. If I’m a state lab, I’m saying, why weren’t we asked to come to dance?”
The federal government can push state-level agencies to adopt federal standards by attaching them as conditions to grants, or just hope that they take them as a model. But, Cino said, they would need more money and the manpower to meet those standards — to train analysts in new standards, update validation reports, and monitor updated procedures. “It’s easier to sit back and say, we have jurisdiction over the federal labs, so we can have the federal labs do it,” she said.
The new language standards, while they will add a much-needed consistency to the current discussion at the federal level, also don’t get at the root of the problems in forensic science — that some of the tools scientists are using in criminal cases aren’t necessarily based on any real science.
Even couching the language of fingerprint analysis in more uncertainty, as the new language standards do, doesn’t change the fact that there haven’t been any studies that actually prove that everyone’s fingerprint is unique. Context can also sway expert conclusions when they’re comparing prints.
The once widely used bitemark analysis, predicated on dentists’ belief that they could trace a bite back to the mouth that made it, has little evidence to support it: like fingerprints, there’s no real study on the uniqueness of bitemarks. There are even questions about the accuracy of DNA evidence.
There’s a pressing need, then, for scientists to study the methods forensic analysts use, or want to use. “There is certainly a problem is foundational validity,” Cino said. “We need federal dollars to fund grants to have that research. We need to get to the ground truth of if we can actually say this is the knife that killed someone based on the striation patterns.”
Language standards should only be an intermediary step on the way to that better science, said Alicia Carriquiry, director of the Center for Statistics and Applications in Forensic Evidence.
“I hope the DOJ’s message on this is that this is acceptable language for the state of the science today, but we need to continue to push the science forward,” she said.
Carriquiry would also like to see independent scientists contributing to the work being done on forensic science standards at the DOJ. “You learn more when you have a larger group of scientists involved,” she said.
The DOJ began working on uniform language standards back in 2015, and at that time, outside scientists, including Carriquiry’s group at the Center for Statistics and Applications in Forensic Evidence, were involved. But since the Trump administration took office last year, there’s been a retreat toward working with only a small group of researchers, and keeping things in-house, she said.
Despite the weaknesses, it is encouraging to see the DOJ continuing to recognise the ongoing issues in forensic science, Cino said. “But going forward, the most interesting thing is going to be the execution. It’s always great to have grand plans, but it would be nice to see how this is all going to be done.”
DNA evidence has led to the exoneration of 354 wrongfully convicted individuals, according to the Innocence Project. Mistakes, invalid methods, and misleading testimony about forensic evidence contributed to the lion’s share of the original convictions.
It often takes a high-profile case, overturned for faulty forensic evidence, to draw attention to the problems with a particular tool — like the 2015 case in which Steven Mark Chaney, who had spent 28 years in jail for murder, was freed after the forensic dentist said the testimony on bitemark analysis was wrong.
Soon after, the Texas Forensic Science Commission recommended an end to the use of bitemarks as evidence.
Cino hopes that scientific progress continues, preventing wrongful convictions in the future and shoring up the science used in the courtroom. “It’s not curing cancer,” she said. “But lives are still on the line.”
Nicole Wetsman is a health and science reporter based in New York. She tweets @NicoleWetsman.