Why Is Courtroom Science So Unscientific?

Why Is Courtroom Science So Unscientific?

May 15, 1989, 83-year-old Modine Wise was found naked and beaten in her home. The perpetrator left evidence scattered around her: a bloody handprint, a jacket, a pack of cigarettes, and two hairs. In February of 1991, the court sentenced 23-year-old Timothy Bridges to life in prison for the crime.

The conviction was largely based on statements from police informants (later revealed to have been bribed for their testimony) and expert forensic testimony. By merely examining the two hairs under a microscope and comparing their characteristics with hairs from Bridges, the FBI-trained analyst concluded the hairs all came from the same person.

At trial, the analyst claimed there was a one in 1,000 chance that another Caucasian person could have the same head hair. “That was a totally invented number,” Christopher Fabricant, director of strategic litigation at the Innocence Project and co-counsel for Bridges’ case, told Gizmodo. “That was routinely done in hair microscopy. They learned from the FBI to overstate conclusions.”

In 2015, the Innocence Project took up Bridges’ case after an FBI investigation revealed its analysts gave scientifically flawed or overstated hair microscopy testimony, affecting as many as 2,500 cases nationwide. Bridges was one casualty.

The district attorney for the county, Andrew Murray, vacated Bridges’ conviction due to the unscientific hair microscopy testimony. Additionally, the prosecution revealed that while other crime scene evidence had since been destroyed, the jacket remained. DNA analysts found semen on the jacket and showed that the DNA did not belong to Bridges. He spent over 25 years in prison before his exoneration, while the real perpetrator was never identified.

[referenced url=” thumb=” title=” excerpt=”]

While it’s now widely accepted that some forensic disciplines, like microscopic hair comparison, are junk science, evidence from similarly subjective pattern-matching disciplines still gets into courtrooms. There are standards meant to prevent this, but they aren’t working, which can contribute to wrongful convictions. According to legal experts, what’s necessary is stopping bad science before it gets to court.

In 2009, a sweeping report from the National Academy of Sciences revealed the questionable nature of the pattern-matching forensic disciplines. These disciplines involve an expert comparing a piece of evidence—like the structure of a hair or a fingerprint—to a sample from a suspect. But no scientific research shows that such analyses can definitively link a person to a crime, and there’s no research supporting the reliability of these methods, contrary to what analysts have testified. Analysts would often say that two samples were definitely “a match” or that their method boasts a “zero error rate”—a claim that is “not scientifically plausible,” the report authors wrote. A subsequent report, put out in 2016 by the President’s Council of Advisors on Science and Technology, revealed that in the intervening years, not much improved.

One recommendation from both reports is establishing independent crime labs, rather than rely on labs under the control of law enforcement. “By the time [evidence] gets to a judge, in some ways it’s sort of too late,” said Brandon Garrett, a law professor at Duke University who specialises in criminal justice outcomes, evidence, and constitutional rights. “You already may have someone arrested, potentially pleading guilty based on the forensics. We need to focus on improving the science in the laboratories.”

The Houston Forensic Science Centre is working toward that goal. Before the centre’s establishment in 2014, Houston’s crime lab operated under the police department’s control and was rife with scandal. “It was probably the worst example of a mismanaged crime lab in our country,” said Fabricant.

The lab’s leaky roof dripped rainwater onto evidence; analysts failed to keep quality records; and backlogs included over 6,000 untested rape kits, the Houston Chronicle reported. In 2002, the DNA unit was shut down for four years after an audit revealed that technicians misinterpreted data and kept poor records. Hundreds of cases had to be retested as a result.

But now, “Houston is an exemplar,” said Fabricant. Since the new centre was established in 2014, Houston’s forensic analysts have worked outside of law enforcement pressure and with greater transparency. “That we’re closer to an equal partner with law enforcement and prosecution changes the dynamic,” Peter Stout, CEO of the centre, told Gizmodo.

Stout, who has a doctorate in toxicology, was working as a research forensic scientist at the nonprofit research organisation RTI International before he joined the new centre in 2015. “In the forensic world, everybody knows about Houston,” he said. “When the opportunity presented to be part of remediating one of the most failed laboratories in the world, it was just too good to pass up.”

Until last October, the centre was still co-located with the police department, raising questions about its independence. At the end of October, the centre moved into a new, separate facility that boasts state-of-the-art labs. The cost of the move was covered by the existing city-approved budget for the centre’s facility operations.

In an effort to be more transparent, the Houston lab publishes the results of its quality-control tests and the amount of backlogged evidence on its public website. After examining evidence, analysts’ reports detail how they reached their results, which isn’t typical in forensics. Analysts also receive training on how to accurately testify, an effort to ensure that they don’t overstate conclusions on the stand.

“We spend a good bit of time with analysts, educating about the appropriate limitations of what you can and should testify to,” said Stout. In addition, the centre also randomly selects and reviews transcripts of analyst testimony every quarter. A committee made up of three staff members—one forensics expert, a quality division staff member, and someone without forensic expertise—look out for where overstatement or misinterpretation occurs, to help the analyst improve their testimony.

The centre also challenges its own forensic methods. “In crime labs, there has not been any research culture,” Sandra Guerra Thompson, a University of Houston law professor and founding member of the centre’s board of directors, told Gizmodo.

To rectify that, the Houston lab created a blind testing program in 2015. The quality-control division develops mock cases and evidence samples that they submit along with genuine work to test the analysts’ abilities. “A few years down the road, they’re going to be able to say statistically what is the error rate for their work,” said Thompson, which, in theory, is critical to expert testimony.

While standards meant to keep unreliable forensic testimony out of the criminal courtroom exist, they aren’t often used to do so. Some states have adopted a standard called the Frye test, which emerged from a 1923 D.C. Circuit court decision and states that the scientific technique discussed in an expert’s opinion needs to be generally accepted by the relevant scientific community. Most states and the federal government now refer to the Daubert standard, the result of a 1993 Supreme Court decision. This standard requires that the methodology underlying expert testimony be scientifically valid, which includes having a known potential error rate.

“People expected there to be more rigorous screening of forensic science after the [Daubert] decision,” said Garrett. “The court said you can’t just let in experts because the methods they use are generally accepted. You have to ask whether it has a basis in valid and reliable science.”

But in criminal cases, the courts haven’t consistently been asking this. “What we’ve seen is in civil cases, where there are financial stakes, there is often a careful inquiry,” said Garrett. In criminal cases, by contrast, judges tend to let in evidence if similar types were used in the past. According to Thompson, there’s “this inertia: we’ve always let it in, so we’re going to continue to. I think the courts just feel like they have no option. If we’re going to have any kind of law and order, we have got to let prosecutors come in and use fingerprint testimony,” for instance.

Another problem is that it’s up to the defence team to ask the judge for a hearing to dismiss the prosecution’s scientifically flawed evidence or testimony. But defence attorneys are stretched thin, and many lack the financial resources or the know-how to do this, according to Sandy Feinland, a public defender in San Francisco.

“It becomes up to the individual public defenders to teach themselves, and they’re juggling dozens of cases,” Feinland told Gizmodo. “They have overwhelming caseload and share a little time and energy to dig deep into the science… But I think the general condition of the public defender’s offices is that they’re dramatically underfunded and don’t have the resources necessary.”

Even when the defence calls for a hearing, evidence is rarely dismissed entirely.

“What’s frustrating is that we want to use scientific evidence to help us reach reliable verdicts,” said Fabricant. “But we won’t recognise that what we once believed to be valid and reliable evidence is in fact not valid and not reliable. It has resulted in miscarriages of justice.”


Jackie Rocheleau is a freelance journalist and editor based in upstate New York. She writes about neuroscience, public health, and medicine.


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.