Advertisement


 

Everyone’s Biased But Me: Prejudice in Forensic Science



Published: Dec 9, 2022  |  

Exoneree, journalist, public speaker and author



“If they fire me,” Dr. Itiel Dror tells me, “I’ll retire, sit on the beach in the Bahamas with tequila and say, ‘Thank God they fired me. I’m now enjoying life and not working in the lab.’ I’m trying to look at the silver lining,” he says with a laugh, even though his career of thirty years is hanging in the balance. 

Dr. Dror is genial and slight, and speaks with an endearing lisp in an accent that is hard to place. He often asks new acquaintances to guess—it’s a mix of Israeli and British English. He also sports a diamond stud earring in one ear. If that isn’t your mental image of a renowned scientist, you may be afflicted with cognitive bias, which happens to be Dr. Dror’s specialty, especially as it pertains to forensic science. Dr. Dror has been publishing papers on the subject in major scientific journals for nearly thirty years, and his work has been cited over 10,000 times

I first met Dr. Dror nearly a decade ago in New York City, at the introduction of Dr. Saul Kassin, a false confessions researcher who helped me understand how the police in Perugia had broken me down through the coercive interrogation I’d endured—53 hours of questioning over five days. That interrogation remains the absolute most terrifying experience of my life. The study of false confessions and of cognitive bias in expert decision making are closely related—they both contribute to wrongful convictions and can interact in a vicious feedback loop. 

As Dr. Dror put it to me in our recent interview for my podcast, Labyrinths, the two strongest forms of evidence in a criminal trial are forensic science and confession. The problem is, “When they come to someone and say, ‘We have forensic evidence against you,’ they’re more likely to confess. And if the forensic examiner knows that the person confessed, they’re more likely to see that the evidence matches when actually it does not.” Dr. Dror calls this “the bias snowball effect.” 

Just as I owe a great deal of peace of mind to Dr. Kassin, I owe Dr. Dror for helping me to understand how well-meaning and even well-trained people could have been so wrong, and yet so convinced, that forensic evidence proved I was guilty of murder. So it pains me that Dr. Dror is now facing such tremendous backlash for his recent work, which reveals how cognitive bias impacts forensic pathology decisions.

This was a new domain for Dr. Dror, but his aims and his methods were essentially the same as his previous work. “This research is almost boring,” he says. “It’s phenomena that has been shown not only in fingerprints and in DNA, it’s been shown with medical doctors and bankers, and HR, and police officers, and every person on the planet. So I’m not breaking new scientific ground, to be honest. I wish I was.”

Dr. Dror isn’t interested in the kind of bias we think of as prejudice, but the unconscious bias that shapes our every decision. In this sense, while his work is focused on expert decision making, it applies equally to all humans. The truth is: our brains are lazy. They take cognitive shortcuts whenever possible. These shortcuts allow us to make snap decisions that can improve our chances of survival. But that also means that how we perceive and process the world, how we think and judge and categorize, is often not rational.

Our cognitive biases are so deeply ingrained that even when we’re aware of them, we still fall prey to them. It’s tempting to think that scientists and experts are more rational, better able to transcend these natural cognitive biases. Not according to Dr. Dror’s research. 

His initial foray into showing bias in forensic science was with fingerprint analysis. To figure out if fingerprint examiners were being affected by extraneous factors, he came up with a clever study design that allowed him to alter the context, but hold one variable—the judgment of whether a print was a match—constant. For each of the international fingerprint experts who took part in the study, he gave them copies of prints they themselves had marked as matching years prior. But this time, he altered the extraneous case information around the prints, putting them into the context of a known wrongful conviction.

It’s easy to think that other information shouldn’t matter. Fingerprints are all unique, right? They either match or they don’t. Way back in 1892, at the dawn of forensic science, English polymath Sir Francis Galton estimated the odds of finding two identical prints at one in 64 billion. 

But while it’s true that our fingerprints are incredibly unique, most prints examined in a criminal context are latent prints, impressions lifted off objects, which means they are usually distorted, partial, or otherwise degraded. And there’s no accepted scientific standard for how many points of similarity are required to label two prints a match. 

This means that a fingerprint examiner’s job is often to judge whether a partial smudge lifted from a crime scene matches a carefully inked print. Dr. Dror saw that there was a lot of room for bias to creep in there. And sure enough, when he gave these international experts prints they had previously matched but now with new context, most of them changed their judgment. 

In response to this paper, then Chairman of the Fingerprint Society, Martin Leadbetter, wrote in a letter to the editor of Fingerprint Whorld that any examiner who falls prey to such bias “is either totally incapable of performing the noble tasks expected of him/her or is so immature he/she should seek employment at Disneyland.” In other words, he rejected the expertise of the experts in Dr. Dror’s study, claiming that any true expert would be immune from biasing contextual influences.

This is the fallacy of expert immunity. Dr. Dror and others have shown that experts are no less susceptible to cognitive bias than novices, and in fact may be more susceptible to certain biases based on their selective attention, reliance on heuristics, and over-confidence. This has also been shown with police interrogators, who overestimate their lie detection skills

After showing the role of cognitive bias in fingerprint analysis, Dr. Dror went on to show similarly devastating results in other forensic domains, such as DNA interpretation and forensic anthropology. “We’re talking about basic cognitive architecture,” Dr. Dror says. “How our hope, our expectation, impacts our judgment.” If it applies to fingerprints and DNA, it applies to handwriting and firearm analysis. “Domain after domain will show it again and again,” he says. 

Looking back on my own wrongful conviction, I find the cognitive biases revealed in Dr. Dror’s research are particularly illuminating. My first conviction in 2009 rested mostly on a single flawed piece of forensic evidence, a knife that the prosecution claimed had my DNA on the handle and my roommate Meredith’s DNA on the blade. That finding was made by the prosecution’s DNA experts, and when a judge agreed to a review by independent experts, they found that the supposed trace of Meredith’s DNA on the blade was so small it was likely the result of lab contamination (and furthermore was not definitely blood, but likely potato starch). That finding led to my first acquittal and set me free in 2011. But how could that faulty piece of evidence have become so central to begin with?

That knife was not found at the crime scene, but in the apartment of my then-paramour of five days, Raffaele. The knife was pulled at random from a kitchen drawer on a police hunch. It had my DNA on the handle because I’d used it for cooking—no surprise there. What is surprising is that no other knives were examined or tested. This knife also did not fit all of Meredith’s wounds. The prosecution asked a jury to believe that I had carried this knife across town to my apartment, then committed a spontaneous and unplanned rape and murder, and then cleaned the knife before returning it to the kitchen drawer in Raffaele’s apartment. How could the investigators ignore all that context to conclude that this knife had to be the murder weapon? Why didn’t the lab technicians conclude that the tiny trace they found was the result of contamination? It has never been a satisfying answer to me that they were all laughably incompetent. 

What Dr. Dror’s research into cognitive bias reveals is that even intelligent, well-meaning experts can arrive at false and absurd conclusions about forensic evidence, all while believing they are being objective. In my case, those investigators were working for the prosecution, and after I had been coerced into a false admission that implicated me, they were tasked with finding DNA evidence linking me to the crime. That is, they sought evidence that would confirm the incoherent and false statements the police authored and pressured me to sign. They were searching for a particular conclusion, not following the evidence wherever it led them (there was copious DNA from Meredith’s killer, Rudy Guede, in the room and on her body). The investigators fell prey to confirmation bias when examining that knife. They ignored all the evidence that discounted it as the murder weapon, and magnified any evidence that could confirm it. 

The police DNA expert, Patrizia Stefanoni, who tested dozens of samples from the crime scene at once, was also biased by my false admission, and like the investigators, was working directly for the prosecution, seeking evidence to confirm their theories. 

All this leads me to believe that none of the people convinced of my guilt were acting in bad faith, actively trying to frame me—but even so, nobody likes to be wrong. So it’s not surprising that Dr. Dror has faced backlash every time he’s shown how cognitive bias affects expert decision-making, often from the leaders of those fields. Yet the latest backlash from medical examiners and forensic pathologists—those who conduct autopsies and determine cause of death—has dwarfed any previous negativity. On this, I’ll go into further detail next time in part two of this three-part article.