Fri. Feb 3rd, 2023
The science of why eyewitness testimony is often wrong

Gramercy Photos

The advent of DNA testing has made it uncomfortably clear that our criminal justice system often gets it wrong. Things go wrong for various reasons, but many of them touch on science, or rather the lack of a scientific basis for a number of forensic techniques. But in 70 percent of cases where DNA has overturned a conviction, it also contradicted the testimony of one or more eyewitnesses to the events in question.

According to a new perspective published in PNAS, that should not surprise us. The paper’s author, Salk neuroscientist Thomas Albright, argues that we’ve learned a lot about how humans perceive the world, process information, and retain memories. And much of it indicates that we shouldn’t value eyewitness accounts as much as we do. Still, Albright offers some suggestions on how we can adapt the research process to compensate for human limitations a bit.

Persistence of memory

Albright has some history in this area, having co-chaired a study group at the National Academies of Science on the subject. His new perspective is largely a summary of the report that emerged from the group, and is an important reminder that we have solid, evidence-based recommendations for improving the criminal justice system. It is problematic not to implement them several years after the report.

According to the perspective, things go wrong with eyewitnesses from the start. While human vision is good, there are many conditions (low light, distance, and sudden actions) that make it difficult to accurately perceive what’s going on. And we don’t always focus on the things that would make us a good eyewitness; when someone is waving a gun, we tend to look at the gun, not their face. And even if they aren’t brandishing a gun, we can believe they were if the visual information we remember is ambiguous, but we know a robbery took place.

Faced with partial information, research indicates that our brain’s response is not to commit the information we have to memory. Instead, our brains try to create a coherent picture that makes sense. It often involves filling in data using past experience as a guideline. The resulting memory may be satisfactorily complete, but at the cost of incorrect information.

Memories are also remarkably malleable. Rather than remaining unchanged and tucked away in our hippocampus, a lot of research shows that just remembering something can cause memory to be updated (or, in some cases, lost). And chances are, witnesses will be asked many times to recall events before they are finally called to testify in court. One of those moments is often checking out a lineup, either in person or through photos. This can open up a witness to suggestions from detectives who already have a suspect in mind – suggestions that can then be used to influence the memory.

By the time someone comes to testify at trial, their confidence in their memory of the relevant events is not really based on whether they can remember them accurately. Instead, it has largely shifted to confidence in whether they can form a coherent picture of what happened, which is not the same thing. (Albright quotes psychologist Dan Kahneman as saying, “High confidence statements mainly tell you that a person has built a coherent story in their head, not necessarily that the story is true.”) But for a jury, witness confidence tends to be a crucial part of assessing the credibility of a witness.

Do better

That is not to say that the memory of any eyewitness is unreliable or that our legal system’s reliance on their testimony is wholly misplaced. Instead, Albright echoes the National Academies report by arguing that the justice system needs to do more to accommodate the limits of memory. And police departments can implement procedures designed to limit their role in influencing a witness’s recall.

Albright warns that research is still moving in some areas. For example, an early study praised the use of sequential setups, in which possible perpetrators are shown one at a time, as this reduces false identification. But follow-up work showed that the approach reduced the number of identifications, false or true. Now the pendulum has swung back toward recommending simultaneous viewing.

That said, some of the changes the report recommends focus on the most important point in the process: initial identification in a police lineup. It calls for the procedure to be “blind”, meaning that the police conducting it are not involved in the case and have no idea who the potential suspect might be. The proceedings should be videotaped and the witness’ confidence in the identification established at that time. (Albright cites a case where a person who was unsure of a lineup later expressed full certainty about his misidentification in court.)

The courts can also play a role through the instructions they give to both witnesses and juries. Witnesses can be instructed in ways that help provide a consistent and conservative response and involve a sense of uncertainty. Juries may be advised of some of the limitations of eyewitness testimony.

Albright is somewhat optimistic about the potential for change, citing two recent state court rulings that improved the scientific basis for allowing eyewitness testimony. But he notes that the Supreme Court for federal courts set a standard for eyewitness testimony way back in 1977, and that it was based on legal precedent rather than science. There are no obvious ways to update that default.

But ultimately, the key recommendations in the report need to be implemented at the level of thousands of individual police departments across the country. While courts can help drive that process, the change will only be possible if these departments prioritize accuracy over a potentially too high probability of any kind of identification.

PNAS2017. DOI: 10.1073/pnas.1706891114 (About DOIs).

By akfire1

Leave a Reply

Your email address will not be published.