Crime Labs in Crisis: Shoddy Forensics Used to Secure Convictions
To millions of people whose knowledge of crime labs comes from television shows such as CSI, Bones, Crossing Jordan and the venerable Quincy M.E., the forensic experts who work at such labs seem to be infallible scientists who use validated scientific techniques to follow the evidence to the truth, regardless of where it leads. Sadly, that is far from accurate.
“The CSI effect has caused jurors to expect crime lab results far beyond the capacity of forensic science,” wrote Jim Fisher, a former FBI agent and retired criminalistics professor who taught forensic science at Edinboro University of Pennsylvania, in his 2008 book titled Forensics Under Fire: Are Bad Science and Dueling Experts Corrupting Criminal Justice?
Fisher notes that problems in forensics “have kept scientific crime detection from living up to its full potential.” His conclusion is that “bad science, misadventures of forensic experts [and] human error” exemplify “the inability of our 21st century judicial system to properly differentiate between valid research and junk science.”
Crime lab workers are not necessarily scientists. In fact, sometimes only a high school diploma is required for employment as a forensic technician or arson investigator. Nor are lab examiners and their supervisors always the unbiased investigators portrayed on TV; in fact, many crime labs are run by or affiliated with police departments, which have a vested interest in clearing unsolved crimes and securing convictions.
Police often share their suspicions regarding suspects with lab workers before forensic examinations are performed. This has been shown to prejudice lab personnel in areas as diverse as fingerprint examination and chemical testing for accelerants in arson investigations. Further, some lab examiners feel they are part of the prosecution team, helping the police and prosecutors convict suspects regardless of the results of forensic testing. In such cases, forensic experts and other lab personnel may lie about test results, be misleading about the reliability of their methods, and/or cover up test outcomes when they are beneficial to the defendant.
Some forensic examiners “dry-lab” their tests, writing down results for tests they never performed. They may be motivated by understaffing and excessive workloads, a belief that tests required by lab protocols are unnecessary, an inability to perform the tests due to a lack of training, education and experience, or even the belief that the police have already arrested the right person, so evidence testing would be superfluous.
Then there are the forensic “experts” who lie about their academic credentials or accreditation, either on their résumés or in perjurious testimony. They initially may have been motivated to pad their résumé to secure employment, but might also seek to discourage defense attorneys from questioning their test results and conclusions by presenting overwhelming evidence of expertise they do not actually possess. This often works. Giving a bite of truth to the old adage that lawyers went to law school because they couldn’t do math, few judges, prosecutors or defense attorneys can keep up with complicated developments in the field of forensic science.
These types of problems have led to scandals at dozens of crime labs across the nation, resulting in full or partial closures, reorganizations, investigations or firings at city or county labs in Baltimore; Boston; Chicago; Colorado Springs, Colorado; Dallas; Detroit; Erie County, New York; Houston; Los Angeles; Monroe County, New York; Oklahoma City; San Antonio, Texas; San Diego; San Francisco; San Joaquin County, California; New York City; Nashville, Tennessee; and Tucson, Arizona, as well as at state-run crime labs in Illinois, Montana, Maryland, New Jersey, New York, Oregon, Pennsylvania, Virginia, Washington, North Carolina, West Virginia and Wisconsin, plus the federally-run FBI and U.S. Army crime labs. Forensic “expert” scandals have also been reported in the United Kingdom.
The origins of such problems include unqualified or incompetent lab workers, personnel using false academic credentials, contamination in labs that cause false test results, employees falsifying test results to “help the prosecution,” and lab examiners committing perjury. Contributing to these problems is a lack of qualification standards and industry-wide training requirements for lab workers.
One might think that such scandals are caused by a few bad apples in the crime lab barrel, which is the spin typically adopted by the labs themselves. That problem could be fixed by hiring qualified personnel, training them properly and providing adequate oversight. But at least the forensic science that underpins crime lab testing is sound and valid, right? In many cases, wrong.
A 2009 report by the National Academy of Sciences, the most prestigious scientific organization in the United States, revealed that much of the “science” used in crime labs lacks any form of peer review or validation – fundamental requirements for sound science. Such questionable forensic methods include long-established and accepted techniques such as fingerprint comparison, hair and fiber analysis, and bullet matching.
National Academy of
Sciences Report
After a series of crime lab scandals and the FBI’s erroneous identification of an American attorney as a terrorist suspect in the 2004 Madrid train bombings, Congress instructed the National Academy of Sciences (NAS) to review the status of forensic techniques used in criminal prosecutions. Following a two-and-a-half year investigation, the NAS’s National Research Council (NRC) released the report, titled “Strengthening Forensic Science in the United States: A Path Forward,” in February 2009. It exploded among crime labs like a bomb.
The NAS was established by President Abraham Lincoln in 1863 with a mandate to advise the government on issues involving science and technology. The 2009 report identified a number of deficiencies in the forensic sciences and with crime lab personnel who use and testify about such evidentiary methods in court.
“In a nutshell, these people aren’t scientists,” stated NAS member Jay A. Siegel. “They don’t know what validation is. They don’t know what it means to validate a test.”
This criticism applies to virtually all disciplines of forensic testing with the sole exception of DNA analysis, which was first developed in academic biology laboratories and then later adopted for forensic science applications. One example of a commonly-used forensic testing method is bullet matching, in which a lab technician examines two spent bullets under a microscope to compare striations caused by the grooves in the gun’s barrel. If the bullets show similar striations, a “match” is declared. If one bullet came from a known gun, the other bullet is “matched” to that firearm and the technician will testify the bullet must have been fired by that gun.
“It is not possible to state with any scientific certainty that this bullet came from any weapon in the world,” said Siegel, who chairs the Department of Chemistry and Chemical Biology and is the director of the Forensic and Investigative Services Program at Indiana University-Purdue University Indianapolis.
The NAS report noted that the techniques used to connect a person with a crime scene often lack any type of scientific validation. This includes frequently-used forensic methods such as handwriting analysis and comparison of hairs, fibers, tool marks, tire treads, shoe prints and even fingerprints.
To validate these techniques, controlled studies would have to be performed such as blind testing (where the answer is known to the evaluators but not to the persons performing the forensic analysis) and statistical testing (in which a large number of samples are compared to see how often a random “match” occurs). Further, issues of investigator bias, unknown error rates, lack of crime lab independence, underfunding, poor training, lack of lab personnel qualifications, low academic requirements, and lab personnel making exaggerated claims about the accuracy of forensic techniques were noted in the NAS report.
Crime lab officials reacted with predictable outrage, claiming that the report says fingerprint and other identification techniques should be discarded, and blaming the slew of crime lab scandals on a few bad employees.
“If somebody drives and drinks and kills somebody, that’s horrible,” stated Ron Fazio of Integrated Forensic Laboratories in Euless, Texas. “But that doesn’t mean that all driving is bad. The person who made the mistake needs to be dealt with, but that doesn’t mean we should outlaw driving.”
Fazio also argued that some studies have validated bullet comparison methods. Siegel, who reviewed the research referred to by Fazio, said they contained no scientific criteria for the basis of the identification and no validation for the quality of the match. He also noted that such studies were conducted by firearms examiners and published in forensic journals that, unlike scientific journals, are not peer reviewed.
NRC member Karen Kafadar, a professor of statistics and physics with Indiana University at Bloomington, said the committee members who authored the report spent over two years analyzing research and data submitted by forensic professionals and found no studies that met basic scientific criteria. For example, most of the studies were not “blind” in that the participants were aware of the outcome and merely showed how the results were obtained. Other research studies used sample sizes that were too small or had other structural flaws. Kafadar doesn’t believe there are any other relevant studies.
“If experts in Texas believe there was research [that] we failed to acknowledge ... they would have had every opportunity to get it to the committee before our work was done. They had 2½ years to get it in,” she said.
Additionally, the NAS report did not recommend that all forensic methods except DNA testing be abandoned; rather, it said that such methods should be given stringent scientific scrutiny to ensure they are valid.
“One of the confusions that occurred from the report is that since these tests haven’t been validated, they shouldn’t be used. We’re not saying that they are invalid and shouldn’t be used,” said Siegel. “What we said in the report is that the jury is still out there until this scientific testing is done.”
Kafadar believes that forensic evidence presented in court – upon which a defendant’s freedom or even life may depend – should be the result of rigorously-validated, scientifically-proven techniques. The alternative is to allow people to be convicted based on junk or “voodoo” science.
“Voodoo” Science
If the NAS report raises doubts about generally-accepted forensic methods such as fingerprint identification and bullet, hair and fiber comparisons, what does that mean for less-accepted methods? The truth of the matter is that if a prosecutor can convince a judge to allow evidence into court it will be used, no matter how absurd its “scientific” basis might be. And judges are not known for their scientific acumen; according to Seigel, judges “just don’t understand a thing about it. That’s the sad fact.”
In a 2001 national survey of 400 state court judges, the vast majority said they firmly believed in their gatekeeping role – deciding whether scientific evidence should be admitted in court, e.g. under the standard set forth in Daubert v. Merrell Dow Pharmaceuticals, 509 U.S. 579 (1993) – but only 4% had a clear understanding of the scientific concepts of probability and error rate.
Those concepts are key to determining whether scientific evidence is useful or meaningless. As Dave Wymore, former director of the Colorado public defender’s office put it in 1999, when he achieved exclusion of the FBI’s since-discredited comparison bullet lead analysis (CBLA), “Sure, you have this wiz-bang, whipper-dipper machine that looks at all the elements of the universe, but it doesn’t mean anything.”
“The partisan adversarial system used in the courts to determine the admissibility of forensic science evidence is often inadequate to the task,” noted Harry T. Edwards, a judge on the U.S. Court of Appeals for the D.C. Circuit who co-chaired the NRC report committee. “And because the judicial system embodies a case-by-case adjudicatory approach, the courts are not well-suited to address the systemic problems in many of the forensic science disciplines.”
Defense attorneys have no better understanding of science either, said Siegel. One hundred years of unscientific evidence being accepted in the courts proves that point, and faulty forensic methods have been shown to contribute to wrongful convictions.
“Too often lawyers don’t do their homework enough so they can properly cross-examine these people,” observed former president of the American Academy of Forensic Sciences and retired McHenry County, Illinois judge Haskell Pitluck. “They come in and say, ‘I’m an expert.’ And some lawyers simply roll over.”
“If lawyers could do science, they’d be doctors,” said Pitluck, noting that while his forensic acumen exceeds that of most jurists he does not “feel qualified to make many of these calls.”
Now that previously-accepted forensic techniques have been questioned by the NAS, can they be challenged in court? Maybe, but it will be difficult unless defendants can afford to hire a forensic expert to testify on their behalf. For the many defendants who are indigent that is highly unlikely because, whereas the U.S. Supreme Court has recognized a constitutional right to legal counsel in criminal cases, it has made no finding of a right to forensic experts. Additionally, the courts’ deference to precedent works against new challenges even if a defendant can afford an expert.
“The habit of judges to defer to prior decisions disinclines appellate courts to revisit possible or actual errors by trial courts in any given case, and it leads trial judges to submit species of evidence that appellate courts had approved in the past, regardless of how flawed that type of evidence can be shown to be with current knowledge,” wrote Arizona State University law and psychology professor Michael J. Saks and David L. Faigman, a University of California Hastings College of the Law professor, in The Annual Review of Law and Social Science.
This essentially means that once a forensic method has been accepted by one trial court, it has a good chance of becoming an acceptable technique elsewhere. This leads to the acceptance of methods that have no basis in science or fact – junk science.
“The art of junk science is to brush away just enough detail to reach desired conclusions, while preserving enough to maintain an aura of authoritative science,” according to “Criminal Law Forensics: Century of Acceptance May Be Over,” an article in the January 8, 2009 issue of the New York Law Journal. That description certainly applies to the FBI’s infamous CBLA forensic technique.
Comparison Bullet Lead Analysis
A 2004 NAS report led the FBI to abandon comparison (or comparative) bullet lead analysis, known as CBLA. CBLA used a nuclear reactor located in the basement of the FBI’s headquarters in Quantico, Virginia to perform a highly accurate analysis of trace elements found in the lead of a bullet. The report found that the scientific basis of the bullet lead analysis was sound. However, what was not sound was the underlying premise that bullets with similar chemical makeups must have come from the same box of ammunition. The NAS report showed that millions of bullets have the same chemical composition, rendering CBLA useless as a forensic tool.
FBI technicians gave expert testimony about this impressive but ultimately worthless investigative method in more than 2,500 criminal prosecutions nationwide, including capital cases such as that of Kentucky prisoner Ronnie Lee Bowling, who remains on death row.
The CBLA debacle was neither the first nor the latest FBI crime lab scandal. Dr. Frederic Whitehurst, a chemist formerly employed by the FBI, became a whistleblower who revealed shoddy work and lack of validation in the explosives section of the FBI’s crime lab after the 1993 World Trade Center bombing. Since 2005, Whitehurst has directed the nonpartisan Forensic Justice Project (www.whistleblowers.org), which has tried to force the FBI to release a list of its CBLA cases.
“The new revelations about bullet-lead analysis are just the latest examples of the Department’s inadequate efforts to ensure that sound forensic testing is utilized to the maximum extent to find the guilty rather than merely obtain a conviction. Punishing the innocent is wrong and allows the guilty to go free,” said U.S. Senator Patrick J. Leahy, chairman of the Senate Judiciary Committee, in 2007.
So what did the FBI do when it discovered the CBLA testing it had used since the 1980s was bogus? It discontinued the use of CBLA in September 2005 and sent letters to prosecutors stating that while the basic science was sound, CBLA was being abandoned due to costs and other considerations. Senator Leahy criticized the letters for giving “the false impression that these discredited tests had continuing reliability.”
“I’m also troubled that many cases affected by such analysis still need review, and that numerous cases involving possibly innocent defendants serving long jail terms have not been examined,” Leahy added.
As of January 2010 the FBI was still reviewing CBLA cases, and had found 187 in which FBI experts testified. It sent another letter to the prosecutors in those cases stating that CBLA “exceeds the limits of science and cannot be supported by the FBI.” No notification was given to defense attorneys, though, and nothing is being done in cases where defendants were coerced into pleading guilty by prosecutors who threatened them with presumably damning CBLA evidence.
Several convictions that involved CBLA have since been overturned. In one published opinion, the New Jersey Court of Appeals, upon reversing the conviction of Michael S. Behn, who was found guilty of murdering a coin dealer, held that CBLA was unproven and unreliable. “The integrity of the criminal justice system is ill-served by allowing a conviction based on evidence of this quality, whether described as false, unproven or unreliable, to stand,” the court wrote. See: State v. Behn, 868 A.2d 329 (N.J.Super.A.D. 2005), appeal denied.
At Behn’s trial, a prosecutor had told the jury there was a 99.9987% likelihood that bullets found at Behn’s home “came from the same lot” as the bullet used to kill the victim. Behn was retried in 2006 and reconvicted based on other evidence.
In 1997, Tom Kennedy was found guilty of a double homicide in Colorado. Citing discredited CBLA evidence, a judge overturned his conviction in April 2009. Prosecutors are appealing the reversal.
Phillip Scott Cannon was freed in December 2009 after serving more than 10 years for a triple homicide in Oregon; CBLA evidence had been introduced during his trial. Prosecutors said he would not be retried because the exhibits used in the original trial have been destroyed.
In Florida, Jimmy Ates served 10 years of a life sentence for the 1991 murder of his wife. He was released in December 2008 with the agreement of the state attorney’s office due to the use of CBLA evidence in his case. Prosecutors are still pursuing the charges, though, with a retrial set for January 2011.
Procedural problems have hampered CBLA challenges in other cases. In February 2010, the Texas Court of Criminal Appeals rejected a challenge to a 2003 murder conviction based on CBLA evidence. The appellate court acknowledged that CBLA had been discredited, but found that claims raising new evidence not contained in the trial court record were not generally considered on direct appeal and should instead be raised in post-conviction proceedings.
“To ensure that only ‘good’ science is admitted and ‘bad’ science is excluded in our criminal trials, the parties must shoulder the responsibility of providing and explaining the appropriate educational materials for judges to make that determination,” one of the appellate judges remarked in a concurring opinion. See: Gonzales v. State, Texas Court of Criminal Appeals, Case No. PD 1661-09 (February 24, 2010).
Dog Scent Evidence
According to a September 2009 report by the Innocence Project of Texas, “The use of ‘junk science’ by police and prosecutors in Texas is an ongoing injustice. Nowhere is this more obvious than in the government’s use of ‘scent lineups’ – a practice that is happening today throughout the state.”
Fort Bend County Deputy Sheriff Keith A. Pikett and his team of dogs, which he named Quincy, Columbo, James Bond and Clue, are well known throughout South Texas. Since the 1990s Pikett has performed thousands of “dog scent lineups,” in which a suspect is rubbed with a swatch of cloth and the cloth is placed with similar swatches containing other people’s scents. The dog then sniffs a cloth swatch containing a scent from a crime scene and tries to “match” it with one of the swatches in the lineup.
The FBI has warned against using results from this method as primary evidence, saying it should only be used to corroborate other evidence. Robert Coote, who formerly led a police canine unit in the United Kingdom, viewed a video of one of Pikett’s scent lineups.
“This is the most primitive evidential police procedure I have ever witnessed,” he said. “If it was not for the fact that this is a serious matter, I could have been watching a comedy.” The problem is that Pikett and his followers are true believers despite the fact that several suspects identified by dog scent lineups have been exonerated.
Jeff Blackburn, chief counsel for the Innocence Project of Texas, referred to scent lineups as “junk science injustice” and said Pikett merely gives police the match they were already looking for to confirm their suspicions.
Lawrence J. Meyers, associate professor of animal behavior at the Alburn University College of Veterinary Medicine, has another explanation. He thinks that Pikett believes in scent lineups because he either inadvertently allows the dogs to pick up on a subtle or unconscious signal from the handler or detectives, or allows the swatch samples to become contaminated. Despite its dubious nature, dog scent lineup evidence has been admitted in courts in Alaska, Florida, New York and Texas, according to Meyers.
In Victoria, Texas, a Pikett dog scent lineup led to the arrests of Curvis Bickham and Cedric Johnson for a triple homicide. Bickham served 8 months in jail and Johnson served 16 months before another man confessed to the killings. While incarcerated, Bickham lost his home and cars and developed psychological problems.
Retired Victoria County Sheriff’s Department Captain Michael Buchanek was publicly named a “person of interest” in a rape-murder after a Pikett hound “led” police along a convoluted 5-1/2-mile trail that ended in his neighborhood. Based on the scent trail, police obtained a search warrant for Buchanek’s home and car and began a harassment campaign against him. Another man later confessed to the crimes and pleaded guilty.
“A gypsy reading tea leaves and chicken bones is probably as reliable as a dog doing a scent lineup,” said Victoria County district attorney Stephen B. Tyler, who noted that because dogs can’t be cross-examined, reliable testimony from their handlers is of utmost importance. “A dog might be great, but if a dog handler is not good or not credible, it’s only as strong as their weakest link,” he observed.
Buchanek and Bickham have filed separate federal civil rights suits against various law enforcement officials, including Pikett. In March 2010 the district court denied Pikett’s motion for summary judgment on qualified immunity grounds in Buchanek’s case. See: Buchanek v. City of Victoria, U.S.D.C. (S.D. Texas), Case No. 6:08-cv-00008. That same month, the court denied the defendants’ motions to dismiss in Bickham’s lawsuit. See: Curtis v. McStravick, U.S.D.C. (S.D. Texas), Case No. 4:09-cv-03569. Both cases remain pending.
Another wrongly accused defendant, Calvin Lee Miller, was arrested in March 2009 and charged with raping a woman and robbing another based on a Pikett dog scent lineup. DNA evidence eventually cleared him after he spent two months in jail. He has filed a civil rights suit against Pikett and Fort Bend County law enforcement officials, too. See: Miller v. City of Yoakum, Texas, U.S.D.C. (S.D. Texas), Case No. 6:09-cv-00035.
In Florida, Bill Dillon was exonerated by DNA evidence and released in November 2008 after spending 26 years in prison. He had been convicted in part due to fraudulent dog scent evidence. John Preston, a retired Pennsylvania state trooper, claimed that his German Shepard, “Harass II,” had found Dillon’s scent despite the fact that the scent trail was eight days old and a hurricane had swept through the area during that time. In hundreds of cases, Preston convinced juries of his canine’s miraculous abilities, even claiming the dog could track a scent underwater and could follow a scent trail that was six months to a year old.
Brevard County, Florida judge Gilbert Goshorn put Preston to the test in 1984 and exposed him as a fraud. Two attorneys jogged down separate paths, and the next morning “Harass II” was provided with a sweat-soaked shirt from one of the lawyers. The dog was unable to follow the trail. Given a chance to try again, Preston instead left town.
“It is my belief that the only way Preston could achieve the results he achieved in numerous other cases was having obtained information about the case prior to the scent tracking so that Preston could lead the dog to the suspect or evidence in question. I believe that Preston was regularly retained to confirm the state’s preconceived notions about a case,” Goshorn stated.
By 1987, Preston had been completely discredited. He was labeled a “total fraud” by a former prosecutor and a “charlatan” by the Arizona Supreme Court. See: State v. Roscoe, 184 Ariz. 484, 910 P.2d 635 (Ariz. 1996).
Unfortunately, Florida prosecutors never notified the many people who were convicted based on Preston’s dog scent testimony. It was 2006 before Dillon learned that Preston had been exposed as a fraud; he then contacted the Florida Innocence Project, which arranged for the DNA testing that led to his exoneration. Two other defendants convicted in part due to Preston’s testimony have been proven innocent, including Wilton Dedge, who served 22 years before DNA evidence set him free in 2004. He received a $2 million settlement from the state. [See: PLN, March 2006, p.17].
Preston was never prosecuted for his outrageous testimony that sent many defendants to prison; he died in 2008. While now retired, Pikett still has a following of law enforcement officials who praise his work, even though he reportedly lied about his academic credentials, falsely claiming he had BS and MS degrees in chemistry.
“He’s been accused 20 different ways of cheating,” said Fort Bend County assistant county attorney Randall W. Morse, who is defending Pikett in several lawsuits. “Critics are trying to throw up a smoke screen to get defendants off.”
Victoria County Sheriff T. Michael O’Connor referred to dog-scent lineups as a “vital tool in working toward a determination of a case,” and said he would use them again. “I feel they’re credible. I’ve watched those dogs,” he stated. “I looked on in absolute amazement. We believe in this stuff.”
Like many proponents of questionable forensic methods, O’Connor considers an obvious failure to be a success. “We did the right thing, and the wrong person was not convicted,” he said, referring to the Michael Buchanek case. But it was another persons’ confession, not Pikett’s dogs, that got an innocent man off the hook, and it was dog scent evidence that had falsely implicated Buchanek in the first place.
The Innocence Project estimates that between 15 and 20 people “are in prison right now based on virtually nothing but Pikett’s testimony.”
In Florida, Seminole and Brevard Counties have agreed to review 15 to 17 cases from the 1980s in which people remain incarcerated after Preston’s dog scent evidence was used during their prosecutions. Preston also testified in other states and counties and in federal court.
Another form of scent evidence was misused by Michigan dog handler Sandra Marie Anderson, who claimed her cadaver dog, Eagle, could detect human remains. Her expert testimony was used in numerous criminal cases; in one, she said Eagle had “no unsuccessful hits.” However, Anderson was charged in 2003 with planting body parts and other evidence for her dog to “discover,” apparently to boost her credibility. In one case, Eagle found a bloody saw in a murder suspect’s basement; it was later determined the blood was Anderson’s.
Anderson eventually admitted that she had planted evidence in seven criminal investigations, including blood, bones and a toe. She was indicted on ten counts of obstruction, evidence tampering and lying to federal officials, pleaded guilty, and was sentenced in September 2004 to 21 months in prison.
“There are no national standards” for dog scent evidence, admitted Steve Nicely, a professional dog trainer in Austin, Texas who has trained police dogs. “Our standards are so lacking, it’s pathetic. We should be ashamed of ourselves.”
Lie Detection
Almost everyone knows that polygraph examinations, commonly but inaccurately referred to as “lie detectors,” aren’t admissible in court. The reason is because polygraphs and related methods of distinguishing the truth lack reliability – despite what is often portrayed on TV, such as in the Fox series Lie to Me.
Of course this doesn’t stop the government from using lie detectors to screen people applying for certain jobs or to monitor sex offenders on parole. [See: PLN, Dec. 2008, p.1]. It also doesn’t prevent people from trying to come up with new ways of “detecting” lies. One such method, the voice-stress analyzer, is even less reliable than the polygraph, with an accuracy rate about the same as flipping a coin.
John Sullivan, a CIA polygraph examiner for 31 years, screened employees for security purposes. His own security clearance was temporarily revoked due to what he claims was an abusive polygraph exam.
“The irony of my situation certainly did not escape me,” said Sullivan. CIA polygraph operators are not routinely screened. Sullivan got special attention after he published a book titled Gatekeepers: Memoirs of a CIA Polygraph Examiner, which he described as “a window to the often acrimonious and sometimes alarming internal politics of the CIA.”
Despite alleging an abusive polygraph exam in his own case, Sullivan does not believe CIA polygraph examiners should be asked whether they ever misused a test, because “it would open a can of worms.” He filed suit against the CIA in 2007 due to the revocation of his security clearance, which he claimed was retaliatory. The lawsuit was resolved under undisclosed terms in July 2009. See: Sullivan v. CIA, U.S.D.C. (D. DC), Case No. 1:07-cv-00685-JR.
The latest method to be held up as the long-sought-after lie detector is functional magnetic resonance imaging (fMRI), which is being pushed by two private companies, No Lie MRI and Cephos Corp. The technique uses MRI scans and blood oxygen levels to display which parts of the brain are being used and to what degree they are used when people respond to questions.
About 20 peer-reviewed scientific studies have shown that certain parts of the brain become more active when a person is telling a lie. The studies used healthy subjects who lied about simple things, such as which card they were looking at. Whether the technique works for more complicated lies, for people with mental or physical health problems, for people who face serious consequences if they are caught lying, or for those who falsely believe they are telling the truth has not been researched.
Nonetheless, Cephos Corp. claims it has the ability to detect a lie 79 to 97 percent of the time, while No Lie MRI claims at least a 93 percent lie detection rate. No court has allowed fMRI to be admitted as evidence in a criminal case, and the technique has been criticized – including in an article titled “Playing Devil’s Advocate: The case against fMRI lie detection,” published in the February 2008 issue of Legal and Criminological Psychology.
In May 2010, a Tennessee federal court heard testimony as to whether fMRI evidence from Cephos should be introduced in the criminal prosecution of Lorne Semrau, a psychologist accused of Medicare fraud. The court rejected the fMRI evidence, noting that “While it is unclear from the testimony what the error rates are or how valid they may be in the laboratory setting, there are no known error rates for fMRI-based lie detection outside the laboratory setting, i.e. in the ‘real-world’ or ‘real-life’ setting.”
Yet the district court also stated that “should fMRI-based lie detection undergo further testing, development, and peer review, improve upon standards controlling the technique’s operation, and gain acceptance by the scientific community for use in the real world, this methodology may be found to be admissible even if the error rate is not able to be quantified in a real world setting.” See: United States v. Semrau, U.S.D.C. (W.D. Tenn.), Case No. 1:07-cr-10074-JPM-tmp.
That might take awhile. In one amusing 2008 study, neuroscientist Craig Bennett took an Atlantic salmon to a lab at Dartmouth University and used fMRI to study it while it was shown a series of photographs. The fMRI data indicated activity in the area of the fish’s brain, despite the fact that the salmon was “not alive at the time of scanning.” Bennett used the unexpected test results to warn against false positives. “We could set our [test result] threshold so high that we have no false positives, but we have no legitimate results,” he observed.
Such a fishy outcome should make one wonder about the validity of fMIR for lie detection purposes. Indeed, in a 2009 article in Perspectives on Psychological Science, MIT post-graduate student Ed Vul, a statistician, concluded that “a disturbingly large, and quite prominent, segment of fMRI research on emotion, personality, and social cognition is using seriously defective research methods and producing a profusion of numbers that should not be believed.”
Regardless, Cephos Corp. CEO Steven Laken stated, “We’re not going to stop doing what we’re doing.” Referring to the Semrau prosecution in which fMIR evidence was rejected, he said, “The judge in this case is just one person.”
In India, a woman was convicted of murdering her former fiancé based on a Brain Electrical Oscillations Signature (BEOS) profile. Proponents claim BEOS tests can detect lies, but there have been no BEOS studies published in peer-reviewed scientific journals.
With BEOS, supporters claim to be able to tell if a person has a memory or “experiential knowledge” of committing a crime. This is achieved by showing the person a photo of the crime scene while observing the BEOS data. However, neuroimaging studies have found that imagining events triggers similar brain activity to experiencing those events, and there is no scientific evidence that BEOS can differentiate between the two. No U.S. court has addressed the admissibility of BEOS testing.
In the case in India, although Aditi Sharma and her husband, Pravin Khandelwal, were convicted of murder in June 2008 and received life sentences, the Bombay High Court suspended Pravin’s sentence and released him on bail because there was no actual evidence tying him to the crime as a conspirator. Aditi also was released on bail, due to a lack of sufficient evidence. BEOS was not mentioned in the court’s ruling. In September 2008, India’s National Institute of Mental Health and Neuro Sciences declared that brain scans were unreliable in criminal cases.
Hank Greely, a professor at Stanford Law School, has received a $10 million grant to study the legal and ethical implications of neuroscientific practices. “We worry a lot that juries and judges are going to be way too impressed by fancy pictures of brain scans,” he said. “But these are not photographs: they are computer-generated images of radio-wave information taken at a certain time and configured or manipulated in certain ways. Studies already show that people are more likely to give credence to a statement about the brain if it includes a picture of a brain scan, no matter how spurious it is.”
Bite Marks, Ear Prints, Lip Prints
Prosecutors have introduced evidence of bite marks, lip prints and even ear prints to win convictions in a number of jurisdictions. The problem with this type of evidence is that – just as with other kinds of forensic techniques – there have been no systematic, scientific studies to demonstrate the accuracy of such methods.
Lavelle L. Davis was convicted of murder in 1997 and sentenced to 45 years after an Illinois State Police crime lab examiner testified that lip prints found on a roll of duct tape near the murder scene “matched” Davis’ lips. The Illinois Appellate Court upheld the murder conviction, noting that the state’s expert witnesses had testified that lip prints were accepted by the FBI as “a means of positive identification,” and that they “did not know of any dissent inside the forensic community” regarding the validity of lip print comparison. See: People v. Davis, 710 N.E.2d 1251 (Ill.App. 2 Dist. 1999).
Except the experts weren’t telling the truth. According to FBI crime lab spokeswoman Ann Todd, who was quoted in a 2004 news report, the FBI “to this day hasn’t validated lip print comparisons.” In fact, the Davis prosecution is the only known case where such evidence was used.
According to several of the jurors at Davis’ trial, the lip print evidence was the reason they convicted him because the prosecution had otherwise put on a very weak case using an eyewitness who was an admitted liar. Also, Davis’ lawyer had attempted to understand the science behind lip print comparisons and cross-examine the state’s experts without help from an expert of his own.
“You can’t rely on your own cross-examination of the state’s witnesses,” said Kim Campbell, Davis’ appellate attorney. “You have to have your own expert to say why this kind of science is unreliable. And there was nobody saying that at his trial.”
Campbell filed a post-conviction petition and submitted an affidavit by Andre Moenssens, author of the book Scientific Evidence in Civil and Criminal Cases and law professor emeritus at the University of Missouri-Kansas City. The affidavit stated that “making the quantum leap ... to the ultimate notion of identifying an individual by the visible imprint of his or her lips is a journey fueled by two elements: pure speculation and unadulterated conjecture.”
The Circuit Court granted Davis’ post-conviction petition in 2006 and ordered another trial, heeding new expert testimony that lip print comparison evidence is not accepted science. The court also found the eyewitness testimony was “wrought with contradictions and lies and inconsistencies,” and cited prosecutorial misconduct and ineffective assistance of counsel as contributing factors. The trial court was affirmed on appeal and prosecutors dropped the charges against Davis in April 2009. See: People v. Davis, 879 N.E.2d 996 (Ill.App. 2 Dist. 2007).
As odd as it sounds, ear prints have been used in several criminal cases, too. On November 10, 1999, the Washington Court of Appeals reversed the aggravated murder conviction of David Wayne Kunze “because the trial court had improperly admitted ear-print identification evidence” that did not meet the Frye test. [Frye v. United States, 293 F. 1013 (D.C. Cir. 1923)]. The ear print evidence had been introduced even though a supervisor at the Washington State Crime Laboratory “thought that earprint identification was ‘out of the expertise of the [crime lab’s] latent unit.’”
During the retrial prosecutors moved to dismiss the charges, saying they could not prove Kunze’s guilt beyond a reasonable doubt. See: State v. Kunze, 97 Wash.App. 832, 988 P.2d 977 (1999), review denied.
In the United Kingdom, Mark Dallagher was convicted of murder in 1998 and sentenced to life after Dutch ear expert Cornelis Van der Lugt testified that a latent ear print on a window at the home of the victim was “a unique match” to Dallagher. The conviction was reversed on appeal, DNA evidence later implicated another suspect, and Dallagher was exonerated and freed in 2004. See: R. v. Mark Anthony Dallagher, In the Supreme Court of Judicature, Court of Appeal (Criminal Division), No. (2002) EWCA Crim. 1903.
Another U.K. conviction, of Mark Kempster, was based on an ear print left at the scene of a burglary and resulted in a 10-year sentence in March 2001. Kempster’s conviction was overturned on appeal in 2008 due to insufficiency of the ear print evidence, though he remained in prison on unrelated charges.
“At this stage in the game, you can put ear prints and lip prints and nose prints and elbow prints all in the same category – unverified and unvalidated,” noted Ronald Singer, president of the American Academy of Forensic Sciences and crime lab director for the Tarrant County medical examiner’s office in Fort Worth, Texas.
Bite mark comparisons suffer from the same problems as ear and lip prints – a lack of scientific validation. Nonetheless, Mississippi state pathologist Dr. Stephen Hayne and odontologist Michael West have testified that individuals can be identified by bite marks.
Such evidence came under scrutiny after two men who were sent to death row on the basis of bite mark testimony were exonerated by DNA evidence.
One of those cases involved Ray Krone, who was convicted in 1991 of murdering a Phoenix, Arizona cocktail waitress and sentenced to death. The evidence against him included a bite mark on the victim’s breast that matched Krone’s teeth, according to an expert for the prosecution, even though another expert had previously determined there was no match. Krone was exonerated by DNA a decade later and released in 2002.
Krone’s attorney, Christopher Plourd, was outraged that dubious bite mark evidence had led to his client’s conviction, twice – Krone had been convicted at a retrial in which bite mark testimony was again introduced. Plourd decided to test the proficiency of bite mark experts. He hired a private investigator who contacted Dr. West and sent him photos from the victim in Krone’s case, telling him they were from an unsolved murder in a different case. The investigator also provided a dental mold of his own teeth, which he told Dr. West were from the prime suspect. Two months later West presented detailed results of his research – that the dental mold was a match to the bite marks in the photos, a clear impossibility.
In addition to his questionable bite mark testimony, Dr. Hayne also has been criticized for performing between 1,500 and 1,800 autopsies a year, over four times the recommended standard. Another medical examiner who reviewed one of Hayne’s autopsies called it “near complete malpractice.” [Note: An article on problems involving medical examiners will appear in an upcoming issue of PLN].
“There is no question in my mind that there are innocent people doing time at Parchman Penitentiary due to the testimony of Dr. Hayne,” said former Columbus, Mississippi Police Chief J.D. Sanders, who tried for years to have Hayne’s work scrutinized. “There may even be some on death row,” he added.
Arson Investigations
For many years arson investigators testified that the presence of glass with spider-web-like cracks at the scene of a fire indicated the presence of an accelerant, proving arson. It is now known that such “crazed glass” can be caused by water – e.g., from fire hoses – quickly cooling heated glass.
In February 2004, Texas prisoner Cameron Todd Willingham was executed for the arson-murder of his three young daughters. Key to Willingham’s conviction was the testimony of Corsicana Fire Department arson investigators who found crazed glass, “puddle” and V-shaped burn patterns, and a melted aluminum door threshold, which, they concluded, indicated the presence of an accelerant and thus arson.
The Willingham case has recently generated a great deal of controversy in Texas after separate investigations indicated the finding of arson was simply wrong. Reports by two highly-regarded fire science experts, Dr. Craig Beyler, chairman of the International Association for Fire Safety, and Dr. Gerald Hurst, a chemist hired by Willingham’s attorney to review the case, faulted the original investigation.
Beyler found that the arson determination “could not be sustained,” and said the fire investigators who testified at Willingham’s trial “had poor understanding of fire science and failed to acknowledge or apply the contemporaneous understanding of the limitations of fire indicators.” Hurst said the original investigation contained “major errors” and compared the methodology used by investigators to an “old wives tale.”
Beyond the crazed glass, Dr. Hurst noted that no accelerant was needed to melt aluminum or cause charring under an aluminum threshold, since wood fires alone can reach temperatures of more than one thousand degrees. Further, puddle burn patterns on the floor and V-shaped patterns on walls form naturally after a flashover event, when a fire ignites all combustible material in an enclosed space. Based on witness descriptions, it was likely that a flashover had occurred in the Willingham fire. Hurst found that other presumed indicators of arson were likewise discredited.
The Texas Forensic Science Commission scheduled a hearing on Willingham’s case in October 2009, but two days before the hearing Governor Rick Perry – who had declined to stay Willingham’s execution – replaced the commission’s chairman and two other members. The hearing was postponed, and although an interim report in July 2010 found the original arson investigation had relied on “flawed science,” the commission held the investigators were not negligent or guilty of misconduct. The final results of the commission’s report have not yet been released. [See related article in this issue of PLN, “Texas Controversy: Governor Guts Forensic Science Commission”].
Edward Cheever, a deputy state fire marshal who was involved in the original arson investigation, acknowledged that errors were made but defended the findings. “At the time of the Corsicana fire, we were still testifying to things that aren’t accurate today,” he said. “They were true then, but they aren’t now.”
Of course, an investigator’s findings cannot be true one day and not true the next. Investigatory methods may change, but the truth does not. Further, other arson experts found that the theories used in the original Willingham investigation were included in a 1960s arson textbook that has long since been debunked.
In 1997, the International Association of Arson Investigators filed a brief arguing that the requirements for scientific evidence set forth by the Supreme Court in Daubert should not apply to arson investigators, as their methods are “less scientific.”
Even when the methodology is accurate, the results can still be wrong if arson investigators lie about them. In February 1993, Joe Castorena, Assistant Chief Toxicologist for the Bexar County Forensic Science Center in San Antonio, Texas, testified that a class II accelerant had been used to set a deadly fire. The analysis had actually shown that no accelerant was present. Castorena also testified, falsely, that he had performed the test himself. The defendant was found guilty.
Another arson case under examination involves George Souliotes, who was convicted of setting a 1997 California house fire that killed a woman and two children. He was sentenced to life without parole. Souliotes’ conviction was based on arson investigation techniques that have since been discredited. For example, investigators testified that medium petroleum distillates (MPDs), a class of flammable substances, were found both on Souliotes’ shoes and on a carpet at the burned house. More modern tests revealed that the MPDs on the shoes and carpet were from different substances.
Testimony from a witness who placed Souliotes at the scene of the crime has also been questioned. The Ninth Circuit is presently considering Souliotes’ habeas appeal. See: Souliotes v. Hedgpeth, U.S.D.C. (E.D. Cal.), Case No. 1:06-cv-0667-OWW-WMW.
On May 7, 2010, ABC’s “20/20” aired an investigative report on criminal cases involving fires that may have been incorrectly blamed on arson due to “bad fire science.” In one case, a defendant was convicted of arson because investigators found multiple points of origins for the fire. However, other experts noted that aerosol cans present at the scene could have exploded and spread flames to other areas, creating multiple points of origin that mimicked arson – a theory confirmed by testing by the Bureau of Alcohol, Tobacco and Firearms.
Tarnishing the DNA Gold Standard
The 2009 National Academy of Sciences report held out DNA testing as the gold standard of forensic techniques, but even the gold standard is a little tarnished.
In 2001, Kathryn Troyer, an Arizona crime lab analyst, was running a test on the state’s DNA database when she happened across two entries that matched at 9 of the 13 locations on chromosomes (loci) that are commonly used to identify a person.
Since the odds of a random 9-loci match between two unrelated people were estimated by the FBI as 1 in 113 billion, she assumed it was a duplicate entry. That belief was dispelled when she discovered that one person was black and the other white. Troyer then found dozens of similar 9-loci matches in Arizona’s 65,493-profile DNA database.
While labs in the U.S. try to match 13 loci, DNA recovered from crime scenes may be degraded or damaged; thus, sometimes fewer than 13 loci are used for DNA matches. In the United Kingdom only 10 loci are required for a match. Thus, Troyer’s findings were a matter of significant concern. “It surprised a lot of people,” said William C. Thompson, professor of Criminology, Law, and Society and Psychology and Social Behavior at the University of California-Irvine. “It had been common for experts to testify that a nine-locus match is tantamount to a unique identification.”
When word of Troyer’s discovery got out, questions were raised about the accuracy of DNA match statistics. The FBI moved to prevent her findings from being distributed and attempted to block similar research elsewhere, even when court-ordered. Dismissing the “Arizona searches” as misleading and meaningless, the FBI suggested that states could be expelled from the national DNA database (CODIS) if they “tie up the database” with Arizona-type match searches.
FBI experts persuaded judges to block searches in some states. Regardless, “Arizona searches” were performed in two other states pursuant to court orders. The searches of DNA databases in Illinois and Maryland turned up almost 1,000 additional matches with at least 9 loci. In the 220,000-profile Illinois DNA database, 903 nine-loci matches were found. See: People v. Wright, 2010 WL 1194903 (Ill.App. 1 Dist. 2010) (reversing conviction based on a 9-loci DNA match where the trial judge had refused to allow an Arizona-type search of the state’s DNA database).
Only 32 nine-loci matches were found in Maryland’s 30,000-profile database, but three of those matched at 13 loci. The odds of that occurring randomly are quadrillions to one unless the profiles are duplicates or belong to identical twins or siblings. Maryland officials did not conduct follow-up research to analyze the 13-loci matches.
A subsequent search of Arizona’s DNA database found 122 nine-loci matches, 20 ten-loci matches, 1 eleven-loci match and 1 twelve-loci match. The latter two belonged to people who were related to each other.
At the very least, the results of the “Arizona searches” should spur experts to investigate whether the statistical basis of DNA testing used by crime labs – and often cited by prosecutors – is flawed. Admittedly, searching a database for any matches is quite different from comparing a single sample to the entire database, or comparing two samples to each other. Regardless, the existence of so many potential matches in DNA databases should call into question some of the basic premises of DNA statistics.
More than 40 researchers, forensic scientists, statisticians and academics urged the FBI to provide them with access to the 8.6 million DNA profiles in the nationwide CODIS database, after removing identifying information, so they could test statistical assumptions related to DNA matches. The FBI declined, citing privacy concerns.
PLN previously reported a similar issue regarding DNA comparisons in cold cases. When a DNA sample is run against a database, the statistical odds of an erroneous match increase in accordance with the size of the database. In some cases, the odds of an incorrect match may be as high as 1 in 3. But DNA experts frequently cite much lower odds as if the size of the database was not involved, which greatly exaggerates the ability of DNA databases to identify a single unique individual as a perpetrator in cold cases. [See: PLN, Jan. 2009, p.24].
“Fingerprinting and other forensic disciplines have now accepted that subjectivity and context may affect their judgment and decisions,” said Itiel Dror, a neuroscientist at University College London. “It is now time that DNA analysts accept that under certain conditions, subjectivity and even bias may affect their work.”
Dr. Dror and a colleague at Boise State University in Idaho conducted an experiment in which they independently provided a mixed DNA sample from the victim, the defendant and other suspects in a real criminal case to 17 analysts. The result? One examiner said the defendant “could not be excluded” based on the DNA evidence. Four reported the results were inconclusive, while 12 said the defendant could be excluded. In the actual case, two prosecution experts testified that the defendant, Kerry Robinson, charged in connection with a gang rape in Georgia, could not be excluded based on the DNA evidence. He was convicted.
Additionally, even when the accuracy of DNA testing is not at issue, the correct identification of the person whose DNA is being tested is equally important. In several cases labs have mixed up or improperly labeled DNA samples, then used the test results to implicate the wrong person.
In California, for example, a lab called Cellmark Diagnostics switched the labels on DNA samples from the victim and the suspect in a 1995 sexual assault case, then reported there was a match even though the victim’s sample actually contained no DNA from the suspect. A Cellmark employee caught the error at trial. See: State v. Kocak, Superior Court of San Diego (CA), Case No. SCD110465.
In a similar 1999 case in Philadelphia involving defendant Joseph McNeil, the initial test results found that McNeil was the source of DNA left on the victim’s panties with a 99.99% exclusion rate for other potential suspects. However, McNeil’s sample had been switched with that of the victim. Revised test results excluded him as a suspect – though at the time the error was discovered, McNeil’s attorney had reportedly convinced him to take a plea bargain due to the apparently damning DNA evidence.
Another case that involved transposed DNA samples occurred in Nevada in 2001. Lazaro Sotolusson, a prisoner at the North Las Vegas Detention Center, was accused of raping his cellmate. DNA samples were taken from both men. The samples were swapped when they were entered into a lab computer, and a match was returned after they were compared to evidence in unsolved sex crimes. Because the samples had been switched, Sotolusson rather than the other prisoner was prosecuted for those crimes – which involved the sexual assaults of two juveniles at gunpoint. He spent a year in jail before the mistake was discovered in April 2002 by an expert retained by the public defender’s office, a month before he was scheduled to go to trial.
Prior to the error being found, authorities had said the odds that someone other than Sotolusson had committed the crimes were 1 in 600 billion, based on the presumed DNA match. “Despite the credibility commonly afforded to these [DNA] test results, this demonstrates there is always the possibility of human error,” said Public Defender Marcus Cooper.
Ominously, in August 2009 it was reported that scientists in Israel were able to fabricate DNA evidence, including blood and saliva samples. They claimed that if they could access a particular DNA profile in a database, they could make an artificial sample that would match it. “You can just engineer a crime scene,” said Dr. Dan Frumkin, the lead author of an article on fabricating DNA evidence published in the February 2010 issue of Forensic Science International: Genetics. “Any biology undergraduate could perform this,” he added.
Fingerprints: The Old Gold Standard
Prior to the advent of DNA testing in the mid-1980s, fingerprint comparison was considered the gold standard of forensic techniques. Unfortunately, validation problems aside, fingerprint analysis – which has been used as evidence in criminal cases for over 100 years – suffers from a large error rate that is rarely mentioned in the courtroom.
The problems with fingerprint comparison recently gained international notoriety when three FBI fingerprint experts falsely identified, “with 100% certainty,” an Oregon attorney as a terrorism suspect in the 2004 train bombings in Madrid, Spain. This might not have come as such a surprise had the error rate in fingerprint identification been more widely known.
Since fingerprint lab units are usually run by police departments, the examiners almost always know the outcome desired by police investigators. Furthermore, the standard method of matching fingerprints is for an examiner to be given prints taken from a crime scene and a set of a suspect’s prints, and compare them. Thus, the examiner already knows that the set being reviewed for a match is from a person the police suspect of committing the crime, potentially biasing the results.
Researchers in the U.K., including Dr. Itiel Dror, asked five crime lab fingerprint examiners to compare sets of prints after falsely telling them they were the mistakenly-identified fingerprints from the Madrid train bombing case – “thus creating an extraneous context that the prints were a non-match.” Three analysts agreed that the prints were not a match and one was unsure. Unbeknownst to the examiners, they had reviewed the exact same prints and declared them to be a match five years earlier. The study, published in Forensic Science International in 2005, indicated that an 80% error rate can be induced when examiners are informed in advance of the desired outcome.
A November 2009 audit of the Houston Police Department crime lab’s fingerprint unit revealed “irregularities” in over half of the 548 cases that were reviewed.
It was reported in October 2008 that a Latent Print Unit analyst at the Los Angeles Police Department crime lab was fired, two supervisors replaced and three other employees suspended after revelations surfaced that the lab had misidentified fingerprints in two burglary cases. One of the misidentified suspects was extradited from Alabama before the error was discovered by an independent expert hired by a court-appointed defense attorney.
“As a former prosecutor I know that we rely on fingerprint evidence without question,” said Los Angeles councilman Jack Weiss, chair of the city’s public safety committee. “Given the kind of conduct indicated in this report, it’s reasonable to assume that these problems aren’t isolated.”
The Journal of Forensic Sciences published a study in 1995 that revealed a false positive rate (erroneous match) of 2% for fingerprint comparisons, with 25% of U.S. crime labs reporting false positives.
A 2006 study by the University of Southhampton in the U.K. showed that the false positive rate doubled when examiners were told about the circumstances of a case before making fingerprint comparisons.
Further, a proficiency test administered to 156 fingerprint analysts by the Collaborative Testing Service in 1995 resulted in only 44% of the examiners correctly classifying all of the latent samples. “Errors of this magnitude within a discipline singularly admired and respected for its touted absolute certainty as an identification process have produced chilling and mind-numbing realities,” said David Grieve, editor of the Journal of Forensic Identification.
These studies are well known among fingerprint examiners, yet a top FBI expert has testified that fingerprint comparisons have a “zero error rate.”
“I’ll preach fingerprints till I die. They’re infallible,” said John Massey, one of the FBI fingerprint examiners who falsely matched Oregon attorney Brandon Mayfield to the Madrid train bombings. When asked about that incorrect fingerprint match, he said, “We just did our job and made a mistake. That’s how I like to think of it – an honest mistake. I still consider myself one of the best in the world.”
But even honest mistakes by top-notch examiners mean that fingerprint comparisons are not infallible as Massey claims them to be. And therein lies one of the biggest problems in forensics – the inability or unwillingness of experts, analysts and lab personnel to admit fallibility in their methodology or themselves, even in the face of obvious errors.
Brandon Mayfield filed suit against federal officials for his arrest and two-week detention as a result of the FBI’s faulty fingerprint identification, and received a formal apology and a $2 million settlement in November 2006.
There are also problems with how some fingerprint comparisons are made. In a 1995 case in Massachusetts, a defendant was convicted of murder and armed robbery based on a collective fingerprint match. That is, while none of the individual latent prints at the crime scene were sufficient for a match, investigators used multiple prints, which were presumed to have come from the same person, to obtain a match as a group.
The Supreme Court of Massachusetts held that this technique, based on the examination of “simultaneous impressions” of latent prints, did not meet the necessary reliability standard. “[A] fingerprint examiner’s opinion regarding the individualization of simultaneous impressions is less bounded by objective factors,” the court wrote. See: Commonwealth v. Patterson, 445 Mass. 626, 840 N.E.2d 12 (Mass. 2005).
In another Massachusetts case, Stephan Cowans was accused of shooting a police officer in a May 30, 1997 incident. After the shooting the assailant entered a nearby residence and drank a glass of water; two experts with the Boston Police Department later testified that a fingerprint on the glass matched Cowans. A defense expert agreed, and Cowans was also identified by two eyewitnesses, including the injured officer. He was convicted and sentenced to 30 to 45 years in prison.
DNA testing was performed in May 2003 on saliva from the glass and on a cap and sweatshirt discarded by the assailant. The DNA on the glass, the cap and the sweatshirt all matched, but not to Cowans. The fingerprint evidence was reexamined and Suffolk Assistant District Attorney David E. Meier admitted that the “purported [fingerprint] match was a mistake.” Cowans was exonerated in February 2004.
Crime Lab Scandals Nationwide
Former Douglas County, Nebraska CSI commander Dave Kofoed was sentenced on June 1, 2010 to 20 to 48 months in prison after he was convicted of planting blood evidence in a double homicide investigation. Kofoed reportedly planted the victim’s blood in a car linked to two suspects who were later determined to be innocent. He has professed his own innocence.
In March and April 2010, San Francisco prosecutors dismissed around 700 criminal cases after it became known that Deborah Madden, a 27-year crime lab employee, had been skimming cocaine that was booked as evidence. Public defenders had estimated that up to 40,000 cases may need to be reviewed. Madden admitted she stole cocaine from the lab for her own use, and retired on March 1, 2010. The drug analysis section of the SFPD crime lab was closed following the scandal.
“There is no doubt this lab is a mess. ... I know there are issues in the lab,” said Assistant Police Chief Jeff Godown. Madden was later charged with cocaine possession and an unrelated gun charge. It was subsequently learned that she had been convicted of domestic violence in 2008, a fact known to her supervisor but not disclosed to defense attorneys who could have used it to impeach her in criminal cases.
“It was kind of like a ‘don’t ask, don’t tell’ policy,” said Jeff Adachi, San Francisco’s chief public defender. “The police weren’t telling and the district attorney wasn’t asking.”
In April 2010 another scandal hit a crime lab in Ripon, California that serves San Joaquin and four other counties, which led to the examination of almost 2,000 cases in one county alone. Methamphetamine samples handled by Hermon Brown, an examiner at the lab, ended up weighing less than initially reported. Brown was charged with five felony drug and theft-related offenses on September 9, 2010.
Sacramento County, California criminalist Jeffrey Herbert testified in one case that a defendant’s DNA sample matched just one in 95,000 people, despite the fact that a supervisor had told him the odds were closer to 1 out of 47 people. A subsequent review determined that Herbert had “an insufficient understanding” of how to analyze mixed DNA samples. A technical reviewer at the Sacramento County crime lab, Mark Eastman, resigned after problems with his work were found in May 2006. He had entered a DNA sample into a law enforcement database even though the sample did not meet minimum standards, and failed to recheck his results in other cases.
In December 2009, the New York State Inspector General issued a 119-page report blasting the New York State Police crime lab. A private accrediting organization had discovered a lab worker with 30 years’ experience in trace evidence analysis had so little training that he could not even operate a microscope. Garry Veeder was the lab’s only fiber evidence expert. A check of his work showed that he often dry-labbed results; that is, he didn’t perform the tests he claimed he did.
Veeder admitted in an internal investigation to “bypassing an analysis required by forensic center protocols and then creating data to give the appearance of having conducted an analysis not actually performed.” He committed suicide in May 2008, two days after he was asked to attend another interview regarding his work at the lab.
Another New York State Inspector General’s report on crime labs in Monroe and Erie Counties, also released in December 2009, revealed problems with workers dry-labbing, misreporting weights and failing to perform vital steps in tests.
A chemist at the Erie County lab, Kelly McHugh, was fired after falsifying a report. Although she said she had conducted a certain test on a cocaine sample, the chain of custody record indicated the cocaine was in the evidence room at the time McHugh claimed she performed the test. She pleaded guilty to misdemeanor attempted tampering with public records and received a conditional discharge with no judicial punishment in January 2010.
The crime lab in Colorado Springs, Colorado reported in December 2009 that it had performed hundreds of faulty blood-alcohol tests that showed incorrect higher alcohol levels due to testing errors by a chemist who was later fired. At least nine defendants were wrongly convicted; the lab was unable to explain how the errors occurred.
Dirk Janssen, the chemistry supervisor at Wisconsin’s state crime lab, was reprimanded in August 2009 for failing to obtain peer reviews in 27 toxicology cases involving drug evidence. Following a review by the lab director, only five of the cases were found to meet required standards; of the others, half needed corrections.
In July 2009, a Texas man won a $5 million federal jury award against the Houston police department’s crime lab after the lab fabricated evidence that led to his conviction for a 1987 rape and kidnapping of a child. DNA testing freed George Rodriguez after he served 18 years in prison. The jury found that the city “had an official policy of inadequate supervision or training of its Crime Lab personnel” [See: PLN, Jan. 2010, p.32].
Rodriguez is one of four men who have been exonerated after they were convicted based on false evidence from the Houston crime lab. An auditor who reviewed 850 serology cases processed by the lab between 1980 and 1992 found problems in 599 of those cases.
DNA expert Dr. Elizabeth A. Johnson, a vocal critic of Houston’s crime lab, said there were systemic problems at the lab in regard to forensic testing, including with DNA evidence. “They can’t do a sperm sample separation to save their lives,” Dr. Johnson contended in a 2003 article. “If you put a gun to their heads and said you have to do this or you will die, you’d just have to kill them.”
On January 10, 2008, Vanessa G. Nelson, head of the Houston crime lab’s DNA division, was forced to resign after she was caught helping two DNA technicians cheat on proficiency exams. The DNA division was closed. This was the second time the division had been temporarily shut down. Following a November 2002 scandal revealed by local news media, the lab’s DNA division was suspended from December 2002 until July 2006.
Two months after she resigned, Nelson was hired by the Texas Department of Public Safety (DPS) to supervise the agency’s McAllen Division DNA lab. DPS officials said they were aware of the circumstances of her resignation from the Houston crime lab but her supervisor had described her as an outstanding employee.
In March 2009, Wayne County, Michigan prosecutor Kym Worthy admitted that 147 cases in which people were sent to prison would have to be investigated due to failings at the firearms unit in the Detroit police crime lab, which was closed in late 2008. The lab also conducted testing on fingerprints, DNA and drug evidence.
A preliminary audit of the Detroit lab found there was a 10% error rate in ballistics testing, evidence may have been contaminated, lab workers were not given competency tests, it could not be determined if testing equipment was routinely maintained or calibrated to ensure accuracy, and lab findings were hard to validate because notes, photos and other documentation were “almost nonexistent in the case file records.”
“If we have even one person in prison on evidence that was improperly done, that’s a huge problem,” said Worthy. “As prosecutors, we completely rely on the findings of police crime lab experts every day in court and we present this information to juries. And when there are failures of this magnitude, there is a ... betrayal of trust.”
A defense attorney uncovered the problems in Detroit’s crime lab when he had an independent expert examine evidence in a case involving firearms. Prosecutors intend to retest evidence from all cases less than five years old. A Michigan State Police report blamed an inadequate budget and lack of qualifications, training and equipment for the errors. Sixty-eight lab employees were reassigned. None were fired. There are plans to create a new Detroit crime lab staffed by the State Police.
In April 2008, 30-year veteran Nashville, Tennessee police officer Michael Pyburn resigned after he was accused of falsifying records to cover up a botched ballistics test. He had worked in the Nashville Metro Police crime lab for over a decade. The lab’s ballistics unit was shut down, and two other officers who worked in the unit were found to be unqualified and reassigned.
Arnold Melnikoff, formerly the manager of Montana’s state crime lab, testified in a 1987 trial that hair comparisons indicated there was less than one chance in 10,000 that Jimmy Ray Bromgard was not the man who raped an 8-year-old girl. By the time DNA tests exonerated Bromgard almost 15 years later, Melnikoff had moved on to become a chemist at the Washington State Patrol’s lab.
A peer review committee of forensic scientists referred to Melnikoff’s testimony in the Bromgard case as containing “egregious misstatements not only of the science of forensic hair examinations but also of genetics and statistics. These statements reveal a fundamental lack of understanding of what can be said about human hair comparisons and about the difference between casework and empirical research. His testimony is completely contrary to generally accepted scientific principles.”
Two other Montana men convicted due to Melnikoff’s dubious testimony have been exonerated by DNA testing. Washington officials fired Melnikoff in 2004 after an audit revealed problems with his lab procedures; he was also accused of exaggerating his testimony to assist prosecutors. [See: PLN, March 2006, p.28; Nov. 2004, p.12; Feb. 2003, p.10].
Barry Logan, director of the Washington State Patrol crime lab, resigned in March 2008 amid a scandal in which toxicology lab manager Ann Marie Gordon was accused of falsely claiming she had verified solutions used in breath-alcohol testing. Also, firearms and tool mark examiner Evan Thompson was accused in 2006 of shoddy work resulting in serious errors. Gordon’s fabrications invalidated the results of hundreds of breath tests in DUI cases. The effect of Thompson’s incompetence in the more than 1,000 criminal cases he worked on remains unclear. In an initial review of 13 of Thompson’s most complex cases, mistakes were found in all 13. [See: PLN, Oct. 2008, p.8].
Another Washington State Patrol crime lab examiner, Charles Vaughan, who formerly worked for the Oregon State Police crime lab, provided questionable testimony concerning gunpowder residue that sent two Oregon men to prison for life. They were later exonerated and released. [See: PLN, March 2006, p.28].
Two employees of the New York City Police Department crime lab were discovered lying about drug evidence in 2007. The employees dry-labbed, writing reports on tests they never performed. The head of the lab was transferred to another post.
In 2006, the U.S. Army crime lab at Fort Gillem, Georgia admitted that civilian employee Phillip R. Mills had falsified results in as many as 479 DNA tests. Mills had a history of shoddy work, including allowing contamination during testing. He had been suspended in 2004 and retrained, then suspended again in 2005.
Former FBI analyst Jacqueline Blake pleaded guilty in May 2004 to making false statements about following protocol in around 100 DNA analyses. She reportedly failed to compare DNA evidence with control samples, and resigned from the FBI in 2002. According to a report by the Inspector General’s office, she also “falsified her laboratory documentation” to conceal her misconduct. She was sentenced to two years’ probation and community service.
Oklahoma City Police Department crime lab employee Joyce Gilchrist was terminated in September 2001 after being accused of questionable hair and fiber analysis, destroying or withholding exculpatory evidence, overstating results and contributing to several wrongful convictions, including one that sent an innocent man to death row. [See: PLN, Dec. 2009, p.44]. Gilchrist was nicknamed “Black Magic” for her apparent ability to find DNA matches that other examiners could not. The FBI recommended a review of all of her cases.
From 1967 until her retirement in 1991, Janice Roadcap was a chemist for the Pennsylvania State Police crime lab. She provided bogus testimony such as explaining why semen from a rape-murder victim did not match the blood type of the defendant by claiming that antibiotics taken by the victim may have changed the semen’s blood type. The defendant, Barry Laughman, was convicted and, after spending 16 years in prison, exonerated by DNA evidence.
In a 1970 murder case, Roadcap claimed that a palm print had been made in the victim’s blood when, in fact, there were indications the blood had splattered across an already-present palm print. The accidental discovery of Roadcap’s original notes in 2001 was used by defense attorneys to win a retrial. Prosecutors then dropped the charges. By that time the defendant, Steven Crawford, had spent 28 years in prison.
Nor are such forensic mistakes relegated to the distant past. In the Ohio case of Derris Lewis, who was prosecuted for the murder of his twin brother in 2008, experts testified that Lewis’ palm print had been made in his brother’s blood on a wall at his mother’s home, where the homicide occurred. After a jury deadlocked on the murder charge and a retrial was scheduled, it was discovered the palm print had not, in fact, been made in the blood. Lewis, who spent 18 months in jail, received a $950,000 settlement from the City of Columbus in February 2010. [See: PLN, Sept. 2010, p.23].
Fred Zain, a former state trooper, was a serologist for the West Virginia State Police crime lab for 10 years and later worked as chief serologist for the Bexar County Forensic Science Center in San Antonio, Texas until he was fired in 1993. Zain was popular among prosecutors and police because the evidence he produced led to numerous convictions.
It was later learned that he testified about tests he didn’t do and for which the crime lab did not even have the equipment to perform. He also falsified his credentials. This was one of the earliest major crime lab scandals, which came to light in 1993 after the West Virginia Supreme Court released a damning report on Zain’s misconduct. See: In the Matter of an Investigation of the West Virginia State Police Crime Laboratory, Serology Division, 438 S.E.2d 501 (W.Va. 1993) [PLN, March 1998, p.24; Oct. 1994, p.5].
The report, by the Laboratory Accreditation Board of the American Society of Crime Laboratory Directors, found “multiple incidents of misconduct on the part of former State Police serologist Fred Zain” that “may have resulted in serious miscarriages of justice in cases in which he was involved.” The report also found “evidence that Mr. Zain’s supervisors may have ignored or concealed complaints of his misconduct.”
The scandal came to light after DNA evidence exonerated Glendale Woodall, who had been convicted of a West Virginia rape in 1987 based on falsified serological tests. Zain had told the jury that the assailant’s blood types “were identical” to Woodall’s, and that only 6 in 10,000 West Virginia men had similar blood characteristics. Zain presented evidence in over 130 cases in West Virginia, and the state’s Supreme Court held that “[a]ny testimony or documentary evidence offered by Zain, at any time, in any criminal prosecution, should be deemed invalid, unreliable and inadmissible.”
It is interesting to note how the two states handled this incident. West Virginia compiled a list of the cases in which Zain testified, ordered court clerks to preserve the evidence in those cases, and retested blood samples. Texas, however, said defendants would have to challenge each case individually through the post-conviction process – without an appointed attorney in most cases – and those who had pleaded guilty under the threat of the false forensic evidence could not obtain any relief.
Zain was indicted on perjury charges in Texas but the charges were dismissed due to the statute of limitations. A West Virginia jury deadlocked on charges that Zain had obtained money (his salary) under false pretenses, and he died of colon cancer in 2002 before the case was retried.
At least nine defendants in West Virginia were exonerated following an investigation into Zain’s misconduct, and the state has paid $6.5 million in damages for wrongful convictions that resulted due to his fraudulent testimony. Woodall, who was exonerated in 1992, received a $1 million settlement.
Most recently, in March 2010, North Carolina’s State Bureau of Investigation (SBI) was accused of manipulating bloodstain evidence. The Attorney General’s office ordered an audit of the SBI blood analysis unit, including the work of Duane Deaver, a bloodstain expert and forensic trainer for 22 years who was suspected of tailoring test results to please prosecutors. Bloodstain analysis at the SBI lab was suspended in July 2010.
The previous year, a federal court held that Deaver gave “misleading testimony” that “falsely portrayed” he had found blood on a defendant’s boot in a capital murder case. The defendant, George E. Goode, Jr., had his death sentence reduced to life because his attorney had failed to challenge Deaver’s inaccurate testimony. See: Goode v. Branker, U.S.D.C. (E.D. NC), Case No. 5:07-hc-02192-H.
The audit of the SBI blood analysis unit ordered by the Attorney General, conducted by two former FBI officials, was released in August 2010. It found that lab examiners had overstated, excluded or falsely represented blood evidence in dozens of criminal cases, including three that resulted in executions.
The report called for a review of 190 cases, noting that “information that may have been material and even favorable to the defense of an accused defendant was withheld or misrepresented.” The deficiencies were blamed on “poorly crafted policy, inattention to reporting methods which permitted too much analyst subjectivity; and ineffective management and oversight.” In at least 40 cases, lab examiners reported there were indications of blood and no additional tests were performed; however, handwritten lab notes indicated that follow-up tests were negative or inconclusive.
“The documented policies and practices of our state lab support the long-held concern that North Carolina’s lab is the prosecution’s lab, not the justice system’s lab,” said Christine Mumma, director of the North Carolina Center on Actual Innocence.
The SBI’s crime lab director is being replaced and additional audits of the lab’s DNA and firearm and tool mark units have been requested. State lawmakers criticized ASCLD-LAB, the organization that accredited the SBI lab during the time the blood analysis unit was withholding or falsifying evidence. ASCLD-LAB, which is managed by three former SBI officials, accredits most forensic crime labs in the U.S. “Accreditation truly is the final check against this stuff, and it didn’t happen here,” said North Carolina state Representative Rick Glazier.
The above are just several examples of crime lab scandals and lab employees who intentionally or through incompetence presented false or misleading evidence in criminal cases under the guise of their scientific expertise. There have been many more such incidents, some of which are discussed in The Elephant in the Crime Lab, an article by Sheila Berry and Larry Ytuarte that appeared in the Spring 2009 issue of The Forensic Examiner.
Based upon the failings of crime labs, there is a strong argument for oversight by independent organizations. On June 3, 2010, though, the California Crime Lab Task Force, which had been formed by the state legislature three years earlier, voted to disband itself. The Task Force issued a report in 2009 that made 41 recommendations for improving forensic techniques, including improved training and increased staffing.
Consider that even if only a small number of forensic examiners are incompetent or corrupt, due to the large caseloads that crime labs handle they may produce faulty test results in hundreds or thousands of cases before their errors are caught.
But at least one organization, Crime Lab Report, which seeks “to properly frame the issues for those who require access to accurate information about forensic science,” believes that crime labs have been given a bad rap. In a July 16, 2008 study titled The Wrongful Conviction of Forensic Science, which critiques a report by the Innocence Project, Crime Lab Report editors John M. Collins and Jay Jarvis argue that “false eyewitness identifications exacerbated by bad lawyering, and in some cases, government misconduct” is responsible for most wrongful convictions, not forensic failures.
They suggest that the “Innocence Project needs attention and money to drive its public policy agenda. ...[and] taking on crime laboratories will turn heads more quickly than esoteric procedural debates among litigators,” and claim
“[t]he overall statistical weight that can be honestly assigned to faulty forensic science is very small.”
That does not, however, explain the many well-documented examples of crime lab scandals and fraudulent or incompetent forensic experts such as those mentioned above and in the section below. Nor does it take into account the influence of junk science in wrongful conviction cases that do not involve DNA, which is the province of the Innocence Project.
Credentials? What Credentials?
Joseph Kopera, who had a 21-year career as a forensic expert, first as a firearms examiner for the Baltimore police department’s crime lab and then as head of the Maryland State Police fingerprint unit, perjured himself for decades in thousands of criminal cases by claiming he held degrees and certificates he was never awarded. He even tried to present a forged college diploma and falsely claimed that he taught courses at local universities to back up his claims. When defense attorneys confronted Kopera with the truth that he had not received any kind of college degree, he resigned and then committed suicide on March 1, 2007.
Dr. Saami Shaibani, a Wisconsin physicist, frequently testified for the prosecution in high-profile murder cases across the country, from South Dakota to Washington, D.C., on “injury mechanism analysis” – an amalgam of physics, trauma medicine and engineering.
He falsely claimed to be a clinical associate professor at Temple University until a defense attorney discovered that was a lie. “He’s a fraud. Basically, he was trying to create himself as an expert so he could run around the country and testify in these cases,” said Wisconsin defense counsel Stephen Willett.
In one case, Douglas Plude claimed that his wife had tried to commit suicide by taking pills and died with her head in a vomit-filled toilet. Shaibani testified that that scenario was impossible after positioning volunteers about the same size as the victim over a toilet and studying their movements. He said Plude must have forced his wife’s head into the toilet to drown her.
“[Shaibani] had women sticking their heads in toilets,” said Willett. “That’s just not science. How do you peer review that? How do you test his conclusions?”
Plude was found guilty of murder but his conviction was overturned in June 2008 by the Wisconsin Supreme Court, which cited Shaibani’s fraudulent testimony about his credentials and called his conduct “egregious.” Prosecutors are retrying Plude on the murder charges. See: State v. Plude, 310 Wis.2d 28, 750 N.W.2d 42 (Wis. 2008).
In May 2007, James Earl Edmiston pleaded guilty to two counts of perjury in California. He was the state’s expert witness on computers in two cases involving child pornography. Edmiston falsely claimed to have degrees from CalTech, UNLV and UCLA, and had often been used as an expert in both state and federal cases. He received a 2l-month federal sentence.
In the United Kingdom, Gene Morrison of Greater Manchester was revealed as a fraudulent forensic expert in 2007. Police called him a “complete charlatan.” Then four underage girls came forward accusing Morrison of sexual abuse. They said he used his prestigious position as a forensics expert to convince them that no one would believe them if they told anyone. Morrison was convicted of the sex abuse charges and sentenced in December 2009 to an indeterminate prison term with a minimum of 7-1/2 years; he had previously received a 5-year sentence for the forensics fraud. Police officials said they would re-investigate 700 cases in which Morrison participated. He had no academic credentials other than the ones he had purchased through mail-order.
Another U.K. case involved Trevor “Jim” Bates, who was found guilty of four counts of making a false written witness statement for claiming he had a degree in electronic engineering. A former TV repair man, he had been a top prosecution information technology witness in cases involving child pornography; his lack of credentials was unveiled after he appeared as a defense witness in a high-profile case. Bates received a two-year suspended sentence and was ordered to pay court costs of £1,000 (approx. $1,500) in April 2008.
And San Diego prosecutors were faced with the possibility of having thousands of DUI cases overturned in 2006. The reason? Prosecution expert witness Ray Cole, who said he had a degree in premedical studies and was an expert on the effects of alcohol and driving, had lied under oath. His degree was in political science. Nonetheless, he had testified as an expert for more than three decades.
What About Eyewitnesses?
If there are so many problems with crime labs and forensic experts, can we at least trust eyewitness testimony in criminal cases? No, according to eminent memory researcher and University of California-Irvine psychology professor Elizabeth F. Loftus. Memory is not recalled as a whole, but is pieced together anew each time, “more akin to putting puzzle pieces together than retrieving a video recording,” said Dr. Loftus. This allows errors to become incorporated into memories, corrupting them. Yet a person with a corrupted memory believes with total sincerity that the memory is an accurate account.
How can memory become compromised in a criminal investigation? One way is poor or suggestive police procedures. For example, if the police ask “Is this the man who committed the crime?” while showing the witness a man in handcuffs, such questioning is considered suggestive. Likewise, using multiple photographic lineups that share a single common photo – that of the person the police want the witness to identify – corrupts the integrity of the lineup process. Both of these methods are common, though, and witnesses frequently are allowed to testify regarding their identification of a suspect despite the suggestiveness of the identification procedure used by police.
There also have been studies that indicate cross-racial or ethnic identifications, e.g., when the victim and suspect are of different races, are especially prone to error. Additionally, when witnesses are not told that the guilty party might not be in a lineup or photo array, they may feel that they have to pick one of the suspects.
Dr. Loftus has found that stress at the crime scene or during the identification process, the presence of a weapon during the crime, the use of a disguise by the perpetrator, brief viewing times during the identification procedure and a lack of extreme characteristics in the suspect can all reduce the accuracy of eyewitness identification.
Another issue concerning eyewitness testimony is the practice of “recovering” repressed memories of events that may have never occurred, using suggestive and manipulative questioning by examiners. Repressed memories have been used in a number of cases, usually involving sexual abuse, though the accuracy of that method is unknown and likely unknowable. Dr. Loftus has expressed concerns about manufacturing false memories, and wrote that “it might be virtually impossible to tell reliably if a particular memory is true or false without independent corroboration.”
According to a 2009 Innocence Project report, of the first 239 DNA exonerations, 75% involved eyewitness misidentification. In 38% of those cases there were two or more eyewitnesses, such as in the prosecution of Stephan Cowans. Further, studies show that highly confident eyewitnesses are only slightly more accurate than less confident eyewitnesses. Thus, eyewitness testimony cannot be relied upon to correct or contradict errors that result from shoddy forensic work.
Improving Crime Labs
The Innocence Project estimates that about 50% of defendants exonerated by DNA evidence were convicted in part due to “unvalidated or improper forensic science.” Considering that DNA is only an issue in 5 to 10% of criminal cases, it is likely that bad science has resulted in numerous other wrongful convictions that have gone unreported. Also, even when the science may be sound, examiners often obfuscate their testimony by saying forensic results “are consistent with” or “cannot exclude” a suspect, when such language is at best vague and at worst intentionally misleading.
As one example, when Alejandro Dominguez was tried for rape in Illinois in 1990, a forensic serologist testified that blood typing on semen found on the victim could not exclude Dominguez as the rapist. The serologist did not tell the jury that 67% of men in the U.S. also could not be excluded, as the sample was a mixture from both the victim and the rapist and they shared the same blood group markers, which meant the victim’s sample could be masking the perpetrator’s sample. Dominguez served four years of a 9-year sentence and was exonerated by DNA evidence in 2002.
In April 2010, the Scientific Working Group on DNA Analysis Methods released new guidelines for U.S. labs that perform DNA testing, including recommendations to develop stricter criteria for analyzing samples containing mixed DNA. However, the guidelines are not mandatory and “not intended to be applied retroactively.”
The National Academy of Sciences and other experts on forensic science have made a myriad of suggestions for improving the competence of crime labs and the testing techniques they employ. Those suggestions include:
1) Make crime labs independent of law enforcement agencies such as police departments and district attorneys’ offices.
2) Create a strong and independent national crime lab oversight authority that can set standards for certification of labs and employees, employee qualifications, continuing education, training and methodologies.
3) Require crime labs to comply with those standards.
4) Submit forensic methods to rigorous scientific validation that includes determining error rates and realistic probability statistics.
5) Give crime labs as little information as possible when conducting forensic tests, so as not to influence the results. This may require the use of an intermediate office to strip investigatory information from evidence samples and replace that information with coded identification numbers.
6) Mandate that the crime lab employee who actually performs a forensic test be the person who testifies about that test in court. This was recently addressed by the U.S. Supreme Court in Melendez-Diaz v. Massachusetts, 129 S.Ct. 2527 (2009) [PLN, Oct. 2009, p.8], in which the 2009 NAS report was cited by the Court as one reason for making this a requirement. In Melendez-Diaz, the admission of notarized certificates regarding forensic drug testing was held to violate the defendant’s Sixth Amendment right to confront witnesses.
7) End the practice of allowing forensic examiners to make exaggerated claims, and require them to acknowledge any uncertainties in or limits to their findings.
8) Sufficiently fund crime labs to permit the hiring of qualified personnel, continuing training and education, and the elimination of backlogs.
9) Subject lab results to random testing by other forensic labs. Just the knowledge that such testing may occur has been shown to dramatically reduce error rates in crime labs.
10) Provide court-appointed attorneys with forensic experts when the case involves forensic evidence and defense counsel requests an expert.
These reforms would go a long way toward ensuring that crime labs realize their full potential to help identify the guilty while protecting the innocent. However, reaction to the 2009 NAS report appears to indicate that most of the people and organizations involved in the forensic sciences are merely gearing up to protect their own turf rather than helping to reform the field of forensic investigation in general.
If that proves to be the response to the report, then any reforms will be piecemeal at best, crime lab scandals will continue to proliferate, and little will be done to ensure that junk science does not lead to the innocent going to prison or the guilty going free.
“I have no problem with forensic science. I have a problem with the impression that’s being given that those disciplines ... can make an absolute identification of someone, and that’s not the case,” said Terrence Kiely, a law professor at De-
Paul University and author of Forensic Evidence: Science and the Criminal Law.
“It’s the white coat-and-résumé problem,” Kiely continued. “[Forensics experts] are very, very believable people. And sometimes the jurors will take [their testimony] as a ‘yes,’ where the science can only say it’s a ‘maybe.’”
Or, in some cases, when the science in fact says it’s a definite “no.”
Ed. Note: This 15,000-word article merely scratches the surface of questionable forensic techniques and crime lab scandals. Problems with junk science, unreliable evidence and misconduct among lab workers are apparently endemic in the forensics field. Numerous law and academic journal articles have been written on these topics, and this PLN cover story easily could have been more extensive.
For more information, over 100 forensic misconduct cases are archived on the following website: www.corpus-delicti.com/forensic_fraud.html. Also, an extensive list of forensic and crime lab scandals are available here: www.truthinjustice.org/junk.htm. The 2009 NAS report is available at: www.nap.edu/catalog.php?record_id=12589. A list of cases involving forensic evidence that resulted in 116 wrongful convictions can be found here: www.innocenceproject.org/docs/DNA_Exonerations_Forensic_Science.pdf.
Sources: NAS press release, Arizona Daily Star, Associated Press, Baltimore Sun, Brownsville Herald, Chicago Tribune, Detroit Free Press, Detroit News, Durango Herald, Fort Worth Star-Telegram, Virginian-Pilot, Houston Chronicle, Houston Press, Los Angeles Times, Michigan Lawyers Weekly, New York Daily News, New York Law Journal, New York Times, San Francisco Chronicle, Seattle Times, USA Today, Washington Post, Enterprise Security, Forbes, Scientific American, www.ablee.us, www.gritsforbreakfast.blogspot.com, www.kentucky.com, www.miller-mccune.com, www.journal-star.com, www.omahasheriff.org, www.cass-news.com, www.kget.com, www.signonsandiego.com, www.mercurynews.com, www.bakersfield.com, www.thonline.com, http://reason.com, www.northcountrygazette.com, www.coloradoconnecton.com, http://press.senategop.state.il.us, www.newschannel5.com, www.theforensicexaminer.com, http://jimfisher.edinboro.edu, www.msnbc.msn.com, www.newschanne15.com, www.cqpolitics.com, www.southcoasttoday.com, www.post-gazette.com, http://news.stanford.edu, www.bbc.co.uk, www.v3.co.uk, Science Insider, www.floridatoday.com, http://blog.pennlive.com, www.cnn.com, www.slate.com, www.informationliberation.com, www.fortbendnow.com, http://standdown.typepad.com, www.crimelabreport.com, http://dailyme.com, Lexington Herald Leader, www.ketv.com, www.k9fleck.org, www.timesonline.co.uk, www.orlandoweekly.com, www.wired.com, http://lawandbiosciences.wordpress.com, www.mercatornet.com, www.law.stanford.edu, www.forensic-evidence.com, www.marymeetsdolly.com, http://darwin.bio.uci.edu, Las Vegas Review Journal, www.law.northwestern.edu, www.officer.com, Modesto Bee, www.newscientist.com, www.innocenceproject.org, Texas Monthly, News & Observer, http://abclocal.go.com, www.victoriaadvocate.com, www.truthinjustice.org
As a digital subscriber to Prison Legal News, you can access full text and downloads for this and other premium content.
Already a subscriber? Login