From xxxxxx <[email protected]>
Subject Is It Forensics or Is It Junk Science?
Date February 7, 2023 1:05 AM
  Links have been removed from this email. Learn more in the FAQ.
  Links have been removed from this email. Learn more in the FAQ.
[Dubious forensic techniques have spread throughout the criminal
justice system for decades. Here’s what ProPublica has learned about
junk forensic science techniques and how they proliferate.]
[[link removed]]

IS IT FORENSICS OR IS IT JUNK SCIENCE?  
[[link removed]]


 

Sophia Kovatch, Pamela Colloff and Brett Murphy
January 31, 2023
ProPublica
[[link removed]]


*
[[link removed]]
*
[[link removed]]
*
*
[[link removed]]

_ Dubious forensic techniques have spread throughout the criminal
justice system for decades. Here’s what ProPublica has learned about
junk forensic science techniques and how they proliferate. _

,

 

It’s been decades since the intersection of forensic science and
criminal justice first became a pop culture phenomenon, popularized by
countless TV shows, movies and books. But the public’s growing
awareness of forensic techniques obscures a far more complex field
that’s chock full of bogus science — and the people who champion
it, often for profit.

For years, ProPublica has reported on these dubious techniques as
they’ve wormed their way into every corner of our real-life criminal
justice system.

So, what’s legitimate forensic science and what’s junk? Let’s
start with the basics.

What Is Junk Science?

Junk science refers to any theory or method presented as scientific
fact without sufficient research or evidence to support it. Some types
of junk science have virtually no supporting evidence, while others
are oversimplifications of real but complex science findings.

Adding to the risk they pose to the justice system, many forms of junk
science are very subjective and depend highly on individual
interpretation.

How to Spot Junk Science in Forensics

When ProPublica has reported on junk science, we’ve found many
common traits. They could include:

* It has limited or no scientific evidence or research supporting it.
* It is presented as absolutely certain or conclusive, with no
mention of error rates.
* It relies on subjective criteria or interpretation.
* It oversimplifies a complex science.
* It takes just a few days to become an “expert.”

Examples of Junk Science in Forensics and Law Enforcement

Tracing the spread of junk science through the criminal justice system
can be difficult. But ProPublica has followed forensic junk science in
various forms for years.

911 Call Analysis

Police and prosecutors trained in 911 call analysis
[[link removed]]
are taught they can spot a murderer on the phone by analyzing speech
patterns, tone, pauses, word choice and even the grammar used during
emergency calls. These are known as “guilty indicators,” according
to the tenets of the program. A misplaced word, too long of a pause or
a phrase of politeness could reveal a killer.

Analysis of 911 calls appears in the criminal justice system in lots
of different ways. Some detectives say it’s a tool to help build a
case or prepare to interrogate a suspect. They have used it to help
extract confessions. Others present their analyses to prosecutors or
enlist Tracy Harpster, the program’s creator and a retired deputy
police chief from Ohio, to consult on cases.

During Harpster’s career, he had almost no homicide investigation
experience or scientific background. He developed the 911 call
analysis technique based on a small study for his master’s thesis in
2006. After teaming up with the FBI to promote his findings
nationwide, there was enough demand from law enforcement to create a
full-fledged training curriculum.

Since the technique’s development, 911 call analysis has been used
in investigations across the country. ProPublica documented more than
100 cases in 26 states where Harpster’s methods played a role in
arrests, prosecutions and convictions — likely a fraction of the
actual figure. In addition, Harpster says he has personally consulted
in more than 1,500 homicide investigations nationwide.

Despite the seeming pervasiveness of the technique, researchers who
have studied 911 calls have not been able to corroborate Harpster’s
claims. A 2020 study
[[link removed]] from
the FBI warned against using 911 call analysis to bring actual cases.
A separate FBI study in 2022
[[link removed]]
said applying 911 analysis may actually increase bias. And academic
studies from researchers at Villanova and James Madison universities
came to similar conclusions.

Ultimately, five studies have not been able to find scientific
evidence that 911 call analysis works.

In a 2022 interview, Harpster defended his program and noted that he
has also helped defense attorneys argue for suspects’ innocence. He
maintained that critics don’t understand the research or how to
appropriately use it, a position he has repeated in correspondence
with law enforcement officials for years. “The research is designed
to find the truth wherever it goes,” Harpster said.

EXAMPLE: ProPublica chronicled how 911 call analysis was used in the
case of Jessica Logan
[[link removed]],
who was convicted of killing her baby after a detective trained by
Harpster analyzed her call and then testified about it during trial.

Bloodstain-Pattern Analysis

Bloodstain-pattern analysis is a forensic discipline whose
practitioners regard the drops, spatters and trails of blood at a
crime scene as clues that can sometimes be used to reconstruct and
even reverse-engineer the crime itself.

The reliability of bloodstain-pattern analysis has never been
definitively proven or quantified, but largely due to the testimony of
criminalist Herbert MacDonell
[[link removed]],
it was steadily admitted in court after court around the country in
the 1970s and ’80s. MacDonell spent his career teaching weeklong
“institutes” in bloodstain-pattern analysis at police departments
around the country, training hundreds of officers who, in turn,
trained hundreds more.

While there is no index that lists cases in which bloodstain-pattern
analysis played a role, state appellate court rulings show that the
technique has played a factor in felony cases across the country
[[link removed]].
Additionally, it has helped send innocent people to prison. From
Oregon to Texas to New York, convictions that hinged on the testimony
of a bloodstain-pattern analyst have been overturned and the
defendants acquitted or the charges dropped.

In 2009, a watershed report commissioned by the National Academy of
Sciences [[link removed]] cast
doubt on the discipline, finding that “the uncertainties associated
with bloodstain-pattern analysis are enormous,” and that experts’
opinions were generally “more subjective than scientific.” More
than a decade later, few peer-reviewed studies exist, and research
that might determine the accuracy of analysts’ findings is close to
nonexistent.

When MacDonell, who died in 2019, was asked whether he ever considered
changing his course structure or certification process after seeing
students give faulty testimony, MacDonell answered in the negative.
“You can’t control someone else’s thinking,” he said. “The
only thing you can do is go in and testify to the contrary.”

EXAMPLE: ProPublica has also reported on how bloodstain-pattern
analysis was used to convict Joe Bryan
[[link removed]]
of killing his wife, Mickey.

Other Junk Science Examples

ProPublica’s reporting on junk science in forensics goes beyond
bloodstain-pattern analysis and 911 call analysis. We’ve also
covered:

* The pervasiveness of Scientific Content Analysis, or SCAN
[[link removed]],
a means of dissecting written suspect statements while looking for
markers of deception.
* How the FBI used unproven photo analysis
[[link removed]]
in its investigations for years.
* Why police and prosecutors kept using roadside drug tests
[[link removed]]
with known high rates of false positives.

How Does Junk Science Spread in Forensics?

Junk science can spread a lot of different ways, but there are some
common patterns in how it spreads across forensics and law
enforcement.

Often, junk science originates when an individual devises a forensic
technique based on minimal or narrow experience and data. For example,
the original 911 call analysis training curriculum was based on a
study of just 100 emergency calls, most of which came from a single
state.

The creators of these techniques then put together curriculums and
workshops targeting law enforcement at every level around the country.
As more police officers take these courses, these techniques are
employed more often in investigating crimes and interrogating
suspects. When officers testify in court, the impact of junk forensic
techniques makes its way into the justice system.

Other times, prosecutors call the creators and trainees of these
forensic methods as expert witnesses, as was common with
bloodstain-pattern analysis
[[link removed]].

In the courtroom, it’s up to the judge to decide whether certain
evidence is admissible. While judges are experts in the law, they
aren’t necessarily experts in the scientific disciplines that make
up forensics. Once a type of junk science is admitted in a case, other
prosecutors and judges can use that as precedent to allow it in future
cases too. In this way, new junk science methods like 911 call
analysis can spread quickly through the justice system.

How Long Has Junk Science Been a Problem in Criminal Justice?

Forensic science has had a junk science problem for decades. In the
1980s and ’90s, the FBI and other law enforcement agencies used
faulty microscopic hair comparison
[[link removed]]
in hundreds of cases, only formally acknowledging the problematic
science in 2015. Since at least the 1990s, law enforcement has used a
written content analysis tool with no scientific backing
[[link removed]]
to interpret witness and suspect statements.

The 2009 report from the National Academy of Sciences
[[link removed]], which reviewed
the state of forensic science in the United States, found that a lot
of forensic evidence “was admitted into criminal trials without any
meaningful scientific validation, determination of error rates, or
reliability testing to explain the limits of the discipline.” A 2016
report from the President’s Council of Advisors on Science and
Technology
[[link removed]]
found that despite efforts to fund forensic science research, there
was still a major gap in understanding the scientific validity of many
forensic methods.

In 2017, the Trump administration allowed the charter for the National
Commission on Forensic Science [[link removed]]
to expire, further limiting the progress on validating forensic
science methods. Since then, many forensic professionals have
critiqued
[[link removed]]
the junk science problems rampant in forensics and criminal justice.

_Sophia Kovatch [[link removed]] is
an audience editor, SEO with ProPublica._

_Pamela Colloff [[link removed]] is
a reporter at ProPublica and a staff writer at The New York Times
Magazine._

* _ [email protected]
[[link removed]]_
* _ @pamelacolloff [[link removed]]_

_Brett Murphy [[link removed]] is a
reporter at ProPublica._

* forensics
[[link removed]]
* Science
[[link removed]]
* junk science
[[link removed]]

*
[[link removed]]
*
[[link removed]]
*
*
[[link removed]]

 

 

 

INTERPRET THE WORLD AND CHANGE IT

 

 

Submit via web
[[link removed]]

Submit via email
Frequently asked questions
[[link removed]]

Manage subscription
[[link removed]]

Visit xxxxxx.org
[[link removed]]

Twitter [[link removed]]

Facebook [[link removed]]

 




[link removed]

To unsubscribe, click the following link:
[link removed]
Screenshot of the email generated on import

Message Analysis

  • Sender: Portside
  • Political Party: n/a
  • Country: United States
  • State/Locality: n/a
  • Office: n/a
  • Email Providers:
    • L-Soft LISTSERV