From Innocence Project <[email protected]>
Subject The threat of biased AI in policing
Date February 28, 2024 9:09 PM
  Links have been removed from this email. Learn more in the FAQ.
  Links have been removed from this email. Learn more in the FAQ.
John,

As we come to the end of Black History Month, we want to share some growing concerns on artificial intelligence (AI) — the use of which has the potential to deeply exacerbate racial disparities in policing.

Robert Williams' harrowing experience exemplifies this danger. Mistakenly identified by facial recognition technology (FRT), the Michigan resident and father of two spent 30 hours wrongly detained for theft. This wasn't an isolated incident. There are at least seven confirmed cases of misidentification due to the use of facial recognition technology, six of which involve Black people who have been wrongfully accused: Nijeer Parks, Porcha Woodruff, Michael Oliver, Randall Reid, and Alonzo Sawyer, along with Robert Williams. [[link removed]]

Facial recognition technology has been shown to misidentify people of color in part because its algorithms fail to adequately distinguish facial features and darker skin tones. This, coupled with potential officer biases, creates a dangerous cocktail that can lead to misidentifications and wrongful arrests. It also bears stark parallels to past misapplications of forensic techniques like bite mark analysis.

The Innocence Project is actively challenging the misuse of AI in policing. We advocate for proactive measures like pre-trial litigation, policy interventions, and community involvement to prevent unreliable and biased AI from doing further harm.

Last year, the Biden administration issued an executive order to set standards and manage the risk of AI including a standard to develop “tools, and tests to help ensure that AI systems are safe, secure, and trustworthy.” However, there are no federal policies currently in place to regulate the use of AI in policing. [[link removed]]

In the meantime, Innocence Project policy advocate Amanda Wallwin tells us there are ways for concerned community members to influence and encourage local leaders to regulate the use of these technologies in their communities by local law enforcement and other agencies.

This Black History Month, let's honor the fight for equality by actively opposing the use of biased AI in policing. Read our full editorial on FRT here. [[link removed]]

You can lend your voice by contacting local representatives and supporting organizations like the Innocence Project.

Together, we can ensure technology serves justice, without letting the future multiply the mistakes of the past.

Sincerely,
The Innocence Project Team

SHOP: [[link removed]]
DONATE: [[link removed]]

[[link removed]]
[[link removed]]
[[link removed]]
[[link removed]]

The Innocence Project works to free the innocent, prevent wrongful convictions, and create fair, compassionate, and equitable systems of justice for everyone. Founded in 1992 by Barry C. Scheck and Peter J. Neufeld at the Benjamin N. Cardozo School of Law at Yeshiva University, the organization is now an independent nonprofit. Our work is guided by science and grounded in anti-racism.

[link removed]

Copyright © 2024 Innocence Project, All rights reserved.
212.364.5340
[email protected]

unsubscribe from all emails [link removed]
update subscription preferences [link removed]
privacy policy [[link removed]]
disclosures [[link removed]]
Screenshot of the email generated on import

Message Analysis