John,
As we come to the end of Black History Month, we want to share some growing concerns on artificial intelligence (AI) — the use of which has the potential to deeply exacerbate racial disparities in policing.
Robert Williams' harrowing experience exemplifies this danger. Mistakenly identified by facial recognition technology (FRT), the Michigan resident and father of two spent 30 hours wrongly detained for theft. This wasn't an isolated incident. There are at least seven confirmed cases of misidentification due to the use of facial recognition technology, six of which involve Black people who have been wrongfully accused: Nijeer Parks, Porcha Woodruff, Michael Oliver, Randall Reid, and Alonzo Sawyer, along with Robert Williams.
Facial recognition technology has been shown to misidentify people of color in part because its algorithms fail to adequately distinguish facial features and darker skin tones. This, coupled with potential officer biases, creates a dangerous cocktail that can lead to misidentifications and wrongful arrests. It also bears stark parallels to past misapplications of forensic techniques like bite mark analysis.
The Innocence Project is actively challenging the misuse of AI in policing. We advocate for proactive measures like pre-trial litigation, policy interventions, and community involvement to prevent unreliable and biased AI from doing further harm.
Last year, the Biden administration issued an executive order to set standards and manage the risk of AI including a standard to develop “tools, and tests to help ensure that AI systems are safe, secure, and trustworthy.” However, there are no federal policies currently in place to regulate the use of AI in policing.
In the meantime, Innocence Project policy advocate Amanda Wallwin tells us there are ways for concerned community members to influence and encourage local leaders to regulate the use of these technologies in their communities by local law enforcement and other agencies.
This Black History Month, let's honor the fight for equality by actively opposing the use of biased AI in policing. Read our full editorial on FRT here.
You can lend your voice by contacting local representatives and supporting organizations like the Innocence Project.
Together, we can ensure technology serves justice, without letting the future multiply the mistakes of the past.
Sincerely,
The Innocence Project Team
|