John,
Dangerous AI surveillance tools like facial recognition, automated license plate readers (ALPRs), gunshot detection technology, and predictive policing programs are becoming increasingly widespread within police agencies and the criminal legal system.1,2,3,4
Right now the Department of Justice (DOJ) is accepting comments about the use of AI by law enforcement, specifically focused on privacy, civil rights, and civil liberties concerns. We have to make it clear to the Department of Justice that police use of discriminatory, rights-infringing AI should be banned.
Add your name to Fight for the Future’s comment to the DOJ highlighting the dangers of biased AI tools in policing.
The U.S. police and law enforcement systems have a long history of weaponizing data against communities of color, and numerous studies show that the algorithms that power AI technologies replicate the biases in historical “crime data” and exacerbate racism and discrimination rather than solve the problem of discriminatory decision-making.5,6,7 To make matters worse, most AI policing tools simply don’t work –– they're constantly misidentifying people, misreading license plates, and mistaking loud noises for gunshots, which in turn brings more cops into communities that are already over-policed, making everyone less safe. No amount of regulation, transparency, or oversight will fix the dangers inherent in widespread use of biased AI tools in policing.
This comment period will help inform a new Federal-level report with recommendations on the use of AI within law enforcement. Our primary priority is to communicate that there is no safe way for law enforcement to use flawed and biased AI tools, and adding your name to our comment will maximize the impact of that message. Sign the comment here. This is a key opportunity to make our voices heard about abusive policing tech – let’s not pass it up.
With gratitude,
Leila & the team at Fight
Footnotes:
-
Center on Privacy and Technology, Georgetown Law: https://www.law.georgetown.edu/privacy-technology-center/publications/a-forensic-without-the-science-face-recognition-in-u-s-criminal-investigations/
-
Fight for the Future: https://www.stopalprs.com/
-
New/Mode: https://act.newmode.net/action/mpower-change/tell-shotspotter-stop-selling-surveillance
-
MIT Technology Review: https://www.technologyreview.com/2021/02/05/1017560/predictive-policing-racist-algorithmic-bias-data-crime-predpol/
-
MIT Technology Review: https://www.technologyreview.com/2020/07/17/1005396/predictive-policing-algorithms-racist-dismantled-machine-learning-bias-criminal-justice/
-
The Brink: https://www.bu.edu/articles/2023/do-algorithms-reduce-bias-in-criminal-justice/
-
NAACP: https://naacp.org/resources/artificial-intelligence-predictive-policing-issue-brief#
|