John,
A Black woman, 8 months pregnant, was falsely arrested due to racist facial recognition technology.
This technology discriminates against Black and brown people, and it’s making us less safe. We must urgently stop its use to protect our communities.
So I’m supporting the Facial Recognition and Biometric Technology Moratorium Act, which would:
-
Place a prohibition on the use of facial recognition technology by federal entities
-
Place a prohibition on the use of other biometric technologies such as voice recognition by federal entities
-
Condition federal grant funding to state and local entities, including law enforcement, on those entities enacting their own moratoria on the use of facial recognition and biometric technology
-
Prohibit the use of federal dollars for biometric surveillance systems
-
Prohibit the use of information collected via biometric technology in violation of the Act in any judicial proceedings, and more
Please join the call to ban racist facial recognition technology by signing on in support of the Facial Recognition and Biometric Technology Moratorium Act today.
Thank you for taking action to make our communities safer. Together, we’re challenging racist policing and building a true justice system rooted in equity for all.
In solidarity,
Rashida
---------- Forwarded message --------- From: Rashida Tlaib Date: Mon, Aug 21, 2023 Subject: Join the urgent call to ban racist facial recognition technology: To: [email protected]
John,
Porcha Woodruff was 8 months pregnant and getting her daughters ready for school when police officers arrested her at her home and detained her for 11 hours, where she was traumatized, dehydrated, and in pain.
This horrific story marks the third wrongful arrest in my hometown of Detroit based on facial recognition falsely identifying our residents, making them suspects in crimes they did not commit.
This technology is making us less safe, and its racist inaccuracies disproportionately harm Black and brown communities that are already at risk for over-criminalization.
Please sign today to support the Facial Recognition and Biometric Technology Moratorium Act to prevent the government from using racist facial recognition technology.
I’m co-leading the Facial Recognition and Biometric Technology Moratorium Act that would:
-
Place a prohibition on the use of facial recognition technology by federal entities;
-
Place a prohibition on the use of other biometric technologies such as voice recognition by federal entities;
-
Condition federal grant funding to state and local entities, including law enforcement, on those entities enacting their own moratoria on the use of facial recognition and biometric technology;
-
Prohibit the use of federal dollars for biometric surveillance systems;
-
Prohibit the use of information collected via biometric technology in violation of the Act in any judicial proceedings, and more.
Research shows that nearly half of U.S. adults’ faces exist in facial recognition databases, and the faces of Black, brown, and Asian people are up to 100 times more likely to be misidentified than white male faces.
For years I’ve called for a ban on this racist technology, which is being used in neighborhoods across the country to invade privacy, surveil, and criminalize. It’s a flawed and unconstitutional system that endangers our lives and our freedoms.
I’ve met with local community advocates and lifted up their demands to end the Detroit Police Department’s use of the dangerous technology. Inspired by tenants’ rights activists, I also co-introduced a bill to prohibit the technology’s use in public housing that gets HUD funding.
This year, my colleagues and I re-introduced the Facial Recognition and Biometric Technology Moratorium Act. With more false arrests occurring around the country, we must urgently pass this bill and protect our communities.
Please join the urgent call to ban racist facial recognition technology by signing on in support of the Facial Recognition and Biometric Technology Moratorium Act today.
Thank you for taking action.
In solidarity,
Rashida
|