From Center for Democracy & Technology <[email protected]>
Subject CDT’s Spotlight on Discriminatory Data Use
Date April 29, 2022 2:01 PM
  Links have been removed from this email. Learn more in the FAQ.
  Links have been removed from this email. Learn more in the FAQ.
We are fighting to protect your privacy


([link removed])


From the U.S. to the EU, data practices are being examined. Baseline protections for privacy and against data-driven discrimination need to be put into place.

Everywhere we turn, our data is collected and used to analyze our behavior. Such extensive data collection has the potential to improve our lives, break down barriers of access, and increase our society’s capabilities. Unfortunately, data is often used, inadvertently or not, to discriminate against historically marginalized groups including people of color, immigrants, Indigenous communities, women, people with disabilities, and LGBTQIA+ communities.

Increasingly, data feeds powerful automated decision-making systems that have direct impacts on everything from what type of healthcare we receive to whether we are chosen for a job. These tools must be analyzed critically to determine who they are harming and how, and what can be done to eliminate those harms. At the Center for Democracy & Technology (CDT), we are working hard on these problems.



Use of consumer health technologies spiked during the COVID-19 pandemic, with people turning to various connected devices and apps to help with everything from symptom tracking and exposure notifications to maintaining exercise goals. While these technologies have the potential to help us all be healthier and address inequities, including those surrounding the provision of healthcare, not all data is protected and regulated like health information held by doctors. A recent CDT report,
Placing Equity at the Center of Health Care & Technology ([link removed]), identifies and suggests ways in which new privacy protections around unregulated consumer health information can benefit everyone, including underrepresented and overlooked communities harmed by current data practices.

Similar to healthcare, employment and housing have been testing grounds for new technologies. More employers are turning to algorithmic tools at every level of employment, from recruitment and hiring to monitoring and evaluating current employees. A
CDT report ([link removed])
highlights the various ways monitoring poses a risk to workers, particularly those with disabilities, as well as recommendations for policymakers and enforcement agencies on how to address these risks.

More and more, landlords and property management companies are also using algorithmic tools to conduct
tenant background checks and screenings ([link removed])
during the rental application process. These and other types of algorithmic tools used in
credit decisions ([link removed])
can result in denied applications or adverse loan terms because they evaluate characteristics that disproportionately disadvantage marginalized communities.

Discrimination through data occurs even in schools. Disabled students are more likely to be flagged for suspicious behavior by
automated proctoring tools ([link removed]). Schools across the country are still using COVID-era
student activity monitoring tools ([link removed])
to assist teachers in everything from evaluation to discipline, often to the detriment of student privacy and safety — especially for students of color, disabled students, and LGBTQIA+ students.


([link removed])


READ ([link removed])

Comments on AI’s Impact on People with Disabilities to UN Special Rapporteur


([link removed])


READ ([link removed])

We Should Protect Children’s Privacy Through a Comprehensive Federal Privacy and Civil Rights Bill


([link removed])


READ ([link removed])

CDT Comments to OSTP Highlight How Biometrics Impact Disabled People


([link removed])


READ ([link removed])

CDT Joins Civil Society Orgs in Letter Calling on the FTC to Initiate Rulemaking to Protect Against Data Harms



Inadequate data protection means that consumers are more likely to be harmed when their data is used in ways they don’t anticipate, expect, understand, or even know about. The need for federal privacy legislation grows with each new technological advancement, to better protect individuals’ data and shift the burden of data hygiene practices from the consumer to the entities collecting and using the data.

CDT is advocating for comprehensive federal privacy legislation that puts people first. Partners like you have been indispensable in this work. If you are not yet engaged and want to learn more, please reply to this email to join the conversation. You can help put civil rights and civil liberties at the center of the digital age.


LEARN MORE ([link removed])




([link removed])
% link:[link removed] name="[link removed]" content="" %]
% link:[link removed] name="[link removed]" content="" %]
% link:[link removed] name="[link removed]" content="" %]


#CONNECT WITH CDT


SUPPORT OUR WORK ([link removed])


[Manage]([link removed]) your preferences | [Opt Out]([link removed]) using TrueRemove™
Got this as a forward? [Sign up]([link removed]) to receive our future emails.
View this email [online]([link removed]).

1401 K St NW Suite 200 | Washington, DC xxxxxx US

This email was sent to [email protected].
To continue receiving our emails, add us to your address book.


([link removed])
Screenshot of the email generated on import

Message Analysis