We are fighting to protect your digital rights
An equal society depends on everyone having equal access to work.
Sadly, the reality is far from that. People of color, women, disabled people, and other marginalized communities experience unemployment or underemployment at disproportionately high rates, especially amid the economic fallout of the COVID-19 pandemic.
This discrimination is only amplified by the increasing use of artificial intelligence (AI) technology in the professional sphere. Algorithms are taking over for supervisors across industries, determining everything from who gets an interview to what efficiency looks like. While seemingly beneficial on the surface, these technologies can have a harmful impact on workers and their rights.
It is critical that policymakers and employers actively work to mitigate threats to safe and just employment. We're asking the Biden Administration ([link removed]), companies ([link removed]), and regulators around the world to do exactly that. At the Center for Democracy & Technology (CDT), we are dedicated to ensuring that workers' rights remain a priority, especially in the face of algorithmic hiring, management, and surveillance.
Algorithms are marketed by vendors as a way to increase efficiency — why have a human screen résumés when you can have a program do it for you? It may sound like a dream come true for some HR professionals, who can turn over the identification of skills, aptitudes, and company "fit" to a process that claims to do it better and faster than they can. However, these algorithms often are trained on data based on the company's current demographics, ultimately perpetuating the homogeneity of a workplace and increasing the risk of discrimination.
Some localities are working to address these potentially discriminatory practices, and CDT is here to help guide them. Earlier this year, CDT sent an open letter ([link removed]), cosigned by 20 local and national civil society organizations, to the New York City Council in response to a draft ordinance that would require automated decision-making tools used in hiring and employment to be audited. The first of its kind in the United States, the proposed ordinance's impact has the potential to be magnified nationally. While commending the Council for addressing potentially discriminatory impacts of automated employment decision tools, CDT urged the Council to be more explicit in what "audit" actually means, and to ensure the protection of people with disabilities.
California is also grappling with the issue of algorithms and employment. CDT's own Lydia X. Z. Brown, a policy counsel for the Privacy & Data Project, testified before the state's Fair Employment & Housing Council ([link removed]) on how algorithms can be used in the hiring process, and how they can have a negative impact on people with disabilities. From screening software and gamified assessments to automated video interviews and personality tests, a full third of U.S. businesses are using AI-powered hiring tools. Unfortunately, many of these tools are inaccessible to people with disabilities and tend to unfairly screen out disabled applicants for reasons unrelated to the job ([link removed]).
Even once hired, employees with and without disabilities can still be subjected to surveillance and algorithmic decision-making. Employers are increasingly deploying technologies, called bossware, that allow for both the continuous surveillance of workers' activities and the automation of the task of supervising them. CDT's latest report ([link removed]) details the risks: rather than rely on human managers who may be better equipped to deal with nuance and individual needs, these systems standardize the expected behavior across all employees, ignoring needed accommodations and even the need for breaks. Safety standards need to be institutionalized in the algorithms, and the current laws in place that safeguard workers' health need to be enforced.
([link removed])
READ | Report | Algorithm-driven Hiring Tools: Innovative Recruitment or Expedited Disability Discrimination? ([link removed])
Report | Algorithm-driven Hiring Tools: Innovative Recruitment or Expedited Disability Discrimination?
([link removed])
READ | Report | Warning: Bossware May Be Hazardous to Your Health ([link removed])
Report | Warning: Bossware May Be Hazardous to Your Health
([link removed])
READ | CDT Partners with Leadership Conference on Civil Rights Principles on the Use of AI in Hiring ([link removed])
CDT Partners with Leadership Conference on Civil Rights Principles on the Use of AI in Hiring
([link removed])
READ | Opinion | We Need Laws to Take on Racism and Sexism in Hiring Tools ([link removed])
Opinion | We Need Laws to Take on Racism and Sexism in Hiring Tools
([link removed])
READ | NYC Draft Bill on AI in Hiring Needs Higher and Clearer Hurdles ([link removed])
NYC Draft Bill on AI in Hiring Needs Higher and Clearer Hurdles
([link removed])
READ | How Opaque Personality Tests can Stop Disabled People from Getting Hired ([link removed])
How Opaque Personality Tests Can Stop Disabled People from Getting Hired
From New York to California, and everywhere in between, CDT is fighting against algorithm-driven discrimination in hiring and employment. CDT is committed to supporting and advocating for workers' rights across the country. Partners like you have been indispensable in this work. If you have not yet engaged and want to learn more, please reply to this email to join the conversation. You can help CDT fight for civil rights and civil liberties in the digital age.
LEARN MORE ([link removed])
([link removed])
([link removed])
([link removed])
([link removed])
#CONNECT WITH CDT
SUPPORT OUR WORK ([link removed])
1401 K St NW Suite 200 | Washington, DC xxxxxx United States
This email was sent to
[email protected].
To ensure that you continue receiving our emails,
please add us to your address book or safe list.
manage your preferences ([link removed])
opt out ([link removed]) using TrueRemove(r).
Got this as a forward? Sign up ([link removed]) to receive our future emails.
email powered by Emma(R)
[link removed]