|
An equal society depends on everyone having equal access to work.
|
Sadly, the reality is far from that. People of color, women, disabled people, and other marginalized communities experience unemployment or underemployment at disproportionately high rates, especially amid the economic fallout of the COVID-19 pandemic.
This discrimination is only amplified by the increasing use of artificial intelligence (AI) technology in the professional sphere. Algorithms are taking over for supervisors across industries, determining everything from who gets an interview to what efficiency looks like. While seemingly beneficial on the surface, these technologies can have a harmful impact on workers and their rights.
It is critical that policymakers and employers actively work to mitigate threats to safe and just employment. We're asking the Biden Administration, companies, and regulators around the world to do exactly that. At the Center for Democracy & Technology (CDT), we are dedicated to ensuring that workers' rights remain a priority, especially in the face of algorithmic hiring, management, and surveillance.
|
Algorithms are marketed by vendors as a way to increase efficiency — why have a human screen résumés when you can have a program do it for you? It may sound like a dream come true for some HR professionals, who can turn over the identification of skills, aptitudes, and company "fit" to a process that claims to do it better and faster than they can. However, these algorithms often are trained on data based on the company's current demographics, ultimately perpetuating the homogeneity of a workplace and increasing the risk of discrimination.
Some localities are working to address these potentially discriminatory practices, and CDT is here to help guide them. Earlier this year, CDT sent an open letter, cosigned by 20 local and national civil society organizations, to the New York City Council in response to a draft ordinance that would require automated decision-making tools used in hiring and employment to be audited. The first of its kind in the United States, the proposed ordinance's impact has the potential to be magnified nationally. While commending the Council for addressing potentially discriminatory impacts of automated employment decision tools, CDT urged the Council to be more explicit in what "audit" actually means, and to ensure the protection of people with disabilities.
Even once hired, employees with and without disabilities can still be subjected to surveillance and algorithmic decision-making. Employers are increasingly deploying technologies, called bossware, that allow for both the continuous surveillance of workers' activities and the automation of the task of supervising them. CDT's latest report details the risks: rather than rely on human managers who may be better equipped to deal with nuance and individual needs, these systems standardize the expected behavior across all employees, ignoring needed accommodations and even the need for breaks. Safety standards need to be institutionalized in the algorithms, and the current laws in place that safeguard workers' health need to be enforced.
| |
|
Report | Algorithm-driven Hiring Tools: Innovative Recruitment or Expedited Disability Discrimination?
| |
Report | Warning: Bossware May Be Hazardous to Your Health
| |
CDT Partners with Leadership Conference on Civil Rights Principles on the Use of AI in Hiring
| |
Opinion | We Need Laws to Take on Racism and Sexism in Hiring Tools
| |
NYC Draft Bill on AI in Hiring Needs Higher and Clearer Hurdles
| |
How Opaque Personality Tests Can Stop Disabled People from Getting Hired
| |
|
From New York to California, and everywhere in between, CDT is fighting against algorithm-driven discrimination in hiring and employment. CDT is committed to supporting and advocating for workers' rights across the country. Partners like you have been indispensable in this work. If you have not yet engaged and want to learn more, please reply to this email to join the conversation. You can help CDT fight for civil rights and civil liberties in the digital age.
| |
| |
|
|