From Center for Democracy & Technology <[email protected]>
Subject CDT AI Governance Lab Tackles Measuring AI Bias
Date May 23, 2024 4:40 PM
  Links have been removed from this email. Learn more in the FAQ.
  Links have been removed from this email. Learn more in the FAQ.
To view this email online, paste this link into your browser:
[link removed]





MAY NEWSLETTER  

([link removed])

CDT AI Governance Lab Tackles Measuring AI Bias

This month, CDT’s AI Governance Lab ([link removed]) released its first major report, “Navigating Demographic Measurement for Fairness and Equity ([link removed]),” which responds to growing government expectations for practitioners to proactively identify and address bias or discrimination in AI systems. The 120-page report aims to help companies, government agencies, and researchers meet this goal, particularly in cases where demographic data is not available to help. The report offers insight into the implications of different approaches, to help ensure that AI systems serve all communities fairly and responsibly.

([link removed])

CDT report, entitled "Navigating Demographic Measurement for Fairness and Equity." Illustration of a compass and streams of data.

CDT’s AI Governance Lab also recently released an actionable guide for implementing sociotechnical approaches to AI ([link removed]), where we describe how to safely govern and develop AI systems that live up to their promise without causing undue harm, and recommendations for building more effective ([link removed]) and responsible large language models for non-English languages.

At this week’s AI Safety Summit ([link removed]) in Seoul, companies made new commitments to address AI risks, and governments from around the world promised deeper collaboration on research and safety mitigations. CDT’s AI Governance Lab is engaging closely with this work via the U.S. and UK’s AI Safety Institutes, and continuing to work on model documentation and transparency, open-source AI, and responsible approaches to model releases.

In Case You Missed It

— Earlier this month, we were pleased to welcome ([link removed]) Lisa Rice and Dominique Shelton Leipzig to our Board of Directors. Rice is the President and CEO of the National Fair Housing Alliance. Shelton Leipzig is a partner at Mayer Brown, where she leads the firm’s Ad Tech Privacy & Data Management team.

CDT's Jake Laperruque testifies at a House Homeland Security hearing on May 22, 2024.

— CDT’s Jake Laperruque testified before a House Committee on Homeland Security hearing ([link removed]) on AI use in national security. Jake described key principles that should govern the Department of Homeland Security’s use of AI, examined facial recognition as a case study into why responsible use of AI is so critical, and provided a set of steps Congress can take to increase oversight and accountability for the use of AI in the national security space.

— CDT welcomed the passage of Colorado’s Senate Bill 205 ([link removed]), a first-of-its kind state law to increase transparency and assessment of automated decision systems. While the law falls short of the full protections CDT and other public interest advocacy groups called for, it establishes basic safeguards for the use of AI in high-stakes decisions affecting consumers and workers.

— CDT’s Europe team examined the ways in which the ewly-signed EU AI Act will interact with the Digital Services Act ([link removed]) and other EU laws to impact online expression. The post is the latest in a series examining various aspects of the EU AI Act; earlier installments focus on privacy and surveillance issues ([link removed]), and an overview of the Act’s structure and key human rights considerations ([link removed]).

CDT CEO Alexandra Reeve Givens is pictured at left, signing official memo. Givens is seated at a table with a blue tablecloth reading, "United States Access Board".

— CDT launched a partnership ([link removed]) with the U.S. Access Board and the American Association of People with Disabilities (AAPD) to support the Access Board’s work under the Biden AI Executive Order on how AI impacts people with disabilities.

— CDT joined a Ninth Circuit amicus brief ([link removed]) with several other civil society organizations in Alario v. Knudsen, explaining how Montana’s S.B. 419 — which bans TikTok in the state — would violate the free expression and First Amendment rights of TikTok users.

CDT in the Press

— CDT’s Aliya Bhatia and Michal Luria authored an op-ed for CNN ([link removed]) on why restricting and monitoring online content won’t protect kids, and what will.

— CDT’s Alexandra Givens told the Associated Press ([link removed]) about the recent bipartisan Senate AI roadmap, released last week, “It’s time for Congress to act… It’s not enough to focus on investment and innovation. We need guardrails to ensure the responsible development of AI.”

— Washingtonian Magazine named CDT CEO Alexandra Reeve Givens one of its 500 Most Influential People of 2024 ([link removed]).

([link removed])

Image of the U.S. Capitol Building with a computer-shaped head.

CDT "in Person"

— A June 24 symposium organized by CDT and The Future of Free Speech, “Artificial Intelligence & The First Amendment: Protecting Free Speech in the AI Era ([link removed]),” will bring together leading voices from civil society, U.S. institutions, and the private sector for conversations putting proposed AI regulations into the context of the First Amendment and other free speech protections. Participants will discuss how freedom of expression principles, both in the U.S. and abroad, should apply to generative AI and explore ways to create a resilient free-speech culture. To RSVP, or see more information, visit The Future of Free Speech’s website ([link removed]).

— On June 26, CDT’s Ariana Aboulafia will appear on an American Bar Association panel, “AI in Housing and Benefits: Automating Discrimination, Enhancing Surveillance, and Scaling Bias ([link removed]).” Experts will discuss the increasing reliance of state and federal social services agencies on automated fraud detection systems, eligibility screening and assessment tools, and benefits determinations systems that disproportionately harm disabled and elderly people. To register, visit ABA’s website ([link removed]).

Partner Spotlight

Data & Society ([link removed]) studies the social implications of data-centric technologies, automation, and AI. CDT routinely collaborates with them in urging AI practitioners, researchers, policymakers, and governance professionals to center their work in equity and human rights. Just last week, Data & Society released a brief arguing for social science and the humanities to play an important role ([link removed]) in AI governance, and the CDT AI Governance Lab issued coordinated guidance for AI practitioners ([link removed]) on how to realistically integrate these areas of expertise. In March, we were also pleased to jointly lead civil society groups in asking ([link removed]) the U.S. Secretary of Commerce to articulate the expectation that the National Institute of Standards and Technology (NIST) maintain a broad view of “AI safety” that accounts for the entire range of algorithmic harms. 

Kevin Bankston, wearing black glasses, a jacket and shirt in front of a white background.

Staff Spotlight
Kevin Bankston ([link removed]), Senior Advisor on AI Governance

How long have you been working in digital rights? My first day on the job in the world of digital rights — as a First Amendment fellow at the National ACLU office in New York City, litigating internet free speech issues — was on September 10th, 2001. My primary interests then were online speech and copyright, which I now currently teach in the context of AI regulation at Georgetown Law. But the events of my second day on the job changed the course of my career, and I ended up focusing for the next decade on internet privacy and surveillance in the wake of 9/11 and the USA PATRIOT Act. It was while doing that work at my next organization, the Electronic Frontier Foundation, that I first collaborated with colleagues at CDT, which eventually led to my working here.

What is your proudest moment while here at CDT? This is my second time working at CDT; I was previously the Free Expression Project Director about ten years ago, then rejoined last year as Senior Advisor on AI Governance. In both instances, my proudest moments have revolved around building big-tent coalitions to achieve big policy impacts, a CDT speciality. 

A decade ago I helped build a broad coalition of civil society and industry stakeholders to demand more transparency around the government's surveillance orders to internet and phone companies, leading Congress to pass new transparency requirements. And this year, I helped build another coalition of civil society and academic experts to highlight the benefits of an open AI ecosystem and push back against the idea of new restrictions on the open publication of large AI models, which could threaten free speech, competition, and security. We've yet to see whether we'll win that argument too, but we're crossing our fingers.

What is your fandom? If I have to pick only one, it would definitely be science fiction, in print and on screen. Many, many folks working in tech policy do what they do in no small part because of the science fiction they read and watched as younger people. Good sci-fi is like a gymnasium for your mind, forcing you to build the skill of thinking through all the ways a new technology might change a society for good or ill, which is a key part of what tech policy professionals do. 

Sci-fi narratives also often directly impact tech development and tech policy discourse. For a recent example of how sci-fi stories can influence how we think about new technologies, look no further than the competing sci-fi-tinged visions of AI utopian tech optimists versus those predicting a "Terminator"-style AI dystopia; each of these visions are animating different communities in the policy conversation over how to regulate AI. I find this feedback loop between fictional tomorrows and today's real world tech policy issues so personally fascinating that, in my spare time, I study that cycle of influence as a fellow at Arizona State University's Center for Science and the Imagination.

What is the most recent cultural activity you’ve been to? Mother's Day Jazz Brunch at The Hamilton, a storied old music venue and restaurant next to the White House. It's one of the best southern brunch buffets you'll find in DC — biscuits and gravy, fried chicken and waffles, that sort of thing — paired with excellent live jazz. The music, the food, and the friendly and appreciative audience of celebrating families were all comforting reminders of my beloved hometown of New Orleans.

#CONNECT WITH CDT

SUPPORT OUR WORK ([link removed])

([link removed])

([link removed])

([link removed])

([link removed])





1401 K St NW Suite 200 | Washington, DC xxxxxx United States

This email was sent to [email protected].
To ensure that you continue receiving our emails,
please add us to your address book or safe list.

manage your preferences ([link removed])
opt out ([link removed]) using TrueRemove(r).

Got this as a forward? Sign up ([link removed]) to receive our future emails.
email powered by Emma(R)
[link removed]
Screenshot of the email generated on import

Message Analysis