To view this email online, paste this link into your browser:
[link removed]
U.S. NEWSLETTER
([link removed])
CDT Marks the ADA's 35th Anniversary
On July 26, the 35th anniversary of the passage of the Americans with Disabilities Act (ADA) occurred, and CDT’s Disability in Technology Policy Project ([link removed]) — which analyzes the impact of technology and tech policy on people with disabilities — marked the occasion with a flurry of new work.
In the San Francisco Chronicle ([link removed]), CDT’s Ariana Aboulafia argued that emerging technologies like AI and algorithmic systems risk discriminating against people with disabilities, posing significant threats to progress made under the ADA. As Ariana wrote, time-tested strategies like raising awareness, fighting for inclusive policies, and prioritizing community can help minimize the risks of AI-enabled technologies and build a world where disabled people are valued and respected.
Graphic for CDT's Plan Language Resource Hub.
Ariana also wrote for the American Bar Association ([link removed]) about how worker surveillance technology specifically impacts people with disabilities, and j ([link removed])oined Tech Policy Press for a podcast ([link removed]) to discuss tech’s impacts on disabled people outside the context of accessible technology. The episode also featured Professor Blake Reid; Maitreya Shah, Tech Policy Director at the American Association of People with Disabilities (AAPD); and Cynthia Bennett, Senior Research Scientist at Google.
CDT also released new work on how assistive technology should be developed with privacy in mind. In partnership with AAPD, our report argued that making assistive technologies more privacy-protective ([link removed]) benefits both consumers and developers, and explained how developers of assistive technologies can incorporate privacy into their designs.
Finally, we launched a Plain Language Resource Hub ([link removed]) with plain language versions of CDT reports that explore how technology impacts disabled people in employment, voting, public benefits, and more. Plain language reports make our work more available and accessible for all those who are interested, and we’ll be adding new resources regularly as we continue to publish reports on ways that technology affects people with disabilities.
In Case You Missed It
([link removed])
CDT AI Governance Lab report, entitled “Tuning Into Safety: How Visibility into Safety Training Can Help External Actors Mitigate AI Misbehavior.” Illustration of a collection of “blocks,” representing foundational AI models, in varying color gradients and textures. In back: a stack of model spec papers, and an orange “digital” safety cone.
— CDT’s AI Governance Lab released two reports in July. The first explores what it would take to professionalize ([link removed]) the new field of AI assurance, which comprises practices to ensure that AI systems are safe, fair, and effective. The second report outlines how and why developers can provide transparency ([link removed]) into a critical dimension of AI systems — the safety training they undergo to prevent them from misbehaving.
— We joined a Supreme Court amicus brief in ([link removed])NetChoice v. Fitch ([link removed]), led by the Foundation for Individual Rights and Expression (FIRE), supporting NetChoice’s request to keep a Mississippi statute from going into effect pending litigation. We argue that the law — which requires age verification for anyone to access social media sites — violates the First Amendment rights of both minors and adults, burdening or entirely blocking their access to constitutionally protected expression on services critically important to the exercise of free expression rights.
([link removed])
Graphic for CDT report, entitled “Rapid Response: Building Victim-Centered Reporting Processes for Non-Consensual Intimate Imagery.” Illustration of four schematic panels; side-by-side images with definitions on top of them, a cursor pointing to multiple buttons with a flag icon, a group of images with an arrow pointing to a trash can, and an icon of documents, videos, charts, a question mark, a user and an eye with an X over it.
— CDT released a model policy and infographic ([link removed]) for K-12 schools on how to approach the growing problem of non-consensual intimate imagery in the education context. We also released a report examining current mechanisms for reporting non-consensual disclosure of intimate imagery ([link removed]) across eight popular content platforms, which assesses how these services structure their policies, design their reporting tools, and support victims through the process of reporting and removal.
— With the Leadership Conference’s Center for Civil Rights and Technology ([link removed]) and Protect Democracy ([link removed]), we analyzed the potential harms and lasting impacts of the federal government’s expanded efforts to access sensitive information ([link removed]) that states collect to administer public benefits programs.
— We published the final two reports in our series on how content moderation systems operate across multiple regions in the Global South. One focuses on the Quechua context ([link removed]), and the other synthesizes our insights across ([link removed]) regions and presents our recommendations for improving content moderation in low-resource languages of the Global South.
— CDT’s 2024 Annual Report ([link removed]) is live: for more on how we made a difference last year, check it out on our website.
CDT In the Press
— CDT VP of Policy Samir Jain was quoted by the New York Times ([link removed]) on the White House’s AI Action Plan released last month: “The government should not be acting as a ministry of A.I. truth or insisting that A.I. models hew to its preferred interpretation of reality,” he said.
— CDT’s Elizabeth Laird was quoted by Wired ([link removed]) on the sharing of Medicaid data with U.S. Immigration and Customs Enforcement: “By turning over some of our most sensitive health care data to ICE, Health and Human Services has fundamentally betrayed the trust of almost 80 million people,” she said. “Over 90 percent of entitlement fraud is committed by US citizens, underscoring the false pretense of sharing this information with ICE. The results of this decision will be devastating. It will sink trust in government even lower, force individuals to choose between life-saving care and turning over data to immigration authorities, and erode the quality and effectiveness of government services.”
— CDT’s Kate Ruane was quoted by Fast Company ([link removed]), discussing implementation of the UK’s Online Safety Act, which requires age verification: “It’s no wonder VPN downloads soared in the U.K. over the weekend. Privacy and free expression are human rights, and governments should protect them by passing laws to enhance people’s privacy and free expression rights, not endanger them.”
— CDT’s George Slover was quoted by Vox ([link removed]) on Delta’s plans to use AI to determine pricing: “This is a different animal than what the airlines have been doing in the past, and it is more personalized and more intrusive,” he said. “It is a more sophisticated and algorithmically driven and selective price gouging. You are focusing on one particular individual based on their vulnerability and susceptibility.”
— CDT’s Miranda Bogen was quoted by the Washington Post ([link removed]) on a Gemini feature allowing users to turn photos into video: “We need more than vague assurances that developers have stress-tested their systems,” Bogen said. “Unfortunately, there are too few safeguards today to ensure companies are not cutting corners on safety testing as they race to launch cutting-edge tools like this.”
([link removed])
Event graphic for CDT's 2025 Annual Benefit, Tech Prom. Abstract gradients of dark blue, and dark blue and gold text.
CDT "In-Person"
— CDT is pleased to announce our annual Tech Prom ([link removed]) on Thursday, October 23, 2025, at The LINE DC. Join us for a fun night of networking and conversation — you won’t want to miss it! Sponsorships and individual tickets are available now.
— Join CDT on August 7 for a discussion ([link removed]) of the results of our research project addressing how content moderation systems operate in low-resource languages of the Majority World, particularly Maghrebi Arabic, Kiswahili, Tamil, and Quechua. The webinar will feature a series of conversations between CDT researchers and civil society representatives from the Global South. Live interpretation will be available in English, Swahili, Arabic, and Spanish.
— On July 15, CDT’s Jake Laperruque testified before the Massachusetts Joint Committee on the Judiciary ([link removed]) in support of a pair of bills that would provide strong safeguards on law enforcement use of facial recognition.
— On July 16, CDT and the American Enterprise Institute hosted a panel ([link removed]) featuring CDT President and CEO Alexandra Reeve Givens discussing the federal government’s effort to create a national database of personal information collected, analyzing the privacy, security, and civil liberties implications of data consolidation.
Partner Spotlight
CDT is proud to partner with the Ada Lovelace Institute on our publication examining considerations in AI assurance. Through their research, the Ada Lovelace Institute works to ensure that AI and data work for people and society. You can learn more about the Institute’s research on their website ([link removed]).
([link removed])
Image of Travis Hall wearing a light blue shirt.
Staff Spotlight
Travis Hall ([link removed]), Director for State Engagement
How long have you been working in digital rights?
An interesting question! In many ways I have been working in digital rights since 2006, when I started my PhD and did my dissertation on identification technologies, and then at the National Telecommunications and Information Administration (NTIA) I used to say that my job was fighting for human rights by arguing that it is good for commerce. This, however, is my first job in a formal advocacy role, and it is great to be able to champion digital rights because they are rights.
What is your proudest moment while here at CDT?
It has been great to dive right into the fight on the proposed moratorium on state-level AI regulations, and it was awesome to notch that win even if it is ultimately a battle deferred. Being able to quickly spin up as a part of the team has been extraordinarily fun and gratifying. That said, I'm also pretty proud of the post-Spring Fling karaoke session, even though most of my go-to songs are apparently no longer included in the booth playlists.
What is the best book you've read recently?
This one is hard, because I haven't been reading good books recently. I've been thoroughly enjoying pulp! Murderbot, Lies of Locke Lamora, Priory of the Orange Tree, and I am currently listening to the audiobook version of Killers of a Certain Age. None of those are really good — certainly not "best" — but are all fun. I guess the best book that I am reading is The Hobbit, which I am reading before bedtime to my son Eugene. I'm looking forward to him being mature enough for Lord of the Rings, which will be like introducing him to an old friend (My daughter Kimmie, on the other hand, enjoys cookbooks before bedtime).
What is the most recent cultural activity you've been to?
Unless you count the live Lumberjack show in Alaska, we recently went to a live showing of the Washington Metropolitan Video Game Symphony Orchestra. We were only able to make it through half of the show, and my kids were disappointed that they did not sing Diggy Diggy Hole ([link removed]), but it was otherwise quite fun!
#CONNECT WITH CDT
SUPPORT OUR WORK ([link removed])
([link removed])
([link removed])
([link removed])
([link removed])
1401 K St NW Suite 200 | Washington, DC xxxxxx United States
This email was sent to
[email protected].
To ensure that you continue receiving our emails,
please add us to your address book or safe list.
manage your preferences ([link removed])
opt out ([link removed]) using TrueRemove(r).
Got this as a forward? Sign up ([link removed]) to receive our future emails.
email powered by Emma(R)
[link removed]