Recently CDT’s Equity in Civic Tech team issued two comprehensive analyses of how state governments across the country are implementing AI policies in the public sector. The first project analyzed trends in state public sector AI legislation, assessing dozens of bills introduced (and, in some cases, passed) in 2024. While legislatures are taking a variety of approaches to this issue, several themes emerged, most prominently, the creation of task forces and risk management practices. Based on these findings, CDT highlighted New York’s SB 7543 and Maryland’s SB 818 (both of which passed in 2024) as important examples for other states considering the path forward. Both bills contained provisions mandating impact assessments and strong reporting requirements to ensure people’s rights are at the center of the AI conversation.
Simultaneously, CDT reviewed executive orders from 13 states and the District of Columbia governing public sector use of AI. Here too, states prioritized steps to combat AI risks and proposed governance mechanisms including AI task forces. While we found discrepancies in how states defined their AI focus (some explicitly focused on generative AI), we noted some overall trends in what states’ executive orders prioritized, including the development of preemptive strategy measures for the usage of AI, along with pilot projects that test AI tool options for potential usage. Overall, when proposing AI regulations, CDT recommended that states maintain consistent definitions of AI, implement risk management practices, and maintain public transparency to ensure that strong rights-protecting measures are in place.
But public sector use of AI is only one part of the conversation going on at the state level. When it comes to private sector regulations, again, CDT has been intently focused on sharing its expertise with lawmakers across the country.
In Colorado, CDT collaborated with its partners in the development of SB 24-205, which passed into law in May 2024. An important first step towards comprehensive regulation, the legislation mandates consumer protections, reporting requirements, and risk assessments for companies using AI in key decision making processes in Colorado. In partnership with civil society allies, CDT has advocated for measures that would strengthen the legislation to ensure that individual rights are protected and that consumers have adequate recourse from misuse of AI systems. Matt Scherer, CDT’s Senior Policy Counsel for Workers’ Rights and Technology, was appointed to Colorado’s Artificial Intelligence Impact Task Force to provide recommendations on AI topics. CDT’s FAQ on the Colorado law provides a thorough overview of the key policy issues that the bill addresses as well as the changes that need to be made to ensure the bill adequately protects consumers and workers.
In addition, CDT remains committed to advocating for key state legislation on a wide range of issues focused on digital rights and privacy. Following the Dobbs decision in 2022, when statewide reproductive rights legislation became a critical focus of our work, CDT released a report on the passage of Shield Laws and other state legislation advocating for the preservation of privacy protections in reproductive healthcare and beyond. Along with key civil society partners, CDT sent a letter to Vermont Governor Phil Scott encouraging him to sign the bipartisan Vermont Data Privacy Act, to ensure consumer safeguards against the collection and sale of sensitive information online. CDT and its allies also advocated on behalf of worker welfare and privacy in support of New York’s BOT Act to prevent electronic surveillance and automated employment decisions tools (AEDTs) that result in biased decision making.