CDT Releases New Reports Geared Towards Online Safety
Few topics raise such contentious debate — or passionate feelings — as keeping kids safe online. That’s why CDT has devoted so much energy to practical, actionable work that helps move safety forward while respecting the fundamental rights of children and adults alike.
| |
| Graphic for CDT Research report, entitled “What Kids and Parents Want: Policy Insights for Social Media Safety Features.” Illustration of two hands reaching towards a phone.
|
|
A new CDT qualitative research report, "What Kids and Parents Want: Policy Insights for Social Media Safety Features," is a perfect example. Lots of people talk about what families want; CDT’s research team actually found out. We evaluated how 45 teens and parents perceived four widely-proposed approaches to online child safety: age verification, screen-time features, algorithmic feed controls, and parental access. Across all areas, participants highlighted the importance of flexibility, transparency, and respect for family dynamics and individual needs. These results can guide platforms and policymakers to develop tools, processes, and policies that actually address the needs of families
Another set of risks playing out on online platforms — and facing users of all ages — is the non-consensual distribution and sharing of intimate images (NDII). It’s a multifaceted problem, but one part of the solution (which has successfully been applied to other content types) is deterrence messaging — notifications or other text aimed at influencing human behavior, in this case discouraging users from seeking out or sharing NDII in the first place.
A new CDT report examines how platforms might use deterrence messaging to reduce the prevalence of NDII on their platforms. This work grows out of discussions of CDT’s Working Group to Combat Image Based Sexual Abuse, which brings together representatives from major tech platforms, survivor support groups, and privacy and digital rights experts to identify gaps in best practices and develop actionable solutions. By bringing together voices from all parts of the policy community, CDT is doing what it does best: driving forward policies that make a difference on the issues that matter most.
| |
| Graphic for CDT’s report, entitled “From Symptoms to Systems: A Stakeholder-Informed Taxonomy of Generative AI Risks for Eating Disorders.” Illustration of an array of panels alternating between people and iconography relating to the taxonomy of generative AI risks for eating disorders; near the bottom there are four clinicians interacting and looking at the panels.
|
|
— In a new CDT report, we find that some publicly available AI systems currently display risky behaviors that may contribute to eating disorder-related harms. Drawing on interviews with experts, we break down those behaviors into several categories and offer guidance for clinicians, caregivers, and AI developers on how to better identify and manage the risks generative AI may pose to vulnerable users. An accompanying blog post explores larger questions the report raises about how to address AI risks when systems can be deployed in many domains, affecting groups with different risk profiles.
— CDT and NNEDV collaborated on a report that discusses how tech companies should handle users’ accounts of traumatic experiences — including sexual and domestic violence — associated with their tools and services. From dating apps to ridesharing services and other “sharing economy” apps, the report has important lessons for trauma-informed response.
— At a time of growing investment and focus on the use of AI in education, a new CDT report calls on edtech companies to offer greater transparency about important elements of their AI products. In a new rubric, we outline eight core elements of AI-driven products that edtech companies should disclose and information that school administrators should demand before they purchase and use such products, helping educators make informed decisions as they adopt AI in schools.
— In the wake of reporting by 404 Media and other news outlets about U.S. airlines selling their passengers’ private travel information, we called upon Congress and the Department of Transportation to take strong action to protect travelers’ privacy. CDT celebrated when the Airlines Reporting Corporation — a data broker owned by the U.S.’s leading airlines — announced that it would stop selling passengers’ data to law enforcement without a warrant or subpoena by the end of this year.
— Searches of electronic devices at the border — conducted without a warrant under an outdated and legally convoluted rule — have hit a record high. In a blog post, we explain how the combination of emerging AI tools and an ever-growing stockpile of phones, tablets, and laptops collected through warrantless CBP border searches could have disastrous consequences for privacy and civil liberties.
| |
| Graphic for an event hosted by CDT and the Benefits Tech Advocacy Hub titled: Human Oversight in AI for Public Benefits
|
| CDT "In Person"
— Join us on December 2 for a discussion of Human Oversight in AI for Public Benefits in partnership with the Benefits Tech Advocacy Hub. CDT’s Hannah Quay-de la Vallee will be a member of a panel of experts discussing best practices and meaningful human oversight of AI.
— CDT President & CEO Alexandra Reeve Givens participated in a keynote fireside chat at the Privacy & Security Forum hosted by George Washington University.
— On November 12, CDT’s Eric Null served as a questioner at the People’s Oversight Hearing. The event was hosted by Public Knowledge, and covered the consequences for American consumers of the politicized firings, funding cuts, and politicized agendas upending the Federal Communications Commission, Federal Trade Commission, and Consumer Financial Protection Bureau under the current Administration.
| |
— CDT’s Dhanaraj Thakur wrote for Tech Policy Press, arguing that livestreaming platforms must demonstrate the effectiveness of their safety measures.
— CDT’s Jake Laperruque was quoted by 404 Media about ICE use of facial recognition technology: “Handing this powerful tech to police is like asking a 16-year old who just failed their drivers exams to pick a dozen classmates to hand car keys to,” he said. “These careless and cavalier uses of facial recognition are going to lead to U.S. citizens and lawful residents being grabbed off the street and placed in ICE detention.”
— CDT’s Quinn Anex-Ries discussed a concerning Department of Labor plan with NextGov: “The Department of Labor is seeking to compel states to hand over sensitive information about their unemployment insurance programs against the wider backdrop of the Trump administration's pretty sprawling efforts to amass large quantities of information about everyday Americans … largely under the guise of preventing ‘waste and abuse,’ but as we've seen, repurposed to fuel surveillance and immigration enforcement.”
| |
| Jake Laperruque, smiling and wearing a blue suit and white shirt. CDT logo in the background.
|
| Staff Spotlight
How long have you been working in digital rights?
13 years - I began my career in tech policy with a fellowship on surveillance at CDT right after law school, beginning the same week as the Snowden disclosures, an exciting time to be working on privacy!
What is your proudest moment while here at CDT?
Testifying to Congress last year to the House Homeland Security Committee about the impact of AI on national security and how we should respond.
What is your fandom?
No way I can pick just one, but Star Wars, Game of Thrones, and Lord of the Rings are top three.
What is the best book you've read recently?
"The Devils" — Joe Abercrombie is one of my favorite authors currently writing and always crafts fun and thoughtful characters, an engaging story, and fascinating worlds.
| |
|
|
|
|