From Center for Democracy & Technology <[email protected]>
Subject Today, CDT Celebrates Global Encryption Day
Date October 21, 2022 6:57 PM
  Links have been removed from this email. Learn more in the FAQ.
  Links have been removed from this email. Learn more in the FAQ.
CDT’s U.S. Newsletter — October Edition


To view this email online, paste this link into your browser:
[link removed]





OCTOBER NEWSLETTER  

([link removed])

CDT Welcomes the White House's AI Bill of Rights

Earlier this month, CDT welcomed the White House's release ([link removed]) of a Blueprint for an AI Bill of Rights ([link removed]), along with an accompanying fact sheet ([link removed]) listing past and future actions by federal agencies to protect those rights across sectors including healthcare, housing, and employment.

The announcement is the result of a year-long process by the Office of Science and Technology Policy (OSTP), which CDT engaged in throughout, to “make sure new and emerging data-driven technologies abide by the enduring values of American democracy.” With this goal in mind, the AI Bill of Rights establishes that automated systems should — among other things — be demonstrably safe and effective; not cause “unjustified” discrimination; have built-in data privacy protections; and provide timely and accessible notice and explanation of how they operate.

As CDT President and CEO Alexandra Reeve Givens said, “The AI Bill of Rights marks an important step in recognizing the ways in which algorithmic systems can deepen inequality. In particular, we commend the White House for considering the diverse ways in which discrimination can occur, for challenging inappropriate and irrelevant data uses, and for lifting up examples of practical steps that companies and agencies can take to reduce harm.”

Notably, the announcement built on recommendations made by CDT and other civil society advocates, including communities directly impacted by algorithmic harms, and cited CDT’s research on automated test proctoring ([link removed]), bossware ([link removed]), and other surveillance technologies ([link removed]). Further, the Blueprint recognized that to fully address data-driven harms, federal privacy protections — which CDT has long called for ([link removed]) — are an essential supplement to agency actions.

In Case You Missed It

This year, Global Encryption Day ([link removed]) (GED) is celebrated today, October 21, and in the lead-up, CDT joined with our partners in the Global Encryption Coalition ([link removed]) to call on governments to legally protect end-to-end encryption rather than undermine the communications technology that protects the privacy and security of everyone around the world. Global Encryption Day comes on the heels of a September report on 'The right to privacy in the digital age' ([link removed])’ by the United Nations Office of the High Commissioner for Human Rights, which unequivocally defended the importance of end-to-end encryption in the protection of democracy and human rights worldwide. Follow all of CDT’s GED 2022 action via Twitter at @CenDemTech ([link removed]), and join the conversation online using the hashtags #GED2022 ([link removed]) and #GlobalEncryptionDay ([link removed]).

([link removed])

Illustration depicting a pixelated user cursor reaching through the browser screen to affect the platform algorithm. CDT Research report entitled "This is Transparency to Me: User Insights into Recommendation Algorithm Reporting."

In a new CDT research report ([link removed]), we use design research methods to explore how platforms can help users meaningfully understand how recommendation algorithms, which by and large determine what people see on social media, actually work. The report builds on prior research on explainability, transparency, and accountability of algorithms, and examines what a recommendation algorithm transparency report may include and how it could present information to users.

 With the 2022 U.S. midterm elections approaching, CDT joined a coalition of organizations ([link removed]) in urging social media platforms to take measures to curb the spread of election disinformation before the U.S. midterms. The signatories highlighted the importance of addressing false narratives around past elections, and urged the platforms to invest more resources in preventing the spread of non-English disinformation — an issue that CDT has noted is driving a wedge between voters ([link removed]) in non-English-speaking U.S. communities.

([link removed])

Photo of Will Adler, CDT's Senior Technologist for Elections and Democracy. Image has a gray background and shows a portrait of a smiling man wearing glasses and a blue collared shirt.

CDT in the Press

CDT Senior Technologist Will Adler talked with the Washington Post ([link removed]) about new research ([link removed]) CDT conducted with researchers from Georgetown University’s Foo Law Lab, which found that only one in four official election websites in the U.S. uses the .gov domain. That low rate puts up a barrier to voter trust, and creates an opportunity for bad actors to create fake election websites and spread disinformation.

Samir Jain, CDT Director of Policy, discussed the Supreme Court's recent decision to rule on challenges to Section 230 ([link removed]) — a bedrock law concerning the liability of online intermediaries — with Axios: “If the Court were to substantially narrow Section 230 in a way that made online services potentially liable for third-party content, then that might result in significantly less ability for people to speak freely online,” he said.

Alexandra Reeve Givens, CDT President and CEO, spoke with the Washington Post about use of data by political campaigns ([link removed]): “What we want is strong privacy protections across the board, no matter who it is that’s ultimately accessing that information… We want there to be a free flow of information around different campaigns and movements. But the infringement on people’s privacy to identify a target-rich environment is deeply problematic and doesn’t match what users want to see.”

([link removed])

Graphic for a joint event hosted by CDT and the National Center for Learning Disabilities, entitled "Hidden Harms: Mental Health and Students with Disabilities." November 3, 2022 from 4-5 PM ET. Text in blue and black.

CDT "in Person"

In the coming months, CDT and several partners will put on a series of webinars focused on how policymakers can help mitigate the hidden harms of student activity monitoring. A recording of our event ([link removed]) on how this surveillance affects LGBTQ+ students ([link removed]) is already available; future events will spotlight how the tech affects students with disabilities ([link removed]) (Nov. 3) and leads to increased discipline and contact with law enforcement, and how the conversation has changed following the Supreme Court’s Dobbs decision.

On Friday, November 4, join Jake Laperruque, Deputy Director of CDT’s Security and Surveillance Project, for a session at the Privacy + Security Forum ([link removed]). The discussion will explore changes companies can make to protect users from state law enforcement demands for personal data concerning reproductive health choices.

On behalf of all of us at CDT, thank you to everyone who made this year’s Tech Prom ([link removed]) a success! Alexandra Reeve Givens, CDT’s President & CEO, shined a spotlight on the current and future challenges and opportunities shaping technology policy, and CDT's commitment to solutions that advance equity, civil liberties, and democratic values. You can find a copy of her remarks on our website ([link removed]).

Partner Spotlight

CDT is proud to partner with a broad range of civil society ([link removed]) throughout the consultation process for the recently released White House AI Bill of Rights, providing guidance on the use and governance of algorithmic systems, including general context on the positive use cases, potential harms, and/or oversight possibilities for these technologies. In particular, we are honored to work closely with the American Civil Liberties Union, Algorithmic Justice League, Color of Change, Data & Society, Lawyers’ Committee for Civil Rights Under Law, Leadership Conference on Civil and Human Rights, NAACP Legal Defense Fund, People’s Tech Project, and Upturn.

([link removed])

Photo of Michael Yang, CDT's Fellow for Equity in Civic Technology. Image has a green foliage background, and shows a portrait of a smiling man with dark hair wearing glasses and a blue collared shirt.

Staff SpotlightMichael Yang ([link removed]), Fellow, Equity in Civic Technology

How long have you been working in digital rights? I've only recently gotten my start in digital rights. I'm fortunate to have started last summer as a CDT intern, and now I'm doubly fortunate to be back as a fellow. I hope to keep working for many years in the intersection of emerging technology, policy, and civil rights.

What is your proudest moment while here at CDT? I love the little moments where I can help somebody else, whether they're a co-worker or an external CDT partner. One recent moment that comes to mind is listening to and offering feedback on a lawyer's explanation of quantum computing. It's just fun to mind-meld with somebody else and work together toward a common goal.

What is the best book you’ve read recently? The Dispossessed, by Ursula K. Le Guin. I read it for CDT's internal book club actually, and I've had withdrawal ever since finishing it. It's just such a wonderful blend of sci-fi, speculative fiction, character development, and social commentary. I was really touched by the protagonist's struggle to find belonging in two different worlds, a plot point which I think must have been inspired by Le Guin's parents' work together on documenting the life of an Indigenous person who was the last of his tribe. It's deeply sad and profound.

Cats or dogs? Why not both?

#CONNECT WITH CDT

SUPPORT OUR WORK ([link removed])

([link removed])

([link removed])

([link removed])

([link removed])





1401 K St NW Suite 200 | Washington, DC xxxxxx United States

This email was sent to [email protected].
To ensure that you continue receiving our emails,
please add us to your address book or safe list.

manage your preferences ([link removed])
opt out ([link removed]) using TrueRemove(r).

Got this as a forward? Sign up ([link removed]) to receive our future emails.
email powered by Emma(R)
[link removed]
Screenshot of the email generated on import

Message Analysis