From Center for Democracy & Technology <[email protected]>
Subject How to Widen Researcher Access to Data and Protect Privacy
Date January 31, 2023 10:45 PM
  Links have been removed from this email. Learn more in the FAQ.
  Links have been removed from this email. Learn more in the FAQ.
CDT’s U.S. Newsletter — January Edition


To view this email online, paste this link into your browser:
[link removed]





JANUARY NEWSLETTER  

([link removed])

How to Widen Researcher Access to Data and Protect Privacy

In the U.S. and EU, regulators’ attention is increasingly focused on transparency by social media companies, especially on how researchers can access the data they have. While laws requiring social media companies to make data available to independent researchers would have many benefits, they could inadvertently become tools for unjustified and increased law enforcement surveillance of social media users.

([link removed])

Graphic for a CDT report, entitled "Defending Data: Privacy Protection, Independent Researchers, and Access to Social Media Data in the US and EU." Yellow and orange hands attempt to grasp for data behind a grid / barrier.

In a new report from CDT ([link removed]), we make several recommendations for how policymakers can mitigate the risk that law enforcement might take advantage of legal mechanisms widening researchers’ access to data. These include preventing law enforcement from qualifying as vetted researchers, only granting access to researchers at or through a social media company, and requiring researchers to destroy data after a certain time period.

We also examine existing protections for stored social media data in the U.S. and EU, and how they might intersect with new laws on researcher access.

In some cases, law enforcement demands for social media data are lawful and justified, but in others, social media data has been used for illegitimate purposes such as monitoring protestors, dissidents, and members of religious or racial minorities. In light of that risk, policymakers should carefully approach how they widen access to data for researchers and the downstream effects doing so may have.

In Case You Missed It

CDT and six technologists with expertise in online recommendation systems filed an amicus brief with the U.S. Supreme Court in the pivotal online free expression case Gonzalez v. Google ([link removed]). The brief urges the Court to hold that bedrock intermediary liability law Section 230’s liability shield applies to claims against interactive computer service providers based on their recommendation of third-party content, because those claims treat providers as publishers. CDT also filed in an amicus brief in another major online expression case before the Court, Twitter v. Taamneh ([link removed]).

In an amicus brief in Facebook v. New Jersey ([link removed]), CDT — joined by EPIC and EFF — pushed back on an attempt by the State of New Jersey to evade stringent warrant requirements for obtaining wiretapping orders. We argued that, since New Jersey is conducting surveillance that looks and operates like a wiretap, and is seeking disclosures like those of a wiretap, it is conducting wiretaps. We ask that the Supreme Court of New Jersey guard the heightened protections attendant to wiretaps and reject the State’s attempt to evade those protections by compelling repeated disclosures of Facebook users' content with a single warrant.

A white outline of the United States, with a digital gradient. Text: "U.S. Privacy Legislation."

In light of January’s International Data Privacy Day, which saw President Biden pen an op-ed for the Wall Street Journal ([link removed]) that named privacy as his top tech policy priority, CDT called upon the new Congress to continue the push for national privacy protections ([link removed]) and prioritize the American Data Privacy and Protection Act ([link removed]) (ADPPA).

In a post, CDT’s Matt Scherer takes a closer look at a memo from the National Labor Relations Board ([link removed]) (NLRB)’s General Counsel that represents the first significant federal regulatory guidance confronting the harms that intrusive electronic surveillance and management systems inflict on workers. Workers’ rights advocates have expressed increasing concern about these systems in recent years, and CDT has called for regulatory action to help protect workers from the harmful effects of intrusive and nearly continuous worker surveillance systems. 

CDT in the Press

CDT’s Gabriel Nicholas spoke to the Washington Post ([link removed]) about opaque content moderation practices by social media companies, popularly termed “shadowbanning”: “It prevents users from knowing what the norms of the platform are — and either act within them, or if they don’t like them, leave,” he said.

CDT’s Elizabeth Laird discussed software that monitors students’ social media activity with Reuters ([link removed]): “The stated purpose is to keep students safe, and here we have set up a system that is routinizing law enforcement access to this information and finding reasons for them to go into students’ homes.”

Speaking with Foreign Policy about suppression of content by internet companies at the direction of the Indian government ([link removed]), CDT’s Aliya Bhatia said, “[Indian Prime Minister] Modi has always seen the media as an area of control. Tech companies are an extension of this area of control of this government. The real issue here is the impunity and opacity with which Modi is using emergency powers to control what users can say online.”

Screenshot of CDT's Matt Scherer testifying virtually at the EEOC. Image shows woman with red glasses against a blue background at top left, a woman in a pink jacket with flags in the background at top right, and at bottom center, a man in a suit and tie with a picture of a city skyline in the background.

CDT "in Person"

CDT’s Alexandra Reeve Givens participated in a fireside chat with Laurie Locascio, Director of the National Institute of Standards and Technology (NIST) at the launch of NIST’s AI Risk Management Framework ([link removed]).

CDT’s Matt Scherer testified before the U.S. Equal Employment Opportunity Commission’s (EEOC) January 31, 2023 meeting ([link removed]), entitled “Navigating Employment Discrimination in AI and Automated Systems: A New Civil Rights Frontier.” Matt’s testimony notes the unique risks of discrimination that automated and other technology-driven selection procedures pose.

On the International Day of Women in Multilateralism, January 2, UNESCO generated a global dialogue to advance effective responses to online gendered disinformation. CDT’s Dr. Dhanaraj Thakur, Research Director, was on a panel highlighting the main issues and challenges that require the development of a gender-responsive model regulatory framework for digital platforms. The recording of the event is available on the event page ([link removed]).

On Tuesday, February 7, CDT’s Eric Null will join ITIF for a discussion ([link removed]) about the progress Congress has made in crafting bipartisan privacy legislation, the ADPPA’s current legislative status, and the remaining areas of debate regarding the legislation.

On Thursday, February 9, Georgetown University will host a symposium bringing together people who craft and interpret data privacy regulations and the lead engineers who design and build data systems that meet those regulations. CDT’s Mallory Knodel, Chief Technology Officer, will be a featured speaker. Learn more and register on the event page ([link removed]).

([link removed])

Photo of Aliya Bhatia, Policy Analyst CDT's Free Expression Project. Image shows a portrait of a smiling woman with dark hair wearing a green sweater, against a white background.

Staff SpotlightAliya Bhatia ([link removed]), Policy Analyst, Free Expression Project

How long have you been working in digital rights? Even before I worked in the digital rights space, I was advancing similar principles: Most recently, I worked with local government, libraries, and the Census Bureau to ensure New Yorkers had access to digitally secure areas to complete the first decennial census in 2020. After this, I worked with Ranking Digital Rights to increase public accountability of corporate data sharing and governance practices, and then came here to CDT! 

What is your proudest moment while here at CDT? I have so many! But my top two are contributing to the Facebook Oversight Board case regarding UK law enforcement requests to take down a drill music video off of Instagram. The FX team drafted a comment to this submission highlighting the importance of protecting free expression ([link removed]), particularly of Black drill artists who have had a history of dealing with law enforcement, and creating transparency and accountability mechanisms to ensure individuals know when law enforcement is circumventing official legal avenues and exerting pressure to takedown lawful speech.

My second — if I may! — is developing a new workstream around content analysis and moderation of speech in languages other than English. I'm working on a project with Gabe Nicholas, our research fellow, on examining the types of machine learning systems that are used to analyze non-English speech and the challenges these systems face due to the Anglo-centrism of the machine learning field and internet. We are releasing our technical explainer of these systems ([link removed]) this spring!

What is the best book you've read recently? The Immortal King Rao! Vauhini Vara's multi-generational science fiction novel is a must read for tech policy folks. In it she paints a picture of a very real near tech future where governments cede control to private technology giants and these companies shape our lives, relationships to each other, and even our thoughts! 

Cats or dogs? Both!

#CONNECT WITH CDT

SUPPORT OUR WORK ([link removed])

([link removed])

([link removed])

([link removed])

([link removed])





1401 K St NW Suite 200 | Washington, DC xxxxxx United States

This email was sent to [email protected].
To ensure that you continue receiving our emails,
please add us to your address book or safe list.

manage your preferences ([link removed])
opt out ([link removed]) using TrueRemove(r).

Got this as a forward? Sign up ([link removed]) to receive our future emails.
email powered by Emma(R)
[link removed]
Screenshot of the email generated on import

Message Analysis