From Center for Democracy & Technology <[email protected]>
Subject Content Moderation Draws Focus as 2022 Winds Down
Date December 22, 2022 3:01 PM
  Links have been removed from this email. Learn more in the FAQ.
  Links have been removed from this email. Learn more in the FAQ.
CDT’s U.S. Newsletter — December Edition


To view this email online, paste this link into your browser:
[link removed]





DECEMBER NEWSLETTER  

([link removed])

Content Moderation Draws Focus as 2022 Winds Down

As 2022 draws to a close, how social media companies moderate content is a hot topic. The Supreme Court will soon examine cases concerning bedrock online intermediary liability law Section 230, which could dramatically reform the legal landscape for online speech. Meanwhile, the change in ownership at Twitter — and subsequent changes to the company’s content moderation processes and policies — have focused social media users’ attention on how content moderation shapes their online experiences.

At CDT, we’re focused on these issues. When Twitter disbanded its Trust and Safety Council, of which we were a member, we emphasized our belief that social media platforms ([link removed]) “should follow human rights-based substantive and procedural rules to ensure fair treatment and help mitigate abuse,” and that they should also “consult with outside experts and follow due process to protect users’ online speech and safety.” A statement from a group of former Council members ([link removed]) further argues that “substantive policies prohibiting hate speech, harassment, disinformation, and other forms of abuse are vital to setting expectations for healthy discourse and robust participation from a diverse array of users. Transparent moderation procedures that follow norms of due process are likewise crucial to ensuring that users are treated fairly.”

At our sixth annual Future of Speech Online event ([link removed]) earlier this month, “The Supreme Court’s Pivotal Term,” we explored the potential consequences of the pending Supreme Court cases that will address the scope of protections for speech online. Decisions in those cases could seriously restrict online services’ ability to moderate content, while also exposing them to significant legal risk over users’ speech — a legal tightrope that could prove impossible to navigate.

In Twitter v. Taamneh, the Supreme Court will determine when an online intermediary can be held liable under the Anti-Terrorism Act for aiding and abetting an act of international terrorism. In an amicus brief, CDT and six other civil society organizations argued ([link removed]) that holding intermediaries liable for aiding and abetting, based solely on general knowledge that terrorists use their services, would force platforms to over-remove content or otherwise sharply limit the content they allow users to post. For that reason, we urged the Court to hold that, unless speech intermediaries know that they are hosting a specific piece of user-generated content that substantially assists a terrorist act, they cannot be held liable for aiding and abetting.

([link removed])

Graphic for report, entitled "Civil Rights Standards for 21st Century Employment Selection Procedures." A blue and green work station and chair, including desk, chair, and laptop.

In Case You Missed It

As employers increasingly use new tools for hiring and employee management, workers are left with little insight into how they are assessed, or whether they could be subject to unfair or discriminatory decisions. To ensure that tools used to make employment decisions are fair and equitable, CDT partnered with leading civil rights organizations to publish new recommendations and guidance ([link removed]) that help policymakers, industry groups, and employers determine what information candidates should receive, how selection procedures should be audited, and how to ensure accountability when selection procedures threaten workers’ civil rights.

CDT, the Electronic Frontier Foundation, and Fight for the Future led over 90 organizations in a letter opposing the Kids Online Safety Act ([link removed]) (KOSA). Despite its good intentions, KOSA would undermine the privacy, online safety, and digital well-being of all people, but especially children, by effectively forcing providers to use invasive filtering and monitoring tools. It would also jeopardize private, secure communications, incentivize increased data collection on children and adults, and undermine the delivery of critical services to minors by public agencies like schools. We warned that the bill would harm LGBTQ+ youth especially ([link removed]), and could be weaponized by Attorneys General to censor online resources and information for queer and trans youth, people seeking reproductive healthcare, and more.

In comments to the Federal Trade Commission ([link removed]), CDT examined common modern online data practices, where companies collect, share, and process huge amounts of peoples’ data, and their negative effects on marginalized communities and consumers. We called on the FTC to pass rules that address these harms, and made suggestions around what those rules should accomplish, particularly in the context of data practices by private contractors for educational institutions and other governmental entities.

CDT celebrated news that Apple has taken onboard our and other advocates' and experts' advice ([link removed]) in moving away from implementing client-side scanning on its devices, which introduced surveillance and security risks ([link removed]) in addition to fundamental civil liberties concerns ([link removed]). Apple’s new features and service commitments that provide broader accessibility of strong end-to-end encryption are a victory for privacy and security, including safer online services for children. We are also encouraged that Apple is moving towards ongoing protections for children, parents, and other users to protect data stored in the cloud and provide on-device cues for safer communication.

Support CDT This Year End

As a nonprofit, CDT relies on the generosity of donors like you. Your support makes everything we do possible—each fight for privacy, free expression, and democracy is because of, and for, you. Support our work this holiday season by donating to CDT. ([link removed])

CDT in the Press

"When you have a really unclear set of rules that can change at a moment's notice at the whim of the owner of the company, you are starting to create a situation where people are going to self-censor," Emma Llansó, Director of CDT's Free Expression Project, told CNET ([link removed]).

Screenshot of CDT Research Fellow Gabriel Nicholas appearing on CNN. Chyron reads, "New tonight: Critics, Elon Musk fans debate future of social media network Twitter." Portrait-style image shows gray and white background, and man with short dark hair wearing a gray sweater over a blue collared shirt.

CDT Research Fellow Gabriel Nicholas joined CNN for a conversation ([link removed]) about how opaque content moderation practices — or, so-called “shadowbanning” — work.

CDT President and CEO Alexandra Reeve Givens joined NPR's All Things Considered ([link removed]) to talk about the impact of content policy changes at Twitter on free speech around the globe.

([link removed])

Photo of Dhanaraj Thakur, CDT's Director of Research. Image has a light grey background and shows a portrait of a smiling man wearing a purple collared shirt.

CDT "in Person"

Earlier this month, CDT joined the Knight Foundation’s INFORMED 2022, which explores the intersection of technology, media, and democracy. Dhanaraj Thakur, Director of Research, shared insights from CDT’s research into mis- and disinformation targeting women political candidates of color. You can watch the recording ([link removed]) and find out more information on the event ([link removed]).

On December 13, CDT hosted a conversation on how schools’ monitoring software affects students’ access to information about private health issues, and how decision-makers can support students as they meet their reproductive healthcare needs. You can find out more about the Hidden Harms series and find a recording on the event's page ([link removed])

On December 15, CDT and Georgetown’s Massive Data Institute hosted a conversation on how government data can be publicly shared in a responsible manner. For more information and access to the recording, visit the event's page ([link removed]).

([link removed])

Photo of Asha Allen, CDT's Advocacy Director for Europe, Online Expression & Civic Space. Image has a blurred background and shows a portrait of a smiling woman wearing a blue shirt and a silver necklace.

Staff SpotlightAsha Allen ([link removed]), Advocacy Director for Europe, Online Expression & Civic Space

How long have you been working in digital rights? I've been working in equality rights for nearly 8 years and have specialised in digital rights since 2017. Much of my work has been focused on analysing the online environment from a gender equality and intersectional perspective, tackling issues such as online gender based violence and ensuring protections for the free expression of the most marginalised communities in Europe. This is pretty expansive, but this policy field really is at the cutting edge of human rights debate. 

What is your proudest moment while here at CDT? My proudest moment (so far) was seeing the growing CDT Europe team in full action during our highest-level external event to date, just this November. The culmination of months of advocacy and hard work to get the EU Institutions, the United Nations and government representatives around the table with civil society was incredible—the power of our small yet mighty team was definitely a highlight for me.

What is your fandom? Without a doubt Star Trek: Next Generation. I grew up watching the show religiously with my father and would certainly defend Captain Jean-Luc Picard as the best Captain of the starship USS Enterprise.

Cats or dogs? Definitely Cats, I'm a proud feline mum and have grown up with cats. I truly believe they are one of the best species there is.

#CONNECT WITH CDT

SUPPORT OUR WORK ([link removed])

([link removed])

([link removed])

([link removed])

([link removed])





1401 K St NW Suite 200 | Washington, DC xxxxxx United States

This email was sent to [email protected].
To ensure that you continue receiving our emails,
please add us to your address book or safe list.

manage your preferences ([link removed])
opt out ([link removed]) using TrueRemove(r).

Got this as a forward? Sign up ([link removed]) to receive our future emails.
email powered by Emma(R)
[link removed]
Screenshot of the email generated on import

Message Analysis