Content Moderation Draws Focus as 2022 Winds Down
| |
As 2022 draws to a close, how social media companies moderate content is a hot topic. The Supreme Court will soon examine cases concerning bedrock online intermediary liability law Section 230, which could dramatically reform the legal landscape for online speech. Meanwhile, the change in ownership at Twitter — and subsequent changes to the company’s content moderation processes and policies — have focused social media users’ attention on how content moderation shapes their online experiences.
At CDT, we’re focused on these issues. When Twitter disbanded its Trust and Safety Council, of which we were a member, we emphasized our belief that social media platforms “should follow human rights-based substantive and procedural rules to ensure fair treatment and help mitigate abuse,” and that they should also “consult with outside experts and follow due process to protect users’ online speech and safety.” A statement from a group of former Council members further argues that “substantive policies prohibiting hate speech, harassment, disinformation, and other forms of abuse are vital to setting expectations for healthy discourse and robust participation from a diverse array of users. Transparent moderation procedures that follow norms of due process are likewise crucial to ensuring that users are treated fairly.”
At our sixth annual Future of Speech Online event earlier this month, “The Supreme Court’s Pivotal Term,” we explored the potential consequences of the pending Supreme Court cases that will address the scope of protections for speech online. Decisions in those cases could seriously restrict online services’ ability to moderate content, while also exposing them to significant legal risk over users’ speech — a legal tightrope that could prove impossible to navigate.
In Twitter v. Taamneh, the Supreme Court will determine when an online intermediary can be held liable under the Anti-Terrorism Act for aiding and abetting an act of international terrorism. In an amicus brief, CDT and six other civil society organizations argued that holding intermediaries liable for aiding and abetting, based solely on general knowledge that terrorists use their services, would force platforms to over-remove content or otherwise sharply limit the content they allow users to post. For that reason, we urged the Court to hold that, unless speech intermediaries know that they are hosting a specific piece of user-generated content that substantially assists a terrorist act, they cannot be held liable for aiding and abetting.
| |
Graphic for report, entitled "Civil Rights Standards for 21st Century Employment Selection Procedures." A blue and green work station and chair, including desk, chair, and laptop.
|
|
In Case You Missed It
As employers increasingly use new tools for hiring and employee management, workers are left with little insight into how they are assessed, or whether they could be subject to unfair or discriminatory decisions. To ensure that tools used to make employment decisions are fair and equitable, CDT partnered with leading civil rights organizations to publish new recommendations and guidance that help policymakers, industry groups, and employers determine what information candidates should receive, how selection procedures should be audited, and how to ensure accountability when selection procedures threaten workers’ civil rights.
CDT, the Electronic Frontier Foundation, and Fight for the Future led over 90 organizations in a letter opposing the Kids Online Safety Act (KOSA). Despite its good intentions, KOSA would undermine the privacy, online safety, and digital well-being of all people, but especially children, by effectively forcing providers to use invasive filtering and monitoring tools. It would also jeopardize private, secure communications, incentivize increased data collection on children and adults, and undermine the delivery of critical services to minors by public agencies like schools. We warned that the bill would harm LGBTQ+ youth especially, and could be weaponized by Attorneys General to censor online resources and information for queer and trans youth, people seeking reproductive healthcare, and more.
In comments to the Federal Trade Commission, CDT examined common modern online data practices, where companies collect, share, and process huge amounts of peoples’ data, and their negative effects on marginalized communities and consumers. We called on the FTC to pass rules that address these harms, and made suggestions around what those rules should accomplish, particularly in the context of data practices by private contractors for educational institutions and other governmental entities.
| |
Support CDT This Year End
| |
CDT in the Press
- "When you have a really unclear set of rules that can change at a moment's notice at the whim of the owner of the company, you are starting to create a situation where people are going to self-censor," Emma Llansó, Director of CDT's Free Expression Project, told CNET.
| |
Screenshot of CDT Research Fellow Gabriel Nicholas appearing on CNN. Chyron reads, "New tonight: Critics, Elon Musk fans debate future of social media network Twitter." Portrait-style image shows gray and white background, and man with short dark hair wearing a gray sweater over a blue collared shirt.
|
|
| |
Photo of Dhanaraj Thakur, CDT's Director of Research. Image has a light grey background and shows a portrait of a smiling man wearing a purple collared shirt.
|
|
CDT "in Person"
- Earlier this month, CDT joined the Knight Foundation’s INFORMED 2022, which explores the intersection of technology, media, and democracy. Dhanaraj Thakur, Director of Research, shared insights from CDT’s research into mis- and disinformation targeting women political candidates of color. You can watch the recording and find out more information on the event.
- On December 13, CDT hosted a conversation on how schools’ monitoring software affects students’ access to information about private health issues, and how decision-makers can support students as they meet their reproductive healthcare needs. You can find out more about the Hidden Harms series and find a recording on the event’s page.
- On December 15, CDT and Georgetown’s Massive Data Institute hosted a conversation on how government data can be publicly shared in a responsible manner. For more information and access to the recording, visit the event’s page.
| |
Photo of Asha Allen, CDT's Advocacy Director for Europe, Online Expression & Civic Space. Image has a blurred background and shows a portrait of a smiling woman wearing a blue shirt and a silver necklace.
|
|
Staff Spotlight Asha Allen, Advocacy Director for Europe, Online Expression & Civic Space
How long have you been working in digital rights? I've been working in equality rights for nearly 8 years and have specialised in digital rights since 2017. Much of my work has been focused on analysing the online environment from a gender equality and intersectional perspective, tackling issues such as online gender based violence and ensuring protections for the free expression of the most marginalised communities in Europe. This is pretty expansive, but this policy field really is at the cutting edge of human rights debate.
What is your proudest moment while here at CDT? My proudest moment (so far) was seeing the growing CDT Europe team in full action during our highest-level external event to date, just this November. The culmination of months of advocacy and hard work to get the EU Institutions, the United Nations and government representatives around the table with civil society was incredible—the power of our small yet mighty team was definitely a highlight for me.
What is your fandom? Without a doubt Star Trek: Next Generation. I grew up watching the show religiously with my father and would certainly defend Captain Jean-Luc Picard as the best Captain of the starship USS Enterprise.
Cats or dogs? Definitely Cats, I'm a proud feline mum and have grown up with cats. I truly believe they are one of the best species there is.
| |
|
|
|
|