For the past 25 years, law around content moderation and the liability of online intermediaries in the United States has been clear. Between the First Amendment and the heavily debated intermediary liability law Section 230, content hosts have been free to set and enforce their policies as they see fit while being shielded from liability for most user-generated content. Seismic changes in this legal landscape, however, could soon upend the status quo.
At the sixth annual Future of Speech Online event, “The Supreme Court’s Pivotal Term,” the Center for Democracy & Technology (CDT) and Stand Together Trust will explore the consequences of these potential changes. The Court is poised to consider multiple cases that address the scope of protections for speech online, including its first-ever examination of Section 230. Decisions in those cases could force online services to walk a challenging tightrope between reduced ability to moderate content and increased legal risk over users’ speech, potentially threatening their very existence and jeopardizing people’s ability to find places for their speech online.
On December 6-8, join us to deliberate over strategies for protecting online speech and addressing abuse, and consider questions including: Is content moderation “censorship,” or a necessary part of responsible online content hosting? When should intermediaries be liable for speech posted by their users? Who should decide who gets to have access to major online platforms for speech?
We’ll hear from legal experts, online service providers, advocates fighting against online hate and disinformation, and more — including representatives from parties involved in the Supreme Court cases at hand, and European lawmakers. Several members of the Meta Oversight Board will also join us to introduce the Board's opinion on Meta's "cross-check" program, and discuss the Board's accountability function.