The rulings issued last week by the U.S. Supreme Court in Gonzalez v. Google
and Twitter Inc. v. Taamneh allowed an overly broad interpretation of the
liability shield in Section 230 of the Communications Decency Act to stand,
affording tech companies a reprieve from legal liability for its inability to
curb the spread of terrorist content on their platforms.
<[link removed]>
<[link removed]>
<[link removed]>
<[link removed]>
Tech & Terrorism: Supreme Court Rulings Should Prompt Congress To Pass Section
230 Reforms
(New York, N.Y.) — The rulings issued last week
<[link removed]>
by the U.S. Supreme Court inGonzalez v. Google and Twitter Inc. v. Taamneh
allowed an overly broad interpretation of the liability shield in Section 230
of the Communications Decency Act to stand, affording tech companies a reprieve
from legal liability for its inability to curb the spread of terrorist content
on their platforms. The Counter Extremism Project (CEP) and CEP senior advisor
Dr. Hany Farid <[link removed]> had filed
an amicus curiae brief inGonzalez arguing that Google’s recommendation
algorithms are not “content-neutral” as the company maintains and that it
“pushed ISIS videos onto the devices of terrorists” to monetize content.
During oral arguments and in the rulings themselves, however, the Court
repeatedly emphasized that the responsibility to address questions over
liability for platforms hosting extremist content lay with Congress, where
urgency has been building to reform the 27-year-old law.
“These were disappointing but not unexpected rulings after the oral
arguments,” Dr. Farid said. “The good news is that the Court did not uphold
tech’s argument that it has blanket immunity under Section 230 and that
momentum is building for legislative reform. Congress understands that it
cannot punt this issue to the Court in hopes of a satisfactory resolution.”
In 2021, Dr. Farid and CEP supported
<[link removed]>
the introduction of the Protecting Americans from Dangerous Algorithms Act
(PADAA). The bill narrowly amends Section 230 and would lift the liability
shield when an online platform knowingly or recklessly deploys recommendation
algorithms to promote content that, among other things, is relevant to cases
involving international terrorism.
As Dr. Farid told
<[link removed]>
the House Energy and Commerce Subcommittee on Communications and Technology in
2019, “modest changes to Section 230 would go a long way to forcing the
technology sector to invest in more effective technological and human
moderation. Despite years of public outcry and bad press, profits at Google and
Facebook are up. Change will only happen when these companies are held
financially responsible for their failure to create safe products that don’t
lead to the disruption of our democratic elections, don’t lead to horrific
violence against our citizens, don’t lead to allowing child predators to freely
exploit children, and don’t lead to the daily abuse and marginalization of
women and under-represented groups. Like every other industry, the technology
sector should be held responsible when their products are unsafe and lead to
real and measurable harm.”
To read the amicus curiae brief filed by CEP and Dr. Farid, please click here
<[link removed]>
.
###
Unsubscribe
<[link removed]>
|Donate <[link removed]> | Contact Us
<[link removed]>
Were you forwarded this email? Subscribe for yourself here
<[link removed]>
.