From Counter Extremism Project <[email protected]>
Subject Tech & Terrorism: Top EU Official Details Proposal To Address Content Moderation, Liability Issues With Tech Companies
Date July 10, 2020 6:30 PM
  Links have been removed from this email. Learn more in the FAQ.
  Links have been removed from this email. Learn more in the FAQ.
European Union (EU) digital-policy and antitrust czar Margrethe Vestager took
sharp aim at U.S. tech companies last week, detailing a comprehensive


<[link removed]>
<[link removed]>
Tech & Terrorism: Top EU Official Details Proposal To Address Content
Moderation, Liability Issues With Tech Companies

(New York, N.Y.) – European Union (EU) digital-policy and antitrust czar
Margrethe Vestager took sharp aim at U.S. tech companies last week,detailing
<[link removed]>
a comprehensive plan to regulate tech companies, including proposals to curb
their anticompetitive behavior. The new measures aim to compel tech firms to
pay more taxes and take more responsibility for illegal content on their
platforms.

The renewed focus on holding tech accountable for material on their sites
builds upon the EU’s recently proposed Terrorist Content Regulation, which
generated significantopposition
<[link removed]>
from tech companies. As the Counter Extremism Project (CEP) has previously
noted, the measure would allow EU member states to impose fines on tech firms
of up to four percent of their revenue for failure to consistently remove
extremist content from their platforms. It would also require a takedown of
said extremist content within one hour ofreceiving notice
<[link removed]>
from public authorities—a reasonable standard, given that the longer terrorist
propaganda and recruiting materials remain online, the more its viewership will
increase and the higher the likelihood that it will be viewed, copied, and
uploaded elsewhere.

“The EU’s proposed regulations and continued focus on extremist content online
are a welcomed tough line on big tech, that will contribute to the
establishment of responsible content moderation policies worldwide,” said CEP
Executive Director David Ibsen. “The EU’s efforts together with the U.S.
Justice Department’s proposal to roll back tech companies’ broad legal
protections under Section 230 of the Communications Decency Act (CDA)
illustrate a growing international consensus to reform outdated and ineffective
policies concerning the tech industry. These developments are clearly a result
of increasing frustration with tech’s ongoing failure to keep extremist and
terrorist content off their platforms.”

The proposals are the latest in a movement by policymakers on both sides of
the Atlantic to rein in the power and influence of big tech companies by
rolling back legal protections that have long shielded them from liability. In
June, the U.S. Department of Justice outlined recommendations to modify Section
230 of the CDA to remove blanket immunity for harmful content posted by third
parties, including terrorist content that appears on online platforms. As CEP
previouslyaddressed
<[link removed]>
, reform to Section 230 would be a necessary step to updating the legal
framework of an aged and ineffective policy that shields big tech from
liability for content that proliferates on its sites.

 ###

Unsubscribe
<[link removed]>
Screenshot of the email generated on import

Message Analysis

  • Sender: Counter Extremism Project
  • Political Party: n/a
  • Country: n/a
  • State/Locality: n/a
  • Office: n/a
  • Email Providers:
    • Iterable