From Counter Extremism Project <[email protected]>
Subject Tech & Terrorism: YouTube And Meta Platforms Announce New Effort To Fight Online Extremism
Date September 23, 2022 8:50 PM
  Links have been removed from this email. Learn more in the FAQ.
  Links have been removed from this email. Learn more in the FAQ.
Last week, Google-owned YouTube along with Meta announced that they would
expand their efforts to fight online extremism. YouTube stated that the company
would “expand its policies on violent extremism to remove content that
glorifies violent acts, even if the creators of the videos are not related to a
terrorist organization,” while Meta stated that it would partner with a
third-party organization to study online extremism. The announcements came as
part of the Biden Administration’s summit on racism and extremism, where Biden
announced his intention to ask Congress to hold social media companies
accountable by revoking their Section 230 liability immunity—which in essence
is legal protection from content created by their users.





<[link removed]>
<[link removed]>



Tech & Terrorism: YouTube And Meta Platforms Announce New Effort To Fight
Online Extremism



(New York, N.Y.) — Last week, Google-owned YouTube along with Meta announced
<[link removed]>
that they would expand their efforts to fight online extremism. YouTube stated
that the company would “expand its policies on violent extremism to remove
content that glorifies violent acts, even if the creators of the videos are not
related to a terrorist organization,” while Meta stated that it would partner
with a third-party organization to study online extremism. The announcements
came as part of the Biden Administration’ssummit
<[link removed]>
on racism and extremism, where Biden announced his intention to ask Congress
to hold social media companiesaccountable
<[link removed]> by
revoking their Section 230 liability immunity—which in essence is legal
protection from content created by their users.



“YouTube and Meta’s announcements represent another example of the tech
industry’s reactive policymaking,” said Counter Extremism Project (CEP)
Executive Director David Ibsen. “Rather than announce new initiatives and issue
platitudes, companies should focus on and invest the necessary resources in
removing terrorist content. By focusing on removing the ‘worst of the worst’
content from internationally designated terrorist organizations and their
supporters from these sites and platforms, companies will establish clear
guidelines and enforce them in a transparent manner.”



For years, tech companies have simply reacted, often inadequately, to
extremist and terrorist content found on their sites. Rather than operating
proactively by taking preventative measures to protect their users, social
media platforms implemented quick fixes that failed to achieve long-term
results.



During the summer of 2017, for example, YouTube launched several initiatives
relating to terrorist content on its platform, including its Redirect Method––a
program intended to direct individuals searching for ISIS-related content on
YouTube to counter-narrative videos. Between August 2 and August 3, 2018, CEP
reviewed <[link removed]> 649 YouTube videos for
extremist and counter-narrative content, based on searches for six terms
related to Islamic extremism. CEP found a decrease in the number of
counter-narrative videos, indicating that Google did not improve the
performance of its Redirect Method Program. CEP found nine videos (1.4 percent
of the 649 videos checked) that may include counter-narrative messaging,
meaning that a user searching for extremist material on YouTube was four times
more likely to encounter extremist material than counter-narratives.



CEP has documented instances in which both Google
<[link removed]>
andMeta’s Facebook
<[link removed]>
have made express policy changes following public accusations, a scandal, or
pressure from lawmakers. While one would hope that both companies are
continuously working to improve security on their platforms, there is no excuse
as to why so many policy changes have been reactive, and it raises the question
as to what other scandals are in the making.



###





Unsubscribe
<[link removed]>
Screenshot of the email generated on import

Message Analysis

  • Sender: Counter Extremism Project
  • Political Party: n/a
  • Country: n/a
  • State/Locality: n/a
  • Office: n/a
  • Email Providers:
    • Iterable