From Counter Extremism Project <[email protected]>
Subject Tech & Terrorism: New Study Confirms YouTube Algorithm Promotes Misinformation, Conspiracies, Extremism
Date July 8, 2021 3:02 PM
  Links have been removed from this email. Learn more in the FAQ.
  Links have been removed from this email. Learn more in the FAQ.
YouTube’s recommended videos algorithm suggests extremist content,
misinformation, and conspiracy theories to its users, according to a new report
by


<[link removed]>
<[link removed]>
Tech & Terrorism: New Study Confirms YouTube Algorithm Promotes
Misinformation, Conspiracies, Extremism

 

(New York, N.Y.) — YouTube’s recommended videos algorithm suggests extremist
content, misinformation, and conspiracy theories to its users, according to a
new report
<[link removed]>
by the Mozilla Foundation. The results were based on a crowdsourced research
project that enabled thousands of YouTube users to report dangerous content
through a web browser extension for Mozilla researchers to analyze. Of the
videos reported,71 percent
<[link removed]>
of videos flagged by volunteers as harmful were recommended by YouTube’s
algorithm.

 

The Mozilla Foundation’s report adds to an existing body of research
demonstrating how the algorithms of major tech platforms play a key role in
amplifying extremist and hateful content.

 

Counter Extremism Project (CEP) Senior Advisor Dr. Hany Farid, professor at UC
Berkeley, recently discussed in awebinar
<[link removed]>
the role of algorithmic amplification in promoting misinformation and divisive
content online and its devastating consequences. Speaking on algorithmic
amplification, Dr. Farid stated, “Algorithmic amplification is the root cause
of the unprecedented dissemination of hate speech, misinformation, conspiracy
theories, and harmful content online. Platforms have learned that divisive
content attracts the highest number of users and as such, the real power lies
with these recommendation algorithms.”

 

In March 2020, Dr. Farid and other UC Berkeley researchers authored a study, A
Longitudinal Analysis Of YouTube’s Promotion Of Conspiracy Videos
<[link removed]>, that analyzed
YouTube’s policies and efforts towards curbing its recommendation algorithm’s
tendency to spread divisive conspiracy theories. After reviewing eight million
recommendations over 15 months, researchers determined the progress YouTube
claimed in June 2019 to have reduced the amount of time its users watched
recommended videos including conspiracies by 50 percent—and in December 2019 by
70 percent—did not make the “problem of radicalization on YouTube obsolete nor
fictional.” The study ultimately found that a more complete analysis of
YouTube’s algorithmic recommendations showed the proportion of conspiratorial
recommendations are “now only 40 percent less common than when the YouTube’s
measures were first announced.”

 

CEP also conducted a study between August 2 and August 3, 2018, titled OK
Google, Show Me Extremism <[link removed]>, in
which 649 YouTube videos were reviewed for extremist and counter-narrative
content. Counter-narratives are a part of Google’s Redirect Method, which seeks
to target individuals searching for ISIS-related content and direct them to
counter-narrative videos that try to undermine the messaging of extremist
groups. Of the 649 videos, CEP was four times more likely to encounter
extremist material than counter-narratives. The result of CEP’s searches
highlighted the extent of extremist content on YouTube and undermined YouTube’s
claims touting the efficacy of its efforts to promote counter-narrative videos.

 

To watch a recording of the webinar, How Algorithmic Amplification Pushes
Users Toward Divisive Content, please click here
<[link removed]>
.

 

To read Dr. Farid’s report, A Longitudinal Analysis Of YouTube’s Promotion Of
Conspiracy Videos, please click here
<[link removed]>.

 

To read CEP’s report, OK Google, Show Me Extremism: Analysis Of YouTube’s
Extremist Video Takedown Policy And Counter-Narrative Program, please click here
<[link removed]>.

 

###

Unsubscribe
<[link removed]>
Screenshot of the email generated on import

Message Analysis

  • Sender: Counter Extremism Project
  • Political Party: n/a
  • Country: n/a
  • State/Locality: n/a
  • Office: n/a
  • Email Providers:
    • Iterable