From Counter Extremism Project <[email protected]>
Subject Tech & Terrorism: Online Gaming Platforms Facing Scrutiny Over Extremist Content
Date April 20, 2023 6:40 PM
  Links have been removed from this email. Learn more in the FAQ.
  Links have been removed from this email. Learn more in the FAQ.
Concerns are growing in the U.S. and Europe that extremists are propagandizing
and spreading hateful ideologies through online gaming platforms. This has
prompted inquiries from members of the U.S. House of Representatives and U.S.
Senate, a U.S. Department of Homeland Security-funded study, as well as an
acknowledgment by Germany’s Federal Ministry of the Interior that at least one
online gaming platform “allows right wing extremist content to be disseminated.”





<[link removed]>
<[link removed]>



Tech & Terrorism: Online Gaming Platforms Facing Scrutiny Over Extremist
Content



(New York, N.Y.) — Concerns are growing in the U.S. and Europe that extremists
are propagandizing and spreading hateful ideologies through online gaming
platforms. This has prompted inquiries from members of theU.S. House of
Representatives
<[link removed]>
andU.S. Senate
<[link removed]>
, a U.S. Department of Homeland Security-fundedstudy
<[link removed]>, as well as an acknowledgment
<[link removed]>
by Germany’s Federal Ministry of the Interior that at least one online gaming
platform “allows right wing extremist content to be disseminated.”



Video games are attractive to extremists as a way to spread propaganda,
especially to a young audience. In 2021, the Counter Extremism Project (CEP)
located a Telegram channel that shared downloadablewhite supremacist video games

<[link removed]>
. Soon thereafter, CEP researchers located recreations of theChristchurch
terrorist attack
<[link removed]>
video in the online video game Roblox, a map of one of the mosques targeted in
that attack for the game Minecraft, andmultiple examples
<[link removed]>
of group pages on the gaming platform and social network Steam that promoted
white supremacism, fascism, or the extreme right.



In February 2022, CEP researchers located
<[link removed]>
a recreation of the Christchurch attack in Minecraft that had been uploaded to
Meta-owned Instagram as well as several posts glorifying the terrorist attack,
the attack video, the attacker’s manifesto, and files for recreating the attack
video in the video game Garry’s Mod (also known as GMod) that had been shared
on an online file archive site. Three months later, CEP researchers found a
game on Steam thatglorified Kyle Rittenhouse
<[link removed]>
and contained anti-transgender sentiment.



Roblox reportedly had 58.8 million
<[link removed]>
daily active users in the fourth quarter of 2022, and Minecraft had141 million
active users <[link removed]>
worldwide as of August 2021.Over the last year
<[link removed]>, 21,000-29,000 users played Garry’s Mod
each month.



These as well as other gaming platforms and gaming social media sites maintain
policies against extremist content. Steam’sContent Rules
<[link removed]> prohibit
posting “threats of violence or harassment, even as a joke.” Robloxprohibits
<[link removed]>
content that “incites, condones, supports, glorifies, or promotes any
terrorist or extremist organization or individual (foreign or domestic), and
their ideology, or actions.” Minecraftmaintains
<[link removed]> a “zero-tolerance policy
towards hate speech, terrorist or violent extremist content, bullying,
harassing, sexual solicitation, fraud, or threatening others.”



Nevertheless, the ongoing presence of clearly extremist and terrorist content
on gaming platforms and social media sites demonstrate that companies must
continue to diligently enforce their terms of service to assure a safe online
environment.



“Gaming platforms must adopt proactive content moderation rules and enforce
them in a manner that protects users and stymies efforts by extremists to
misuse and spread propaganda through their platforms,” said CEP Executive
Director David Ibsen. “These companies can avoid potential liability and
reputational damage by consistently enforcing their own terms of service,
proactively locating extremist content that violates those agreements, and
quickly removing offending content from their platforms. Big tech companies
have spent tens of millions of dollars to create a mirage that proactive
content moderation is an impossible task. It’s not. It can and should be done
to protect the safety of groups and individuals targeted by extremists across
the globe.”



###





Unsubscribe
<[link removed]>
Screenshot of the email generated on import

Message Analysis

  • Sender: Counter Extremism Project
  • Political Party: n/a
  • Country: n/a
  • State/Locality: n/a
  • Office: n/a
  • Email Providers:
    • Iterable