Tech & Terrorism: Online Gaming Platforms Facing Scrutiny Over Extremist Content
(New York, N.Y.) — Concerns are growing in the U.S. and Europe that extremists are propagandizing and spreading hateful ideologies through online gaming platforms. This has prompted inquiries from members of the U.S. House of Representatives and U.S. Senate, a U.S. Department of Homeland Security-funded study, as well as an acknowledgment by Germany’s Federal Ministry of the Interior that at least one online gaming platform “allows right wing extremist content to be disseminated.”
Video games are attractive to extremists as a way to spread propaganda, especially to a young audience. In 2021, the Counter Extremism Project (CEP) located a Telegram channel that shared downloadable white supremacist video games. Soon thereafter, CEP researchers located recreations of the Christchurch terrorist attack video in the online video game Roblox, a map of one of the mosques targeted in that attack for the game Minecraft, and multiple examples of group pages on the gaming platform and social network Steam that promoted white supremacism, fascism, or the extreme right.
In February 2022, CEP researchers located a recreation of the Christchurch attack in Minecraft that had been uploaded to Meta-owned Instagram as well as several posts glorifying the terrorist attack, the attack video, the attacker’s manifesto, and files for recreating the attack video in the video game Garry’s Mod (also known as GMod) that had been shared on an online file archive site. Three months later, CEP researchers found a game on Steam that glorified Kyle Rittenhouse and contained anti-transgender sentiment.
Roblox reportedly had 58.8 million daily active users in the fourth quarter of 2022, and Minecraft had 141 million active users worldwide as of August 2021. Over the last year, 21,000-29,000 users played Garry’s Mod each month.
These as well as other gaming platforms and gaming social media sites maintain policies against extremist content. Steam’s Content Rules prohibit posting “threats of violence or harassment, even as a joke.” Roblox prohibits content that “incites, condones, supports, glorifies, or promotes any terrorist or extremist organization or individual (foreign or domestic), and their ideology, or actions.” Minecraft maintains a “zero-tolerance policy towards hate speech, terrorist or violent extremist content, bullying, harassing, sexual solicitation, fraud, or threatening others.”
Nevertheless, the ongoing presence of clearly extremist and terrorist content on gaming platforms and social media sites demonstrate that companies must continue to diligently enforce their terms of service to assure a safe online environment.
“Gaming platforms must adopt proactive content moderation rules and enforce them in a manner that protects users and stymies efforts by extremists to misuse and spread propaganda through their platforms,” said CEP Executive Director David Ibsen. “These companies can avoid potential liability and reputational damage by consistently enforcing their own terms of service, proactively locating extremist content that violates those agreements, and quickly removing offending content from their platforms. Big tech companies have spent tens of millions of dollars to create a mirage that proactive content moderation is an impossible task. It’s not. It can and should be done to protect the safety of groups and individuals targeted by extremists across the globe.”
###