(New York, N.Y.) — The Counter Extremism Project (CEP) produces a weekly report on the methods used by extremist and terrorist groups on the Internet to spread their ideologies and incite violence. Last week, CEP researchers identified around three dozen accounts on TikTok, Instagram, and Twitter/X that praised white supremacist terrorism, including the Christchurch attack.
Additionally, CEP researchers located an audio message from ISIS spokesperson Abu Hudhayfah al-Ansari on RocketChat, Telegram, and pro-ISIS websites calling for worldwide attacks on Jews and their allies across the globe as well as on Western and Arab states. In another post on RocketChat, the pro-ISIS tech group Qimam Electronic Foundation (QEF) shared a guide for inspecting suspicious URLs to avoid clicking on harmful links.
CEP researchers located a video of the founder of the neo-Nazi group The Base on a Russian video streaming platform recommending that followers prepare for war and educating them via Telegram on ways to allegedly bypass the restrictions of an AI assistant to potentially use it for criminal purposes.
Finally, on Twitter/X and Telegram, AI-generated content, antisemitic rhetoric, and conspiracy theories, including blood libel, QAnon-related posts, and others, were found in large quantities across the platforms following the arrest of Jewish worshippers accused of unlawfully constructing a tunnel in a Brooklyn synagogue.
Extreme-Right and Neo-Nazi Content, Including Content Glorifying White Supremacist Terrorists, Located on TikTok, Instagram, and Twitter/X
In a sample of content located on January 10, CEP researchers found 37 accounts on TikTok, Instagram, and Twitter/X that spread extreme-right or Neo-Nazi content or that glorified acts of terrorism committed by white supremacists.
CEP found 18 accounts on TikTok that glorified or encouraged acts of violence. Content included modified and unmodified footage from the Christchurch attack video, including clips edited into longer violent videos. Other accounts praised the white supremacist perpetrators of the 2015 Charleston church shooting and the May 2022 Buffalo attack. Two accounts posted sections of the manifestos of both the Christchurch attacker and the perpetrator of the August 2019 attack targeting Latinos at an El Paso Walmart. One account with over 1,000 subscribers specifically advocated for acts of violence against Jews. The 18 accounts had an average of 438 followers, ranging from 0 (in one instance) to an account with 1,388 followers that expressed support for the attacks in Christchurch, Halle, and Charleston.
CEP located nine extreme-right accounts on Instagram. Six accounts were affiliated with white supremacist Active Clubs in Slovakia, Estonia, Ireland, Brazil, and Romania, with one account for an affiliated group in New York, New Jersey, and Pennsylvania. Other accounts located on Instagram posted antisemitic and neo-Nazi propaganda. One account with over 70 followers posted a video containing violent footage from the Christchurch and Buffalo terrorist attacks. The nine accounts averaged 261 followers, ranging between 1 and 1,597.
Lastly, CEP researchers located ten extreme-right or neo-Nazi accounts on Twitter/X. Four accounts, two created in November 2023 and two made in December 2023, were affiliated with Active Clubs in Romania, Ohio, Canada, and an area in Missouri and Illinois. Six additional Twitter/X accounts were located that posted violent footage from either the Christchurch or the Buffalo terrorist attacks. Despite the low follower counts (0 to 39) of the Twitter/X accounts that glorified or advocated for acts of terrorism, the violent videos they posted received hundreds of views in some cases. An account with only one follower posted a violent clip from the Christchurch attack in response to a tweet from a far-right British politician. The video glorified the Christchurch attacker and received over 700 views. A different account with only eight followers posted a violent video clip glorifying the Buffalo attacker and received over 550 views.
“TikTok, Instagram, and Twitter/X should significantly increase resources devoted to identifying and immediately removing content from its platforms that depict extreme violence, glorifies terrorists, and spreads extremist propaganda. Tech companies are well aware of the scope of the threat to users and the public, and they have a responsibility to uphold their content moderation policies,” said CEP researcher Joshua Fisher-Birch. “Accounts connected to Active Clubs, a known white supremacist movement recruiting around the world, and that use standardized logos and symbols, should be removed. Extremist and terrorist content is spreading on these major platforms, among others, increasing the potential for radicalization, recruitment, and real-world violence.”
CEP reported all accounts to relevant national authorities or TikTok, Instagram, or Twitter/X. CEP directly reported six accounts to TikTok. Six days later, three were removed, but accounts that remained on the platform glorified white supremacist and antisemitic attackers. Of the eight accounts reported directly to Instagram, only one was removed within the same time frame. Accounts that were not removed from Instagram included those affiliated with Active Clubs as well as profiles that posted antisemitic and anti-transgender content. All the accounts reported to Twitter, which were connected to neo-Nazi Active Clubs, were still on the platform six days later.