With your support, Campaign for Accountability is working to expose corruption and hold the powerful accountable.
This Week's Updates:
New Studies from TTP and ADL Shed Light on Algorithm-Driven Antisemitism
Yesterday, CfA’s Tech Transparency Project (TTP) and the Anti-Defamation League (ADL) released a pair of studies that revealed how mainstream social media platforms were amplifying antisemitism through content recommendation algorithms, search features, and auto-generated pages. As platforms like Facebook, Instagram, Twitter, and YouTube shifted from chronological feeds to individualized recommendations, a user’s activity and preferences have come to shape the type of content they are served. Using test accounts with an established interest in conspiracy theories, TTP and ADL’s first study found that Instagram, Facebook, and Twitter began recommending extremely antisemitic content; Instagram, in particular, pushed even more egregious posts to an account registered as a teen.
For the second study, researchers tested the auto-complete search functions of several social media platforms, as well as Google, to explore the pervasiveness of hate groups. Facebook, Instagram, and YouTube were all found to be hosting extremist content, which every platform made easier to find by using search-autocomplete features. In their drive to engage users, platforms like YouTube and Facebook even auto-generated pages or channels for hate groups – an issue that TTP has previously covered. Meanwhile, Google created “knowledge panels” in searches for hate groups, some of which linked directly to their official pages or promotional events featuring “recruiters.” Of all the platforms featured in the study, YouTube did the best job of ignoring a user’s interest in antisemitic content and avoided recommending hateful videos. In the context of YouTube’s success, it is especially disturbing that Meta allowed Instagram to recommend the most extreme, antisemitic content to a teenage test user.
A Possible Crypto Walk-Back in Arkansas
Two weeks ago, CfA’s newsletter highlighted “right to mine” crypto bills, which have been signed into law in Arkansas and Montana and limit the ability of local governments to regulate crypto mining operations – notably, both bills were based on model legislation created by a lobbying firm called the Satoshi Action Fund, which is led by a pair of former Trump Administration officials. In Arkansas, the law has been criticized by residents and lawmakers who feel its passage was rushed. Some towns have managed to quickly pass ordinances limiting the decibel level or placement of crypto mining operations, before the law comes into effect. Now, Arkansas Sen. Bryan King (R) is attempting to repeal the legislation, though his bill cannot be considered until 2025 without a special session of the General Assembly.
“We’ve discovered that the crypto facility proposed in Arkansas would use enough power for 7,000 or 8,000 homes, while only employing two or three people,” King told Government Technology. “A lot of times whenever these facilities are placed and energy usage goes up, it’s basic economics that citizens’ energy rates go up as well.”
Boogaloo Movement Returns to Facebook (Again)
On Thursday, Vice News published a story highlighting the Boogaloo movement’s presence on Facebook, which has been growing steadily despite Meta’s promise to address it. Drawing on data and insights provided by TTP, Vice’s Tess Owen explained how violent, anti-government movements like the Boogaloo Bois have been able to avoid detection by using coded terms and a network of backup pages. As Meta’s “year of efficiency” eats into its trust and safety teams, platforms like Facebook are gradually returning to a more unmoderated state, where extremist organizations can fly under the radar by changing their lingo. In September of 2022, TTP noted that Boogaloo activity was beginning to surge on Facebook, as groups and meme pages gathered thousands of members. While Meta CEO Mark Zuckerberg has previously extolled AI as a content moderation solution, extremist organizations are simply able to move faster and invent new signifiers for their ideology, which AI systems can't pick up on. For the Boogaloo Bois, then, Facebook’s disinvestment in human trust and safety workers has been a gift.