I’ll give you a little peek behind the curtain here at Poynter.
My Poynter colleague Angela Fu started working on a piece a couple of weeks ago about the relationship (or, more accurately, the nonrelationship) between X and fact-checkers.
That piece — “It’s easy to find misinformation on social media. It’s even easier on X” — was published on Poynter’s website on Thursday.
In it, Fu writes, “A platform that used to downgrade hoaxes, conspiracy theories and false claims has become one where even the boss now spreads the stuff. That change didn’t happen immediately, but the shift of X from a useful information source to a locus of misinformation has alarmed fact-checkers worldwide.”
The boss of X, of course, is Elon Musk, who seemingly spends his days coming up with new ways to use his social media platform as a toy for misinformation and divisiveness. Of course, he has every right to endorse someone for president. In this case, he made it clear that he supports Donald Trump for the White House.
Nothing wrong with that, and that’s not what this is about.
Let’s go back to what Fu wrote in her story: “The relationship between large technology companies and professional fact-checkers has always been contentious, with fact-checkers accusing the platforms of not doing enough to combat the spread of misinformation. But at least there is a relationship. Meta — which owns Facebook, Instagram and WhatsApp — partners with independent fact-checkers to review and rate posts on its platforms, for example. TikTok operates a similar program.”
Fu continues, “X under Elon Musk has shown no interest in doing the same. It does not have a formal relationship with fact-checkers and instead relies on its crowdsourced fact-checking program ‘Community Notes,’ Maldita.es co-founder and CEO Clara Jiménez Cruz said. While experts acknowledge that there are some advantages to the Community Notes system, it also has its flaws, allowing many pieces of mis- and disinformation to go unchecked and viral.”
Jiménez Cruz said, “It doesn’t only involve misinformation itself, but also hate speech and other forms of manipulative content.”
Fu writes, “The result, fact-checkers say, is a worse experience for users as misleading and hateful posts clutter people’s feeds and disinformation campaigns run rampant. Many worry about the effects of those campaigns, especially during a year when nearly half the world’s population votes in national elections.”
So let’s get back to Musk. In the past couple of weeks, he posted a creepy and cringy tweet about Taylor Swift, then followed that up with a dangerously irresponsible response to an X user who posted after a second apparent assassination plot against Trump, “Why they want to kill Donald Trump?”
Musk wrote, “And no one is even trying to assassinate Biden/Kamala.” He then included a “person thinking” emoji.
Musk eventually deleted his remark — nine hours after he posted it and after it had more than 4 million views. He then followed up with two flippant posts that basically said he was joking.
And now here’s another example of Musk acting like a donkey. It’s CNN’s Liam Reilly with “Elon Musk boosts fake Trump rally bomb threat and false claims about the election.”
Reilly also pointed out another time, just this week, when Musk shared a false video involving that whole baseless story about Haitian immigrants eating pets in Springfield, Ohio.
Again, this all happened since Fu started working on her story.
As Fu wrote, “Musk has emerged as a major spreader of misinformation, amplifying false claims to his 197 million followers. The Center for Countering Digital Hate found that Musk made 50 false or misleading posts about the U.S. elections between Jan. 1 and July 31, generating nearly 1.2 billion views, and The Washington Post reported last week that Musk’s online posts have coincided with harassment campaigns towards election administrators.”
Be sure to check out Fu’s story for more.