Researchers argue the debate about free speech versus online harm requires a better understanding of how social media feeds us information Email not displaying correctly?
View it in your browser ([link removed]) .
[link removed]
[link removed]
• In this edition you will find:
• A look at the debate over moderating misinformation
• Full Fact overrules a judge’s COVID-19 falsehoods
• The Wall Street Journal unwinds TikTok’s algorithm
A lackluster debate
AP Photo/Susan Walsh
President Joe Biden and Facebook’s tête-à-tête over social media companies’ culpability in spreading false information about vaccines raised familiar questions about protecting online speech while mitigating offline harms.
For a quick recap, Biden told NBC News’ Peter Alexander that social media companies like Facebook were “killing people,” by enabling the spread of anti-vax content. Facebook’s vice president of integrity, Guy Rosen, responded in a blog post ([link removed]) and outlined Facebook’s efforts to fight vaccine misinformation Then on Monday, Biden clarified to CNN’s Kaitlan Collins he was referring to the disinformation dozen ([link removed]) — a group of 12 prominent anti-vax activists with large social media followings.
Critics of the White House’s approach cited press secretary Jen Psaki’s briefing ([link removed]) where she laid out the ways in which the Biden administration has taken a more active role in flagging false content on Facebook. Evan Greer, director of the internet advocacy group Fight for the Future ([link removed]) , warned of unintended consequences if the White House or the U.S. government more broadly gets into the business of regulating online content.
“If you think the government can require platforms to ‘only remove misinformation about COVID’ without a) leading to massive collateral damage / censorship of legitimate speech, and b) setting a precedent that will be weaponized by future authoritarians, plz read a history book,” she tweeted ([link removed]) . In addition to a history book, Greer’s warning is backed up by some recent scholarship.
IFCN advisory board member Peter Cunliffe-Jones, along with fellow researchers Alan Finlay and Anya Schiffrin, found in a study of anti-misinformation laws in Africa ([link removed]) that rules meant to curb demonstrably false speech wound up stifling all speech. The study found most of these laws aimed to punish false speech as opposed to correcting it, and this punitive approach did not have a demonstrable impact on reducing falsehoods.
At the same time, we’ve seen the real-world impact of both mis- and disinformation — whether that be dropping vaccination rates for diseases like measles or the genocide of Rohingya Muslims in Myanmar. Fact-checkers around the world have made a valiant effort to clean up the information ecosystem, but as Duke Reporters’ Lab director Bill Adair said during his June IFCN Talk ([link removed]) , “We need more troops.”
Independent columnist Charlie Warzel argued this dichotomy between free expression and fighting falsehoods misses the point. In his Galaxy Brain newsletter ([link removed]) , Warzel wrote that the influence of companies like Facebook has locked us in a nuance-free debate about how to improve the information ecosystem.
One solution Warzel offered was for social media companies to be more transparent with their data. Facebook does offer a transparency page ([link removed]) , and an ad library ([link removed]) , but it gets to set the terms of what information it chooses to share, and as Kevin Roose ([link removed]) shared last week, Facebook may not be too keen on sharing data that has the potential to make them look bad.
Alexi Drew, a technology researcher and senior analyst at RAND Europe, argued that more transparency around recommendation algorithms will help both researchers and the public better understand how they get their information. She used Google’s personalized search results as an example.
“Are we shaping the algorithm or is the algorithm shaping us?” she said. “We don’t actually know which dynamic that actually lies in for certainty.” Drew argued that a better understanding of how these systems work will empower individual users to know how these technologies are shaping their worldview. Let’s hope it can also move the debate about content moderation forward.
Interesting fact checks
AP Photo/Lindsay Moller, POOL
• Full Fact: "Lord Sumption made several errors about Covid on Today" ([link removed]) (in English)
• A former justice of the United Kingdom’s Supreme Court chose to step outside of his expertise to make three factually incorrect claims about the U.K.’s COVID-19 death numbers on BBC’s Radio 4’s “Today” program. Jonathan Sumption gave incorrect death figures while criticizing the British government’s pandemic response. The BBC issued a correction to Sumption’s claims following Full Fact’s fact check.
• Teyit: "Claim: Vaccines alter the human genome, make people "humanoids" and deprive them of their human rights" ([link removed]) (in Turkish)
• A Twitter user who claims to be an academic stated that COVID-19 vaccines turn you into a “humanoid” — a nonhuman creature or being with characteristics resembling those of a human, according to Merriam-Webster — and rob you of free will. Teyit debunked this claim by explaining that mRNA vaccines do not alter the human genome. They also discovered the Twitter user was not an academic and that the username had links to online scams dating back to 2009.
Quick hits
Photo by: KGC-330/STAR MAX/IPx 2021 7/20/21 / AP Photo
From the news:
• "Study: Anti-refugee disinformation connects extremists, politicians," ([link removed]) from EURACTIV.com. A report by the Institute for Strategic Dialogue found that anti-refugee disinformation has connected far-right extremists across borders, and that mainstream politicians are helping to spread these falsehoods to a wider audience. It also found the nongovernmental organizations working with refugees are common target for abuse by these networks both on and offline.
• "Investigation: How TikTok's Algorithm Figures Out Your Deepest Desires," ([link removed]) from The Wall Street Journal. Journal reporters used dozens of automated bot accounts to learn more about how TikTok’s recommendation algorithm works. It found that time spent on each video was a strong indicator of what kinds of videos the platform would recommend.
• "How Blockchain Can Help Combat Disinformation," ([link removed]) from the Harvard Business Review. The emerging technology most commonly associated with online currencies like Bitcoin could be used to keep track of verified information, help users build a reputation for spreading reliable news, and remove the financial incentive for spreading online falsehoods.
From/for the community:
• "Religious Exemption From Taking COVID Vaccine? Anti-Vaxxers New Tactic," ([link removed]) from BOOM. Anti-vaccination groups in India are spreading a form letter requesting a religious exemption from taking the COVID-19 vaccine that is rife with falsehoods about the vaccines’ ingredients and effects.
• "Disinformation in June exploits soccer and vacations to hit COVID-19 vaccines," ([link removed]) from the European Digital Media Observatory. A report by 12 European fact-checking agencies (10 of whom are signatories to the IFCN’s Code of Principles) found that 38% of their fact checks in June were related to vaccines.
• "Knight announces new investments in tech and democracy research, including an open call for research into disinformation’s impact on communities of color," ([link removed]) from the Knight Foundation. Knight announced it distributed $4 million to three separate projects researching and developing solutions and policies to fight online misinformation and harassment. It also announced an additional $1.5 million would be set aside for projects researching ways to prevent targeted misinformation attacks on communities of color. Projects have until Sept. 15 to submit their proposals.
If you are a fact-checker and you’d like your work/projects/achievements highlighted in the next edition, send us an email at
[email protected] (mailto:
[email protected]) by next Tuesday.
Any corrections? Tips? We’d love to hear from you:
[email protected] (mailto:
[email protected]) .
Thanks for reading Factually.
Harrison Mantas
Reporter, IFCN
@HarrisonMantas ([link removed])
[link removed] [link removed] [link removed] [link removed] mailto:
[email protected]?subject=Feedback%20for%20Poynter
[link removed]
[link removed]
[link removed]
[link removed]
[link removed]
© All rights reserved Poynter Institute 2021
801 Third Street South, St. Petersburg, FL 33701
If you don't want to receive email updates from Poynter, we understand.
You can update your subscription preferences here ([link removed]) or unsubscribe ([link removed]) .