Researchers argue the debate about free speech versus online harm requires a better understanding of how social media feeds us information
|
Email not displaying correctly?
View it in your browser.
|
|
• |
In this edition you will find: |
|
• |
A look at the debate over moderating misinformation |
• |
Full Fact overrules a judge’s COVID-19 falsehoods |
• |
The Wall Street Journal unwinds TikTok’s algorithm |
|
|
|
AP Photo/Susan Walsh |
President Joe Biden and Facebook’s tête-à-tête over social media companies’ culpability in spreading false information about vaccines raised familiar questions about protecting online speech while mitigating offline harms.
For a quick recap, Biden told NBC News’ Peter Alexander that social media companies like Facebook were “killing people,” by enabling the spread of anti-vax content. Facebook’s vice president of integrity, Guy Rosen, responded in a blog post and outlined Facebook’s efforts to fight vaccine misinformation Then on Monday, Biden clarified to CNN’s Kaitlan Collins he was referring to the disinformation dozen — a group of 12 prominent anti-vax activists with large social media followings.
Critics of the White House’s approach cited press secretary Jen Psaki’s briefing where she laid out the ways in which the Biden administration has taken a more active role in flagging false content on Facebook. Evan Greer, director of the internet advocacy group Fight for the Future, warned of unintended consequences if the White House or the U.S. government more broadly gets into the business of regulating online content.
“If you think the government can require platforms to ‘only remove misinformation about COVID’ without a) leading to massive collateral damage / censorship of legitimate speech, and b) setting a precedent that will be weaponized by future authoritarians, plz read a history book,” she tweeted. In addition to a history book, Greer’s warning is backed up by some recent scholarship.
IFCN advisory board member Peter Cunliffe-Jones, along with fellow researchers Alan Finlay and Anya Schiffrin, found in a study of anti-misinformation laws in Africa that rules meant to curb demonstrably false speech wound up stifling all speech. The study found most of these laws aimed to punish false speech as opposed to correcting it, and this punitive approach did not have a demonstrable impact on reducing falsehoods.
At the same time, we’ve seen the real-world impact of both mis- and disinformation — whether that be dropping vaccination rates for diseases like measles or the genocide of Rohingya Muslims in Myanmar. Fact-checkers around the world have made a valiant effort to clean up the information ecosystem, but as Duke Reporters’ Lab director Bill Adair said during his June IFCN Talk, “We need more troops.”
Independent columnist Charlie Warzel argued this dichotomy between free expression and fighting falsehoods misses the point. In his Galaxy Brain newsletter, Warzel wrote that the influence of companies like Facebook has locked us in a nuance-free debate about how to improve the information ecosystem.
One solution Warzel offered was for social media companies to be more transparent with their data. Facebook does offer a transparency page, and an ad library, but it gets to set the terms of what information it chooses to share, and as Kevin Roose shared last week, Facebook may not be too keen on sharing data that has the potential to make them look bad.
Alexi Drew, a technology researcher and senior analyst at RAND Europe, argued that more transparency around recommendation algorithms will help both researchers and the public better understand how they get their information. She used Google’s personalized search results as an example.
“Are we shaping the algorithm or is the algorithm shaping us?” she said. “We don’t actually know which dynamic that actually lies in for certainty.” Drew argued that a better understanding of how these systems work will empower individual users to know how these technologies are shaping their worldview. Let’s hope it can also move the debate about content moderation forward.
|
|
AP Photo/Lindsay Moller, POOL |
|
• |
A former justice of the United Kingdom’s Supreme Court chose to step outside of his expertise to make three factually incorrect claims about the U.K.’s COVID-19 death numbers on BBC’s Radio 4’s “Today” program. Jonathan Sumption gave incorrect death figures while criticizing the British government’s pandemic response. The BBC issued a correction to Sumption’s claims following Full Fact’s fact check. |
|
|
• |
A Twitter user who claims to be an academic stated that COVID-19 vaccines turn you into a “humanoid” — a nonhuman creature or being with characteristics resembling those of a human, according to Merriam-Webster — and rob you of free will. Teyit debunked this claim by explaining that mRNA vaccines do not alter the human genome. They also discovered the Twitter user was not an academic and that the username had links to online scams dating back to 2009. |
|
|
|
Photo by: KGC-330/STAR MAX/IPx 2021 7/20/21 / AP Photo |
• |
"Study: Anti-refugee disinformation connects extremists, politicians," from EURACTIV.com. A report by the Institute for Strategic Dialogue found that anti-refugee disinformation has connected far-right extremists across borders, and that mainstream politicians are helping to spread these falsehoods to a wider audience. It also found the nongovernmental organizations working with refugees are common target for abuse by these networks both on and offline. |
|
|
• |
"How Blockchain Can Help Combat Disinformation," from the Harvard Business Review. The emerging technology most commonly associated with online currencies like Bitcoin could be used to keep track of verified information, help users build a reputation for spreading reliable news, and remove the financial incentive for spreading online falsehoods. |
|
|
|
If you are a fact-checker and you’d like your work/projects/achievements highlighted in the next edition, send us an email at [email protected] by next Tuesday.
Any corrections? Tips? We’d love to hear from you: [email protected].
Thanks for reading Factually.
|
© All rights reserved Poynter Institute 2021
801 Third Street South, St. Petersburg, FL 33701
If you don't want to receive email updates from Poynter, we understand.
You can update your subscription preferences here or unsubscribe.
|
|