This is the first time Facebook is levying repercussions on individual users
|
Email not displaying correctly?
View it in your browser.
|
|
• |
In this edition you will find: |
|
• |
Facebook announces repercussions for individuals spreading misinformation |
• |
MediaWise teaches viewers to unpack a hoax about aliens and George Floyd |
• |
PolitiFact’s Angie Holan gets real about fact-checking during the pandemic |
|
|
Take care before you share |
|
By Wachiwit/ Shutterstock |
Facebook announced Wednesday it would begin limiting the reach of individual users who repeatedly share posts flagged by members of its Third-Party Fact-Checking Program. (Fact-checkers are required to be signatories to the IFCN’s Code of Principles to participate in the program).
This announcement marks a change in the company’s approach to handling misinformation, which previously only reduced the distribution of individual posts flagged by members of the 3PFC. Now, users who repeatedly share misinformation will have the reach of all their posts reduced regardless of whether those posts contain misinformation — i.e a user who repeatedly shares U.S. election and COVID-19 falsehoods would also see reduced distribution of their vacation pics or posts about a family apple pie recipe.
Facebook noted in its announcement it has taken actions in the past against pages, groups, and individual Instagram accounts that have repeatedly shared misinformation, but this is the first time the company will levy repercussions on individual Facebook users.
Users will also start seeing pop-up notifications anytime they try to join a group that has been repeatedly shared false information. This has the potential to limit the reach of groups spreading anti-vaccine misinformation, especially those targeting new moms.
Facebook has had different versions of these sorts of pause notifications both in its fact-checking program and its efforts to fight COVID-19. In a March op-ed for Morning Consult, Guy Rosen, Facebook’s vice president of integrity, cited a statistic that 95% of users did not click on a post that had a fact check warning label.
The last big change is a redesign of how users are notified that something they’ve posted or interacted with has been fact-checked. Now, users will be given a link directly to the fact check article and will also be given the opportunity to share, which has the potential to increase both the visibility of fact-checkers on the platform and the traffic to their individual sites.
|
|
Photo courtesy of MediaWise |
|
• |
MediaWise reporter Heaven Taylor-Wynn showed viewers of this video fact check how to use keyword search and reading upstream techniques to suss out that this viral clip of Derek Chauvin’s lawyer was a truncated version of his entire speech. Taylor-Wynn also reminded viewers to be wary of posts online that elicit a strong emotional response as this is a common attribute of disinformation. |
|
|
• |
An image claiming to be the front page of The New York Times international edition used a picture of a crocodile under the headline, “India’s PM cried.” NewsMobile discovered this was a photoshop by a Times parody account that went viral in a climate of political tension over Prime Minister Narendra Modi’s handling of the COVID-19 pandemic. |
|
|
|
AP Photo/Ng Han Guan
|
• |
"Can COVID-19 uncertainty be fact-checked?" from Poynter. PolitiFact editor-in-chief Angie Holan wrote the challenges of fact-checking during the COVID-19 pandemic while urging readers to get more comfortable with changing information and not having all the answers. |
|
|
|
If you are a fact-checker and you’d like your work/projects/achievements highlighted in the next edition, send us an email at factually@poynter.org by next Tuesday.
Any corrections? Tips? We’d love to hear from you: factually@poynter.org.
|
© All rights reserved Poynter Institute 2021
801 Third Street South, St. Petersburg, FL 33701
If you don't want to receive email updates from Poynter, we understand.
You can update your subscription preferences here or unsubscribe.
|
|