(Shutterstock)
A study that dropped last week strongly suggests “prebunking” is an effective way to counter the propaganda techniques at the center of mis- and disinformation.
Prebunk is a frequently used term used in the fact-checking space. It’s derived from debunk and means to preemptively refute expected false narratives, misinformation or manipulation techniques. As opposed to fact-checking every instance of a false claim, which many argue is impossible, prebunking seeks to inoculate the public against anticipated narratives in advance.
Researchers say fact-checking is like treating the symptoms of an illness and compared prebunking to vaccination.
The study, published by Science Advances and led by Cambridge researchers in partnership with Jigsaw — a research branch of Google — exposed millions of YouTube users to 90-second clips that explained manipulation techniques, like fearmongering, scapegoating and playing into emotions. Users subsequently completed follow-up surveys at later dates that tested their ability to determine whether a manipulation technique was implemented.
“So think about when you get a vaccine. It has a microdose of the virus. It's not the whole virus, but it's like a little piece of it that your body can recognize," Beth Goldberg, head of research at Jigsaw, told the International Fact-Checking Network. “It's the same thing in a prebunking video; we show you a little clip of the propaganda, so that you can recognize the manipulation tactics going forward.”
The five manipulation categories depicted in the videos were emotional language, incoherence, false dichotomies, scapegoating and ad-hominem attacks.
Researchers discovered:
- Users who viewed the emotional language video clips were 1.5 to 1.67 times more likely than the control group to recognize the manipulation technique in the future.
- Users who watched the false dichotomies video clips were nearly twice as likely as the control group at recognizing the technique.
- Users who watched the incoherence video were over twice as good at identifying the technique at a later date.
After watching the videos on YouTube, users’ ability to recognize manipulation techniques increased by 5% on average, according to Jigsaw.
“Our interventions make no claims about what is true or a fact, which is often disputed. They are effective for anyone who does not appreciate being manipulated,” said lead author Dr. Jon Roozenbeek, a postdoctoral fellow with the Social Decision-Making Lab at Cambridge. “The inoculation effect was consistent across liberals and conservatives. It worked for people with different levels of education, and different personality types. This is the basis of a general inoculation against misinformation.”
"We had a control group and a treatment group and could actually see whether people paid attention to our ad — and we found that this seems like a pretty useful approach overall,” said Goldberg. "The next step will be, 'How does that affect sharing of misinformation?' We haven't gotten there yet. But at least we know that we're effectively teaching people these concepts."
Researchers also touted the scalability of the experiments.
“If anyone wants to pay for a YouTube campaign that measurably reduces susceptibility to misinformation across millions of users, they can do so, and at a miniscule cost per view,” said Roozenbeek.
|