Separating Viral Truths and Lies
Nudging people to be more skeptical on social media
As the coronavirus began to shake the world, before morphing into the COVID-19 pandemic, wild rumors started to pollute social media. The Chinese government did not concoct the virus deliberately, as it happens. Nor did massive amounts of sulfur dioxide emerge from cemeteries in Wuhan, China, where the virus was first transmitted. Even so, those vicious claims made their way onto Facebook feeds and Twitter timelines around the world.
“On social media, we shut our brains off a little bit and stop thinking about whether things are true in the first place,” says cognitive psychologist Gordon Pennycook of the University of Regina in Saskatchewan, Canada.
Pennycook and his research partner David Rand, a cognitive psychologist at the Massachusetts Institute of Technology, suggest that there’s a simple way to stem the flow of bad information on social media: nudging people to think about the accuracy of what they share before they share it.
Pennycook and Rand’s research team recently recruited 1,000 Americans to participate in a study to see if encouraging accuracy on social media platforms could reduce misinformation about COVID-19. Participants were demographically representative of the American public by age, region, gender, and race. The researchers posted their analysis this week, just as the pandemic intensified in the United States.
Half of the participants first received a prompt to consider whether a sample headline—one claimed that the TV show Seinfeld would be streaming worldwide on Netflix in 2021, and looked like a social media post—was true or false. Researchers wanted these participants to be primed to think about accuracy in general, not yet about the accuracy of claims about COVID-19 in particular. The rest of the participants did not receive such a nudge.
Everyone then indicated how likely they were to share each of 30 headlines about COVID-19, half of which were true and half of which were false. All, however, were designed to look like real social media posts. People who did not receive the accuracy nudge prior to rating these headlines were just as prone to sharing false as true headlines. The group that did receive nudges, on the other hand, showed a clear preference for sharing true headlines about COVID-19.
Pennycook argues that it should be easy for social media companies to incorporate such accuracy nudges into routine practice.
Although the accuracy nudge appears to have potential, University of Cambridge social psychologist Sander van der Linden cautions that this strategy alone is not a silver bullet. The divergences between the two groups—nudge versus no nudge—though real, were modest, and the study captures self-reported intentions rather than actual behavior.
In an interview van der Linden also noted that the effectiveness of accuracy nudges could fade with time, as people become accustomed to them and eventually ignore them entirely. In what he hopes will be another solution to the spread of misinformation on the Internet, van der Linden has developed a 15-minute online game that teaches people to be aware of the tricks behind creating bad information; it is now being adapted to focus on COVID-19. He argues that this approach likely has more sticking power for changing people’s online behavior than priming them with a single true-false question. Still, solutions like the accuracy nudge should be part of the conversation right now, van der Linden adds. To protect us from bad information, he says, “we need a multilayered defense system in our society.”