How to Fight Vaccine Misinformation
Study shows personal stories are more effective than facts in countering anti-vaxxers
Just after Christmas, a Wisconsin pharmacist attempted to destroy 570 doses of a COVID-19 vaccine, yanking precious vials from a storage refrigerator. According to multiple reports, he’d become convinced it could alter human DNA. It can’t. Nor, as other false rumors have claimed, will it allow the government to track you or fill your body with fetal tissue — but that hasn’t stopped vaccine misinformation from spreading online, spooking people concerned about potential side effects.
“Misinformation is more impactful than the correction,” says Michelle A. Amazeen, a Boston University associate professor of mass communication.
For the past year, Amazeen and Arunima Krishna, a Boston University assistant professor of public relations, have explored the spread of vaccine misinformation and the efficacy of different efforts to halt it. Although their study started before COVID-19 tore across the United States — and their research has focused on vaccines in general — Amazeen says the coronavirus pandemic has “magnified how important the work is that we’re doing.”
As part of their research, they created a fictitious Facebook post telling the emotional story of a boy who supposedly developed autism after receiving the measles, mumps, and rubella (MMR) vaccine. Amazeen and Krishna showed the post to around 1,000 participants, then tested three approaches to countering its false message: the story of an uneventful vaccine success (a kid got the vaccine, but nothing bad happened), a conversion event narrative (a reformed anti-vaxxer lauding the positive impact of a vaccine), and a factual chart listing the true numbers of adverse events. All were displayed as comments to the original post.
Although they’re still preparing the results for publication, Amazeen and Krishna found straight facts did little to shift opinions and that the personal stories were more effective. The researchers also tested what would happen if they changed the source of the rebuttal posts, swapping in a friend, a doctor, the government, and a vaccine awareness group — and sometimes giving no source.
“We found that in roughly half the cases, a lot of people didn’t notice the source, even if we told them to,” says Amazeen, whose research focuses on persuasion and misinformation. That’s important, she says, because of the way social media works, encouraging us to scroll through post after post. “Misinformation is out there, it’s amplified, and social media platforms have been slow to respond,” she says. “And repetition breeds familiarity. You see this here, you see this there, and all of a sudden, ‘Yeah, I’ve seen that somewhere before, so it must be true.’”
Although Amazeen says we should all pay more attention to where information comes from — whether about vaccines or election results — social media companies need to start taking a harder line on fast-spreading untruths, even if it hurts their bottom line.
For those who did notice the source, it mattered: they were more likely to trust a friend or a known healthcare professional than a vaccine awareness group or the government. When it comes to the COVID-19 vaccine, says Krishna, it means local doctors are better placed than the Centers for Disease Control and Prevention (CDC) to correct misinformation and stop its spread.
“Among people who are vaccine-hesitant, trust for their healthcare professionals, their doctors and nurses, is much, much higher than for the government, the pharmaceutical companies,” she says.
Amazeen says a recent coronavirus public health campaign that effectively pushed out valid information — and cracked down on falsehoods — was one developed by BU College of Communication students, F*ck It Won’t Cut It. The effort, which included student-created posters and social media messages about masking, distancing, and more, was recognized by the American Marketing Association and the CDC.
“They’ve done so much right, they’ve been effective. They’re using students to communicate with students,” says Amazeen. “You don’t have the administration telling students, ‘Here’s what you need to do,’ you have peers.” She touted the campaign’s success in The COVID-19 Vaccine Communication Handbook, a practical guide to countering misinformation she coauthored with scientists and experts from around the world. The handbook has advice for community leaders, healthcare professionals, policymakers, communications experts, and the general public to help them stay ahead of misinformation — and fight back when they see it spreading.
With the social media misinformation study wrapping up, Amazeen and Krishna are expanding their research to examine the role of race in vaccine communication.
“The history of medical research is fraught for different communities of color,” says Krishna. They’ll be joined for that work by Rob Eschmann, a BU School of Social Work assistant professor and assistant director of research at the University’s Center for Antiracist Research.
For additional commentary by Boston University experts, follow us on Twitter at @BUexperts. Follow Michelle Amazeen at @commscholar, Arunima Krishna at @ArunimaKPhD, Rob Eschmann at @robeschmann, College of Communication at @COMatBU and Center for Antiracist Research at @AntiracismCtr on Twitter.