In 2017, Kids Plus Pediatrics, an independent medical practice in Pittsburgh, posted a 90-second video on its Facebook page encouraging parents to have their children vaccinated against human papillomavirus (HPV), a sexually transmitted infection that can cause cancer.
For three weeks, the post garnered positive, higher-than-average interest and engagement from the page’s followers. Then, seemingly out of nowhere, the anti-vaccination attacks began. Thousands of comments from vaccine skeptics and opponents poured in from around the world. Chad Hermann, KPP’s communications director, says he spent 18 hours per day, over the next eight days, combing through the remarks, individually banning commenters, and trying to remove dozens of negative Yelp and Google reviews left by strangers who’d never set foot in the practice.
“It’s incredibly overwhelming,” Hermann says, reflecting on the experience.
Now that incident is the subject of a new study published in the journal Vaccine.
Hermann and Todd Wolynn, a physician and CEO of KPP, partnered with researchers at the University of Pittsburgh to learn more about what drove the commenters to relentlessly harass the practice for days on end.
The team randomly selected a group of 197 individuals who commented on the post, then used coding and computer analysis to assess every publicly available post on their personal profiles over a two-year period from 2015 and 2017. The researchers logged and coded posts, for example, that addressed water fluoridation, genetically modified crops, and “chemtrails,” topics that suggest broader views on the role of government and science in people’s lives. They also evaluated how the commenters were connected to each other or to the same Facebook groups.
Women who identified themselves as parents comprised the sample’s majority. Of the 55 people who indicated a political affiliation, more than half supported Donald Trump and a handful backed Vermont Sen. Bernie Sanders. The authors worked to verify that each account belonged to an authentic user by looking at a profile’s posting history, friend relationships, and pictures of real-life events. (The commenters surely didn’t expect to become part of a study in which researchers pored over their profiles. They weren’t notified of the study, but the researchers anonymized their information.)
The study found that the individuals clustered into four thematic subgroups: those concerned about “trust” in the scientific community as well as infringements on personal liberty; those focused on “alternatives” to vaccination, including homeopathic treatments; those interested in vaccine “safety” who also felt vaccination might be immoral; and, those emphasizing “conspiracy” on the part of the government, scientific community, or others to hide information about diseases and vaccines.
“By knowing and categorizing, we’re going to be able to tailor what we do much more specifically,” says study co-author Brian A. Primack, director of the Center for Research on Media, Technology, and Health at the University of Pittsburgh School of Medicine.
Portraying vaccines as triggering the body’s natural immune system, for instance, could be an effective response to someone in the “alternatives” subgroup. Similarly, people worried about trust and liberty might be swayed by the argument that vaccines leave children free to pursue their lives.
Her son died. And then anti-vaxers attacked her – CNN https://t.co/2ZCi5yvYSt
— Eve Switzer MD, FAAP (@kidoctr) March 19, 2019
The study arrives at a moment of reckoning for social media companies, particularly Facebook and YouTube, accused of letting misinformation about vaccines proliferate unchecked for years, potentially playing a role in measles outbreaks across the country and globe. Recent reporting also suggests that anti-vaxxers coordinate or participate in massive digital harassment campaigns designed to silence people who advocate for vaccinations, including physicians and parents.
Facebook announced this month that pages and groups disseminating false information about vaccines will receive lower rankings and won’t appear in recommendations and predictions driven by the company’s algorithms. Misinformation itself will not be removed from the platform.
“We’re hoping to provide the science for regulators to make evidence-based policies.”
“We have seen Facebook and other platforms making these adjustments, for lack of a better word, but our study provides an evidence base for the need for these policies,” says study co-author Beth L. Hoffman, a research assistant at the Center for Research on Media, Technology, and Health at the University of Pittsburgh School of Medicine. “We’re hoping to provide the science for regulators to make evidence-based policies.”
Walter Quattrociocchi, head of the Laboratory of Data and Complexity at the University of Venice in Italy, wrote in an email that while the Vaccine study bases its conclusions on a small sample and “simple” analytics, it captures the “echo chamber” effect of social media documented in previous studies. Quattrociocchi, who was not involved in the new research, has studied vaccine misinformation on Facebook as well as echo chambers.
Naomi Smith, a digital sociologist at the Federation University Australia who has also studied the anti-vaccine movement on Facebook, agreed that the study confirms previous research but cautioned against generalizing its findings.
“I think this may be useful understanding flurries of anti-vaccination sentiment such as the one detailed in this paper,” she wrote in an email. “However, it is difficult to make a broader statement from a single incident about the overall landscape of anti-vaccination activity on Facebook.”
Wolynn, who is a co-author of the study and has worked on “vaccine confidence” programs for Merck and Sanofi, knows that some readers may learn about his involvement with pharmaceutical companies and dismiss the new research out of hand. But he’s used to anti-vaxxers going after people who advocate for vaccinations and is more interested in is coming up with strategies to help the public make evidence-based decisions about vaccines.
The study co-authors collectively call for increased media literacy, using entertainment to convey pro-vaccine messages and storylines, developing interventions to target the subgroups with effective messaging, and looking at the role medical professionals can play online.
Both Hermann and Wolynn say they’ve spoken to physicians who’ve either been attacked after sharing pro-vaccine content or remain silent out of fear they’ll be targeted, which is why Hermann is developing a social media and communications toolkit for physicians. He also wishes that Facebook would make it possible for professional pages to restrict or pre-ban members of private Facebook groups, which would make it easier to prevent members of anti-vaccination groups from mobbing a medical provider’s page.
“One of the reasons we’re doing this is if [physicians] don’t post pro-science, pro-vaccine information, that leaves a gigantic void on social media,” says Hermann. “And guess who’s going to fill it?”