HealthCoronavirus

Actions

Social Media Preps for Wave of COVID Vaccine Misinformation

After years of a hands-off approach to anti-vaccination content, Facebook, YouTube and Twitter are looking to reduce false info on COVID vaccines.
Posted

Health and misinformation experts tell Newsy they anticipate a new wave of misinformation with upcoming COVID-19 vaccine rollouts.

"There's already mistrust about this because of the perception of, you know, rushing things, trying to get things out. And not knowing really who to trust in terms of information," said Jason Shepherd, associate professor of neurobiology and anatomy at the University of Utah.

"Facebook has really been a hub for a lot of vaccine-opposed and anti-vaccine-related communities, particularly because of the use of private groups," said Kolina Koltai, a post-doctoral scholar at the University of Washington's Center for an Informed Public. "The uniqueness — being able to facilitate a certain community, to be able to talk to people in a private space — is what really lent Facebook to be so successful."

Newsy reached out to Facebook, YouTube and Twitter about new policies to deal with COVID-19 vaccine misinformation. 

A Twitter spokesperson told us the site “still working through our policy and product plans in advance of the introduction of a viable, medically-approved vaccination.” It’s priority now is removing COVID-19 content with potentially harmful calls to action.

Facebook recently said it will “start removing false claims about these vaccines that have been debunked by public health experts.” YouTube’s policies state it will remove videos with COVID-19 misinformation that goes against consensus from the World Health Organization or local health authorities.

Experts told Newsy that some of the proactive fact-checking efforts by social media companies should be commended. 

"I think what we saw before with anti-vaxxer movements associated with the MMR vaccine, those types of claims just went unfettered, nothing whatsoever debunked," said Robert McKeever, associate professor of mass communication at the University of South Carolina. "And you now you're seeing on social media not only debunking, but getting out in front of potential misinformation that may arise related to vaccines." 

"I think it's been really useful in the last few months to see social media companies beginning to label information as factually untrue or to suggest that there's other sources of information," said Jennifer Reich, professor of sociology at the University of Colorado-Denver and author of "Calling the Shots: Why Parents Reject Vaccines."

But even if the biggest social platforms use robust artificial intelligence and widespread fact checking to identify harmful COVID-19 misinformation, users still may find ways to post or share it.

When Facebook a few years ago started to put down more restrictions and the way that anti-vaccine really was being shared on the platform [...] one thing that users started doing is not spelling out the word vaccine," Koltai said. "They started adopting new terminology. So anything from X-vaxxer instead of antivaxxer. You know, we're talking about medical freedom. And so these are terms that are even harder to monitor because you don't want to say stuff like, 'Oh, I'm against medical freedom.'"

In order to effectively combat COVID-19 vaccine misinformation, research shows social media needs to be hawkish in actioning content. 

"What Facebook and other platforms need to be doing is really actively monitoring in the same way that they monitored for election misinformation," Koltai said. "This misinformation can happen very quickly. And then after it spreads, even if you take it down the next day, the damage from that misinformation can already be done."