Anti-COVID-19 vaccine accounts thrive on social media, despite platform rules

With vaccination against COVID-19 in full swing, social platforms like Facebook, Instagram and Twitter say they’ve stepped up their fight against misinformation that aims to undermine trust in the vaccines. But problems abound.

For years, the same platforms have allowed anti-vaccination propaganda to flourish, making it difficult to stamp out such sentiments now. And their efforts to weed out other types of COVID-19 misinformation — often with fact-checks, informational labels and other restrained measures, has been woefully slow.

Twitter, for instance, announced this month that it will remove dangerous falsehoods about vaccines, much the same way it’s done for other COVID-related conspiracy theories and misinformation. But since April 2020, it has removed a grand total of 8,400 tweets spreading COVID-related misinformation — a tiny fraction of the avalanche of pandemic-related falsehoods tweeted out daily by popular users with millions of followers, critics say.

"While they fail to take action, lives are being lost," said Imran Ahmed, CEO of the Center for Countering Digital Hate, a watchdog group. In December, the nonprofit found that 59 million accounts across social platforms follow peddlers of anti-vax propaganda — many of whom are immensely popular superspreaders of misinformation.

RELATED: COVID-19 causes the immune system to go rogue and harm the body, researchers say

Efforts to crack down on vaccine misinformation now, though, are generating cries of censorship and prompting some posters to adopt sneaky tactics to avoid the axe.

"It’s a hard situation because we have let this go for so long," said Jeanine Guidry, an assistant professor at Virginia Commonwealth University who studies social media and health information. "People using social media have really been able to share what they want for nearly a decade."

The Associated Press identified more than a dozen Facebook pages and Instagram accounts, collectively boasting millions of followers, that have made false claims about the COVID-19 vaccine or discouraged people from taking it. Some of these pages have existed for years.

Of more than 15 pages identified by NewsGuard, a technology company that analyzes the credibility of websites, roughly half remain active on Facebook, the AP found.

One such page, The Truth About Cancer, has more than a million Facebook followers after years of posting baseless suggestions that vaccines could cause autism or damage children’s brains. The page was identified in November as a "COVID-19 vaccine misinformation super spreader" by NewsGuard.

Recently, the page stopped posting about vaccines and the coronavirus. It now directs people to sign up for its newsletter and visit its website as a way to avoid alleged "censorship."

Facebook said it is taking taking "aggressive steps to fight misinformation across our apps by removing millions of pieces of COVID-19 and vaccine content on Facebook and Instagram during the pandemic."

"Research shows one of the best ways to promote vaccine acceptance is by showing people accurate, trusted information, which is why we’ve connected 2 billion people to resources from heath authorities and launched a global information campaign," the company said in a statement.

RELATED: Facebook launches ‘worldwide campaign’ to combat COVID-19 vaccine misinformation

Facebook also banned ads that discourage vaccines and said it has added warning labels to more than 167 million pieces of additional COVID-19 content thanks to our network of fact-checking partners. (The Associated Press is one of Facebook's fact-checking partners).

YouTube, which has generally avoided the same type scrutiny as its social media peers despite being a source of misinformation, said it has removed more than 30,000 videos since October, when it started banning false claims about COVID-19 vaccinations. Since February 2020, it has removed over 800,000 videos related to dangerous or misleading coronavirus information, said YouTube spokeswoman Elena Hernandez.

Prior to the pandemic, however, social media platforms had done little to stamp out misinformation, said Andy Pattison, manager of digital solutions for the World Health Organization. In 2019, as a measles outbreak slammed the Pacific Northwest and left dozens dead in America Samoa, Pattison pleaded with big tech companies to take a closer look at tightening rules around vaccine misinformation that he feared might make the outbreak worse — to no avail.

It wasn’t until COVID-19 struck with a vengeance that many of those tech companies started listening. Now he meets weekly with Facebook, Twitter and YouTube to discuss trends on their platforms and policies to consider.

"When it comes to vaccine misinformation, the really frustrating thing is that this has been around for years," Pattison said.

The targets of such crackdowns are often quick to adapt. Some accounts use intentionally misspelled words — like "vackseen" or "v@x" — to avoid bans. (Social platforms say they're wise to this.) Other pages use more subtle messaging, images or memes to suggest that vaccines are unsafe or even deadly.

"When you die after the vaccine, you die of everything but the vaccine," read one meme on an Instagram account with more than 65,000 followers. The post suggested that the government is concealing deaths from the COVID-19 vaccine.

"It’s a very fine line between freedom of speech and eroding science," Pattison said. Purveyors of misinformation, he said, "learn the rules, and they dance right on the edge, all the time."

Twitter said it is continuously reviewing its rules in the context of COVID-19 and changes them based on guidance from experts. Earlier this month, it added a strikes policy that threatens repeat spreaders of coronavirus and vaccine misinformation with bans.

RELATED: Facebook to remove false claims about COVID-19 vaccines

But blatantly false COVID-19 information continues to pop up. Earlier this month, several articles circulating online claimed that more elderly Israelis who took the Pfizer vaccine were "killed" by the shot than those who died from COVID-19 itself. One such article from an anti-vaccination website was shared nearly 12,000 times on Facebook, leading earlier this month to a spike of nearly 40,000 mentions of "vaccine deaths" across social platforms and the internet, according to an analysis by media intelligence firm Zignal Labs.

Medical experts point to a real-world study showing a strong correlation between vaccination and decreases in severe COVID-19 disease in Israel. The nation’s health ministry said in a Thursday statement that the COVID-19 vaccine has "profoundly" reduced the rate of deaths and hospitalizations.

As U.S. vaccine supplies continue to increase, immunization efforts will soon shift from targeting a limited supply to the most vulnerable populations to getting as many shots into as many arms as possible. That means tackling the third of the country’s population who say they will not or probably won’t get it, as measured by a February AP-NORC poll.

"Vaccine hesitancy and misinformation could be a big barrier to getting enough of the population vaccinated to end the crisis," said Lisa Fazio, a professor of psychology at Vanderbilt University.

Some health officials and academics generally believe that the social-platform efforts are helpful, at least on the margins. What’s not clear is how big of a dent they can put in the problem.

"If someone truly believes that the COVID vaccine is harmful and they feel a responsibility to share that with friends and family ... they will find a way," Guidry said.

And some still blame business models that they say encouraged the platforms to serve up engaging, if false, coronavirus misinformation in order to profit from advertising.

When the Center for Countering Digital Hate recently studied the crossover between different types of disinformation and hate speech, it found that Instagram tended to cross-pollinate misinformation via its algorithm. Instagram might feed an account that followed a QAnon conspiracy site further posts from, say, white nationalists or anti-vaxxers.

"You continue to allow things to disintegrate because of the seamless intermingling of misinformation and information on your platforms," Ahmed, the center’s CEO, said.

NewsCoronavirus VaccineNewsU.S.Social Media