ѻý

No, Marburg Virus Isn't in COVID Shots: Users Debunk Vax Misinfo on X

<ѻý class="mpt-content-deck">— Community notes were accurate and cited mostly credible sources, researchers say
MedpageToday
 An image of a community note on an X post.

The Community Notes feature, an open-source misinformation countermeasure introduced to X (formerly Twitter) in 2022, was successful in correcting COVID-19 vaccine misinformation in viral posts, a study suggested.

Out of the 205 randomly sampled Community Notes on posts mentioning COVID vaccination, 97% were entirely accurate, 2% were partially accurate, and 0.5% were inaccurate, reported John Ayers, PhD, MA, of the Altman Clinical Translational Research Institute at the University of California San Diego, and colleagues.

Most notes cited high or moderate credibility sources (49% and 44%, respectively), while 7% cited low credibility sources, they detailed in a research letter published in .

Half of notes were about adverse events (51%), 37% responded to conspiracy theories, 7% related to vaccine recommendations, and 5% were about vaccine effectiveness. On average, each post with a Community Note had more than a million views, the authors said.

Ayers told ѻý that while misinformation is a hot topic, he rarely hears about solutions for combatting it, which is what inspired this research.

"If you give the public the right tools ... and give them a forum to identify and respond to potential misinformation, then they can do that and do it in a way where it mirrors what experts would have done, which is what our study shows," he said.

For instance, that COVID vaccines caused blood clots. But a Community Note corrected it, citing a literature review that did not find strong evidence to back that claim, though "it did find that 'occurrence of blood clots in COVID-19 is up to 10 times more common than from the vaccines' injection.'"

Another post claimed that "Marburg Virus is baked into the covid shots and will be activated by four 1 min pulse waves at 15GHz from 5G towers throughout the country." A Community Note with moderate credibility that was deemed entirely accurate was added, pointing out that "there is no evidence for Marburg virus being in the COVID-19 vaccines. 5G does not cause viral illnesses. COVID-19 vaccines do not contain 5g technology."

While the posts featuring vaccine misinformation were viewed hundreds of millions of times, there were still many more posts with fewer views that didn't have a Community Note.

"The small number of notes addressing posts with COVID-19 vaccine misinformation suggests opportunities for health professionals to contribute to this mission via participating in Community Notes," Ayers and team wrote. Previous research found that the few physicians spreading COVID misinformation online mostly used X as their platform.

Becky Smullin Dawson, MPH, PhD, of Allegheny College in Meadville, Pennsylvania, told ѻý that this study demonstrates the enormous scope of misinformation online.

"I think the authors are pointing out an opportunity -- notes are a way to get accurate and credible information onto X. It is a band-aid solution to the wicked problem of misinformation, but a step forward nevertheless," she said, adding that this study proved that the notes are "both accurate and highly visible."

Dawson, who was not involved in the study, also highlighted the of pandemic defense used in the early days of the COVID pandemic, where layers of imperfect interventions add up to slow or stop transmission. She said that Community Notes are one slice of that model for combatting misinformation.

"It may be a layer with many holes in it, but if combined with other methods of combatting misinformation, it could help slow the spread of misinformation, or at least correct misinformation in a public space like X," she said.

For this study, the researchers assessed X's Community Notes data from December 2022 to December 2023. The 45,783 notes were filtered to identify the 657 that included "vaccin*" and "covid*" or "coronavirus" and then divided by primary topic: adverse events, conspiracy theories, vaccine recommendations, and vaccine effectiveness. They were also categorized by accuracy and by credibility based on what types of sources the notes cited.

The authors noted that their primary limitation was that they only studied note quality, though that is also predictive of effectiveness and persuasiveness. Furthermore, they did not study user engagement with the notes, nor how the notes impacted perception or behavior.

In the future, they recommended investigating other health topics and encouraged other social media platforms to open-source their misinformation countermeasures for researchers to assess.

  • author['full_name']

    Rachael Robertson is a writer on the ѻý enterprise and investigative team, also covering OB/GYN news. Her print, data, and audio stories have appeared in Everyday Health, Gizmodo, the Bronx Times, and multiple podcasts.

Disclosures

This study was in part funded by the National Center for Advancing Translational Sciences and the Burroughs Wellcome Fund.

Ayers reported owning equity in Health Watcher and Good Analytics.

Other co-authors reported receiving personal fees from Pearl Health, Bloomberg LP, and Good Analytics; consulting fees from Bayer, Gilead, Model Medicines, Fluxergy, Linear Therapies, and Vx Biosciences; and grant support from the National Institute on Drug Abuse and the National Institutes of Health Clinical and Translational Science Awards.

Dawson had no conflicts of interest.

Primary Source

JAMA

Allen MR, et al "Characteristics of X (formerly Twitter) community notes addressing COVID-19 vaccine misinformation" JAMA 2024; DOI: 10.1001/jama.2024.4800.