Misinformation Alerts
Knowing what misinformation is being shared can help you generate effective messaging.
These insights are based on a combination of automated media monitoring and manual review by public health data analysts. Media data are publicly available data from many sources, such as social media, broadcast television, newspapers and magazines, news websites, online video, blogs, and more. Analysts from the Public Good Projects triangulate this data along with other data from fact checking organizations and investigative sources to provide an accurate, but not exhaustive, list of currently circulating misinformation.
Alerts are categorized as high, medium, and low risk.
- High risk alerts: We recommend directly addressing and debunking the misinformation
- Medium risk alerts: We recommend monitoring the situation but not actively engaging.
- Low risk alerts: Provided for informational purposes. We do not recommend additional action at the moment.
Several popular vaccine opponents claim that the updated COVID-19 vaccines were not properly safety tested in humans, specifically emphasizing that Pfizer’s vaccine was tested in 10 mice and Moderna’s in 50 people.
Recommendation: This misleading claim underscores a persistent misconception about how vaccines are developed and tested. Debunking messaging may explain that updated COVID-19 vaccines use the same basic formula as the original vaccine and don’t require extensive testing, similar to annual flu shots and other vaccines that are updated to protect against multiple strains of the same pathogen. COVID-19 vaccine ingredients have been safety tested and monitored for over three years. Minor changes to the vaccine’s targets, like those in the updated COVID-19 vaccines, don’t impact its safety. Messaging may also emphasize that updated COVID-19 vaccines are effective against the XBB.1.5 subvariant and its close relatives, which make up around 90 percent of circulating variants. Fact-Checking Source(s):
An anti-vaccine organization published an article claiming a two-month-old died 34 hours after being vaccinated due to toxic levels of aluminum in his blood. There is no evidence that the death is related to vaccines.
Recommendation: The widespread nature of this myth and its potential to cause general vaccine hesitancy elevates its risk. Prebunking messaging may explain that aluminum is not inherently toxic; it’s a naturally occurring element in the air, water, and earth around us. Debunking messaging may emphasize that certain vaccines contain aluminum as an adjuvant (an ingredient that boosts immune response). The amount of aluminum in vaccines is a tiny fraction of the amount required to be toxic to an infant, and it’s up to health officials to determine the cause of death. Fact Checking Source(s): Children's Hospital of Philadelphia, CDC
Several trending posts share a video of Anthony Fauci “finally admitting” that COVID-19 vaccines can cause myocarditis, falsely insinuating that he previously denied a potential link.
Recommendation: The potential for this misinformation to damage trust in federal health authorities and public health guidelines increases its risk. Debunking messaging may emphasize that federal health agencies were the first to detect and alert the public to myocarditis as a rare potential risk of mRNA COVID-19 vaccines in June 2021. In the following months, Fauci discussed vaccine-related myocarditis risk several times. Messaging may also emphasize that data from vaccine clinical trials, three years of safety monitoring, and real-world data demonstrate the safety of the mRNA vaccines. Infections like COVID-19 and the flu are the most common cause of myocarditis. Many large-scale studies have shown that the risk of developing myocarditis and other heart complications associated with COVID-19 infection is far higher than the risk from infection. Additionally, vaccine-related myocarditis cases are significantly milder than viral myocarditis, with most resolving on their own. Fact Checking Source(s): Yale Medicine, American Heart Association
A cancer researcher claims that vitamin D is a “key factor” and “all that was needed” against COVID-19 in a viral post. The post claims that researchers studying a potential link between vitamin D and COVID-19 are being silenced and “disappeared,” despite one such study being published earlier this year. Notably, the researcher works for a wellness company that sells supplements like “optimized vitamin D” and a fake COVID-19 vaccine “detox.”
Recommendation: The promotion of supplements as alternatives to vaccination is widespread and persistent. Debunking messages may emphasize that while some research suggests vitamin D deficiency may impact COVID-19 risk and outcomes, there is no evidence that vitamin D supplements are a key factor in fighting COVID-19, as the post claims. Studies are ongoing to determine if the nutrient may play a role in preventing or treating COVID-19. Vaccines remain the best-proven protection against COVID-19 Fact Checking Source(s): JAMA, The Scientist, BMJ
A surgeon discourages parents from vaccinating their children in a recent episode of a podcast, claiming childhood vaccines contain “aborted babies,” aluminum, formaldehyde, “baby cow blood,” yeasts, and mercury.
Recommendation: Responding to each false claim may detract from priority talking points. Ensuring FAQs include informational and educational resources about vaccine ingredients is recommended. Prebunking messaging could highlight the anti-vaccine tactic of using fake and “scary” vaccine ingredients to discourage vaccination. Fact-checking sources:
A popular anti-vaccine account claims that unvaccinated children are healthier than vaccinated children, citing a scientist whose study linking autism to vaccines was retracted.
Recommendation: The claim that routine vaccines weaken children’s immune systems and make them more susceptible to disease is persistent. Debunking messaging may emphasize that decades of research has found no evidence that vaccines weaken the immune system or cause any chronic illness. Children are far more likely to be harmed by a vaccine-preventable disease than a vaccine. Fact Checking Source(s): Health Feedback, Public Health Association of BC
A video clip has resurfaced of a social media CEO claiming that they were worried mRNA vaccines might alter DNA. The CEO is being criticized for silencing people who questioned the vaccines’ safety while allegedly expressing similar beliefs privately.
Recommendation: Fact-checking sources: This misleading clip has resurfaced and been debunked several times before. Debunking messaging may explain that the clip was deceptively edited to hide that the video was recorded while COVID-19 vaccines were still in development and before the CEO understood how they work. Messaging may also reinforce that vaccine mRNA cannot enter the cell nucleus or alter DNA. Fact Checking Source(s): Snopes, USA Today
A social media post with tens of thousands of engagements claims that vaccines work by causing health issues like seizures, allergies, and heart problems and keeping the medical industry in business.
Recommendation: The high engagement on the post increases its risk. Debunking messaging may emphasize that vaccines are one of the most cost-effective medical interventions available. The reason that doctors recommend immunizations is to protect children and their families from debilitating and deadly diseases. Fact Checking Source(s): Boost Oregon, PublicHealth.org
A popular anti-vaccine social media account shared a screenshot of a social media user claiming to have developed 34 blood clots after getting a COVID-19 booster. The post received over tens of thousands of engagements.
Recommendation: Misinformation linking COVID-19 vaccines to blood clots is widespread and persistent online. Debunking messaging may explain that there is no evidence that mRNA vaccines increase vaccine risk and that blood clots have been reported as an extremely rare adverse reaction to two non-mRNA COVID-19 vaccines. Messaging may also emphasize that COVID-19 infection increases blood clot risk far more than any COVID-19 vaccine and highlight that viral images of alleged vaccine-related blood clots are almost always unsourced, and, in some cases, are not blood clots at all. Fact Checking Source(s): Nebraska Medicine, Poynter, University of Buffalo
A bizarre conspiracy theory circulating on social media claims the Burning Man festival was shut down and a national emergency declared due to an Ebola outbreak. Some posts repeating the claim include a fake screenshot of a health authority post confirming the outbreak.
Recommendation: Responding to every conspiracy theory may detract from priority talking points. There was no Ebola outbreak and no national emergency was declared. In fact, thousands of festival-goers were left stranded in the Nevada desert after storms caused flooding and road closures. This misinformation on Ebola in online vaccine conversations–as opposed to COVID-19 or other vaccines like HPV–was unusual in part because ERVEBO, the Ebola Zaire vaccine, is not currently commercially marketed in the U.S. It is only stockpiled in the Strategic National Stockpile and made available through CDC for pre-exposure vaccination of individuals. Fact-checking sources:
Alerts are categorized as high, medium, and low risk.
- High risk alerts: We recommend directly addressing and debunking the misinformation
- Medium risk alerts: We recommend monitoring the situation but not actively engaging.
- Low risk alerts: Provided for informational purposes. We do not recommend additional action at the moment.
Vaccine Misinformation Guide
Get practical tips for addressing misinformation in this new guide. Click image to download, or see highlights.