Editorials urge clinicians and researchers to speak out against misinformation. Public health leaders warn that trust in evidence-based guidance is declining. Scientists describe being “under attack for someone else’s political gain.”
These pleas, warnings, repetitions of scientific facts, and proclamations of authority in response to misinformation and conspiracy beliefs are welcome but ineffective. It’s more productive to think about conditions which allow conspiracy theories to take root and flourish.

Francisco Goya – The Sleep of Reason Produces Monsters (1799). When understanding fails, fear and fantasy rush in.
Insights from social science
In the past, conspiratorial beliefs have been treated as outliers – signs of individual pathology or cognitive weakness, driven by ignorance or bad faith actors.
But in a polarised world marked by declining trust in institutions, misinformation is rife, even mainstream. In the US, three quarters of people endorse at least one conspiratorial belief on major public issues. This forces a re-evaluation.
The key question is not who believes these ideas, but when and under what conditions.
Research offers the following insights:
- Education and intelligence are no protection; conspiracy beliefs take hold even in literate, educated populations.
- Conspiracy beliefs are almost universal across cultures and historical periods.
- They are consequential, shaping health behaviour, political participation and social cohesion.
- They are emotional and social responses to anxiety, threat and group identity.
- They are context-sensitive, increasing during periods of rapid social change, or institutional failure such as during the COVID-19 crisis and also longer-term disruptions like climate change and technological transformation.
Healthcare is particularly vulnerable
Conspiracy beliefs are predictable responses to specific conditions, especially uncertainty, power asymmetries and perceived threat. Modern healthcare generates these conditions.
- Uncertainty relates to causes (“why did I get cancer?”), risks (“what are the chances this vaccination will harm me or my child?”) and outcomes (“what happens if I do nothing?”).
- Power asymmetry refers to the gap between regulators, public health agencies, pharmaceutical companies and governments, and those who live with the consequences of their decisions – a gap that’s easy to exploit.
- For many people medical decisions such as whether to accept vaccination for themselves or a child, feel threatening to their physical health, autonomy, identity and values.
Visibility and probability
Healthcare also puts pressure on trust and sense-making because mechanisms such as immunity, viral transmission and genetic risk operate invisibly; when vaccines work, nothing happens and disease never appears.
Clinical recommendations are framed in probabilities – population averages and risk reduction – while the patient experience is singular, immediate and binary.
The paradox of prevention
Many healthcare interventions are offered to people who are well, or at least feel well. Childhood immunisation, cancer screening, statins and other preventive therapies ask individuals to accept interventions today, with some degree of risk, for the possibility of a future benefit.
When trust is established, these challenges are manageable. When trust is fragile, alternatives take hold.
The three “C”s
If healthcare creates conditions for adoption of conspiracy beliefs, the next question is how people respond. Communication scholar Dannagal G. Young argues that misinformation spreads because it meets three basic human needs – for control, comprehension and community – the “three Cs”.

Control
When people feel they have little control over decisions in their lives, conspiracy explanations become more appealing.
Conspiracy narratives present interventions like vaccination as something being done to individuals, and that can be resisted by them. Even when the stories are false, they restore a sense of personal agency.
Comprehension
Official explanations emphasise uncertainty and trade-offs, and are revised with new knowledge. This is scientifically honest, but also hard work for the listener.
Conspiracy narratives have simple causes, identifiable actors and obvious motives. People who adopt these explanations often feel they understand what is really going on, sometimes more clearly than experts do. Psychologists call this the illusion of explanatory depth.
Community
Health beliefs are formed and sustained in families, peer groups, religious communities, online networks and political movements. Information from within the group is readily accepted; outside information is viewed with suspicion even when it is accurate.
Examples from Nigeria and South Africa
During polio eradication efforts in northern Nigeria in the early 2000s vaccination was rumoured to be a foreign plot. This idea drew on long-standing social and political tensions and fears. Uptake improved only when the programme shifted from top-down messaging to trusted local leaders who explained in practical terms how vaccination worked and why it mattered.
In SA during the COVID-19 vaccine rollout, hesitancy clustered in specific social networks and misinformation spread quickly through channels like WhatsApp. Uptake improved when trusted local figures – clinicians, faith leaders, and community organisers – engaged directly within those same networks.

Hieronymus Bosch – The Garden of Earthly Delights. In closed communities, shared beliefs can drift far from objective reality.
Evidence-based actions
The final question is what to do about it. If conspiracy beliefs grow out of social and psychological conditions then responses should address those conditions directly.
Make explanations easier to understand
Good explanations focus on how and why decisions are made without overwhelming amounts of detail. They spell out the evidence that has been considered, note what is still uncertain, and explain why guidance may change over time. People are more likely to accept health advice when they can form a clear mental picture of how it works and where it comes from.
Support people’s sense of choice
People feel more confident when they understand their options and feel heard. Community consultation, clear consent processes and shared decision-making are approaches that do not require abandoning evidence-based recommendations; they help people engage with them.
Work through people who are already trusted
Local clinicians, nurses, pharmacists, community health workers and faith leaders are generally trusted. Vaccination and other health programmes work best when they are delivered through these existing relationships. Effective programmes invest in community partnerships and locally lead outreach.
Encourage reflection, don’t argue
People are more open to reconsidering their views when they are invited to explain their thinking, ask questions and reflect rather than being challenged or corrected. This approach helps people recognise gaps in their own understanding without feeling attacked or dismissed.
Limit the spread of harmful misinformation
On the supply side, along with community engagement, there is a need for faster identification and disruption of deliberate organised disinformation and digital platform practices that amplify misleading or harmful content.

Albrecht Dürer – Melencolia I (1514). Surrounded by tools of reason, yet unable to act.
Misinformation may come from government
In the United States, we are witnessing a complex shift where narratives that diverge from established science are articulated and promoted by scientific and public health agencies under government direction.
What was once external pressure has, in some cases, become internalised. Major changes to national vaccine advisory structures and immunisation guidance have been criticised by health experts and state officials as lacking scientific basis or transparency.

Pieter Bruegel the Elder – The Blind Leading the Blind (1568).
This reflects the same demand-side dynamics that drive conspiracy beliefs. The three Cs – control, comprehension, and community – help explain why government actors may adopt, amplify or tolerate claims that diverge from scientific evidence, even while invoking the language of trust and inclusion.
Control inside institutions
New policies emphasise individual choice and decentralised decision-making, often framed as respect for personal autonomy. This language resonates with public concerns but weakens the role of evidence. Personal agency becomes an end in itself rather than something exercised within boundaries set by defined population-level risks and benefits. There is a tension where political manoeuvrability may override scientific consensus.
Comprehension and strategic ambiguity
Scientific uncertainty, for example about the origins of autism or claimed unsafe vaccine ingredients, has been highlighted without equivalent effort to explain how decisions are actually made. Scientific evidence for guidance shifts and the distinction between evidence and political, legal, or operational constraints are unclear. The result fertilises the ground for suspicion and conspiracy, now reinforced by official silence or inconsistency.
Community and institutional identity
Government agencies operate within an increasingly fragmented political and social landscape. When sceptical or conspiratorial views are echoed, legitimised, or left unchallenged by official voices, they gain credibility. Beliefs that once circulated in oppositional communities are being normalised in official structures.
A governance problem, not a communication failure
A charitable interpretation is that these dynamics reflect attempts to govern in a challenging, polarised environment. A less generous one is that mistrust is being accommodated rather than addressed. Demand-side forces that drive conspiracy beliefs are now shaping policy from within.
This is a governance problem with consequences for trust, public health and the credibility of science, both domestically in the uS and globally.
Conclusions
Rebuilding trust is a structural challenge. It requires healthcare systems to restore a sense of agency (control), provide accessible logic (comprehension), and leverage trusted local networks (community). Only by addressing the conditions that allow conspiracies to flourish can we expect them to be replaced with more truthful and beneficial versions of reality.
References and resources
Centers for Disease Control and Prevention. (2024). Measles cases and vaccination coverage. https://www.cdc.gov/measles
Douglas K, Sutton R, Cichocka A. The psychology of conspiracy theories. Current Directions in Psychological Science. https://doi.org/10.1177/0963721419881991
Engelbrecht M, Heunis C, Kigozi G. COVID-19 Vaccine Hesitancy in South Africa: Lessons for Future Pandemics. Int J Environ Res Public Health. 2022 May 30;19(11):6694. doi: 10.3390/ijerph19116694.
Hotez P. (2023). Scientists are under attack. Nature. https://www.nature.com/articles/d41586-023-02981-z
How the Power of Shared Identity Can Help Defuse Health Misinformation. https://www.ihi.org/library/blog/how-power-shared-identity-can-help-defuse-health-misinformation
Rozenblit L., Keil, F. (2002). The misunderstood limits of folk science: An illusion of explanatory depth. Journal of Experimental Psychology: General, 131(4), 521–534.
UNICEF. (2021). Vaccine misinformation management field guide. https://www.unicef.org/mena/reports/vaccine-misinformation-management-field-guide
UNICEF. (2023). New data indicates 30% decline in confidence in childhood vaccines in South Africa. https://www.unicef.org/southafrica/press-releases/new-data-indicates-30-decline-confidence-childhood-vaccines-south-africa
van Prooijen J, Douglas K. Belief in conspiracy theories: Basic principles of an emerging research domain. European Journal of Social Psychology, 48(7), 897-908. https://doi.org/10.1002/ejsp.2530
World Health Organization. (2020). How Nigeria eradicated polio.
World Health Organization. (2022). Behavioural and Social Drivers of Vaccination (BeSD): Practical guide. https://www.who.int/publications/i/item/9789240049680
World Health Organization. (2023). Global immunization coverage. https://www.who.int/news-room/fact-sheets/detail/immunization-coverage.
Young, D. (2023). Wrong: How Media, Politics, and Identity Drive Our Appetite for Misinformation. Johns Hopkins University Press. https://www.press.jhu.edu/books/title/12834/wrong