A renewed focus on epistemology—the study of how knowledge is constructed and validated—is essential for addressing the challenges of 21st-century science. Central to this is cultivating epistemic humility in the training of future scientists and physicians in order to revitalize the scientific method at a time when it is urgently needed to confront global challenges such as climate change, pandemics, and the ethical development of artificial intelligence.
As early human civilizations emerged, they required more precise knowledge of nature and control over natural phenomena, which gave rise to formal philosophical inquiry into the nature of truth. Classical epistemological metaphors—such as Plato's allegory of the cave and Zhuangzi's butterfly dream—underscore the enduring challenge humans face in distinguishing illusion from reality. The scientific revolution introduced a modern approach to investigating the natural world.
The scientific method succeeds precisely because it acknowledges the limitations of human cognition in discerning truth. Evolution prioritized survival over accuracy, and the human mind adapted to make rapid judgments using what psychologists call the “fast pathway”—relying on heuristics, analogies, and emotional cues.1 Our reasoning is further shaped by social pressures that prioritize group cohesion over objective analysis. Scientists are no exception: they, too, are subject to ambition, competition, and conformity to herd mentality.
To counter these tendencies, the scientific method is specifically designed to engage a slower, more deliberate mode of cognition. Rooted in the intellectual traditions of Socrates and Descartes, it embraces systematic doubt and builds understanding from first principles, using structured methods to guard against bias and self-deception. Epistemic humility is a modern extension of this tradition. It begins by acknowledging that the human observer is inseparable from what is being observed, and that bias and social influence are ever-present. This mindset holds conclusions as provisional and subject to revision in light of new evidence or broader perspectives.
Epistemic humility begins with metacognition: the ability to reflect on one's own thought processes. Because the fast, intuitive pathway dominates human cognition, conscious effort and self-regulation are required to engage slower, more analytical reasoning. This means resisting first impressions, seeking out diverse perspectives, and remaining open to the possibility of error. Systems thinking complements this mindset by breaking down complex problems into constituent parts and then reintegrating these into a broader context—recognizing interdependencies across variables and scales. This approach is especially relevant today, as science constructs increasingly sophisticated models, from the molecular level of biology and genetics to the macroscale in the design of health systems and care delivery.
Teaching epistemic humility in light of present-day psychological science requires confronting the “bias blind spot”—our tendency to recognize flaws in others' reasoning more readily than in our own. Research suggests no single intervention reliably overcomes this, but several promising strategies exist. Diverse teams can enhance critical thinking by introducing heterogeneity of thought. Artificial intelligence (AI) offers another potential tool: its ability to approach problems differently from humans has already proven useful in many domains. However, AI is not a panacea—it can inherit human biases from its training data, and users may override an AI's suggestions even when those suggestions are valid. While recent studies have raised concerns that over-reliance on AI may dull critical thinking, others have shown that AI can enhance learning when applied appropriately.2 Another approach, mindfulness training, is intended to support metacognition and self-regulation, though evidence for its impact on critical thinking remains limited. Further research is needed to identify effective strategies for strengthening critical thinking and teaching epistemic humility.
A growing crisis of trust in science and its institutions adds urgency to these efforts. Polarized media ecosystems, algorithm-driven information feeds, and rampant misinformation have eroded public confidence in experts and created ideological echo chambers. False narratives often spread faster than truth, distorting public discourse and weakening the authority of science.3 The information space is dominated by moneyed interests seeking to influence not only spending behavior but also political opinions and voting patterns. Even scientific meetings and clinical guideline development are not impervious to the influence of outside interests such as pharmaceutical companies.
Yet there is reason for hope: research suggests that scientists who openly acknowledge uncertainty and limitations are perceived as more trustworthy.4 With proper oversight, AI could serve not as a destabilizing force but as a catalyst for epistemic rigor. We must foster a culture of epistemic humility—one that values doubt, embraces complexity, and remains open to revision. Such a culture can renew the scientific enterprise and guide it toward a new era of discovery.
During the preparation of this work the authors used ChatGPT in order to offer editing and formatting suggestions. After using this tool/service, the authors reviewed and edited the content as needed and take full responsibility for the content of the published article.
Declaration of interests
The authors declare no competing interests and received no specific funding for this work.
Contributor Information
Leo Anthony Celi, Email: lceli@mit.edu.
Matilda Dorotic, Email: matilda.dorotic@bi.no.
Joseph Dubin, Email: joseph.dubin@bmc.org.
Noushin Nazarian, Email: n.nazarian@unimelb.edu.au.
Reza Salarikia, Email: salarikiareza@gmail.com.
References
- 1.Kahneman D. Farrar, Straus and Giroux; New York (NY): 2011. Thinking, fast and slow. [Google Scholar]
- 2.Celi L.A. Teaching machines to doubt. Nat Med. 2025 doi: 10.1038/s41591-025-04013-x. [DOI] [PubMed] [Google Scholar]
- 3.Vosoughi S., Roy D., Aral S. The spread of true and false news online. Science. 2018;359(6380):1146–1151. doi: 10.1126/science.aap9559. [DOI] [PubMed] [Google Scholar]
- 4.Koetke J., Schumann K., Bowes S.M., Vaupotič N. The effect of seeing scientists as intellectually humble on trust in scientists and their research. Nat Hum Behav. 2025;9(2):331–344. doi: 10.1038/s41562-024-02060-x. [DOI] [PubMed] [Google Scholar]
