Skip to main content
JMIR Medical Education logoLink to JMIR Medical Education
. 2026 Mar 31;12:e95205. doi: 10.2196/95205

Authors’ Reply: Why Medical Education Without Artificial Intelligence Still Matters: A Neuroscience-Informed Perspective

Juan S Izquierdo-Condoy 1, Marlon Arias-Intriago 1, Laura Montero Corrales 2, Esteban Ortiz-Prado 1,
Editor: Tiffany Leung
PMCID: PMC13037753  PMID: 41915771

We thank the author of the pertinent commentary, Why Medical Education Without Artificial Intelligence Still Matters: A Neuroscience-Informed Perspective” [1]. We believe this letter makes a valuable contribution to the debate on artificial intelligence (AI) in medical education by emphasizing the cognitive consequences of training clinicians in environments where these tools are available yet not always reliable or accessible in everyday practice.

From our perspective, the author’s argument expands on a concern we had already raised previously: if AI, particularly generative AI, can compromise critical thinking and cognitive autonomy, this erosion may also translate into a reduced capacity to sustain clinical reasoning when the tool fails or is unavailable [2]. In our article, “Artificial Intelligence in Medical Education: Transformative Potential, Current Applications, and Future Implications” [3], we identified uncritical dependence on algorithmic outputs as one of the main limitations of incorporating AI into medical training.

The neurocognitive evidence cited by the author reinforces this argument. Recent studies suggest that the use of generative AI may alter how individuals engage in complex tasks. Although this evidence remains preliminary, it is consistent with what has already been described regarding automation bias and cognitive off-loading. Likewise, accepting automated suggestions without sufficient analysis may increase the risk of error when the system fails or is unavailable [4,5]. Taken together, these findings support a central idea: AI should support reasoning, not replace expert judgment.

In our viewpoint, we addressed these concerns through concrete proposals. Importantly, our discussion was not limited to generative AI but also encompassed other educationally relevant technologies, including natural language processing, intelligent tutoring systems, virtual reality, and augmented reality. In this context, we advocated for AI literacy curricula that would enable students not only to use these tools but also to critically evaluate them, recognize their limitations, identify biases, and assess their outputs against sound clinical reasoning. We also supported governance structures aligned with international ethical frameworks, such as the UNESCO (United Nations Educational, Scientific and Cultural Organization) Recommendation on the Ethics of Artificial Intelligence and the World Health Organization guidance on the ethics and governance of AI in health [3]. We consider human oversight, transparency, bias monitoring, and accountability to be essential safeguards for the responsible integration of AI.

We also agree with the author’s suggestion to include structured exercises in which students must function without AI support, as this may strengthen independent diagnostic reasoning and decision-making under uncertainty. At the same time, this discussion must acknowledge persistent technological inequalities, particularly in low-income countries, where access to advanced AI tools remains uneven [6].

We agree that AI integration in medical education should not displace human intelligence but rather enhance professional competence without undermining clinical autonomy or critical thinking. Preparing future physicians to function appropriately with or without AI will be essential to protect patient safety, respond across diverse contexts, and preserve the human dimension of medicine. We again thank the author for this valuable contribution and concur on the need for further empirical evidence, particularly longitudinal studies, to guide a prudent, ethical, and evidence-based integration of AI into medical education.

Acknowledgments

Language editing assistance was used during the preparation of this manuscript. A generative artificial intelligence (AI) tool (ChatGPT, OpenAI, GPT-5.4 Thinking) was consulted solely to improve grammar, clarity, and sentence structure, as English is not the authors’ first language. The tool was used exclusively for linguistic refinement and formatting support. All scientific concepts, interpretations, arguments, and conclusions presented in this manuscript were independently developed by the authors. The authors take full responsibility for the intellectual content, accuracy, and integrity of the work.

Abbreviations

AI

artificial intelligence

UNESCO

United Nations Educational, Scientific and Cultural Organization

Footnotes

Funding: The authors declared that no financial support was received for this work.

Conflicts of Interest: None declared.

References

  • 1.Verdonk C. Why medical education without artificial intelligence still matters: a neuroscience-informed perspective. JMIR Med Educ. 2026;12:e94594. doi: 10.2196/94594. doi. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Izquierdo-Condoy JS, Arias-Intriago M, Tello-De-la-Torre A, Busch F, Ortiz-Prado E. Generative artificial intelligence in medical education: enhancing critical thinking or undermining cognitive autonomy? J Med Internet Res. 2025 Nov 3;27:e76340. doi: 10.2196/76340. doi. Medline. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Izquierdo-Condoy JS, Arias-Intriago M, Montero Corrales L, Ortiz-Prado E. Artificial intelligence in medical education: transformative potential, current applications, and future implications. JMIR Med Educ. 2026 Feb 17;12:e77127. doi: 10.2196/77127. doi. Medline. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Abdelwanis M, Alarafati HK, Tammam MMS, Simsekler MCE. Exploring the risks of automation bias in healthcare artificial intelligence applications: a Bowtie analysis. J Safety Sci Resilience. 2024 Dec;5(4):460–469. doi: 10.1016/j.jnlssr.2024.06.001. doi. [DOI] [Google Scholar]
  • 5.Kosmyna N, Hauptmann E, Yuan YT, et al. Your Brain on ChatGPT: accumulation of cognitive debt when using an AI assistant for essay writing task. arXiv. 2025 Jun 10; doi: 10.48550/arXiv.2506.08872. doi. [DOI] [Google Scholar]
  • 6.Izquierdo-Condoy JS, Arias-Intriago M, Nati-Castillo HA, et al. Exploring smartphone use and its applicability in academic training of medical students in Latin America: a multicenter cross-sectional study. BMC Med Educ. 2024 Nov 30;24(1):1401. doi: 10.1186/s12909-024-06334-w. doi. Medline. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from JMIR Medical Education are provided here courtesy of JMIR Publications Inc.

RESOURCES