Skip to main content
Annals of Medicine and Surgery logoLink to Annals of Medicine and Surgery
editorial
. 2025 Jun 10;87(8):4693–4694. doi: 10.1097/MS9.0000000000003456

How AI quietly undermines the joy and effort of learning: a call for rebalancing education in the digital age

Md Kamrul Hasan a,b,*
PMCID: PMC12333830  PMID: 40787558

The integration of artificial intelligence (AI) into the learning process has undoubtedly reshaped education, providing learners with access to resources, tailored experiences, and efficient tools to tackle problems. However, an important question remains: what is the cost of this transformation to the intrinsic joy of learning?[1]. While discussions about AI ethics, including fairness, transparency, and accountability, have taken center stage, the impact on the experience of learning itself has not received equal attention. This editorial seeks to examine how the excessive use of AI diminishes the satisfaction that comes from overcoming challenges through personal effort.

Learning is fundamentally rewarding because of the intellectual challenges it involves. Overcoming such challenges activates the brain’s reward system, producing dopamine that reinforces motivation and engagement. The satisfaction derived from this process – the effort-reward cycle – is a cornerstone of human development[2]. However, many AI tools, by offering immediate solutions, sidestep this process. For instance, tools that generate essays or solve complex problems in seconds remove the need for critical thinking or creativity, reducing the learner’s role to one of passive consumption. Although these tools save time, they often shift the focus from understanding to mere completion, undermining the depth and richness of learning[3]. Another core aspect often overlooked in discussions of AI-driven learning is the value of background knowledge. When learners are exposed to the context or history behind a topic – even briefly – it tends to stimulate curiosity, deepen engagement, and create cognitive hooks for long-term retention. In contrast, relying on AI to deliver direct answers bypasses this natural entry point into learning. Without understanding the “why” behind a concept, learners may struggle to apply knowledge meaningfully or retain it over time. This undermines not only the joy of learning but also its sustainability and relevance in real-world settings[4].

Consider the example of a student using an AI-driven tool to write an essay. While the final product may appear polished, the student bypasses critical thinking and the iterative process of drafting and refining ideas. Similarly, reliance on AI for solving mathematical problems deprives learners of the opportunity to develop problem-solving skills, which are vital not only academically but also in real-life scenarios. This impact can differ across education levels. For example, overuse of AI in early education may hinder foundational cognitive and emotional development due to decreased active engagement[2]. In secondary and higher education, AI can undermine the development of independent thinking and deeper conceptual learning[1]. In professional and medical training, AI is useful for standardizing procedural skills but should be used alongside hands-on learning to preserve clinical reasoning and decision-making capabilities[3,5]. For example, AI-driven simulation tools allow trainees to repeatedly practice procedural steps in a controlled environment, with built-in feedback mechanisms that enhance learning precision and confidence[5]. The ease provided by these tools may create an illusion of mastery, but they inhibit the brain’s capacity to adapt and grow through effortful engagement.

The long-term psychological implications of this shift are significant. Cognitive neuroscience underscores that the brain’s plasticity thrives on effortful learning experiences. When tasks requiring deep thought are delegated to AI, this reduces the brain’s ability to form and strengthen neural pathways essential for critical thinking and creativity[6]. This phenomenon, often referred to as “cognitive outsourcing,” not only hampers intellectual growth but also diminishes confidence in one’s own abilities over time. Furthermore, the standardization inherent in many AI systems risks homogenizing educational experiences. By promoting generic solutions, AI discourages individuality and the diverse approaches that are central to human learning[7].

Some argue that policies and regulations can address these concerns by controlling the extent to which AI is used in education. For example, the European Commission’s Ethics Guidelines for Trustworthy AI highlight the importance of principles such as human agency and societal well-being[8]. Similarly, Canada’s proposed Artificial Intelligence and Data Act aims to ensure that AI systems are transparent, fair, and accountable[9]. In the United States, the Blueprint for an AI Bill of Rights seeks to protect individuals from potential harms of AI systems, emphasizing fairness and security[10]. While these frameworks are critical for guiding responsible AI use, they do little to tackle the experiential aspects of learning. Policies may limit AI usage, but they cannot force learners to engage deeply or reclaim the joy of learning through effort.

The real challenge lies in redefining AI’s role in education. AI should act as a complement to, rather than a replacement for, human effort. It can be used to enhance learning by providing personalized feedback, identifying areas of difficulty, and suggesting resources to overcome challenges. Such applications allow learners to engage meaningfully with the material while still benefiting from AI’s capabilities. However, this requires a cultural shift in how both educators and students perceive AI. Integrating AI responsibly into education demands a commitment to preserving the integrity of the learning process[7].

Educators play a pivotal role in this transformation. By developing curricula that balance traditional methods with AI tools, they can ensure that students actively participate in the learning process. For example, instead of permitting students to rely solely on AI-generated answers, teachers can encourage them to use AI as a supportive resource while still demonstrating their understanding through original work. Assessments can also be restructured to reward creativity and effort rather than focusing solely on accuracy or efficiency. Similarly, students must recognize that the value of learning lies not just in acquiring knowledge but in the intellectual and emotional growth that comes with grappling with challenges. Experiences like struggling with a difficult concept or solving a tough problem are irreplaceable and cannot be replicated by any AI tool[8].

While this editorial offers a theoretical perspective, the absence of primary data limits empirical validation of the ideas discussed. Nonetheless, the overuse of AI in education risks eroding the essence of what makes learning fulfilling: the effort-reward cycle. While ethical frameworks and policies are necessary for regulating AI use, they do not address the cognitive and emotional dimensions of this issue. The solution lies in fostering a balanced approach where AI is used to enhance rather than replace human effort. As we continue to explore the potential of AI in education, it is crucial to remember that learning is more than a transaction of knowledge; it is a deeply human experience characterized by discovery, effort, and growth.

Acknowledgements

The author affirms that all ideas and content presented in this manuscript are their own. ChatGPT (OpenAI) was used solely to assist with language refinement and structural clarity. All content was reviewed, edited, and approved by the author.

Footnotes

Sponsorships or competing interests that may be relevant to content are disclosed at the end of this article.

Published online 10 June 2025

Ethical approval

Not applicable.

Consent

Not applicable.

Sources of funding

No funding from any public, private, or non-profit research agency was received for this study.

Author contributions

Conceptualization, investigation, formal analysis, methodology, project administration, writing – original draft, writing – review and editing: M.K.H.

Conflicts of interest disclosure

The authors report no competing interests. The authors alone are responsible for the content and writing of this article.

Research registration unique identifying number (UIN)

Not applicable.

Guarantor

Md. Kamrul Hasan.

Provenance and peer review

Not commissioned, externally peer-reviewed.

Data availability statement

Not applicable.

Declarations

All relevant ethical, funding, authorship, and data-related declarations are included in the sections above.

References

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

Not applicable.


Articles from Annals of Medicine and Surgery are provided here courtesy of Wolters Kluwer Health

RESOURCES