Skip to main content
Journal of Participatory Medicine logoLink to Journal of Participatory Medicine
editorial
. 2025 May 16;17:e75794. doi: 10.2196/75794

From E-Patients to AI Patients: The Tidal Wave Empowering Patients, Redefining Clinical Relationships, and Transforming Care

Susan S Woods 1,, Sarah M Greene 2, Laura Adams 2, Grace Cordovano 3, Matthew F Hudson 4
Editor: Amy Price
PMCID: PMC12101788  PMID: 40378413

Abstract

Artificial intelligence (AI) and large language models offer significant potential to enhance many aspects of daily life. Patients and caregivers are increasingly using AI for their own knowledge and to address personal challenges. The growth of AI has been extraordinary; however, the field is only beginning to explore its intersection with participatory medicine. For many years, the Journal of Participatory Medicine has published insights on tech-enabled patient empowerment and strategies to enhance patient-clinician relationships. This theme issue, Patient and Consumer Use of AI for Health, will explore the use of AI for health from the perspective of patients and the public.

Introduction

Artificial intelligence (AI) and large language models (LLMs) offer boundless potential to enhance many aspects of daily life. The promise of AI for health is profound: to discover new treatments, gain efficiencies, and deliver precision medicine—the right intervention to the right person at the right time [1]. Experts are effusive about AI, which can reduce cognitive workload, enhance prevention, and lower costs. Many blunt this enthusiasm with caution, as the field struggles to genuinely address AI ethics, accountability, privacy, and governance [2].

Along with the hope (and hype) of AI within health care, the public is swiftly taking AI into their own hands. Consumers are at the forefront in this era of AI. A survey conducted in January 2025 by Imagining the Digital Future Center found that 52% of US adults used ChatGPT, Gemini, CoPilot, or other LLMs. Among LLM users, half reported personal learning as their goal, and 39% sought information about physical or mental health [3]. Patients burdened with life-changing or rare conditions commonly search for the resources that they need to solve problems. As consumer costs of care keep rising and health care is relentlessly hard to navigate, patients and caregivers are gaining skills and intelligence using LLMs across a breadth of topics. These information seekers go beyond clinical content, using AI for personalized advice to tackle legal, financial, social, and many of life’s challenges.

While people may not realize the ubiquity of AI, millions interact with AI daily using assistants such as Siri or Alexa and streaming platforms such as Netflix and Spotify [4]. Launched in November 2022, ChatGPT reached 100 million users in 2 months and hundreds of millions of users by March 2024 [5]. This scorching adoption has been faster than for personal computers and the internet. In 2024, a total of 39.4% of US adults aged 18-64 years reported using generative AI, and 32% used it weekly. In contrast, 20% of the public used the internet 2 years after its launch, and 20% owned a computer after 3 years of availability. While price and ease of use play a role in the difference, the advancement of AI is without historic parallel.

Projections of the health AI market over the next decade are staggering, with estimates of US $27 billion in 2024 climbing to US $613 billion by 2034 [6]. At this early stage, the direct-to-consumer market may mature faster and more readily than inside health care [7]. Yet, current research on AI for health largely focuses on clinician and professional users. It is essential to study how AI can best serve patients while mitigating risks. Although papers on the use of AI by patients and the public are starting to emerge, we believe this is the first theme issue in a medical journal that is dedicated to the topic.

Rise in AI in Health Care Delivery Settings

Across health care, AI tools vary in their capabilities and stage of adoption (eg, to analyze data or optimize workflows) [8]. LLMs currently evaluate x-rays and images and enhance radiologists’ diagnostic accuracy. AI is even in operating rooms, helping surgeons with the use of robotics during procedures. AI-enabled wearable devices gather patient data remotely to inform and augment cardiologists’ decision-making. AI is synthesizing vast volumes of data locked in electronic health records, transforming raw data into actionable information. AI is accelerating pharmaceutical development, expediting drug discovery, and reducing the costs of clinical trials [9]. Notably, patient-physician-scientist partnerships are expanding, and using AI for “drug repurposing,” or searching existing medications that work for rare diseases, is also accelerating [10].

For patients, the visibility of AI in health care is low but rising. AI scribes are being used to record human conversations during encounters and summarize visits. Automating the documentation of visits may realize a “holy grail” by giving clinicians more time for patients and families. One study found that a year after deploying AI scribes, most physicians had a positive experience. All patients in the study reported that AI had either a positive or neutral impact on the quality of their visit; only 8% of patients felt some level of discomfort [11]. These AI agents remain a work in progress, as AI documentation continues to gain accuracy and completeness.

Health systems are using AI-derived content to respond to patients’ emails. Research on AI automated responses suggests that patients find messages to be satisfactory, with many comparable to emails from physicians; moreover, patients rated some responses as more empathetic than human clinician replies [12]. While AI messaging may help, health systems recognize the inherent risks in responding with inaccurate or potentially harmful information. Further, ethical concerns have been raised when patients believe responses are from a human and not a computer, or if they cannot ascertain whether replies are written by AI [13].

AI will remodel the patient experience and affect patient-clinician relationships. AI assistants do not replace the need for human judgment, particularly in cases requiring nuanced decisions. Importantly, patient and public involvement in AI development and refinement are critical to improve value, ensure safety, and engender trust. Further, more attention is warranted on the growth of AI tools that patients and caregivers are using independently for their health [5].

The (R)evolution of Patient and Public Agency and Empowerment

The 21st century will be the age of the net empowered medical end user, and the patient-driven online support networks of today will evolve into more robust and capable medical guidance systems that will allow end users to direct and control an ever-growing portion of their own medical care.

[Tom Ferguson, MD, 2002] [14]

Ferguson was a family physician and pioneer who advocated for consumer use of the internet, believing that clinicians had much to learn from patients and families. He observed that patients who possessed internet-derived knowledge were more involved in their health and their care—the hallmark of participatory medicine [15]. He presciently wrote about tech-savvy patients who disengage from doctors who do not support patients accessing online information for self-care.

Participatory medicine continues to evolve, albeit sluggishly. For over three decades, the internet has served patients as a powerful tool to access previously unavailable information and connect with peers [16]. This shift in how people manage their health also altered power dynamics at medical visits and led to the term “Dr. Google” [17]. While greater patient control and contribution unfolded, not all clinicians have been comfortable with patients online or serving in a new role as “guide” or “partner” rather than expert authority.

The Journal of Participatory Medicine (JoPM) has been a pioneer, contributing insights on tech-enabled patient empowerment and enhancing patient-clinician relationships. JoPM’s early content was published on the Society of Participatory Medicine website, edited by Charlie Smith, Joe Graedon, and Terry Graedon, from 2009 to 2017. Authors included luminaries such as Esther Dyson, George Lundberg, Jessie Gruman, Kurt Stange, Kate Lorig, “e-patient Dave” DeBronkart, and many others. In 2017, JoPM joined JMIR Publications as a peer-reviewed, open access journal to advance the science of participatory care (also referred to as coproduction and co-design). Published papers mirror the 15-year shift in relationships between patients, their health information, and their providers.

Health professionals often overestimate the risks of e-patients (patients and caregivers online) and underestimate their value [18]. Despite the long-standing evidence that a participatory decision-making style leads to greater patient satisfaction and trust in health professionals [19], medical educators and practitioners have yet to fully acknowledge that patients are already active managers of their care, failing to support patients in this role [20]. Yet the evidence is there: e-patients are more prepared, feel more in control of their care, and achieve better outcomes [21].

The value of patient-facing technology continues to soar. Patients can now access all their clinical notes and test results online, mandated by the 21st Century Cures Act. Opening notes ushered in a wealth of research showing benefits of shared data to patients and families [22]. Along with technology empowering patients, health care has adopted a more holistic perspective. This shifted patient inquiry from “What is the matter with you?” to “What matters to you?” This approach robustly assesses social drivers of health and clarifies patient context, allowing care teams to codevelop realistic and achievable care plans.

The democratization of information and near-universal access to the internet have help innumerable patients. Not all health care organizations celebrate such progress, however. Patient portals, a splendid tool for patients, also contribute to clinician’s administrative burden. Patient messaging volume has escalated, leading some organizations to charge for e-communication. Real-time access to laboratory, imaging, and pathology tests causes apprehension among clinicians who feel unprepared when patients are first to see results. Some clinicians also believe that patient access to their health information threatens therapeutic relationships and extends the length of visits [23].

AI advancements introduce a range of new challenges. Too much information may overwhelm patients and caregivers and add uncertainty and anxiety when seeking credible and reliable resources, while a lack of information can cause patient anxiety. Lack of internet connectivity or device access excludes patients from benefiting from digital tools [24]. Consequently, there are expectations that AI tools—somewhat paradoxically—will solve the problem of too much information and narrow the digital divide. Then again, AI-derived outputs are knowingly biased since public access to peer-reviewed research is often behind “paywalls” that are restricted to institutional subscribers.

AI Patients and Consumers: It Is Already Here

Often considered “the future,” AI is here today and integrated into everyday life. Positioned to facilitate moving patients and families into this new age, AI amplifies earlier e-patient behavior to obtain relevant health information, increase patient control over health and care, enhance health literacy, stimulate coequal contributions in decision-making processes, and enhance relationships with clinicians. Society has moved from e-patients to AI patients.

The public use of AI will grow exponentially. AI assistants will be increasingly used to explore symptoms; help with managing chronic diseases; and offer advice on nutrition, exercise, and more. AI-enabled wearable and smart devices, now used for people to track their activities to make real-time adjustments, will flourish. Those with life-altering diagnoses or rare diseases will use AI as a research assistant and copilot to obtain tailored data to guide treatment planning, especially when traditional forms of care have been exhausted. AI-powered peer support will transform into patient-led knowledge networks, and caregivers will use AI tools to monitor their loved ones while aiming to lower their stress.

As AI augments traditional care, there will be consequences. One example is the surge of low-cost AI chatbots targeting adolescents and young adults to address mood and mental health. Promoted as “personal intelligence” tools, these on-demand chatbots engage users to reflect on their feelings, organize thoughts, and help make decisions. Early research on AI chatbots for anxiety and depression has been mixed. Some studies show reductions in symptoms and perceived loneliness among frequent users [25]. Challenges, however, include emotional attachment and user dependency, lack of professional oversight, harmful messaging, and legal and privacy issues [26].

As health systems use “virtual first” approaches to care, boundaries between patients using AI alone versus AI with clinicians may become blurred. AI accuracy and trustworthiness will require incorporating human intelligence and feedback (human in the loop) to improve its accuracy and earn trust. Still, because patients’ needs are often not being met, any tools that can help patients navigate care and solve problems could be valuable.

The Need for Research, Education, and Co-Design

These challenges underscore the need for research to identify both AI benefits and risks, especially among vulnerable populations. Like the e-patient era, the AI patient era may underestimate the significance of people using information to manage their health. Unlike the past, however—where risks to patients online were overestimated—AI stakeholders may underestimate the risks of AI to patients. These tools are powerful yet presently subject to only minimal regulation and governance. AI researchers must study how patients and caregivers use AI and assess how it impacts their lives. AI developments need to be co-designed with patients and ensure that governance includes rigorous regulatory and other guardrails, thereby preventing harm while promoting beneficial use [27]. Reputable organizations provide salient approaches to meaningfully involve patients and the public in research and care delivery, including the Patient-Centered Outcomes Research Institute [28] and the UK Standards for Public Involvement [29]. Critical guidelines are available from the National Academy of Medicine’s AI Code of Conduct [30] and The Light Collective’s AI Rights for Patients, which outlines seven patient rights critical to the development and deployment of AI in health care [31].

Finally, there is a fundamental educational imperative to equip patients and consumers with the knowledge and skills necessary to critically engage with AI tools for health. Educational offerings should encompass basic concepts and principles of AI and LLMs, effective prompting strategies, and understanding that machine learning systems may generate inaccurate or misleading outputs (ie, “hallucinations”). Learners must be aware of AI’s considerable variability in quality, transparency, equity, and reliability. Such instruction is essential to ensure individuals use AI tools responsibly and effectively to support their health and well-being.

Our journal’s theme issue, Patient and Consumer Use of AI for Health, begins exploring the use of AI for health from the perspective of patients and the public. The scope of our special issue posits the following:

  • What is the patient and caregiver experience using AI tools for health and care?

  • How can patients, caregivers, and the public use AI for maximum benefit?

  • What are the risks and unintended consequences of AI use by patients, and how can these be mitigated?

  • What is the impact of AI derived from health systems and presented to patients?

  • How does AI affect patient-clinician relationships or patient–health care relationships?

  • How can patient and public involvement be a standard in designing, developing, and deploying AI for health?

The growth of AI has been extraordinary; however, the field is only beginning to explore its intersection with participatory medicine. Health care must expand its “patient-centered” views and embrace the power that AI use affords patients and caregivers, as they are not seeking permission but are already using LLMs. Researchers must investigate consumer use of AI, co-designing studies with patients and caregivers, and determine how to avoid unintended consequences. The innovation community must embrace patient and public involvement throughout the development life cycle. We hope that this work inspires others to contribute to this new era of #PatientsUseAI.

Abbreviations

AI

artificial intelligence

JoPM

Journal of Participatory Medicine

LLM

large language model

Footnotes

Conflicts of Interest: None declared.

References

  • 1.Goldberg CB, Adams L, Blumenthal D, et al. To do no harm - and the most good - with AI in health care. Nat Med. 2024 Mar;30(3):623–627. doi: 10.1038/s41591-024-02853-7. doi. Medline. [DOI] [PubMed] [Google Scholar]
  • 2.National Academies of Sciences, Engineering, and Medicine . Diagnosis in the Era of Digital Health and Artificial Intelligence: Proceedings of a Workshop—in Brief. The National Academies Press; 2024. doi. [DOI] [Google Scholar]
  • 3.Close encounters of the AI kind: the increasingly human-like way people are engaging with language models. Imagining the Digital Future Center at Elon University. Mar, 2025. [02-05-2025]. https://imaginingthedigitalfuture.org/wp-content/uploads/2025/03/ITDF-LLM-User-Report-3-12-25.pdf URL. Accessed.
  • 4.Real-time population survey. Google Sites. [08-03-2025]. https://sites.google.com/view/covid-rps URL. Accessed.
  • 5.Burmagina K. Artificial intelligence usage statistics and facts. Elfsight. 2025. [08-03-2025]. https://elfsight.com/blog/ai-usage-statistics URL. Accessed.
  • 6.Precedence Research Artificial intelligence (AI) in healthcare market size expected to reach USD 613.81 bn by 2034. GlobalNewswire. Aug 12, 2024. [08-03-2025]. https://www.globenewswire.com/news-release/2024/08/12/2928598/0/en/Artificial-Intelligence-AI-in-Healthcare-Market-Size-Expected-to-Reach-USD-613-81-Bn-by-2034.html URL. Accessed.
  • 7.Mandl KD. How AI could reshape health care-rise in direct-to-consumer models. JAMA. 2025 Feb 24; doi: 10.1001/jama.2025.0946. doi. Medline. [DOI] [PubMed] [Google Scholar]
  • 8.Topol EJ. High-performance medicine: the convergence of human and artificial intelligence. Nat Med. 2019 Jan;25(1):44–56. doi: 10.1038/s41591-018-0300-7. doi. Medline. [DOI] [PubMed] [Google Scholar]
  • 9.Paul D, Sanap G, Shenoy S, Kalyane D, Kalia K, Tekade RK. Artificial intelligence in drug discovery and development. Drug Discov Today. 2021 Jan;26(1):80–93. doi: 10.1016/j.drudis.2020.10.010. doi. Medline. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Morgan K. Doctors told him he was going to die then AI saved his life. New York Times. Mar 20, 2025. [31-03-2025]. https://www.nytimes.com/2025/03/20/well/ai-drug-repurposing.html URL. Accessed.
  • 11.Tierney AA, Gayre G, Hoberman B, et al. Ambient artificial intelligence scribes: learnings after 1 year and over 2.5 million uses. NEJM Catalyst. 2025 Apr 16;6(5) doi: 10.1056/CAT.25.0040. doi. [DOI] [Google Scholar]
  • 12.Tai-Seale M, Baxter SL, Vaida F, et al. AI-generated draft replies integrated into health records and physicians’ electronic communication. JAMA Netw Open. 2024 Apr 1;7(4):e246565. doi: 10.1001/jamanetworkopen.2024.6565. doi. Medline. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Cavalier JS, Goldstein BA, Ravitsky V, et al. Ethics in patient preferences for artificial intelligence-drafted responses to electronic messages. JAMA Netw Open. 2025 Mar 3;8(3):e250449. doi: 10.1001/jamanetworkopen.2025.0449. doi. Medline. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Ferguson T. From patients to end users. BMJ. 2002 Mar 9;324(7337):555–556. doi: 10.1136/bmj.324.7337.555. doi. Medline. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Ferguson T, Frydman G. The first generation of e-patients. BMJ. 2004 May 15;328(7449):1148–1149. doi: 10.1136/bmj.328.7449.1148. doi. Medline. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Fox S, Duggan M. Health online 2013. Pew Internet Center. Jan 15, 2013. [08-03-2025]. http://www.pewinternet.org/2013/01/15/health-online-2013 URL. Accessed.
  • 17.Lee K, Hoti K, Hughes JD, Emmerton L. Dr Google is here to stay but health care professionals are still valued: an analysis of health care consumers’ internet navigation support preferences. J Med Internet Res. 2017 Jun 14;19(6):e210. doi: 10.2196/jmir.7489. doi. Medline. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Greenfield S, Kaplan S, Ware JE., Jr Expanding patient involvement in care. Ann Intern Med. 1985 Apr 1;102(4):520–528. doi: 10.7326/0003-4819-102-4-520. doi. Medline. [DOI] [PubMed] [Google Scholar]
  • 19.Merner B, Schonfeld L, Virgona A, et al. Consumers’ and health providers’ views and perceptions of partnering to improve health services design, delivery and evaluation: a co-produced qualitative evidence synthesis. Cochrane Database Syst Rev. 2023 Mar 14;3(3):CD013274. doi: 10.1002/14651858.CD013274.pub2. doi. Medline. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Fox S. Rebel Health: A Field Guide to the Patient-Led Revolution in Medical Care. The MIT Press; 2024. doi. [DOI] [Google Scholar]
  • 21.Greene J, Hibbard JH, Sacks R, Overton V, Parrotta CD. When patient activation levels change, health outcomes and costs change, too. Health Aff (Millwood) 2015 Mar;34(3):431–437. doi: 10.1377/hlthaff.2014.0452. doi. Medline. [DOI] [PubMed] [Google Scholar]
  • 22.Walker J, Leveille S, Bell S, et al. OpenNotes after 7 years: patient experiences with ongoing access to their clinicians’ outpatient visit notes. J Med Internet Res. 2019 May 6;21(5):e13876. doi: 10.2196/13876. doi. Medline. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Steitz BD, Turer RW, Lin CT, et al. Perspectives of patients about immediate access to test results through an online patient portal. JAMA Netw Open. 2023 Mar 1;6(3):e233572. doi: 10.1001/jamanetworkopen.2023.3572. doi. Medline. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Woods SS, Forsberg CW, Schwartz EC, et al. The influence of digital inclusion factors on sustained patient portal use: a prospective cohort of enrolled users. J Med Internet Res. 2017 Oct 17;19(10):e345. doi: 10.2196/jmir.7895. doi. Medline. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Khosravi M, Azar G. Factors influencing patient engagement in mental health chatbots: a thematic analysis of findings from a systematic review of reviews. Digit Health. 2024 Apr 22;10:20552076241247983. doi: 10.1177/20552076241247983. doi. Medline. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Haque MDR, Rubya S. An overview of chatbot-based mobile mental health apps: insights from app description and user reviews. JMIR Mhealth Uhealth. 2023 May 22;11:e44838. doi: 10.2196/44838. doi. Medline. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Meskó B, deBronkart D. Patient design: the importance of including patients in designing health care. J Med Internet Res. 2022 Aug 31;24(8):e39178. doi: 10.2196/39178. doi. Medline. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.The value of engagement in research. Patient-Centered Outcomes Research Institute. [08-03-2025]. https://www.pcori.org/engagement-research/value-engagement-research URL. Accessed.
  • 29.UK Standards for Public Involvement. Google Sites. [08-03-2025]. https://sites.google.com/nihr.ac.uk/pi-standards/home URL. Accessed.
  • 30.Adams L, Fontaine E, Lin S, Crowell T, Chung VCH, Gonzalez AA. Artificial intelligence in health, health care, and biomedical science: an AI code of conduct principles and commitments discussion draft. NAM Perspect. 2024 Apr 8; doi: 10.31478/202403a. doi. Medline. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Collective AI rights for patients. The Light Collective. Jun, 2024. [08-03-2025]. https://lightcollective.org/patient-ai-rights URL. Accessed.

Articles from Journal of Participatory Medicine are provided here courtesy of JMIR Publications Inc.

RESOURCES