Abstract
Emergency Medical Services (EMS) is likely to experience transformative changes due to the rapid advancements in artificial intelligence (AI), such as OpenAI’s ChatGPT. In this short commentary, we aim to preliminarily explore some profound implications of AI advancements on EMS systems and practice.
Keywords: EMS, artificial intelligence, prehospital, machine learning, EMT, paramedic
As with most other health-care industries, Emergency Medical Services (EMS) is likely to experience transformative changes due to the rapid advancements in artificial intelligence (AI), such as OpenAI’s ChatGPT. These large conversational language models allow users to ask questions and receive answers from AI using everyday speech.1 Two components of programs like ChatGPT are generative AI and natural language processing (NLP), which have the potential to reshape EMS clinical practice and education – creating novel avenues for clinicians, educators, and administrators to redefine the landscape of EMS. Generative AI refers to a branch of AI that focuses on creating new content by learning patterns from training data. NLP, however, involves the ability of AI systems to understand and process human language, enabling greater efficiency in data entry, documentation, and real-time decision support.2 AI is not only becoming a topic of immense public interest but also one of unforeseeable risk, which is prompting regulatory scrutiny and controversial debates.3 In this short commentary, we aim to preliminarily explore some profound implications of AI advancements on EMS systems and practice.
The integration of generative AI into prehospital clinical practice holds immense promise for enhancing diagnostic capabilities, as the incorporation of NLP algorithms provides EMS clinicians with access to a vast repository of evidence-based guidelines.4 OpenAI’s comprehensive medical knowledge database is further evidenced by its demonstrated capacity to pass several medical licensing exams.5 AI might be supportive in clinical decision-making and point of care triage assistance. Box 1 depicts a sample prehospital treatment using ChatGPT for a fictional case of a 47-year-old female patient with symptomatic bradycardia.
Box 1.
Question and Answer Response Using OpenAI GPT-3.5 for a Fictional Prehospital Scenario
| [Question] “I am a paramedic. Create a prehospital patient care plan for a 47 year old female patient presenting with chest pain and symptomatic sinus bradycardia. Vital signs are BP 90/42, SPO2: 92%, HR: 44, RR: 22”. |
| [Answer] “As a paramedic, it is important to prioritize the care of a patient presenting with chest pain and symptomatic sinus bradycardia. Based on the provided vital signs, here’s a prehospital patient care plan for a 47-year-old female patient: 1. Primary Assessment: ● Ensure scene safety and universal precautions. ● Assess the patient’s airway, breathing, and circulation (ABCs). ● Provide supplemental oxygen via a nasal cannula or non-rebreather mask to maintain oxygen saturation above 94%. ● Initiate cardiac monitoring to continuously monitor the patient’s heart rate and rhythm. 2. Establish IV Access: ● Establish intravenous (IV) access with a large-bore catheter (eg, 18 gauges) for fluid administration and medication delivery. 3. Cardiac Monitoring: ● Continuously monitor the patient’s cardiac rhythm. ● Assess the 12-lead ECG to identify any ST-segment changes or signs of acute coronary syndrome. 4. Administer Medications: ● If the patient is symptomatic from bradycardia, administer atropine 0.5 mg IV bolus, repeated every 3–5 min up to a total dose of 3 mg, as per local protocols. ● Consider contacting medical control for further guidance on pharmacological interventions, such as transcutaneous pacing or additional medication administration (eg, epinephrine infusion or dopamine infusion). 5. Intravenous Fluid Administration: ● Administer a fluid bolus (eg, normal saline) if the patient is hypotensive (systolic blood pressure less than 90 mmHg) or shows signs of poor perfusion. 6. Pain Management: ● Administer appropriate analgesia, such as intravenous morphine or fentanyl, to relieve the patient’s pain and reduce anxiety. 7. Reassess Vital Signs and Symptoms: ● Regularly reassess the patient’s vital signs, cardiac rhythm, and level of pain. ● Monitor the patient’s oxygen saturation and provide supplemental oxygen as needed to maintain saturation above 94%. 8. Transport: ● Prepare the patient for immediate transport to the nearest appropriate cardiac facility capable of providing definitive care. ● Consider requesting a medical intercept with an advanced life support (ALS) unit if available. 9. Continual Monitoring and Support: ● Continuously monitor the patient’s vital signs, cardiac rhythm, and oxygen saturation during transport. ● Provide ongoing reassurance and emotional support to the patient. Remember, this care plan should be adjusted based on local protocols, medical director’s guidance, and the specific needs of the patient. Effective communication with the receiving hospital and early activation of the appropriate cardiac care team are crucial for timely intervention and optimal patient outcomes”. |
Notes: Adapted from OpenAI, GPT-3.5. Available from: chat.openai.com/.6
Immersive training simulations, enhanced through generative AI, can create dynamic virtual environments that closely replicate real-life emergency scenarios, especially in high-risk, low-occurrence (HALO) cases. These simulations offer educators and learners unparalleled opportunities to cultivate critical thinking skills, refine decision-making abilities, and enhance clinical competencies through experiential learning. NLP-powered interactive virtual assistants in EMS education also open doors to a new era of personalized learning. These intelligent companions could provide real-time guidance and feedback, addressing learner queries and fostering active engagement and knowledge retention.
A major cause for concern, however, includes “hallucinations”, in which AI convincingly generates inappropriate or inaccurate responses. A 2023 study that assessed ChatGPTs abilities to provide cardiovascular disease prevention recommendations found that 18% of responses were flagged as inappropriate for a specific patient.7 Additionally, it has been seen that the AI chatbot, will at times, continue to affirm false statements or cite research that does not exist even when questioned further.1 In the scenario featured in Box 1, ChatGPT erroneously recommends administration of 0.5 mg of atropine for bradycardia, whereas the AHA has recommended a dose of 1 mg since 2020.8 This rather small error, however, demonstrates how infrequent information program updates can have a significant impact on evidence-based patient care.
Addressing inherent biases within AI algorithms is equally critical. Rigorous evaluation, iterative improvements, and ongoing monitoring are imperative to mitigate biases and ensure equitable health-care outcomes for all patients. As of late, there have been ongoing conversations on which regulatory authorities will ultimately oversee the use of AI in patient care.9 Furthermore, as technology advances, preserving essential human connection within EMS remains pivotal. Compassion and empathy integral to EMS care must harmonize with technological advancements. In situations where a provider’s clinical judgment disagrees with recommendations provided by AI, current tort law dictates that providers should default to the standard of care.10
The interface between human providers and machine learning tools facilitated through NLP and Generative AI presents as interesting areas of research, given the recent advancements in open-source programs in conjunction with wider availability of large EMS datasets. While AI tools integrating NLP and generative AI are still far from being suitable for use as a tool in clinical care, we can expect to see the private sector expanding the inclusion of these open-source technologies into EMS device research and development with future models that are trained on clinically relevant and up-to-date data sets. AI features have become increasingly available on many mainstream email and word processors. EMS ePCR systems in particular might be an area where NLP and generative AI will prove helpful. AI chatbots are also currently useful for educators as a tool for generating personalized educational materials.
Generative AI and NLP advancements present an exciting frontier for EMS clinical practice and education. As clinicians, educators, and researchers within the EMS community, it is our collective responsibility to actively engage in the discourse surrounding these technologies. By embracing the potential of generative AI and NLP while navigating ethical considerations and integration challenges driven by research, we have the tools to revolutionize the delivery of prehospital emergency care.
Funding Statement
The work was not funded by any specific source.
Author Contributions
All authors made a significant contribution to the work reported, whether that is in the conception, study design, execution, acquisition of data, analysis and interpretation, or in all these areas; took part in drafting, revising or critically reviewing the article; gave final approval of the version to be published; have agreed on the journal to which the article has been submitted; and agree to be accountable for all aspects of the work.
Disclosure
The authors report no known conflicts of interest, financial or otherwise in this work. The work is solely that of the authors and does not necessarily represent the views, policies, or opinions of their affiliated institutions, employers, or partners. It was not reviewed or endorsed by any specific institution in particular. The work is an independent publication that was neither funded nor reviewed by the university.
References
- 1.Eysenbach G. The role of ChatGPT, generative language models, and artificial intelligence in medical education: a conversation with ChatGPT and a call for papers. JMIR Med Educ. 2023;9:e46885. PMID: 36863937; PMCID: PMC10028514. doi: 10.2196/46885 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Voytovich L, Greenberg C. Natural language processing: practical applications in medicine and investigation of contextual autocomplete. Acta Neurochir Suppl. 2022;134:207–214. PMID: 34862544. doi: 10.1007/978-3-030-85292-4_24 [DOI] [PubMed] [Google Scholar]
- 3.Vearrier L, Derse AR, Basford JB, Larkin GL, Moskop JC. Artificial intelligence in emergency medicine: benefits, risks, and recommendations. J Emerg Med. 2022;62(4):492–499. PMID: 35164977. doi: 10.1016/j.jemermed.2022.01.001 [DOI] [PubMed] [Google Scholar]
- 4.Mueller B, Kinoshita T, Peebles A, Graber MA, Lee S. Artificial intelligence and machine learning in emergency medicine: a narrative review. Acute Med Surg. 2022;9(1):e740. PMID: 35251669; PMCID: PMC8887797. doi: 10.1002/ams2.740 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Kung TH, Cheatham M, Medenilla A, et al. Performance of ChatGPT on USMLE: potential for AI-assisted medical education using large language models. PLOS Digit Health. 2023;2(2):e0000198. PMID: 36812645; PMCID: PMC9931230. doi: 10.1371/journal.pdig.0000198 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.OpenAI, GPT-3.5. Available from: chat.openai.com/. Accessed May 17, 2023.
- 7.Sarraju A, Bruemmer D, Van Iterson E, Cho L, Rodriguez F, Laffin L. Appropriateness of cardiovascular disease prevention recommendations obtained from a popular online chat-based artificial intelligence model. JAMA. 2023;329(10):842–844. PMID: 36735264; PMCID: PMC10015303. doi: 10.1001/jama.2023.1044 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Merchant RM, Topjian AA, Panchal AR, et al. Adult basic and advanced life support, pediatric basic and advanced life support, neonatal life support, resuscitation education science, and systems of care writing groups. part 1: executive summary: 2020 American Heart Association Guidelines for cardiopulmonary resuscitation and emergency cardiovascular care. Circulation. 2020;142(16_suppl_2):S337–S357. PMID: 33081530. doi: 10.1161/CIR.0000000000000918 [DOI] [PubMed] [Google Scholar]
- 9.Price WN 2nd, Gerke S, Cohen IG. Potential liability for physicians using artificial intelligence. JAMA. 2019;322(18):1765–1766. PMID: 31584609. doi: 10.1001/jama.2019.15064 [DOI] [PubMed] [Google Scholar]
- 10.Morley J, Machado CCV, Burr C, et al. The ethics of AI in health care: a mapping review. Soc Sci Med. 2020;260:113172. PMID: 32702587. doi: 10.1016/j.socscimed.2020.113172 [DOI] [PubMed] [Google Scholar]
