Abstract
Science fiction literature and films no longer focus on artificial intelligence. In contrast to all other aspects of life, medical education and clinical patient care have been progressing slowly. Recently, a lot of text from the internet was used to construct and train chatbots, especially ChatGPT. The language model ChatGPT, created by OpenAI, has emerged as a useful resource for medical research and education. It has proven to be a useful tool for researchers, students, and medical professionals because of its capacity to produce human-like answers to challenging medical queries. However, using ChatGPT also has significant drawbacks. The possibility of erroneous or biased information being spread, which could have negative effects on patient care, is one of the key worries. Moreover, the overreliance on technology in medical education could also lead to a decline in critical thinking and clinical decision-making skills. Overall, ChatGPT has the potential to be a boon to medical education and research, but its use must be accompanied by caution and critical evaluation.
Keywords: research, medical education, open ai, chatgpt, artificial intelligence, ai & robotics in healthcare
Introduction and background
Due to the rapid advancement of scientific knowledge and technology, experts now frequently use new artificial intelligence (AI) models for convenience and quick access to necessities. Computers that read user input and deliver responses are used by Google and Meta to create complex language model tools [1]. To develop predictions and combine words in meaningful ways, a lot of data and computing methods are used. Similar to this, in November 2022, the artificial intelligence company "Open AI" unveiled the brand-new trending bot "ChatGPT." Millions of users and investors have invested in this bot, and scientists predict that it will eventually replace people. By creating a user interface that allows the general public to interact with it directly, Open AI made a revolutionary choice [1]. Since its launch, ChatGPT has undergone extensive testing across a range of disciplines to ensure that it operates naturally and conversationally. Technology breakthroughs and the creation of AI technologies like language models like ChatGPT have considerably helped medical education and research. However, whether these technological advancements are a boon or a bane depends on how they are utilized. When replying to medical inquiries or providing differential diagnoses for various symptoms, it has been utilized in health care to provide medical aid and information [2].
Big language models can help with clinical decision-making and medical education. According to new research by Kung et al., ChatGPT performed at or near the passing level for all three United States Medical Licensure Exams (USMLE) [3]. How much ChatGPT can impact medical research is a crucial question. Following Dr. Biswas, this software can revolutionize medical writing by speeding up the process and saving time. It can gather data, help with literature searches, and produce a preliminary document that the medical writer can then refine [4]. It can be quite advantageous for the medical sector. It can aid in the efficient diagnosis of patients by providing a summary of their vast medical records based on their family history, lab results, and symptoms. Because ChatGPT lacks critical thinking and provides information redundantly and arbitrarily, many scientists and journals are opposed to it [5].
According to numerous education experts, students taking communication and philosophy courses frequently use the easily detectable exam cheating tool ChatGPT. A growing worry is that people may eventually lose the ability to come up with fresh ideas and won't be able to provide sound justifications for their positions [5]. Similar to this, the accountability of the content of the bot is a problem when employing ChatGPT in scientific articles. That raises ethical questions, medical law issues, copyright issues, a dearth of original thought and reasoning, methodological biases, and content inaccuracies [6]. When using ChatGPT for research, it might be simple to create a paper's content using the data from search engines. However, as previously mentioned, it was unable to conduct a thorough literature review or evaluate and discuss articles [7]. The only observable result was a rephrased language that depends on the precise instructions given to the bot and is not entirely free of plagiarism. According to Mayo Clinic doctors Thomas Davenport and Nitin Mittal, ChatGPT can be abused innumerably. It might eventually prevent the human mind from doing even the most basic tasks [8,9]. Also, ChatGPT's current training data, which is updated through 2021, was mentioned as one of the main barriers to adoption by John Halamka (President of the Mayo Clinic platform) and Paul Cerrato (Senior Research Analyst and Communications Specialist, Mayo Clinic platform) [10]. Because medical literature is updated frequently, there is growing concern that papers created using ChatGPT may lack clinical reasoning and thinking.
ChatGPT can be a boon for medical education and research. It has the potential to provide instant access to a vast amount of medical information and can assist medical students and researchers in analyzing complex medical data. Additionally, ChatGPT can provide personalized learning experiences for medical students by tailoring information to their learning styles and preferences. ChatGPT also poses some challenges and limitations that can be considered a bane. While ChatGPT can provide vast amounts of information, it may not always be accurate or reliable. The responses produced by ChatGPT are also based on patterns and trends in the data it has been trained on, so they might not always offer a thorough or nuanced knowledge of a specific medical idea or situation. Consequently, it is crucial to make sure that medical students and researchers are aware of ChatGPT's limitations and use it in addition to rather than as a substitute for conventional learning and research techniques. All things considered, ChatGPT has the potential to be a useful tool in medical research and education, but it must be used carefully and with awareness of its limitations. This review article focuses on whether ChatGPT in medical education and research is a boon or a bane.
Review
Here are a few basic similes that could be used to describe ChatGPT:
· ChatGPT is like a virtual assistant, always ready to help you with your language-related tasks.
· ChatGPT is like a language genie, granting your wishes for information and conversation in the blink of an eye.
· ChatGPT is like a language chameleon, adapting to any language and communication style to help you communicate effectively.
· ChatGPT is like a language atlas, mapping out the vast landscape of human language and knowledge to help you navigate it more efficiently.
· ChatGPT is like a language encyclopedia, containing a wealth of information and knowledge on a wide range of topics to help you learn and communicate better.
Can ChatGPT influence health care?
ChatGPT, a cutting-edge language model created by Open AI, has the potential to fundamentally alter how medical information is handled and communicated in the disciplines of clinical and translational medicine [11].
· Access to current data: ChatGPT can access a huge amount of medical data and respond to clinical queries in real time with precise answers.
· Increased patient engagement: Patients can track their health information with ChatGPT and obtain answers to their questions regarding medicine straightforwardly and rapidly.
· The lessened workload for medical professionals: ChatGPT can assist in lessening the administrative strain on medical professionals, allowing them to concentrate more on patient care.
· Accuracy will certainly increase as more data are gathered and examined for ChatGPT.
· ChatGPT may be able to communicate with electronic health records (EHRs), facilitating a more fluid exchange of data between patients and healthcare professionals.
· Customized medicine: Based on unique patient information and health histories, ChatGPT has the potential to offer individualized medical recommendations.
· To find new medication targets and create individualized treatment programs for patients, enormous volumes of health data, including patient genomes, are analyzed.
· Clinical studies that are optimized to find the most promising participants.
ChatGPT and medical education
Technology breakthroughs are advancing medical education, and AI like ChatGPT can serve several beneficial functions, which are as follows [12]:
· Automatic scoring: The clarity, vocabulary, grammar, and sentence structure of student essays and papers can be evaluated using ChatGPT. Teachers and other educators who routinely deal with the enormous effort required for grading a large number of assignments will find this feature to be of particular use.
· Teaching assistance [13,14]: ChatGPT can be used to create exercises, tests, and scenarios that will help with practice and assessment in the classroom. Its capacity to produce translations, justifications, and summaries can also be used to simplify for students difficult academic material.
· Personalized learning [15]: ChatGPT can be used to create virtual tutors or assistants who can assist students with their homework and ask questions. Teachers may find it challenging to create individualized study plans and learning materials for each student in a classroom setting that take into account people's diverse learning styles and abilities.
· Research assistance [16]: ChatGPT can be used to help students with their academic work by responding to their questions and writing summaries. Moreover, it can be used to create outlines, bibliographies, and other types of research tools. ChatGPT's capacity to aid in the creation of outlines, as well as with literature reviews and data analysis, can make medical research simpler. Also, it can be used to summarize pertinent publications and highlight crucial discoveries, assisting medical researchers in making sense of a large amount of material available online.
· Fast access to information [17]: Using ChatGPT, you may quickly provide accurate and current information about medical topics. This could cover a wide range of issues, including illnesses and how they are treated as well as medical procedures. Students and medical professionals who require immediate access to information or clarification on a topic could find this useful.
· Creating case scenarios [18]: Medical students can hone and improve their diagnostic and treatment planning skills by using ChatGPT to create case studies and scenarios. This could aid students in developing their aptitude for clinical thinking as well as preparing them for situations that may arise in actual clinical settings.
· Language translation [19]: By utilizing ChatGPT's language translation tools, medical professionals and educators may communicate with patients from various linguistic backgrounds and deliver the finest medical treatment.
ChatGPT and medical research
One benefit of employing ChatGPT in research may be its capacity to analyze vast amounts of data rapidly and accurately, including patient records, medical reports, scientific journals, and medical reports, all of which can offer unique insights into the causes, symptoms, and treatment possibilities. According to ChatGPT, natural language processing techniques are utilized to extract crucial information from the texts and present it in a structured way. Another novel application of ChatGPT is to assist researchers in the creation of novel theories. ChatGPT may be able to come up with fresh suggestions for additional studies by analyzing the current literature and identifying knowledge gaps [20]. By examining patient records and finding common trends, ChatGPT may be able to help with the creation of clinical decision support systems. Authors have already acknowledged ChatGPT as a co-author by utilizing the AI technology that ChatGPT enabled [21,22]. However, not all scientists and authors agree, as Thorp recently stated that “ChatGPT is fun, but not an author” [6].
It is true that because the authors did not produce the content themselves, ChatGPT may be used to produce articles that may be considered to be plagiarized. Another potentially hazardous application of ChatGPT is when researchers use it to create material that is eerily similar to passages or sections of previously published works. Moreover, ChatGPT may generate language that is almost the same as that found in previously published studies. This capability can be exploited to fudge study results or mislead readers and researchers. The question of whether human reviewers could discern between authentic scientific abstracts and ChatGPT-written AI-generated abstracts was examined in a recent study that was posted on a preprint server [13,23]. Blinded reviewers acknowledged having trouble telling the difference between abstracts produced by humans and those produced by AI. In fact, in 32% of the abstracts produced by the AI bot, ChatGPT was successful in deceiving blinded reviewers. Emerging methodologies designed to distinguish between human-authored content and generative AI-generated content, such as the Japanese stylometric analysis elucidated by Zaitsu and Jin, warrant prudent application due to their varying levels of accuracy across distinct scenarios [24]. The generated writings could also be erroneous, biased, and lack comprehension of the subtleties of medical science(s) and language, in addition to the issue with plagiarism (ChatGPT is trained on a large dataset of text but may not have enough information about a specific case). Certain topics don't have as many articles; therefore, the AI bot can undervalue their significance or distinctiveness. The AI bot might thus prevent researchers from evaluating the available data based on quality and instead restrict them to a more broad perspective. It is crucial to advise future ChatGPT users to use the tool responsibly and to always properly reference any materials utilized to avoid the problems highlighted above.
Advantages of ChatGPT
· Improved learning: ChatGPT can provide medical students with instant access to a vast amount of medical knowledge, which can help them learn quickly and efficiently. Giving students immediate feedback on their comprehension of the subject matter can also improve the learning experience.
· Time-saving [25]: With ChatGPT, medical professionals and researchers can save time in researching, writing papers, and studying as the information can be accessed quickly.
· Personalized learning: ChatGPT can provide personalized learning experiences, tailoring the content to the specific needs of each learner.
· Better medical diagnosis: With ChatGPT's ability to process large amounts of medical data, it can help medical professionals make better diagnoses and provide better treatment options for their patients.
Disadvantages of ChatGPT
· Inaccurate or misleading information may hurt patients, and ChatGPT is not a replacement for professional medical care.
· Technology dependence: ChatGPT might be used as a crutch by medical professionals, limiting their capacity to diagnose and treat patients without the use of technology.
· Limited to existing data: ChatGPT can only provide information based on the data that has been previously input into its system. This means that if there is a gap in knowledge or a lack of data, ChatGPT will not be able to provide accurate information.
· Lack of critical thinking [26]: ChatGPT is a machine and cannot think critically like a human, meaning it cannot interpret or analyze medical information beyond what is programmed into its system. It cannot provide judgment or discernment required for ethical or legal aspects of medical practice.
· Data privacy [27]: Data privacy and security are essential in medical research and practice. The use of ChatGPT may require the sharing of medical data, which can pose a risk of data breaches and privacy violations.
· ChatGPT might replace conventional learning tools such as libraries and tutors [28].
· Bias in the training data: The quality of ChatGPT depends on the training data. The model could reinforce a bias if the training set of data is skewed.
Limitations of ChatGPT in medical education and research
Although the text produced by ChatGPT, an AI service, may be remarkably similar to content written by humans, it does have errors, just like any other statistical model. Its main faults include a deficiency in human-like comprehension and a lack of data input after 2021. This could allow it to ignore the context of the prompt and lead to the creation of pointless content or genuinely generic ideas and notions [28,29].
Recently, peer-reviewed publications using ChatGPT as an author were published [30]. Due to the aforementioned limitations, the World Association of Medical Editors (WAME) has suggested including ChatGPT with the authors [31,32]. The correctness and cultural appropriateness of the translations were not the focus of this investigation. The effectiveness of apps in healthcare settings may be impacted by the appropriateness of translated words for the context, the syntax of translated phrases (such as word order and grammar), and the ability to recognize different accents and dialects (when using free voice input). Studies in the past have shown that Google Translate's translation of non-Western languages is less accurate [33-35].
Remarkably, the first query about the possible impact of ChatGPT on clinical and translational medicine nearly seems to have a resolution [33]. But, given the widespread debate that has started around ChatGPT and its effect on health care, several issues need to be addressed. AI in medicine has indisputable advantages and benefits that have already impacted every element of the industry, from research to clinical applications [11]. AI-based algorithms are employed everywhere, from data-driven preclinical research to medical decision assistance in everyday clinical practice. For instance, standardized medical procedures in line with good clinical practice (GCP) already include pattern recognition, pre-interpretation of physiological and biophysical data like ECG, EEG, and EMG, as well as medical image interpretation for patient management and diagnosis [36].
However, ChatGPT's warnings about difficulties and risks inevitably prompt a lot of inquiries, including pertinent questions about medical ethics [37,38], data security and privacy, as well as inaccurate or deceptive information that could harm patients. All other things being equal, patients come across as certain and ready for the doctor's interview, especially when it comes to crucial clinical data such as disease-specific symptoms and medical history. Unfortunately, this frequently does not serve the attending physician's best interests and might result in misunderstandings. Also, if ChatGPT is utilized without permission, its variable accuracy can cause issues in the educational setting. This instrument has already been prohibited at certain universities. Medical students and trainees may be negatively impacted by ChatGPT, leading them to incorrectly interpret medical information or even make clinical judgments without utilizing their judgment. So, before it can be utilized, for instance, for applications that accompany diagnosis and therapy, the ChatGPT algorithm needs to be trained and verified on relevant, evidence-based knowledge bases. So, before it can be utilized, for instance, for applications that accompany diagnosis and therapy, the ChatGPT algorithm needs to be trained and verified on relevant, evidence-based knowledge bases [39]. By facilitating access to current information, enhancing patient engagement, and lessening the burden on the medical staff, ChatGPT has the potential to make a substantial impact on the disciplines of clinical and translational medicine [40].
ChatGPT has the potential to have a significant impact on the fields of clinical and translational medicine by simplifying access to up-to-date information, improving patient participation, and reducing the workload on the medical staff. To ensure that ChatGPT is used securely and successfully, further study and development are required. Yet, some difficulties and dangers must be taken into account and avoided. It is used as the desire and moral duty to enhance the patient's quality of life. As an AI language model, ChatGPT has the potential to play a significant role in medical research in the future. Here are some ways that ChatGPT can be used:
· Data analysis: ChatGPT can be used to analyze large sets of medical data, such as electronic health records, medical images, and patient outcomes. It can identify patterns and trends that may be missed by human analysts, which can be useful in identifying potential risk factors and predicting health outcomes.
· Literature review: ChatGPT can assist in conducting a comprehensive literature review for a given research question, by analyzing a vast amount of medical literature and providing a summary of relevant information.
· Decision support: ChatGPT can be used to assist healthcare providers in making treatment decisions by analyzing patient data and providing personalized treatment recommendations based on the latest medical evidence.
· Patient engagement: ChatGPT can be used to provide patients with accurate and reliable information about their health conditions, treatments, and preventive measures. It can also help patients make informed decisions about their health by answering their questions in real time.
· Drug discovery: ChatGPT can be used to predict drug candidates for a given target by analyzing the chemical and biological properties of thousands of compounds. This can help researchers save time and effort in identifying potential drugs for a given target.
· Clinical trial design: ChatGPT can assist in designing clinical trials by identifying potential inclusion and exclusion criteria, selecting appropriate endpoints, and analyzing the potential risks and benefits of the study.
However, it is important to note that ChatGPT is an AI language model and not a substitute for medical professionals. It should always be used in conjunction with expert medical advice and clinical judgment. Additionally, as with any AI technology, there are potential risks and ethical concerns that need to be carefully considered and addressed, such as data privacy, bias, and accountability. ChatGPT is an AI language model that has the potential to assist humans in many tasks, including in the field of medicine and medical research. However, it is important to note that AI technologies like ChatGPT are not designed to replace humans in these fields, but rather to augment and assist human capabilities.
While AI technologies like ChatGPT can perform some tasks more efficiently than humans, they are not capable of replacing human expertise, judgment, and empathy. Many aspects of healthcare and medical research require human skills, such as diagnosing complex medical conditions, interpreting medical imaging, and communicating with patients about their care. Furthermore, the use of AI technologies like ChatGPT in health care and medical research must be carefully regulated and monitored to ensure that they do not cause harm or perpetuate bias. Ethical considerations, such as data privacy, transparency, and accountability, must also be taken into account when using AI technologies in these fields.
Therefore, it is unlikely that ChatGPT or any other AI technology will completely replace humans in the field of medicine and medical research. Rather, these technologies will be used to assist and augment human capabilities, leading to more accurate diagnoses, personalized treatments, and better patient outcomes. ChatGPT is not a means of war, even after the chatbot. As an AI language model, ChatGPT is designed to assist humans in various tasks such as language translation, text summarization, and conversation generation. Its intended purpose is to improve the efficiency and accuracy of human communication and information processing, not to be used as a weapon of war or to cause harm to others.
While it is true that AI technologies, including chatbots, have been used in various military applications, such as autonomous weapons and surveillance systems, it is important to note that the development and use of such technologies are subject to strict ethical and legal regulations. The use of AI technologies in warfare must comply with international humanitarian law and the principles of proportionality, distinction, and military necessity. Furthermore, the development and use of AI technologies, including ChatGPT, must adhere to ethical and moral considerations, including transparency, accountability, and the protection of human rights. The responsible development and use of AI technologies can bring significant benefits to society, but it is important to ensure that they are used for beneficial purposes and do not cause harm to others.
Conclusions
Utilizing ChatGPT in the realms of clinical management, medical research, and education holds immense potential, but it should not supplant human expertise, given the inherent limitations of AI. As advancements in information technology, machine learning, and AI rapidly evolve, it is imperative to approach their integration into the medical sector with both eagerness and caution. ChatGPT can streamline the acquisition of up-to-date information, bolster patient engagement, and alleviate some demands on medical professionals, especially within clinical and translational medicine. For the ethical and effective employment of ChatGPT, it is crucial to continuously assess its applications, make necessary adaptations, and invest in thorough research. This will safeguard its use while maximizing the benefits it offers to the medical community.
Acknowledgments
MJ, SPK, NJ, AN, SY, and SKB conceptualized the research ideology, design, data acquisition and interpretation; MJ, SPK, NJ, AN, SY, and SKB drafted the article, and reviewed and finalized the manuscript. All authors, i.e., MJ, SPK, NJ, AN, SY, and SKB gave final approval to publish the manuscript. All authors, i.e., MJ, SPK, NJ, AN, SY, and SKB agreed to the integrity and accountability of the research investigated. MJ - Madhan Jeyaraman; SPK - Shanmuga Priya K; NJ - Naveen Jeyaraman; AN - Arulkumar Nallakumarasamy; SKB - Suresh K. Bondili; SY - Sankalp Yadav.
The authors have declared that no competing interests exist.
References
- 1.If you still aren’t sure what ChatGPT is, this is your guide to the viral chatbot that everyone is talking about. business insider. [ Aug; 2023 ]. 2023. https://www.businessinsider.in/tech/news/if-you-still-arent-sure-what-chatgpt-is-this-is-your-guide-to-the-viral-chatbot-that-everyone-is-talking-about/articleshow/96990537.cms. https://www.businessinsider.in/tech/news/if-you-still-arent-sure-what-chatgpt-is-this-is-your-guide-to-the-viral-chatbot-that-everyone-is-talking-about/articleshow/96990537.cms.
- 2.Garg A. Garg A. What is ChatGPT, and its possible use cases? [ Jul; 2023 ]. 2022. https://www.netsolutions.com/insights/what-is-chatgpt/ https://www.netsolutions.com/insights/what-is-chatgpt/
- 3.Performance of ChatGPT on USMLE: Potential for AI-assisted medical education using large language models. Kung TH, Cheatham M, Medenilla A, et al. PLOS Digit Health. 2023;2:0. doi: 10.1371/journal.pdig.0000198. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.ChatGPT and the future of medical writing. Biswas S. Radiology. 2023;307:0. doi: 10.1148/radiol.223312. [DOI] [PubMed] [Google Scholar]
- 5.Belagere C. Belagere C. students have started using chatgpt to cheat in assignments, tests. How are professors catching them? [ Jul; 2023 ]. 2023. https://thesouthfirst.com/karnataka/students-have-started-using-chatgpt-to-cheat-in-tests-exams-how-are-professors-catching-them/#:~:text=Interestingly%2C%20professors%20said%20ChatGPT%20itself,the%20chatbot%20to%20confirm%20it. https://thesouthfirst.com/karnataka/students-have-started-using-chatgpt-to-cheat-in-tests-exams-how-are-professors-catching-them/#:~:text=Interestingly%2C%20professors%20said%20ChatGPT%20itself,the%20chatbot%20to%20confirm%20it.
- 6.ChatGPT is fun, but not an author. Thorp HH. Science. 2023;379:313. doi: 10.1126/science.adg7879. [DOI] [PubMed] [Google Scholar]
- 7.ChatGPT is shaping the future of medical writing but still requires human judgment. Kitamura FC. Radiology. 2023;307:0. doi: 10.1148/radiol.230171. [DOI] [PubMed] [Google Scholar]
- 8.Assessing the capability of ChatGPT in answering first- and second-order knowledge questions on microbiology as per competency-based medical education curriculum. Das D, Kumar N, Longjam LA, Sinha R, Deb Roy A, Mondal H, Gupta P. Cureus. 2023;15:0. doi: 10.7759/cureus.36034. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.The future of medical education and research: Is ChatGPT a blessing or blight in disguise? Arif TB, Munaf U, Ul-Haque I. Med Educ Online. 2023;28:2181052. doi: 10.1080/10872981.2023.2181052. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Halamka J. Halamka J. Preparing for the world of generative AI. Mayo Clinic platform. [ Jul; 2023 ]. 2023. https://www.mayoclinicplatform.org/2023/02/01/preparing-for-the-world-of-generative-ai/ https://www.mayoclinicplatform.org/2023/02/01/preparing-for-the-world-of-generative-ai/
- 11.The potential impact of ChatGPT in clinical and translational medicine. Baumgartner C. Clin Transl Med. 2023;13:0. doi: 10.1002/ctm2.1206. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.How does ChatGPT perform on the United States Medical Licensing Examination? The implications of large language models for medical education and knowledge assessment. Gilson A, Safranek CW, Huang T, Socrates V, Chi L, Taylor RA, Chartash D. JMIR Med Educ. 2023;9:0. doi: 10.2196/45312. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.News from artificial intelligence: comparing scientific abstracts generated by chatgpt to original abstracts using an artificial intelligence output detector, plagiarism detector, and blinded human reviewers. GMDP Academy. [ Jul; 2023 ];https://gmdpacademy.org/news/news-from-artificial-intelligence-comparing-scientific-abstracts-generated-by-chatgpt-to-original-abstracts-using-an-artificial-intelligence-output-detector-plagiarism-detector-and-blinded-human-re/ 2023 2023:3–2023. [Google Scholar]
- 14.Why ChatGPT is such a big deal for education. Anders BA. https://scalar.usc.edu/works/c2c-digital-magazine-fall-2022---winter-2023/why-chatgpt-is-bigdeal-education C2C Digital Magazine. 2023;1:4. [Google Scholar]
- 15.ChatGPT utility in healthcare education, research, and practice: Systematic review on the promising perspectives and valid concerns. Sallam M. Healthcare (Basel) 2023;11:887. doi: 10.3390/healthcare11060887. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.ChatGPT - Reshaping medical education and clinical management. Khan RA, Jawaid M, Khan AR, Sajjad M. Pak J Med Sci. 2023;39:605–607. doi: 10.12669/pjms.39.2.7653. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.ChatGPT makes medicine easy to swallow: An exploratory case study on simplified radiology reports. [ Jul; 2023 ];Jeblick K, Schachtner B, Dexl J, et al. arXiv. 2022 doi: 10.1007/s00330-023-10213-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Promoting student case creation to enhance instruction of clinical reasoning skills: A pilot feasibility study. Chandrasekar H, Gesundheit N, Nevins AB, Pompei P, Bruce J, Merrell SB. Adv Med Educ Pract. 2018;9:249–257. doi: 10.2147/AMEP.S155481. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Language translation apps in health care settings: Expert opinion. Panayiotou A, Gardner A, Williams S, et al. JMIR Mhealth Uhealth. 2019;7:0. doi: 10.2196/11316. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Artificial intelligence bot ChatGPT in medical research: The potential game changer as a double-edged sword. Dahmen J, Kayaalp ME, Ollivier M, Pareek A, Hirschmann MT, Karlsson J, Winkler PW. Knee Surg Sports Traumatol Arthrosc. 2023;31:1187–1189. doi: 10.1007/s00167-023-07355-6. [DOI] [PubMed] [Google Scholar]
- 21.Rapamycin in the context of Pascal's Wager: Generative pre-trained transformer perspective. Zhavoronkov A. Oncoscience. 2022;9:82–84. doi: 10.18632/oncoscience.571. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.A conversation on artificial intelligence, chatbots, and plagiarism in higher education. King MR, chatGPT chatGPT. Cell Mol Bioeng. 2023;16:1–2. doi: 10.1007/s12195-022-00754-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Daily briefing: ChatGPT listed as author on research papers. Graham F. Nature. 2023 doi: 10.1038/d41586-023-00188-w. [DOI] [PubMed] [Google Scholar]
- 24.Distinguishing ChatGPT(-3.5, -4)-generated and human-written papers through Japanese stylometric analysis. Zaitsu W, Jin M. PLoS One. 2023;18:0. doi: 10.1371/journal.pone.0288453. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Can artificial intelligence help for scientific writing? Salvagno M, Taccone FS, Gerli AG. Crit Care. 2023;27:75. doi: 10.1186/s13054-023-04380-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Opinion paper: “So what if chatgpt wrote it?” Multidisciplinary perspectives on opportunities, challenges and implications of generative conversational ai for research, practice and policy. Dwivedi YK, Kshetri N, Hughes L, et al. Inter J Inf Manage. 2023;71:102642. [Google Scholar]
- 27.Robson K. Robson K. Do AI chatbots like ChatGPT pose a major cybersecurity risk? Verdict. [ Jul; 2023 ]. 2023. https://www.verdict.co.uk/do-ai-chatbots-like-chatgpt-pose-a-major-cybersecurity-risk/ https://www.verdict.co.uk/do-ai-chatbots-like-chatgpt-pose-a-major-cybersecurity-risk/
- 28.An era of ChatGPT as a significant futuristic support tool: A study on features, abilities, and challenges. Haleem A, Javaid M, Singh RP. BenchCouncil Transactions on Benchmarks, Standards and Evaluations. 2022;2:100089. [Google Scholar]
- 29.‘I think this is the most disruptive technology’: Exploring sentiments of ChatGPT early adopters using Twitter data. Haque MU, Dharmadasa I, Sworna ZT, Rajapakse RN, Ahmad H. arXiv. 2022 [Google Scholar]
- 30.ChatGPT listed as author on research papers: Many scientists disapprove. Stokel-Walker C. Nature. 2023;613:620–621. doi: 10.1038/d41586-023-00107-z. [DOI] [PubMed] [Google Scholar]
- 31.Readership awareness series - Paper 4: Chatbots and ChatGPT - Ethical considerations in scientific publications. Ali MJ, Djalilian A. Ocul Surf. 2023;28:153–154. doi: 10.1016/j.jtos.2023.04.001. [DOI] [PubMed] [Google Scholar]
- 32.Chatbots, ChatGPT, and scholarly manuscripts - WAME recommendations on ChatGPT and chatbots in relation to scholarly publications. Zielinski C, Winker M, Aggarwal R, et al. Afro-Egypt J Infect Endem Dis. 2023;13:75–79. doi: 10.25259/NMJI_365_23. [DOI] [PubMed] [Google Scholar]
- 33.Usage of multilingual mobile translation applications in clinical settings. Albrecht UV, Behrends M, Schmeer R, Matthies HK, von Jan U. JMIR Mhealth Uhealth. 2013;1:0. doi: 10.2196/mhealth.2268. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.English and Mandarin translation using Google Translate software for pre-anaesthetic consultation. Beh TH, Canty DJ. https://pubmed.ncbi.nlm.nih.gov/26603812/ Anaesth Intensive Care. 2015;43:792–793. [PubMed] [Google Scholar]
- 35.Use of Google Translate in medical communication: Evaluation of accuracy. Patil S, Davies P. BMJ. 2014;349:0. doi: 10.1136/bmj.g7392. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Artificial intelligence in disease diagnosis: A systematic literature review, synthesizing framework and future research agenda. Kumar Y, Koul A, Singla R, Ijaz MF. J Ambient Intell Humaniz Comput. 2023;14:8459–8486. doi: 10.1007/s12652-021-03612-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Red teaming ChatGPT via jailbreaking: Bias, robustness, reliability and toxicity. Zhuo TY, Huang Y, Chen C, Xing Z. arXiv. :2023. [Google Scholar]
- 38.The role of ChatGPT in data science: How AI-assisted conversational interfaces are revolutionizing the field. Hassani H, Silva ES. Big Data Cogn Comput. 2023;7:62. [Google Scholar]
- 39.Artificial intelligence discusses the role of artificial intelligence in translational medicine: A JACC: Basic to translational science interview with ChatGPT. Mann DL. JACC Basic Transl Sci. 2023;8:221–223. doi: 10.1016/j.jacbts.2023.01.001. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.Design for risk control: The role of usability engineering in the management of use-related risks. van der Peijl J, Klein J, Grass C, Freudenthal A. J Biomed Inform. 2012;45:795–812. doi: 10.1016/j.jbi.2012.03.006. [DOI] [PubMed] [Google Scholar]