Skip to main content
. 2024 Sep 12;8:e56797. doi: 10.2196/56797

Table 3.

Free-text responses collected for 4 questions regarding participant use of and beliefs about large language models in clinical and nonclinical work.

Coded survey responsea Respondents, n Example
How have you used ChatGPT to help you with your non-clinical work?

Information search 12 It is a helpful adjunct to online searching. Helps you quickly narrow what you are looking for (assuming you don't want a broad search)

Plan recreational activity 8 Used for coming up with ideas for nonwork-related group activities

Summarize text 5 Summarize review papers into usable notes.

Revise communication 5 Editing text for grammar

Literature review 3 Triage/screen PubMed abstracts to identify references of interest

Generate title 3 Generate catchy titles for manuscripts and presentations.

Ideation 3 Generate research ideas

Draft mass communication 3 social media for my business

Creative writing 2 Write poems (in English and other languages), generate ideas

Well-being programming 2 Generate relaxation scripts.

Workflow 1 Write workflow proposals

Draft cover letter 1 Write … cover letters

Translation 1 translation of materials from English to another language

Task management 1 organizing to-do lists
What concerns do you have about using ChatGPT clinically?

Perceived lack of utility 5 Still not clear on how it would be used in healthcare

Potential bias in the data model 3 At times, ChatGPT is confounded by the presence of wrong data and, therefore, presents clearly inaccurate statements.

Legal 3 legal concerns- I am so careful about my documentation, and I just don't think chat GPT will ever word things the way I need it to help me in medicolegal situations.

Automation bias 2 Worried about clinician interpretation of ChatGPT output … and cannot replace clinical reasoning

Plagiarism 2 It is plagiarism on steroids.

Learning curve 1 Would not like to have to master a new technology in addition to the onslaught of requests for computer interface as it is

Depersonalization 1 That it could take away from collaborative development of an illness explanation that provider and patient/family engage in together.

Skill atrophy 1 Humans writing reports allows clinicians to integrate data in a way that supports clinical decision making and patient counseling. I am already finding a lack of critical thinking skills in graduate students. Push button documentation would be efficient (and report writing is arduous) but we all still need to think.
How would you use the Boston Children’s Hospital HIPAAb-compliant version of ChatGPT if it were available?

Research 6 My AI robot will ... learn to sort data in redcap

Translation 3 I use it probably in the most simple of ways to translate patient handouts into their language.

Summarize clinical narrative 3 Summarization of complex patient medical history and relevant clinical information and other data aggregation tasks (e.g., ascertain primary/longitudinal care team members involved in patient's care)

Extract data from narratives 1 Review patient charts and imaging reports to generate tabular data for research.

Workflow 1 Lots of potential nonclinical purposes, describing workflow, responsibility mapping
If a HIPAA-compliant version of ChatGPT were available, what types of information would you feel comfortable entering?

Demographic data 1 Name of patient’s school

Patient medications 1 Info related creating a prior auth insurance, medication, dx, etc.

aThe responses were analyzed and codified by a team of 3 physicians using formal qualitative methodology.

bHIPAA: Health Insurance Portability and Accountability Act.