Abstract
Background:
Physicians regularly use jargon in patient communication, which can lead to confusion and misunderstanding.
Objective:
To assess the general public’s understanding of names and roles of medical specialties and job seniority titles.
Methods:
We surveyed adults ≥18 years old without medical or nursing training at the 2021 Minnesota State Fair. Volunteers completed an electronic survey, filling-in-the-blanks for 14 medical specialties (e.g., “pediatricians are doctors who take care of _____”). Participants also ranked physician titles in order of experience (medical student, intern, senior resident, fellow, attending). We used descriptive statistics to summarize the responses. Two researchers coded open-ended answers as correct, partially correct, or incorrect, with a third researcher for coding discrepancies.
Results:
204 participants completed the survey: (55% female; mean age 43; 67% of respondents with a bachelor’s degree or higher). Of 14 medical specialties listed on the survey, respondents most accurately identified dermatologists (94%) and cardiologists (93%). Six specialties were understood by less than half of the respondents: neonatologists (48%), pulmonologists (43%), hospitalists (31%), intensivists (29%), internists (21%) and nephrologists (20%). 12% of participants correctly identified medical roles in rank order. Most participants (74%) correctly identified medical students as the least experienced. Senior residents were most often identified as the most experienced (44%), with just 27% of respondents correctly placing the attending there.
Conclusions:
When introducing themselves and their role in patient care, medical professionals should recognize that titles are a common source of misunderstanding among the general public and should describe their role to minimize confusion.
Introduction:
Minimizing jargon usage has been identified as an important target to optimize the ability of patients to participate in shared clinical decision-making.1 Beyond being a source of frustration for patients, confusion created by their clinicians’ use of jargon has the potential to impede adherence to therapy and worsen clinical outcomes.2–5 Yet despite recognition of the importance of minimizing medical jargon, it is well established that healthcare professionals frequently use terminology that is not understood by their patients.5–7 This disconnect between clinicians’ desire to avoid jargon and their continued use of it has been called “jargon oblivion,” and likely reflects the false assumption that patients share an understanding of the terminology used.8 In order to reduce jargon usage during clinical encounters, we first must recognize what terms or phrases are commonly misunderstood by patients.
Much of the research on jargon has aimed to quantify patient understanding of technical terminology via surveys gauging patient comprehension of terms or phrases they may hear in a medical visit.2,3,9,10 An important and under-studied area of misunderstanding is confusion over medical titles. In a survey of patients in a breast clinic, for example, only 43% correctly defined “oncologist” and 28% “radiologist,” despite the fact that these were members of the patient’s care team.11 We hypothesize that the names of other subspecialties may engender similar levels of confusion. Furthermore, we theorize that the role titles used to describe the stages of medical training may be an additional source of confusion for patients. Given that our first words to patients often include stating our specialty and role on the clinical teaching team, our introductions may represent the first insertion of poorly understood jargon into the clinical encounter. The aim of this study was to determine the degree to which members of the general public understand common specialty names and seniority titles that are used in clinical settings.
Methods
Study Design and Setting
This study was a cross-sectional survey of adult visitors to the Minnesota State Fair to assess their understanding of medical specialty names and seniority titles. The Minnesota State Fair is an annual statewide event conducted near Minneapolis and St. Paul, Minnesota, with the highest per day attendance of any state fair in the United States.12 The University of Minnesota runs a “Driven to Discover” research pavilion at the fair, inviting fairgoers to participate in approved studies conducted by University researchers. This setting was selected to allow for recruitment of a large sample of adults without medical context cues found in any clinical setting. This study was selected to be conducted at the 2021 Minnesota State Fair via a peer-reviewed proposal process and was approved by the University of Minnesota Institutional Review Board (STUDY00012955).
Survey Design
We developed a novel survey consisting of 14 free-response questions to assess respondents’ understanding of medical specialties. We chose a mix of common specialties and subspecialties, focusing on fields not previously reported in published studies of jargon. Each specialty question was structured as a sentence-completion item, with various phrasings used to allow for the possibility of an appropriate natural-language response to each question while minimizing the provision of contextual clues. For example, “Pulmonologists are doctors who specialize in diseases involving the ________” or “Hospitalists are doctors who take care of ________.” The fourteen specialties evaluated were: cardiologist, dermatologist, internist, intensivist, gastroenterologist, geriatrician, hospitalist, neonatologist, nephrologist, neurologist, ophthalmologist, pediatrician, pulmonologist, and rheumatologist.
Further, to evaluate layperson understanding of seniority titles used in medical training, we asked respondents to place the following titles in order, from least- to most-experienced: Medical Student, Intern, Senior Resident, Fellow, Attending.
The anonymous survey also included brief demographics including gender, age, and level of education. Demographic data were collected in a standardized format stipulated by the Driven to Discover team. The survey was administered on an iPad via REDCap13 electronic data capture tools hosted at the University of Minnesota. Our novel survey instrument was developed by the study team and tested on selected laypeople for usability, comprehension, and timing. Given that the design was quite simple with fill in the blank (e.g. “Specialist X is a doctor who takes care of __________”) and ordering of roles, we did not feel it was necessary to do formal validity testing.
Data Collection
We surveyed fairgoers over the course of two days at the Minnesota State Fair in August and September 2021. Adults who passed by or visited the research building were invited to complete the brief survey; participants received a University of Minnesota branded string backpack as compensation for their time. To be eligible, prospective participants were required to have no self-reported significant personal medical or nursing background, to be 18 years of age or older, and to be comfortable participating in an English-language survey. The survey was voluntary and anonymous, and the incentive was provided regardless of whether the survey was completed in full.
Data and Statistical Analysis
Two researchers (E.H. and C.P.) coded the free-text responses to open-ended questions as correct, partially correct, incorrect, or did not know. They also performed a qualitative analysis of free-response questions. Discrepancies in coding were resolved through consultation with a third researcher. Descriptive statistics were used to summarize the responses, and the association of a completely correct response with demographics (including age, gender, education) was examined with multivariable logistic regression models. Adjusted odds ratio (aOR) and 95% confidence intervals (CI) were reported from these models. P-values less than 0.05 were considered statistically significant. SAS V9.4 (SAS Institute Inc., Cary, NC) was used for analysis.
Results:
The survey was completed by 204 volunteers with a mean age of 43 years; 55% were female. Two thirds of respondents (67%) reported having earned a bachelor’s degree or higher. Of 14 specialties included in the survey, only two were correctly understood by more than 90% of respondents: dermatologist (94% correct) and cardiologist (93%). Six specialties were understood by fewer than half of the respondents: neonatologist (48%), pulmonologist (43%), hospitalist (31%), intensivist (29%), internist (21%), and nephrologist (20%). Some free text responses demonstrated consequential understanding gaps, including 4% of participants indicating that nephrologists specialized in dead people and 12% of reporting that internists were still in training. See Table 1 for full results. In some cases it was not clear what respondents meant by common incorrect answers (eg., Hospitalists are doctors who take care of hospitals). Partially correct answers were excluded from this reporting analysis due to the heterogeneity of fill-in-the-blank partially correct answers obtained.
Table 1:
Percent of respondents correctly describing specialty
| Specialist | Correct N=204 N (%) |
Most common wrong answers |
|---|---|---|
| Dermatologist | 191 (94%) | No repeated wrong answer |
| Cardiologist | 190 (93%) | No repeated wrong answer |
| Pediatrician | 181 (89%) | Feet (2%) |
| Neurologist | 181 (89%) | No repeated wrong answer |
| Gastroenterologist | 182 (89%) | No repeated wrong answer |
| Geriatriciana | 151 (74%) | Stomach (3%), Women (2%) |
| Ophthalmologistag | 137 (67%) | Ear, Nose, or Throat (3%) |
| Rheumatologista | 114 (56%) | Nerves (3%) |
| Neonatologistag | 97 (48%) | Fetuses (10%), Brain/nerves (8%) |
| Pulmonologistg | 88 (43%) | Heart (17%), Blood vessels (13%) |
| Hospitalist | 63 (31%) | Hospitals (3%) |
| Intensivist | 59 (29%) | Emergency (3%) |
| Internist | 42 (21%) | Insides (15%), In training (12%) |
| Nephrologist | 41 (20%) | Death (4%) |
Older age more likely to answer correctly p<0.05
Odds of correct response higher for female gender p<0.05
There were a few statistically significant differences in correct identification of specialties by demographic category. Female respondents were more likely than males to correctly identify neonatologists (62.8% correct vs 28.6%; aOR (95% CI) = 4.74 (2.54–8.86); p<0.0001), and ophthalmologists (76.1% vs 56.0%; aOR (95% CI) = 2.69 (1.40–5.18); p<0.005). Older age was associated with a slightly increased likelihood of correctly identifying geriatricians, neonatologists, ophthalmologists, and rheumatologists (Table 1). Increasing education was not associated with increased likelihood of correctly defining any of the specialist types.
When asked to rank medical team members by seniority, only 12% of study participants were able to correctly place the five seniority titles in order from least experienced to most experienced. Sixteen percent of study participants were not able to place any seniority title in the correct location. Most study participants (74%) correctly identified the medical student as the least experienced. However, the senior resident was identified (44%) more often than any other level as being the most experienced member of the team, with only 27% of respondents correctly placing the attending in that position. Full results from this item are displayed in Table 2.
Table 2:
Number and percent of respondents ranking role by experience
| LEAST experienced | 2nd least experienced | Middle | 2nd most experienced | MOST experienced | |
|---|---|---|---|---|---|
| Medical Student | 150 (74%) | 26 (13%) | 8 (4%) | 18 (9%) | 2 (1%) |
| Intern | 32 (16%) | 128 (63%) | 32 (16%) | 10 (5%) | 2 (1%) |
| Senior Resident | 2 (1%) | 5 (3%) | 50 (25%) | 58 (29%) | 89 (44%) |
| Fellow | 10 (5%) | 27 (13%) | 53 (26%) | 59 (29%) | 55 (27%) |
| Attending | 10 (5%) | 18 (9%) | 61 (30%) | 59 (29%) | 56 (27%) |
Bolded number= Most commonly ranked position for the role
Shaded number = Correct position ranking for the role
In a multivariable logistic regression model, age (p=0.75), gender (p=0.21), and education (p=0.095) were not associated with correctly placing the five seniority titles. Those with more education more often correctly ranked all five titles in order (17% for those with a graduate or professional degree, 15% for those with a bachelor’s, 4% for those with less than a bachelor’s), but this finding was not statistically significant.
Discussion:
Clear understanding of clinician specialty names and seniority roles within the team upon introduction is an important starting point for effective communication between patients and their clinical teams. Prior studies on jargon understanding by patients have evaluated the comprehension of specific words and phrases in clinical settings, with some including one or two specialties among the terms examined.3,10,11 However, to our knowledge, this is the first dedicated study to survey a broader cohort of laypeople in a non-clinical setting regarding their understanding of an extended list of specialty names and to include training levels.
In our sample, many specialties were understood by fewer than half of respondents, with nephrologist and internist understood by fewer than one-quarter of participants. There were some significant differences by demographic group, such as female respondents being more likely to correctly explain the role of a neonatologist. These differences may be based on different rates of relevant life experiences. For example, more females will have experience with a neonatologist than males, as not all infants in an a neonatal intensive care unit will have a male caretaker, and accordingly, it is not surprising that women may have more familiarity with the title “neonatologist.” In some cases, we found that the wrong answers provided indicate a potentially serious communication gap. One can imagine that the belief that a geriatrician specializes in women’s health or that a pediatrician cares for feet could lead patients to feel unheard by their referring physicians and be less likely to make an appointment with those specialists. In addition, if patients believe that an internist only deals with internal organs (the definition provided by 1 in 6 respondents) they might be less likely to discuss concerns about skin conditions with their internist, potentially delaying important care. Regardless of the degree of misunderstanding, the use of jargon increases the cognitive load burden on the patient.
Similarly, we found widespread confusion over the order of seniority of members of the medical team, with approximately 3 in 4 participants (73%) unaware that an attending is the most senior member. Uncertainty over the roles of members of a clinical team in an academic setting can potentially lead to frustration and incomplete disclosures. Further research is needed to determine the impact of this lack of understanding on the patient experience.
Reduction in jargon use overall and explanation of technical terms when they must be used have the potential to enhance communication between patients and their doctors. In particular, the results of this study demonstrate that many types of specialist clinicians should develop the habit of either assessing the patient’s understanding of their specialty or automatically including a brief explanation of it when they introduce themselves to new patients. For example, Rau, Basir, and Flynn showed that participants in their focus group had much improved understanding and significant preference for the term “baby specialist” during a prenatal visit as compared to the term “neonatologist.”3
In addition, clinicians working in practice settings that include trainees should be aware that patients may not automatically know that the attending is the leader of the team and that the senior resident is still in training. Clinicians should work to develop concise, respectful wording to explain their situation, such as “I’m Dr. X, a rheumatology fellow. That means that I am a physician training to specialize in diseases of inflammation, joints, and immune system overreactions.” Hospitalists in particular may have a unique role in reducing this confusion, as they often serve as the gatekeeper initiating the consultation of specialists and often work with medical trainees. Hospitalists should provide a clear explanation of trainee roles and consulting subspecialists; white boards and face cards may be helpful to support this work.14 Furthermore, specialties with particularly poor comprehension may benefit from public education campaigns, which could include improved branding in specialty office waiting rooms.
Limitations
Our study has several limitations to consider. First, our surveyed population demonstrated a higher education level than the United States average (67% of our sample had a bachelor’s degree or higher, vs. 35% in the US overall).15 This may reflect a combination of the higher proportion of Minnesotans with college degrees compared to the US average16 coupled with selection bias due to the fact that respondents were volunteers who chose to visit a university research building during their day at a fair. Additionally, during the 2021 fair, the research building had a mask mandate to mitigate the spread of SARS-CoV2, while many other areas of the fair did not. Willingness to comply with this requirement may have further skewed our sample in an unmeasured direction. However, even in our highly educated sample, there was a poor understanding of medical jargon, and we found no statistical association with increased education level and ability to define any of the specialties. We also did not collect data regarding participants’ race, ethnicity, health literacy level or native language; these topics would be appropriate targets for future study.
We intentionally chose a setting that was not associated with clinical care and used an open-ended survey format to more accurately reveal participant knowledge gaps, to avoid contextual clues or potential waiting room internet searches of medical terms. However, the open-ended format allowed for a wide variety of partially correct responses, which resisted analysis via this study design; further studies may be warranted to better characterize the partial understandings that occur. Finally, as we wanted to minimize survey fatigue, we did not examine the understanding of all specialties; this leaves room for further study.
Conclusions:
Use of jargon has been shown to be a barrier to effective communication between patients and clinicians. This study demonstrates that a large proportion of laypeople lack clear understanding of the names of medical specialties and the titles used to indicate seniority in medical training. Clinicians should consider taking time to explain their specialties and roles within the team in plain language to reduce patient misunderstanding from the moment of their introduction.
Acknowledgements:
The authors wish to acknowledge the Driven to Discover research faculty and coordinators: Logan Spector, PhD, Ellen Damerath, PhD, and Annie Hotop for enabling our research at the Minnesota State Fair.
Funding:
Less than $4,500 was provided in part by the University of Minnesota Driven to Discover grant. This research was supported by the National Institutes of Health’s (NIH) National Center for Advancing Translational Sciences, grant UL1TR002494. The NIH had no role in the design and conduct of the study. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health’s National Center for Advancing Translational Sciences.
Footnotes
To the authors’ knowledge, no conflict of interest, financial or other, exists.
Disclosures: None
Ethical approval: This study was selected to be conducted at the 2021 Minnesota State Fair via a peer-reviewed proposal process and was approved by the University of Minnesota Institutional Review Board (STUDY00012955)
Prior presentations: This material was presented in poster format at the Association of Pediatric Program Directors meeting in May 2022
Contributor Information
Emily Hause, Department of Pediatrics, University of Minnesota, Minneapolis, MN 55454..
Corinne Praska, University of Minnesota School of Medicine, Minneapolis, MN 55455..
Michael B Pitt, M Health Fairview Masonic Children’s Hospital, Minneapolis, MN, 55454..
Marissa A Hendrickson, M Health Fairview Masonic Children’s Hospital, Minneapolis, MN, 55454..
Victoria Charpentier, University of Minnesota School of Medicine, Minneapolis, MN 55455..
Katherine A. Allen, Department of Pediatrics, University of Minnesota, Minneapolis, MN 55454..
Rachael Gotlieb, University of Minnesota School of Medicine, Minneapolis, MN 55455..
Scott Lunos, Biostatistical Design and Analysis Center, Clinical and Translational Science Institute, University of Minnesota, Minneapolis, MN, 55414..
Jordan Marmet, M Health Fairview Masonic Children’s Hospital, Minneapolis, MN, 55454..
References
- 1.Killian L, Coletti M. The role of universal health literacy precautions in minimizing “Medspeak” and promoting shared decision making. AMA J Ethics. 2017;19(3). doi: 10.1001/journalofethics.2017.19.3.pfor1-1703 [DOI] [PubMed] [Google Scholar]
- 2.Dua R, Vassiliou L, Fan K. Common maxillofacial terminology: Do our patients understand what we say? Surgeon. 2013;13(1):1–4. doi: 10.1016/j.surge.2013.09.009 [DOI] [PubMed] [Google Scholar]
- 3.Rau NM, Basir MA, Flynn KE. Parental understanding of crucial medical jargon used in prenatal prematurity counseling. BMC Med Inform Decis Mak. 2020;20(1):1–7. doi: 10.1186/s12911-020-01188-w [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Lehmann LS, Brancati FL, Chen MC, Roter D, Dobs a S. The effect of bedside case presentations on patients’ perceptions of their medical care. N Engl J Med. 1997;336(16):1150–1155. doi: 10.1056/NEJM199704173361606 [DOI] [PubMed] [Google Scholar]
- 5.Reddin G, Davis NF, Donald KM. Ward stories: lessons learned from patient perception of the ward round. Ir J Med Sci. 2019;188(4). doi: 10.1007/s11845-019-01975-z [DOI] [PubMed] [Google Scholar]
- 6.Charpentier V, Gotlieb R, Praska CE, Hendrickson M, Pitt MB, Marmet J. Say What? Quantifying and Classifying Jargon Use During Inpatient Rounds. Hosp Pediatr. 2021;11(4):406–410. doi: 10.1542/hpeds.2020-002790 [DOI] [PubMed] [Google Scholar]
- 7.Howard T, Jacobson KL, Kripalani S. Doctor talk: Physicians’ use of clear verbal communication. J Health Commun. 2013. doi: 10.1080/10810730.2012.757398 [DOI] [PubMed] [Google Scholar]
- 8.Pitt MB, Hendrickson MA. Eradicating Jargon-Oblivion—A Proposed Classification System of Medical Jargon. J Gen Intern Med. 2020;35(6):1861–1864. doi: 10.1007/s11606-019-05526-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.LeBlanc TW, Hesson A, Williams A, et al. Patient understanding of medical jargon: a survey study of U.S. medical students. Patient Educ Couns. 2014;95(2):238–242. doi: 10.1016/j.pec.2014.01.014 [DOI] [PubMed] [Google Scholar]
- 10.Pieterse AH, Jager NA, Smets EMA, Henselmans I. Lay understanding of common medical terminology in oncology. Psychooncology. 2013. doi: 10.1002/pon.3096 [DOI] [PubMed] [Google Scholar]
- 11.O’Connell RL, Hartridge-Lambert SK, Din N, St John ER, Hitchins C, Johnson T. Patients’ understanding of medical terminology used in the breast clinic. Breast. 2013;22(5):836–838. doi: 10.1016/j.breast.2013.02.019 [DOI] [PubMed] [Google Scholar]
- 12.The Biggest State Fairs in the United States. Readers.com. Updated 2020. Accessed April 23, 2022. https://www.readers.com/blog/biggest-state-fairs/
- 13.Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, Conde JG. Research electronic data capture (REDCap)-A metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform. 2009;42(2). doi: 10.1016/j.jbi.2008.08.010 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Goyal AA, Tur K, Mann J, Townsend W, Flanders SA and Chopra V (2017), Do Bedside Visual Tools Improve Patient and Caregiver Satisfaction? A Systematic Review of the Literature. Journal of Hospital Medicine, 12: 930–936. [DOI] [PubMed] [Google Scholar]
- 15.Educational attainment distribution in the United States from 1960 to 2020. Statista. Accessed April 23, 2022. https://www.statista.com/statistics/184260/educational-attainment-in-the-us/
- 16.Educational Attainment Data. Minnesota Office of Higher Education. Updated 2017. Accessed April 23, 2022. http://www.ohe.state.mn.us/sPages/educ_attain.cfm#:~:text=Minnesota%20ranks%202nd%20(50%20percent,an%20associate%20degree%20or%20higher.
