Abstract
Physician–patient email communication is gaining popularity. However, a formal assessment of physicians' email communication skills has not been described. We hypothesized that the email communication skills of rheumatology fellows can be measured in an objective structured clinical examination (OSCE) setting using a novel email content analysis instrument which has 18 items. During an OSCE, we asked 50 rheumatology fellows to respond to a simulated patient email. The content of the responses was assessed using our instrument. The majority of rheumatology fellows wrote appropriate responses scoring a mean (±SD) of 10.6 (±2.6) points (maximum score 18), with high inter-rater reliability (0.86). Most fellows were concise (74%) and courteous (68%) but not formal (22%). Ninety-two percent of fellows acknowledged that the patient's condition required urgent medical attention, but only 30% took active measures to contact the patient. No one encrypted their messages. The objective assessment of email communication skills is possible using simulated emails in an OSCE setting. The variable email communication scores and incidental patient safety gaps identified, suggest a need for further training and defined proficiency standards for physicians' email communication skills.
Introduction
According to a recent survey, 74% of adult Americans use the internet regularly and a majority of them (89%) receive or send an email on a daily basis.1 Survey results also confirm that patients are increasingly looking for health information online (83%) and most of them express a desire to communicate with their providers via email (74%).2–4 Although the first patient–physician email communication2 was reported almost 20 years after the introduction of email into the public domain, email is gradually emerging as an alternative communication tool, replacing routine office visits or telephone conversation for non-emergency situations such as prescription refill requests. As the number of email users rises to 1.9 billion worldwide this year,1 it is clear that use of email is going to grow in medicine as well.4
Over the last decade several authors have discussed the advantages, disadvantages, and barriers to adoption of email communication in healthcare.5–7 Similarly, the content of patient emails and the time and resources that are utilized for implementing emailing policies within institutes have also been reported.8–10 The American Medical Informatics Association11 and the American Medical Association12 (AMIA/AMA) have published email communication guidelines for physicians (boxes 1 and 2).8 10 13 However, a decade later, awareness of and adherence to these guidelines are poor. In a random cross-sectional survey of over 10 000 primary care physicians in Florida, Brooks et al reported that only 6.7% adhered to half of the 13 selected email communication guidelines.14
Box 1. Summary of communication guidelines (adapted from Kane and Sands11 and (YPS) AMA12).
Caution when using email for urgent matters.
Inform patient about privacy issues.
Retain electronic or paper copies of email communication.
Establish type of transactions (prescription refill, appointment scheduling, etc).
Establish sensitivity of subject matter (HIV, mental health, etc).
Acknowledge receipt of messages and send a new message to inform completion of request.
Develop archival and retrieval mechanisms.
Maintain a mailing list of patients, but do not send group mailings where recipients are visible to each other. Use blind copy feature in software.
Avoid anger, sarcasm, harsh criticism, and libelous references to third parties.
The end of the email messages should have a standard block of text which contains the physician's full name, contact information, and reminders about security and the importance of alternative forms of communication for emergencies.
Remind the patients when they do not adhere to the guidelines.
Terminate the email communication relationship with patients who repeatedly do not adhere to the guidelines.
Physician should instruct the patients to do the following:
Put the category of transaction in the subject line of the message for filtering: prescription, appointment, medical advice, billing question.
Put their name and patient identification number in the body of the message.
Use the auto-reply feature to acknowledge reading the clinician's message.
Be concise in their email messages.
Box 2. Medicolegal and administrative guidelines (adapted from Kane and Sands11 and (YPS) AMA12).
Consider obtaining the patient's informed consent for use of email. Written consent forms should include the following and be retained in the patient's medical record. A copy should be given to patients.
Itemize terms in communication guidelines.
Provide instructions for when and how to escalate to phone calls and office visits.
Waive encryption requirement, if any, at patient's insistence.
Indemnify the healthcare institution for information loss due to technical failures.
-
Describe security mechanisms in place including:
Using a password-protected screen saver for all desktop workstations in the office, hospital, and at home.
Never forwarding patient-identifiable information to a third party without the patient's express permission.
Never using patient's email address in a marketing scheme.
Not sharing professional email accounts with family members.
Not using unencrypted wireless communications with patient-identifiable information.
Double-checking all ‘To’ fields prior to sending messages
Perform at least weekly backups of mail onto long-term storage. Define ‘long-term’ as the term applicable to paper records.
Commit policy decisions to writing and electronic form.
Furthermore, certain skills, such as communication, teamwork, and procedures, are often assumed to be mastered by trainees until studies challenge this assumption.15–19 Although many physicians are nowadays comfortable with email communication, we cannot be sure that such communication is effective. To our knowledge, an objective assessment of physicians' email communication skills has not been described in the literature. Therefore, we have performed this pilot study with the following objectives: (1) to establish whether rheumatology fellows' email communication skills can be measured using simulated patient emails; and (2) to perform content analysis of these email responses for clarity, appropriateness, and professionalism.
Methods
ROSCE: rheumatology objective structured clinical examination
Objective structured clinical examination (OSCE)20 is now used routinely for the evaluation of rheumatology fellows (rheumatology OSCE, or ROSCE). ROSCE has been validated for training and objective assessment of communication skills and professionalism, Accreditation Council for Graduate Medical Education (ACGME) core competencies that cannot be measured effectively by in-service examination.21 Fellows perceive it as a valuable educational experience because it provides immediate feedback.22 23
Study participants
Fifty rheumatology fellows (first and second year) from six different fellowship programs across the Midwest participated in our ‘Highway I-70 ROSCE’ between 2005 and 2008. The fellows rotated through 10 to 12 stations designed to assess different skills such as history taking, physical examination, telephone communication, joint and soft-tissue injections, radiographs, and pathology slides.
Email communication station
One of the ROSCE stations was an email communication station (see online supplementary appendix 1, available at www.jamia.org). In this station, fellows had to respond to emails from simulated patients. In the email, a patient with rheumatoid arthritis on immunosuppressant medications seeks advice about the sudden onset of knee swelling and pain with fever. Fellows were instructed to logon to their email accounts and compose an email response. These responses were either saved or emailed back to a specially prepared account and then printed for analysis. The patient email was designed to be an easy question for a rheumatology fellow to answer so that the focus of the assessment would be on the quality of the email response.
Novel physician email content analysis instrument
After a review of physician–patient email communication literature8 24–27 and the available guidelines,11 12 we developed a scoring instrument (18 items) that measured what we believe are the three broad domains of email communication skills that physicians must demonstrate: (1) understanding the role of email in the physician–patient relationship (as an extension of and to complement a visit and not for urgent matters); (2) awareness of the administrative, confidentiality, and medicolegal aspects of email (such as proper documentation and encryption); and (3) ability to write professional email responses (clear, concise, and courteous).
We used a binary scale (0 and 1) for each of the 18 items included in the instrument, with the pre-determined accepted behavior assigned one point. A fellow would receive no points if their answer was not clear, used medical jargon, or was too vague. Certain items, such as acknowledging the urgency of the issue or providing a contingency plan, received one point if the fellow included their contact details or suggested that the patient should go to the nearest emergency room. Evaluation of other items, such as conciseness and tone of the message including courtesy and formality, was more subjective. In order to determine what constitutes courtesy in email communication, we consulted recent books28 and articles about email communication. For example, in professional emails, some literature suggests that the salutation should be ‘Dear Mr/Mrs Surname’29 and the closing remark should be ‘Sincerely’ or ‘Regards’. The courtesy of the email was judged by presence of words such as ‘please’, ‘thank you’, ‘apologize’, and ‘appreciate’.8
The significance of each item was debated by the authors. Subsequently, four internal medicine residents were asked to voluntarily respond to similar emails to test the scoring instrument in an informal pilot study (data not included). The pilot testing gave us new insight into physicians' email communication skills. For example, we assumed that all physicians would include their name, address, and telephone number as a signature block. We discovered that this was not the case, so we added two separate points for the inclusion of name and telephone number, and eliminated address. We also added a point for discussing a back-up plan in the email because it was deemed significant by our study group. Finally, an 18-point novel email content analysis instrument was agreed upon and used by two independent, blinded observers. The pilot emails were re-evaluated and the differences in scores were discussed to establish consensus before the final evaluation of the ROSCE emails.
Email evaluations
Two independent observers (MK, CV) evaluated and scored the emails retrospectively after obtaining an exempt status from the University of Missouri-Columbia Institutional Review Board. Both observers have a background in education, instrument design, and assessment of performance. Observer 1 (MK) is a post doctoral research fellow and has recently completed a 1-year dedicated fellowship in clinical simulation and education. Observer 2 (CV) is a full-time clinician-educator rheumatologist and one of the authors of the ROSCE. Observers were blinded to the institution and demographic information of the fellows.
Statistical analysis
Scores are expressed as mean±SD, unless stated otherwise. We attempted to apply the Cohen's kappa to assess the chance agreement among raters but realized that it was less appropriate in this situation because of the sparseness of the data (50 emails×2 observers=100 observations distributed over 16×16=249 score pairs). Pearson's correlation coefficient is more meaningful in this case to determine the inter-rater reliability. The Shapiro-Wilk test was used to assess the normality of the distribution of scores on each rater. To test if there was variation over the years or due to the rater, a two-way analysis of variance (ANOVA) was performed with p<0.05 being significant. The open source R v 2.11.1 for Windows was used for statistical calculations.
In compliance with IRB requirement, we did not collect personal information such as trainee's name, age, sex, or institute. This information was also not deemed necessary because all the rheumatology fellows went through the same station and preceptor over the 4-year period. No additional training was provided for email communication skills at any of the participating institutions. Hence, these factors were not included in the analysis.
Results
Email communication skills score
All 50 emails were assessed using the instrument discussed above. The average score (±SD) received by fellows was 10.6±2.6 (range 3–16; maximum achievable score 18). The inter-rater reliability between the two observers was 0.86 (Pearson's correlation coefficient). A scatterplot (figure 1) of the email scores from both observers further indicates a high correlation between the two raters.
Figure 1.
Scatterplot of CV's scores plotted against MK's scores. The straight line at 45 degrees is drawn as a reference to illustrate perfect agreement. As can be seen, the two raters are not in perfect agreement but do have a high correlation. MK and CV are the initials of individual observers.
The Shapiro-Wilk test confirmed that the email communication skills scores closely approximated the normal distribution. The p values of 0.2 and 0.6 for observers 1 and 2, respectively, suggests that the null hypotheses that the scores follow a normal distribution is reasonable and cannot be rejected. A quantile-quantile (Q-Q) plot of each observer's scores is included (figure 2), providing a graphical interpretation of the Shapiro-Wilk test.
Figure 2.
A) Gaussian distribution (top panel) and quantile-quantile (Q-Q) plots (bottom panel) of email communication skills scores given by both observers (MK and CV). Normally distributed data would lie on the straight line in the Q-Q plot. The email communication scores of MK and CV do not deviate significantly from normality.
The two-way ANOVA for the email communication scores confirmed that there was a significant difference in the email communication scores between years (highest mean scores in 2006, p=0.03) but not between observers (p=0.49). This corroborates the change over the years suggested in figure 3.
Figure 3.
A plot of the mean email communication scores given by each observer (MK and CV) over 4 years. The results confirm that the quality of email responses did not improve over time.
Clarity of email responses
Fellows used simple language to convey their message and avoided medical jargon (88% of emails), limited their discussion to the specific complaint of the patient (100%), and laid out a clear plan of action for the patient (90%). Most fellows performed well in explaining the medical information in the email.
Appropriateness of email responses
At the outset of our study we assumed that most fellows would identify this patient email as inappropriate, due to the emergent situation, and make some effort to contact the patient either directly or through their office. Ninety-two percent of the fellows acknowledged the urgency of the medical situation in their text, but only 30% demonstrated awareness that an email response in this situation is not appropriate and took active measures to contact the patient. While 86% included their names, only 20% considered sharing their telephone number with the patient. Seventy-two percent of fellows suggested that the patient should go to the nearest urgent care facility or emergency room and 62% outlined a back-up plan, but only 28% considered requesting a read receipt of their email for confirmation that the email advice was ever received by the patient. Twenty percent of fellows mentioned that they would notify another healthcare professional outlined in their plan to facilitate patient care, such as notifying the regional emergency room or their office assistant about the patient's expected arrival.
Professionalism of email responses
The majority of the responses were concise (74%) and courteous (68%) but not formal enough (22%). A detailed breakdown of salutations reveals that most fellows prefer to use either ‘Mr Smith’ (38%) or ‘Dear Mr Smith’ (28%) but other salutations were also used. Other salutations were used in 12% of emails and included: ‘Dear Jake’ (six emails), ‘Hello Mr Smith’,2 ‘Smith’,1 ‘Patient 1’,1 no salutation1, and ‘Mrs Smith’.1 The closing remarks included ‘Sincerely’ (46%), ‘None’ (22%), ‘Thank You’ (12%) and ‘Others’ (20%).
Discussion
Our study demonstrated for the first time that writing an email to patients is a skill which can be measured objectively in the setting of an OSCE. The quality of the email responses varied significantly, which highlights the need for training and assessment for email communication skills. Given the increasing patient interest in email communication4 30 and the growing physician familiarity with email communication, the use of email will probably increase. And rightly so, since email communication is convenient, fast, asynchronous30 and allows easy documentation. In addition, it permits communication with more than one person, a concept very useful in evolving multi-disciplinary care of patients.
The normal distribution of email communication skills scores (average score 10.6, SD 2.6) and the presence of outliers are evidence that the capacity to communicate professionally by email is not an innate skill, nor is it acquired during the participants’ medical training. Email communication with patients is expanding, and for this to happen safely, the correct policy, procedures, and training must be in place. Educators in medical school, residency, and fellowship programs should survey their needs and raise awareness of existing guidelines for email communication among residents, fellows, and faculty.
Poorly written or unclear emails may reflect badly on the sender and the organization5 but may also lead to communication errors,31 a common sentinel event according to The Joint Commission on the Accreditation of Healthcare Organizations. Understandably, the Institute of Medicine explicitly included patient–physician email communication as an integral component of patient-centered care in its report Crossing the Quality Chasm: A New Health System for the 21st Century.32 Therefore, we believe that our study will be of interest to all providers involved in patient-centered care, especially those who utilize office visit to manage non-urgent issues, an ideal setting for patient–provider email communication.33
Although most emails are about non-urgent requests from patients,8 with the increasing use of email4 it is possible that serious situations may arise via email and cause liability issues. In our study, 92% of fellows identified the medical emergency but only a third made an active effort to resolve this issue by calling the patient either themselves or through their office. Less than a third offered their phone number or requested a read receipt to ensure seamless communication. None of the trainees used a secured or encrypted email to communicate with their patient. These results are not surprising given that medical trainees rarely receive formal training in effective email communication with patients. Our study reveals a potential patient safety gap, one that we have not thought about previously, which needs to be evaluated and addressed with appropriate training and increased awareness of AMIA/AMA email communication guidelines.11 12
Our study has certain limitations. First, although trainees were asked to respond to the email as if it were from a real patient, it is possible that they would respond differently when actually emailing real patients. However, simulation has been shown to predict real behavior in a variety of clinical settings.34 35 Our study involved only fellows from one subspecialty who responded to a single email and it is possible that their performance would vary with different, or more, emails. However, we believe our multi-institutional sample is representative and the results may be generalizable. Perhaps the major limitation of the study was the lack of a validated email communication evaluation instrument. Our instrument was developed de novo and has face validity given the normal distribution of the scores. We are currently conducting validity studies at our institution and are in the process of identifying the appropriate weight for each item on the instrument. Future studies will also help us define the ‘cut-off’ or the desired level of proficiency.
Conclusion
Objective assessment of physicians' email communication skills is possible in an OSCE setting using simulated emails. Although rheumatology fellows can communicate medical information with patients via email, they do so with variable efficacy. This suggests that email communication is not an innate skill and there may be a need for formal training and assessment. Further research is needed to explicitly define the email communication skills, outline the proficiency standards, and develop a structured curriculum for training as we move towards the widespread adoption of patient–physician email communication.
Supplementary Material
Acknowledgments
We are very grateful to the fellows and faculty who enthusiastically participated in the ROSCE and provided valuable feedback.
Footnotes
Competing interests: None.
Ethics approval: This study was conducted with the approval of the University of Missouri-Columbia, Columbia, Missouri (IRB project number: 1157417).
Contributors: We are very grateful to the fellows and faculty who enthusiastically participated in the ROSCE and provided valuable feedback.
Provenance and peer review: Not commissioned; externally peer reviewed.
References
- 1.Generations 2009 report. http://www.pewinternet.org/Static-Pages/Trend-Data/Online-Activites-Total.aspx (accessed 5 Feb 2010).
- 2.Neill RA, Mainous AG, 3rd, Clark JR, et al. The utility of electronic mail as a medium for patient-physician communication. Arch Fam Med 1994;3:268–71 [DOI] [PubMed] [Google Scholar]
- 3.Kleiner KD, Akers R, Burke BL, et al. Parent and physician attitudes regarding electronic communication in pediatric practices. Pediatrics 2002;109:740–4 [DOI] [PubMed] [Google Scholar]
- 4.Singh H, Fox SA, Petersen NJ, et al. Older patients' enthusiasm to use electronic mail to communicate with their physicians: cross-sectional survey. J Med Internet Res 2009;11:e18. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Granberry N. Email–from “to” to “send”. AAOHN J 2007;55:127–30 [DOI] [PubMed] [Google Scholar]
- 6.Slack WV. A 67-year-old man who e-mails his physician. JAMA 2004;292:2255–61 [DOI] [PubMed] [Google Scholar]
- 7.Sands DZ. Electronic patient-centered communication: managing risks, managing opportunities, managing care. Am J Manag Care 1999;5:1569–71 [PubMed] [Google Scholar]
- 8.White CB, Moyer CA, Stern DT, et al. A content analysis of e-mail communication between patients and their providers: patients get the message. J Am Med Inform Assoc 2004;11:260–7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Stiles RA, Deppen SA, Kathleen Figaro M, et al. Behind-the-scenes of patient-centered care: content analysis of electronic messaging among primary care clinic providers and staff. Med Care 2007;45:1205–9 [DOI] [PubMed] [Google Scholar]
- 10.Liederman EM, Morefield CS. Web messaging: a new tool for patient-physician communication. J Am Med Inform Assoc 2003;10:260–70 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Kane B, Sands DZ. Guidelines for the clinical use of electronic mail with patients. J Am Med Inform Assoc 1998;5:104–11 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.(YPS) AMA Guidelines for Physician-Patient Electronic Communications. 2000. http://www.ama-assn.org/ama/pub/about-ama/our-people/member-groups-sections/young-physicians-section/advocacy-resources/guidelines-physician-patient-electronic-communications.shtml (accessed 30 Dec 2009). [Google Scholar]
- 13.Katz SJ, Moyer CA, Cox DT, et al. Effect of a triage-based e-mail system on clinic resource use and patient and physician satisfaction in primary care: A randomized controlled trial. J Gen Intern Med 2003;18:736–44 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Brooks RG, Menachemi N. Physicians' use of email with patients: factors influencing electronic communication and adherence to best practices. J Med Internet Res 2006;8:e2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.ElBardissi AW, Regenbogen SE, Greenberg CC, et al. Communication practices on 4 Harvard surgical services: a surgical safety collaborative. Ann Surg 2009;250:861–5 [DOI] [PubMed] [Google Scholar]
- 16.Hicks CM, Gonzales R, Morton MT, et al. Procedural experience and comfort level in internal medicine trainees. J Gen Intern Med 2000;15:716–22 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Huang GC, Smith CC, Gordon CE, et al. Beyond the comfort zone: Residents assess their comfort performing inpatient medical procedures. Am J Med 2006;119:71.e17–24 [DOI] [PubMed] [Google Scholar]
- 18.Wickstrom GC, Kolar MM, Keyserling TC, et al. Confidence of graduating internal medicine residents to perform ambulatory procedures. J Gen Intern Med 2000;15:361–5 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Dyer P. Improving communication to reduce readmissions: physicians, home health providers must work as a team. J Ark Med Soc 2007;103:199–200 [PubMed] [Google Scholar]
- 20.Hodges B. Validity and the OSCE. Med Teach 2003;25:250–4 [DOI] [PubMed] [Google Scholar]
- 21.Berman JR, Lazaro D, Fields T, et al. The New York City Rheumatology Objective Structured Clinical Examination: Five-year data demonstrates its validity, usefulness as a unique rating tool, objectivity, and sensitivity to change. Arthritis Care Res 2009;61:1686–93 [DOI] [PubMed] [Google Scholar]
- 22.Hassell AB. Assessment of specialist registrars in rheumatology: experience of an objective structured clinical examination (OSCE). Rheumatology (Oxford) 2002;41:1323–8 [DOI] [PubMed] [Google Scholar]
- 23.Berman J, Fields T, Bass AR, et al. Immediate Station Feedback During the Annual New York Rheumatology Objective Structured Clinical Examination (NY-ROSCE) Increases Intra-Exam Scores [abstract]. Arthritis Rheum. 2009;60(Suppl 10):122219404945 [Google Scholar]
- 24.Roter DL, Larson S, Sands DZ, et al. Can E-mail messages between patients and physicians be patient-centered? Health Commun 2008;23:80–6 [DOI] [PubMed] [Google Scholar]
- 25.Roter DL, Hall JA. Health education theory: An application to the process of patient - provider communication. Health Educ Res 1991;6:185–93 [DOI] [PubMed] [Google Scholar]
- 26.Levinson W, Roter D. The effects of two continuing medical education programs on communication skills of practicing primary care physicians. J Gen Intern Med 1993;8:318–24 [DOI] [PubMed] [Google Scholar]
- 27.Beach MC, Roter DL. Interpersonal expectations in the patient-physician relationship. J Gen Intern Med 2000;15:825–7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Shipley D, Schwalbe W. Send—Why People Email So Badly and How to Do It Better. 2nd edn New York: Alfred A Knopf, 2008 [Google Scholar]
- 29.Safire W. Dear Hunting. The New York Times, 2006 [Google Scholar]
- 30.Wallwiener M, Wallwiener CW, Kansy JK, et al. Impact of electronic messaging on the patient-physician interaction. J Telemed Telecare 2009;15:243–50 [DOI] [PubMed] [Google Scholar]
- 31.Woolf SH, Kuzel AJ, Dovey SM, et al. A string of mistakes: the importance of cascade analysis in describing, counting, and preventing medical errors. Ann Fam Med 2004;2:317–26 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Crossing the Quality Chasm: A New Health System for the 21st Century: Institute of Medicine, 2001 [Google Scholar]
- 33.Patt MR, Houston TK, Jenckes MW, et al. Doctors who are using e-mail with their patients: a qualitative exploration. J Med Internet Res 2003;5:e9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Seymour NE, Gallagher AG, Roman SA, et al. Virtual reality training improves operating room performance: results of a randomized, double-blinded study. Ann Surg 2002;236:458–63; discussion 63–4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 35.Blum MG, Powers TW, Sundaresan S. Bronchoscopy simulator effectively prepares junior residents to competently perform basic clinical bronchoscopy. Ann Thoracic Surg 2004;78:287–91 [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.



