Skip to main content
Journal of the American Medical Informatics Association: JAMIA logoLink to Journal of the American Medical Informatics Association: JAMIA
. 2023 Aug 12;30(12):1965–1972. doi: 10.1093/jamia/ocad158

Electronic health records and clinical documentation in medical residency programs: preparing residents to become master clinicians

Chad Anderson 1,, Mala Kaul 2, Nageshwara Gullapalli 3, Sujatha Pitani 4
PMCID: PMC10654888  PMID: 37573135

Abstract

Objective

The ubiquity of electronic health records (EHRs) has made incorporating EHRs into medical practice an essential component of resident’s training. Patient encounters, an important element of practice, are impacted by EHRs through factors that include increasing documentation requirements. This research sheds light on the role of EHRs on resident clinical skills development with emphasis on their role in patient encounters.

Materials and Methods

We conducted qualitative semistructured interviews with 32 residents and 13 clinic personnel at an internal medicine residency program in a western US medical school focusing on the resident’s clinic rotation.

Results

Residents were learning to use the EHR to support and enhance their patient encounters, but one factor making that more challenging for many was the need to address quality measures. Quality measures could shift attention away from the primary reason for the encounter and addressing them consumed time that could have been spent diagnosing and treating the patient’s chief complaint. A willingness to learn on-the-job by asking questions was important for resident development in using the EHR to support their work and improve their clinical skills.

Discussion

Creating a culture where residents seek guidance on how to use the EHR and incorporate it into their work will support residents on their journey to become master clinicians. Shifting some documentation to the patient and other clinicians may also be necessary to keep from overburdening residents.

Conclusion

Residency programs must support residents as they develop their clinical skills to practice in a world where EHRs are ubiquitous.

Keywords: electronic health records, medical residents, patient encounters, quality measures

BACKGROUND AND SIGNIFICANCE

In order to become master clinicians, residents must develop the clinical skills necessary to effectively engage with patients to diagnose and treat their medical conditions. The use of electronic health records (EHRs) has become ubiquitous for the delivery and documentation of care in the United States with 96% of hospitals and 78% of office-based physicians adopting a certified EHR as of 2021.1 Consequently, residents can now expect their careers as physicians to require EHR use2,3 and learning to incorporate the EHR into medical practice has become a key component of resident training.4–8 Patient encounters are an essential part of a physician’s work where they engage directly with the patient to establish trust and address the patient’s medical needs.9–11 Therefore, a key skill that residents must develop is using that encounter time effectively.4,12 EHRs can be supportive in patient encounters, but they can also draw the physician’s attention away from the patient, with negative consequences to communication, trust, and the quality of care.13,14 Early studies on resident perceptions of EHRs found that residents recognized the potential benefits of EHRs that included increased availability of information, but they were often ambivalent to those benefits because of perceived challenges that included less effective interactions with patients, decreased documentation quality, and increased workload.15–17 Since then, information technology has become more advanced, yet those same challenges have continued to be evidenced in studies of EHR use by residents.18–21

A trend contributing to the challenges residents and practicing physicians perceive with EHRs, particularly in the United States, is increasing clinical documentation requirements.22,23 That documentation growth has been enabled by EHRs because a digital record can expand the scope of data capture over what is feasible with a paper record.24,25 Consequently, the digitalization of medical records has coincided with a significant increase in clinical documentation with much of the burden for that documentation falling to physicians.26 Quality measures are one component of those documentation requirements that physicians must address.27 Their implementation arose from the desire for systematic information on healthcare quality that would lead to effective policy levers to improved care,28,29 but they have direct cost to physicians in terms of time and effort.27 Quality measures are typically documented during patient encounters and EHRs are designed to support that documentation by prompting the physician when one or more measures need to be addressed, but those prompts can become a distraction to the primary focus of the encounter.11 Consequently, to become a master clinician, residents must learn to incorporate the EHR into patient encounters, which often includes the management of quality measure prompts, in a way that will not detract from their ability to diagnose and treat the patient’s medical needs. The research question we seek to answer is, how are electronic health records impacting resident-patient encounters and what are the implications of those impacts for resident training and clinical skills development? In the following sections, we present our study and explain how EHRs are impacting resident-patient encounters and the development of resident’s clinical skills. We conclude by offering potential interventions to improve residents’ use of EHRs and mitigate the impact of increasing documentation requirements that could ease the path to residents becoming master clinicians in a world where EHRs are now ubiquitous.

MATERIALS AND METHODS

To answer our research question, we conducted qualitative semistructured interviews with residents in an internal medicine residency program along with program faculty and support staff at a clinic that was part of the resident’s rotation. A qualitative research approach was employed because it provided the flexibility necessary to pursue emergent avenues of inquiry as we progressed from interview to interview.30 We provide details on our research methods in Table 1.

Table 1.

Ancker et al31 guidelines for reporting qualitative research methods.

Sampling (Describe the sampling approach and report and justify the sample size)
Our purposive sampling approach was to draw participants from one rotation setting of a residency program with the expectation that we would achieve greater depth in our data by focusing on the residents, faculty, and staff in a single setting. Access to the setting was facilitated by 2 of the authors who are faculty in the program. One of the program rotations is an outpatient clinic operated by the university. The clinic was chosen as the venue for the study because all the residents rotate through the clinic and resident availability is more consistent during that rotation. One of the faculty authors identified days where resident schedules at the clinic were typically less busy and the first author then arranged to be onsite those days to conduct interviews. On each scheduled interview day, the faculty author would speak to the residents at the clinic and invite them to sit for an interview during a break in their day. Some residents were not available during the days we conducted interviews, but our sampling process enabled us to interview 32 of the program’s 54 residents with a mix of residents in each year of the program. We followed the same invitation process for faculty and clinic staff. The 2 faculty authors were not interviewed as study participants. We considered pursuing interviews with the residents and clinic staff who were not available on our scheduled interview days, but repetition of responses and minimal novel information in later interviews indicated we had reached sufficient saturation in our sample.32
Data collection (Report how data was collected and any methods for reducing bias)
The University of Nevada Reno IRB approved the study. All interviews were conducted in person by the first author with the second author also participating in 6 of the interviews. Both authors had significant prior experience with qualitative research methods and specifically with conducting semistructured interviews. Neither had prior connections with any of the study participants. The faculty authors refrained from participating in the interviews to reduce the potential for power differentials in their faculty/student relationship with the residents that might reduce participant candor. Participant recruitment was open to all residents, faculty, and staff at the clinic to ensure diversity. Interviews were conducted over a total of 8 days in January and March of 2022. Interviews were structured through interview guides included in Appendix S1. The interview guides were developed with input from all authors. We were interested in understanding the influence of electronic health records and other health information technologies on resident training and skill development. Consequently, we structured our interview guides to capture data on those influences prior to residency and during all major work activities the residents engaged in during their residency with special focus on work activity at the clinic. All interviews were audio-recorded with an average recording length of 23 minutes. Free text field notes were also taken by the first author to capture observations about the clinic setting in which the study participants were working.
Data analysis (Describe data analysis methods)
Interview recordings were initially transcribed using an automated AI-based web service. The transcripts were then cleaned by the first author and 3 paid assistants with the first author conducting a final verification of transcripts cleaned by the assistants. The cleaned transcriptions were loaded into the qualitative data analysis software nVivo for analysis. The analysis was conducted by the first author following Miles and Huberman’s33 guidelines for qualitative data analysis, which starts with an initial round of coding to identify descriptive/interpretive codes (eg, dictation, MyChart, Epic training). All coding was data-driven with no codes created a priori. A second round of coding was then conducted to aggregate the descriptive/interpretive codes into pattern codes (eg, expectations for technology, patient encounter engagement, required documentation). An example of our coding is provided below.
Source Quote Descriptive/interpretive codes Pattern codes
Participant R20, page 4 If there’s any lab work or any imaging and stuff, I pull up to show them. Lab results and imaging Patient encounter engagement
Participant F3, page 8 You can see that the patient had a colonoscopy, because they had GI bleeding 3 months ago, but it’ll say please mark and review that it needs to be done. Quality measures Patient encounter engagement
Participant R26, page 8 If patients are asking a complicated question, then you can’t completely answer with MyChart. Patient questions, MyChart Technology limitations
Participant MA16, page 8 Just drilling that in their heads to communicate with us through there, either through Teams or Voalte. Communication with other clinicians, Microsoft Teams, Voalte Technologies in use

An analysis of our pattern codes identified common ideas and overlaps between pattern codes leading us to develop 3 key themes that are described in the results.

Research setting

The study clinic offers healthcare services to low-income members of the local community with patients assigned to each resident during the course of their residency. EpicCare Ambulatory was the clinic’s EHR at the time of the study. Residents encounter their patients when the patients come to the clinic for scheduled visits. When a patient arrives at the clinic for a scheduled visit, they check in at the front desk and then a medical assistant (MA) brings them back to an exam room. The MA will take vitals and gather other preliminary information from the patient including their chief complaint that prompted the visit. The MA will then inform the resident either in a digital message or in person that the patient is ready for them. The resident will go into the exam room and interact with the patient to address their chief complaint along with anything else that may need to be addressed (eg, incomplete quality measures). Each exam room has a computer through which the EHR can be accessed. The computer monitor is attached to an articulating arm so the screen can be swiveled toward or away from the patient. The attending physician may accompany the resident in the visit but often the resident will see the patient alone and then come out to discuss their assessment and plan with the attending. The resident will also inform the MA about any orders that need to be processed for the patient (eg, medications, labs). The resident will complete the visit note either right after the visit or later in the day. This process is repeated throughout the day until all scheduled patient visits have been completed. Residents may then have to finish any remaining documentation on their patient visits for that day.

RESULTS

We interviewed a total of 45 participants for our study. Twenty-four participants were female and 21 were male. Box 1 provides a breakdown of the participants by their roles.

Box 1.

Study participants.

Role Number
Resident—First Year (PGY1) 13
Resident—Second Year (PGY2) 8
Resident—Third Year (PGY3) 9
Chief Resident—Third Year 1
Chief Resident—Fourth Year 1
Program Faculty 3
Medical Assistant (MA) 9
Office Manager 1

The residents in our study reported both expectations for using EHRs throughout their careers and developing patterns of significant EHR use in their residency. Most of the residents reported engaging with the EHR for more than half of their time in the clinic and many reported regularly accessing it at home. On one end of the use spectrum, a PGY1 (R13) said in response to how much time he spent on the EHR in the clinic, “basically as long as we’re here” and whether he uses the EHR at home, “Outside of the scheduled time, I’m not on the EHR. So, you know, the 80-hour work schedule, yeah, easily under that.” Similarly, a PGY2 (R26) stated, “95% of the time. I mean, you’re always on it, honestly…Even if your clinic is done, unless you’re a really experienced resident, you just have to be working on notes and cleaning up the inbox. So, it’s not an eight to five job. It’s more like six to eight.” On the other end of the use spectrum, a PGY3 (R10) stated in response to how much time she spends on the EHR, “Quite a bit. I would say at least two hours a day” and whether she takes work home, “I try and finish up while I’m here, but some days, very rarely, I take a couple of notes back home. But for the most part, I manage between eight and five.” While some PGY2 and PGY3 residents reported marked improvement in their EHR use from when they started the program, that was not a consistent pattern as some senior residents reported continued struggles to effectively incorporate the EHR into their work.

Based on the resident’s usage of EHRs, we identified 3 overarching themes from our interview data. These include: (1) patient encounters: role of the EHR in the development of resident’s clinical skills, (2) quality measures: helpful encounter guide or distraction from what’s important, and (3) resident training: self-engagement in learning to use the EHR effectively.

Patient encounters: role of the EHR in the development of resident’s clinical skills

The EHR did not always support residents in meeting the goals of patient encounters. One of the faculty (F2) explained his view on the challenge residents face with patient interactions that involve the computer. “The EMR has a way of being a distraction from the real thing we’re supposed to be doing, which is listening to a patient, taking care of them, doing a physical exam, making an assessment, and doing that stuff.” A second faculty (F1) noted how some residents struggled in their response to that challenge. “There are some who initially said, I’m not going to do the computer in the room. I’m just going to write my notes and do it later so that I can interact with the patient. And that’s lovely, but that’s really not how the world is going to work down the road, so they really have to figure out how to do both.” Based on conversations with the patients during check out, the MAs also reflected on the patient’s perspective. One MA (MA10) said, “patients often complain that they do not like the provider sitting there on the computer and typing while they’re having that interaction…So it’s a matter of the residents finding that happy medium of typing, making eye contact, active listening, going back to the computer.” Collectively, the faculty and staff saw effective EHR use in patient encounters as an essential component of resident development, because it could easily become an impediment to patient engagement and clinical care.

We explored that issue with the residents and categorized resident use of the EHR during patient encounters based on their reported engagement. Of the 31 residents who provided direct patient care at the clinic, 6 avoided or at least minimized their use of the EHR during encounters (minimal use), 11 used the EHR to review and show patients their lab results and other information (moderate use), and 14 used the EHR throughout the patient encounter including to write orders and notes (integral use). Table 2 presents quotes illustrating each category of EHR use.

Table 2.

Resident’s use of the EHR during patient encounters.

Resident EHR use Quote
R40 (PGY3) Minimal I’m sort of a person I don’t like to look at the screen when I’m talking to the patient. I just feel it’s disrespectful for the patient…So I was not using my computer on anything.
R38 (PGY1) Minimal I don’t really use the technology during the patient encounter. Occasionally I’ll have Epic open, if I want to show them something, like their labs or their imaging, but I try not to. I just feel like it kind of interferes with the flow of things.
R41 (PGY3) Moderate I know one of the residents I was talking to yesterday, they start the note…I’m not there yet, or that’s not my style. I focus more on the problem list and the orders and medications, because those are the things that need to be done before the patient leaves the room.
R27 (PGY2) Moderate In each room we have a computer, so we can go through notes or through labs, anything. Very useful. I don’t type notes. When I see my patients, I like to give them the most time possible because it really matters for the patient. Because you don’t want to be writing your note when you’re seeing the patient or they’ll feel like they’re ignored.
R17 (PGY1) Integral It’s so much easier for me to look up things and do it in real time. And the fact that I can show my patient the images right then and there, and we can look at it together just makes it easier…and then I write the orders there. I think the ability to write the orders right then and there, refill the medication right then and there, before I leave the room is super important.
R25 (PGY3) Integral I tend to put the screen to face them. And then I don’t sit behind it. So we’re both sitting parallel or in front of the screen, and then that way as I’m typing or we’re doing something else, they’re able to see what I’m doing and they don’t think I’m ignoring them behind the screen… And so they’re able to correct me as I’m writing notes. So it becomes more accurate.

Quality measures: helpful encounter guide or distraction from what’s important

While most of the residents in our study were comfortable using the EHR in their patient encounters on at least a moderate basis, one component of encounters that many of them felt challenged by was the requirement to address quality measures. The faculty expressed concern over the extent to which quality measure documentation would take time away from addressing the patient’s chief complaint and the development of resident’s clinical skills. One faculty (F3) lamented that “one of the disappointments of electronic health records is this tendency of all these people to piggyback on and get data, get data, get data, get data, get data…you can layer stuff on there, STIs, preventive health, mammography, BRCA for cancer. The list is indefinite. And then the patient comes in as a client and says, well, I had a stroke, why are they asking me if I’m sexually active? Isn’t the doctor worried about my stroke and my blood pressure. And that’s where the doctor really, really needs to have his expertise. But there’s this weight on him that to get to that, he’s got to do A-B-C-D or he’s bad.”

Multiple participants described how quality measures shift the resident’s focus away from diagnosing and treating the patient’s chief complaint. (F3) noted this saying, “insurers have this onus that if they want to reimburse you and give you money for the visits that you have completed, they may have a series of questions or series of things to review above and beyond the chief complaints…[like] they didn’t ask about the mammography status of this lady or have a conversation about this person’s sexual orientation. See how it is. Those are really easy things to do, because you can look and see plus or minus, right. Switch on or off. But to understand the assessment and plan and give a doctor credit for being a master clinician does not exist at the insurance level…The residents shouldn’t be held under an ACO for production because that affects their ability to be master clinicians.”

Some residents found the quality measures to be helpful in guiding their interactions with patients (R11) “I find it really useful because it prompts the care gaps in a patient’s care,” but many were skeptical about the value of those measures (R39) “I think ultimately it does distract from the encounter.”Table 3 presents quotes from residents on their perceptions of the requirements to document quality measures that we have categorized as helpful, neutral, and harmful.

Table 3.

Resident perceptions of quality measures.

Resident View Quote
R10 (PGY3) Helpful For me being an internal medicine physician, for me, the most important thing with technology would be through the EMR, like having pop-ups come up each time I see a patient saying, hey, you’re missing these out, you need to fill these things in, those kind of things will make a difference for a primary care physician or a hospitalist. So that’s what I feel, more and more of those, meeting your quality metrics basically.
R30 (PGY3) Helpful We have those quality measures and it can be really beneficial because sometimes we may not ask, you know, like a follow up or an acute visit, when was their last screening done. And so if it’s flagging those things, it’s nice to just quickly see, because maybe they had a mammogram 2 months ago and it just wasn’t in our system yet. But if they didn’t, then it’s a nice, easy way to order things, to kind of meet quality measures, make sure patients are getting all their proper screenings and everything. So, I think it’s helpful.
R37 (PGY1) Helpful The quality measures are important and I think that’s great. And so, yeah, overall, it’s something that gives you kind of like a background of everything you need to do. Did you get this checked off? You know, that helps a lot.
R28 (PGY1) Neutral They just have these yellow boxes and some administrators told our attendings that we absolutely have to click these yellow boxes. But it’s basically just redundant because I’m already doing it. But if I don’t do it their way, then it doesn’t give them a little green light in their system.
R34 (PGY3) Harmful There’s so many quality measures and buttons to click, I’m finding myself looking at the screen and clicking through the stuff. And then I think, probably not the best way, but before the end of the visit, I was like, okay, now I have to talk to you about what the computer wants me to talk to you about. And then we’ll go through either the depression screening or whatever screening.
R39 (PGY2) Harmful I think unfortunately because of the system that we’re in, we’ve become very focused on checking boxes and not treating people. And we check boxes with the intent of treating people. But I think in practice, what ends up happening is we check boxes so that quality measures are met, you know, in quotations. But it doesn’t necessarily improve the quality of care for the patient, because it may not be relevant to what they’re there for. And it may just be like a legal, protective thing. So as a whole, I mean, if I had to like pick a side on it, I actually dislike, I disapprove of the quality measures because I think that they add yet another thing that we are supposed to check off in addition to the 57 other things that we need to take care of already. It puts us in front of the computer for longer. It keeps us staying at clinic later than, you know, would be ideal. And it gives us less face to face time and talking time with the patient. It’s more computer time.
R26 (PGY2) Harmful If they send us a quality measure notification or care notification, then you have to do it. Even if like your patients may have more pressing issues. Because this is how we get evaluated as providers and how our clinic is evaluated. Like whether we are compliant with those quality measures. Colonoscopy, pap smear, breast cancer screening.
R32 (PGY2) Harmful We have a lot of metrics to go through in a short period of time. There’s a lot of things that the government…and states want for you to document on each encounter. And so, if you don’t meet these metrics, then you know, they’ll start coming after you and say, you’re getting dinged on this, this and this.
R33 (PGY2) Harmful I think sometimes you just feel like the job is reduced to clicking as many boxes as you can in one visit. And I think it takes away from the actual practice of medicine.

Whatever the resident’s perception, quality measures documentation does take time during the patient visit, which reduces the time available to address the patient’s chief complaint(s). This was noted by a PGY2 (R32), “These quality measures add up…if they come in for a 30-minute appointment, the MA could spend half the time and you might only have five or 10 minutes of that left and the patient hasn’t gotten to any of their concerns.” The faculty (F2) also noted the effect of quality measures notifications in the EHR on the resident’s time with their patients saying, “it’s not uncommon to see these yellow boxes or red font or whatever it is telling you that you need to do something. And the challenge is that there is just not simply enough time to do all of those things.” The faculty saw this resulting in poorer results for patients from the encounter when their primary problem is not fully addressed along with missed opportunities for diagnostic skill development and overall skill reduction for residents. Yet they felt constrained in how they could respond to those issues because of factors that include regulatory requirements for clinical documentation. One of the faculty (F3) commented that, “GME doesn’t have any control in my mind whatsoever regarding documentation…we have no ability to delete those, reduce those, or increase those, or choose other metrics to evaluate the performance of the residents… our job is to try to bring these people up as a master clinician and it’s hard to do when there’s all this pressure about documentation.”

Resident training: self-engagement in learning to use the EHR effectively

While residency programs may be constrained by external requirements, training can be effective for addressing resident EHR use issues (eg, quality measures documentation).5,6 However, the effectiveness of any specific training modality is challenged by the fact that residents do not all enter residency with the same computer self-efficacy, EHR experience, or motivation to learn. One of the faculty (F1) noted this saying, “I think the EMR is real variable. There are people that have pretty good skill at navigating that stuff and then there’s other people who don’t. We have residents that come in and they really struggle for a long time with all aspects of it…and so they have to be trained over and over and over again.” The limited effectiveness of onboarding training and the necessity for on-the-job learning was also noted with one faculty (F2) saying, “they get their standard Epic user training, which is the same training I got and is frankly terrible and doesn’t really prepare you for the actual application or use of this software. So then you end up with just learning as you go, which is probably the most realistic way that we’ve trained residents.” Consequently, while the clinic conducted formal training sessions for residents when they first started at the clinic, the faculty and clinic staff had the goal of creating a culture for the residents where asking questions of others (eg, faculty mentors, fellow residents, medical assistants) was encouraged. Table 4 presents quotes from residents about the types of training they received.

Table 4.

Training opportunities and perceptions.

Resident Quote
R31 (PGY1) I go to the MAs or my peers. Sometimes the attendings, they know things that we don’t.
R26 (PGY2) We did receive some training coming into residency, like a couple days into orientation. They usually throw a lot of information at you like within a few hours and I mean, you do keep some good things, but usually you kind of like learn those as you go along the way. It’s like trial and error. So honestly, getting EMR training itself, it’s kind of giving you a glance, but in my opinion, you just gotta jump into it and do it. We were mainly learning from our seniors.
R23 (PGY2) If I mess something up, one of the MAs will tell me and I’ll kind of be like, okay, how could I have done that? I’ll start asking, so what should I have done to order this or that? And so usually, because they’re here 4 days a week, every week, they know how to do a lot better than I do. So, I usually just ask them like, hey, I don’t know how to order this. And I just kinda learn from them. It’s like asking them and then doing it myself.
R14 (PGY3) There were some questions I asked my colleague. She was just a first-year resident, but I don’t think we have that inhibition that we only need to go to people above us. I think generally what I’ve noticed is people kind of prefer peers in their same class. That would probably be the way most of them would have preferred. But I’ve seen instances where they have gone to colleagues, like the juniors and stuff, and there were no inhibitions because everybody is learning.
R12 (PGY2) I think initially when we came in, we just had an orientation meeting, and that was about 4 or 5 hours that we received. So the senior residents were important at that time and some attendings teach you.
R13 (PGY1) There was no formal ladders, so to speak, but I just went up to my senior resident, like, hey, can you teach me how to do this? Hey, can you teach me how to do that? And sometimes there’s been a few residents who would always ask, hey, anything I can help you with? They would be proactive on their end before I got stuck or before all of us got stuck, they would offer to help.

One limitation that the data indicated regarding the effectiveness of the training described in Table 4 is that residents must be motivated to seek out those types of training opportunities. Motivation affected, among other things, the extent to which the residents sought help when they did not know how to accomplish a task with the EHR (eg, placing a particular type of order) or how to use the EHR more efficiently (eg, use of templates for note writing). One of the faculty (F1) in response to a question about resident motivation to seek guidance on using technology more effectively noted that “some people are more inclined to ask for help than others, you know, the way they’re wired.” She elaborated saying, “You’d love to think every resident out there is like, I’m going to take care of every patient like it’s my mother or my father…[but] others don’t have that same drive or that professionalism or that sense of I’m owning this problem and this patient.”

DISCUSSION

Medical residency programs must prepare residents to be master clinicians and the ubiquity of EHRs in medical settings has made training residents to effectively incorporate the EHR into their work an essential component of that process.4–8 Through our analysis we found 3 key themes for resident EHR use that included patient encounters: role of the EHR in the development of resident’s clinical skills, quality measures: helpful encounter guide or distraction from what’s important, and resident training: self-engagement in learning to use the EHR effectively. We discuss below some suggestions for resident training and easing the documentation burden for residents.

Regarding resident training, studies have found that providing enhanced EHR training to residents can lead to improvements in their efficiency in using EHRs,34,35 which can enable residents to spend more time developing their clinical knowledge and improving the quality of their patient care.34 However, those efficiency gains may be offset by the increasing documentation requirements from payers and regulators that impose significant time and effort costs on residents.23 Consequently, EHR training provided by residency programs must include both general EHR use skills development and documentation skills development. Research on effective EHR training practices for residents is still relatively sparse as a recent systematic review of the literature found just 4 studies on training residents to properly engage with EHRs.6 One challenge that was noted in the review was addressing competing demands for time to provide specific learning events and experiential opportunities.6

In our study, we found that learning effective EHR use is a combination of planned interventions on the part of residency programs and personal initiative to learn on the part of residents. Specifically, while onboarding training can provide an entry base of knowledge, ongoing opportunities for learning are essential to effective clinical skills development; opportunities that include formalized mentoring relationships between senior and junior residents and creating a culture where residents feel comfortable asking questions from any clinic personnel. For example, both the MAs and residents in our study noted that MAs have direct visibility into certain challenges that residents experience with navigating and using the EHR to complete documentation requirements. Clinic rotations in residency programs should therefore foster closer relationships between MAs and residents to enable the MAs to guide and train residents as these issues are evidenced. Encouraging residents to engage with each other to establish learning communities and share knowledge on best practices for EHR use can also lead to both skill gains and increased personal initiative to learn and share that learning with peers. These and other training tactics can help to address issues residents experience with EHR use that can impede their clinical skills development.

Regarding easing documentation burden, with most US hospitals and physician’s offices using EHRs to manage patient data1 and payers moving to performance and outcome-based reimbursement models, documentation pressures for physicians have increased substantially.36 The result for residents is reduced time for learning and clinical skills development, less robust patient engagement, and the extension of the resident’s workday potentially leading to frustration and burnout. Given that EHR systems are here to stay, actionable solutions for residency programs are necessary to address these challenges.

One option would be to empower and incentivize patients to take a more active role in their care. We found that residents in our study often felt pressured by alerts in the EHR to address and document quality measures during patient visits for acute problems. Quality measures should ideally be addressed during annual wellness visits so prompting and incentivizing patients to schedule and keep annual wellness visits would enable residents to maintain the focus of other visits on the focal problem(s) for those visits. That would also have the added benefit of giving the resident time to develop personalized preventive care plans for more of their patients,37 which could improve health outcomes for those patients and further develop the resident’s clinical skills. Enabling and prompting patients to provide information for quality measures documentation ahead of visits would be another way to shift some documentation burden from residents as well as empower patients to take more ownership of their care. For that to work, patients would need to have clear guidance on the value of their documentation efforts and some guidance on how to complete the documentation or there is a risk they would feel exploited rather than empowered.38 Patient-facing tools like MyChart could be used for this purpose. The patient could be sent a pre-visit questionnaire to fill out that included quality measures questions. A video explaining both how to complete the documentation and why there is value in that activity could be presented to the patient before displaying the questionnaire to improve engagement.

Medical assistants can also support residents in addressing certain documentation requirements (eg, quality measures) during patient encounters. One way this was accomplished in our study was that when rooming the patient, some MAs would ask the patient to complete the depression screening questions on the exam room computer before notifying the resident that the patient was ready for them. The MAs could also prepare the patients for questions the resident would be asking by explaining what the questions would be and why answering them was important. This would also provide an opportunity for the MAs to reinforce the reasoning behind scheduling an annual visit so that quality measures questions would not take up time during an acute visit.

This study has a few limitations. First, the findings are based on data collected at a single point of time. Consequently, our insights regarding the impact of time on resident skill development is limited to retrospective accounts from our participants. Future studies could utilize a longitudinal structure to address this limitation. Second, one rotational setting within one internal medicine residency program was the only setting included in the study, which may affect generalizability to the extent that documentation requirements vary from program to program and setting to setting. Third, interviews were the primary data collection method and therefore the responses could have been biased by the interview script.

CONCLUSION

Our study shows that residency programs need to consider multiple avenues for training and supporting residents in their use of EHRs as they prepare to become practicing physicians. In particular, the effective use of EHRs in patient encounters is important to enable residents to make good decisions about the care their patients require and for the development of their clinical skills that will make them master clinicians. Since EHR use and increased documentation requirements are ubiquitous in healthcare, residency programs must find ways to effectively support their residents’ learning to incorporate EHRs into their work and streamline documentation requirements to maximize the development of residents’ clinical skills.

Supplementary Material

ocad158_Supplementary_Data

Contributor Information

Chad Anderson, Department of Information Systems & Analytics, Miami University, Oxford, OH 45056, United States.

Mala Kaul, Department of Information Systems, University of Nevada Reno, Reno, NV 89557, United States.

Nageshwara Gullapalli, Department of Internal Medicine, University of Nevada Reno School of Medicine, Reno, NV 89557, United States.

Sujatha Pitani, Department of Internal Medicine, University of Nevada Reno School of Medicine, Reno, NV 89557, United States.

AUTHOR CONTRIBUTIONS

All authors contributed to the study conception, design, and data collection. C.A. was responsible for the data analysis and C.A. and M.K. were responsible for manuscript drafting. All authors agreed to be accountable for all aspects of the work.

SUPPLEMENTARY MATERIAL

Supplementary material is available at Journal of the American Medical Informatics Association online.

FUNDING

This study was not supported by external funding sources.

CONFLICTS OF INTEREST

None declared.

DATA AVAILABILITY

The data underlying this article will be shared on reasonable request to the corresponding author.

ETHICS APPROVAL

This study was approved by the University of Nevada Reno IRB.

REFERENCES

  • 1. ONC. National Trends in Hospital and Physician Adoption of Electronic Health Records. Office of the National Coordinator for Health Information Technology; 2021. [Google Scholar]
  • 2. Khairat S, Burke G, Archambault H, Schwartz T, Larson J, Ratwani RM.. Focus section on health IT usability: perceived burden of EHRs on physicians at different stages of their career. Appl Clin Inform. 2018;9(2):336-347. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3. Holmgren AJ, Lindeman B, Ford EW.. Resident physician experience and duration of electronic health record use. Appl Clin Inform. 2021;12(4):721-728. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4. Lanier C, Dao MD, Hudelson P, Cerutti B, Perron NJ.. Learning to use electronic health records: can we stay patient-centered? A pre-post intervention study with family medicine residents. BMC Fam Pract. 2017;18(1):69. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5. Kim JG, Rodriguez HP, Estlin KA, Morris CG.. Impact of longitudinal electronic health record training for residents preparing for practice in patient-centered medical homes. Perm J. 2017;21(3):16-122. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6. Rajaram A, Hickey Z, Patel N, Newbigging J, Wolfrom B.. Training medical students and residents in the use of electronic health records: a systematic review of the literature. J Am Med Inform Assoc. 2020;27(1):175-180. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7. Chi J, Bentley J, Kugler J, Chen JH.. How are medical students using the electronic health record (EHR)?: An analysis of EHR use on an inpatient medicine rotation. PLoS One 2019;14(8):e0221300. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8. Wenger N, Méan M, Castioni J, Marques-Vidal P, Waeber G, Garnier A.. Allocation of internal medicine resident time in a Swiss hospital: a time and motion study of day and evening shifts. Ann Intern Med. 2017;166(8):579-586. [DOI] [PubMed] [Google Scholar]
  • 9. Zhang J, Chen Y, Ashfaq S, et al. Strategizing EHR use to achieve patient-centered care in exam rooms: a qualitative study on primary care providers. J Am Med Inform Assoc. 2016;23(1):137-143. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10. Duke P, Frankel RM, Reis S.. How to integrate the electronic health record and patient-centered communication into the medical visit: a skills-based approach. Teach Learn Med. 2013;25(4):358-365. [DOI] [PubMed] [Google Scholar]
  • 11. Crampton NH, Reis S, Shachak A.. Computers in the clinical encounter: a scoping review and thematic analysis. J Am Med Inform Assoc. 2016;23(3):654-665. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12. Asan O, Kushner K, Montague E.. Exploring residents’ interactions with electronic health records in primary care encounters. Fam Med. 2015;47(9):722. [PMC free article] [PubMed] [Google Scholar]
  • 13. Campbell EM, Sittig DF, Ash JS, Guappone KP, Dykstra RH.. Types of unintended consequences related to computerized provider order entry. J Am Med Inform Assoc. 2006;13(5):547-556. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14. Silverman J, Kinnersley P.. Doctors’ non-verbal behaviour in consultations: look at the patient before you look at the computer. Br J Gen Pract. 2010;60(571):76-78. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15. Hier DB, Rothschild A, LeMaistre A, Keeler J.. Differing faculty and housestaff acceptance of an electronic health record. Int J Med Inform. 2005;74(7-8):657-662. [DOI] [PubMed] [Google Scholar]
  • 16. Embi PJ, Yackel TR, Logan JR, Bowen JL, Cooney TG, Gorman PN.. Impacts of computerized physician documentation in a teaching hospital: perceptions of faculty and resident physicians. J Am Med Inform Assoc. 2004;11(4):300-309. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17. Aaronson JW, Murphy-Cullen CL, Chop WM, Frey RD.. Electronic medical records: the family practice resident perspective. Fam Med. 2001;33(2):128-132. [PubMed] [Google Scholar]
  • 18. Aylor M, Campbell EM, Winter C, Phillipi CA.. Resident notes in an electronic health record: a mixed-methods study using a standardized intervention with qualitative analysis. Clin Pediatr (Phila). 2017;56(3):257-262. [DOI] [PubMed] [Google Scholar]
  • 19. Caceres JW, DiCorcia MJ.. The impact of technology on the development of core entrustable professional activities (EPAs). MedSciEduc. 2018;28(1):247-249. [Google Scholar]
  • 20. Tutty MA, Carlasare LE, Lloyd S, Sinsky CA.. The complex case of EHRs: examining the factors impacting the EHR user experience. J Am Med Inform Assoc. 2019;26(7):673-677. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21. Friedberg MW, Chen PG, Van Busum KR, et al. Factors affecting physician professional satisfaction and their implications for patient care, health systems, and health policy. RAND Health Q. 2014;3(4):1. [PMC free article] [PubMed] [Google Scholar]
  • 22. Downing NL, Bates DW, Longhurst CA.. Physician burnout in the electronic health record era: are we ignoring the real cause? Ann Intern Med. 2018;169(1):50-51. [DOI] [PubMed] [Google Scholar]
  • 23. Christino MA, Matson AP, Fischer SA, Reinert SE, DiGiovanni CW, Fadale PD.. Paperwork versus patient care: a nationwide survey of residents’ perceptions of clinical documentation requirements and patient care. J Grad Med Educ. 2013;5(4):600-604. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24. Hunt LM, Bell HS, Baker AM, Howard HA.. Electronic health records and the disappearing patient. Med Anthropol Q. 2017;31(3):403-421. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25. Kuhn T, Basch P, Barr M, Yackel T; Physicians MICotACo. Clinical documentation in the 21st century: executive summary of a policy position paper from the American College of Physicians. Ann Intern Med. 2015;162(4):301-303. [DOI] [PubMed] [Google Scholar]
  • 26. Wright AA, Katz IT.. Beyond burnout—redesigning care to restore meaning and sanity for physicians. N Engl J Med. 2018;378(4):309-311. [DOI] [PubMed] [Google Scholar]
  • 27. Casalino LP, Gans D, Weber R, et al. US physician practices spend more than $15.4 billion annually to report quality measures. Health Aff (Millwood). 2016;35(3):401-406. [DOI] [PubMed] [Google Scholar]
  • 28. Burstin H, Leatherman S, Goldmann D.. The evolution of healthcare quality measurement in the United States. J Intern Med. 2016;279(2):154-159. [DOI] [PubMed] [Google Scholar]
  • 29. Panzer RJ, Gitomer RS, Greene WH, Webster PR, Landry KR, Riccobono CA.. Increasing demands for quality measurement. JAMA 2013;310(18):1971-1980. [DOI] [PubMed] [Google Scholar]
  • 30. Mason J. Qualitative Researching. 2nd ed. SAGE Publications; 2002. [Google Scholar]
  • 31. Ancker JS, Benda NC, Reddy M, Unertl KM, Veinot T.. Guidance for publishing qualitative research in informatics. J Am Med Inform Assoc. 2021;28(12):2743-2748. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32. Morse JM. The Significance of Saturation. SAGE Publications; 1995:147-149. [Google Scholar]
  • 33. Miles MB, Huberman AM.. Qualitative Data Analysis. SAGE Publications; 1994. [Google Scholar]
  • 34. Stroup K, Sanders B, Bernstein B, Scherzer L, Pachter LM.. A new EHR training curriculum and assessment for pediatric residents. Appl Clin Inform. 2017;8(4):994-1002. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35. Vuk J, Anders ME, Mercado CC, Kennedy RL, Casella J, Steelman SC.. Impact of simulation training on self-efficacy of outpatient health care providers to use electronic health records. Int J Med Inform. 2015;84(6):423-429. [DOI] [PubMed] [Google Scholar]
  • 36. Adler-Milstein J, Zhao W, Willard-Grace R, Knox M, Grumbach K.. Electronic health records and burnout: time spent on the electronic health record after hours and message volume associated with exhaustion but not with cynicism among primary care clinicians. J Am Med Inform Assoc. 2020;27(4):531-538. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37. Bluestein D, Diduk-Smith R, Jordan L, Persaud K, Hughes T.. Medicare annual wellness visits: how to get patients and physicians on board. Fam Pract Manag. 2017;24(2):12-16. [PubMed] [Google Scholar]
  • 38. Essén A, Värlander SW, Liljedal KT.. Co-production in chronic care: exploitation and empowerment. Eur J Mark. 2016;50(5/6):724-751. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

ocad158_Supplementary_Data

Data Availability Statement

The data underlying this article will be shared on reasonable request to the corresponding author.


Articles from Journal of the American Medical Informatics Association : JAMIA are provided here courtesy of Oxford University Press

RESOURCES