Abstract
Objective: We describe how electronic health record (EHR) audit files can be used to understand how time is spent in primary care (PC).
Materials/methods: We used audit file data from the Geisinger Clinic to quantify elements of the clinical workflow and to determine how these times vary by patient and encounter factors. We randomly selected audit file records representing 36 437 PC encounters across 26 clinic locations. Audit file data were used to estimate duration and variance of: (1) time in the waiting room, (2) nurse time with the patient, (3) time in the exam room without a nurse or physician, and (4) physician time with the patient. Multivariate modeling was used to test for differences by patient and by encounter features.
Results: On average, a PC encounter took 54.6 minutes, with 5 minutes of nurse time, 15.5 minutes of physician time, and the remaining 62% of the time spent waiting to see a clinician or check out. Older age, female sex, and chronic disease were associated with longer wait times and longer time with clinicians. Level of service and numbers of medications, procedures, and lab orders were associated with longer time with clinicians. Late check-in and same-day visits were associated with shorter wait time and clinician time.
Conclusions: This study provides insights on uses of audit file data for workflow analysis during PC encounters.
Discussion: Scalable ways to quantify clinical encounter workflow elements may provide the means to develop more efficient approaches to care and improve the patient experience.
Keywords: workflow, electronic health records, primary health care
BACKGROUND
Ambulatory care is expected to undergo substantial changes in the next decade. These changes are motivated by reduced reimbursements, the shift to accountable care models, a declining number of primary care physicians (PCPs), an increasingly aging population, and rapid growth in the number of insured patients, among other pressures.1,2 PCP panel sizes are expected to increase substantially, with an expectation that patient experience will also improve.3,4 One approach to achieving these ambitious changes is to understand how time is used, including the patient’s time, and to identify more efficient ways to address patient needs and organize health services.
The rapid adoption of electronic health records (EHRs) has placed considerable focus on the development of alerts, bundled protocols, and other health information technology tools to improve the quality of care under these increasing pressures.5 The Health Information Technology for Economic and Clinical Health (HITECH) Act triggered a rapid increase in EHR adoption, and now EHR vendors are beginning to make electronic health applications available to health systems.5 Improvements in the development, implementation, and uses of health information technology in clinical workflows strongly depend on the ability to learn rapidly.6 Approaches to workflow analysis in service businesses including health care are dominated by time- and resource-intensive methods that depend on ethnographic research and in-depth interviewing.7,8 These methods are not scalable for use in health care, where service workflows will be transformed through accelerating growth in information technology solutions, data density and diversity, and other factors.
The demand to keep pace by rapidly learning what works best in health care will require scalable and automated approaches to measuring workflow elements. While data to support workflow analysis are widely available, only a minority of time and motion studies leverage these automatically computer-generated data.8 In part, the challenge is to identify a source of automated data that both facilitates rapid evaluation and is comprehensive enough to account for the key elements of the clinical workflow. Attributes of the Workflow Elements Model include the actors, the actions of the actors, the timing of the actions, how the actions interact, and the characteristics of these actions within the context of workspace and organizational factors.7
In health care, a wealth of time-stamped data are automatically generated in the EHR audit file to track user activity in the EHR and document all EHR-related transactions. Used for data security and privacy management, the audit file is also a valuable resource to understand how time is used in clinical practice. For every clinical encounter, the audit file tracks the actors (eg, doctors, nurses, patients), their actions (e.g., medication orders, check-in), and the sequencing and duration of these actions. The audit file can be used to study workflow and to evaluate efforts to increase efficiency in less time and with fewer resources than are typically required with other approaches to workflow analysis.7 Previous studies have used audit files to measure clinical documentation time, detect patterns of EHR use, and monitor user activity.9–13 We describe another method of utilizing the audit file to understand clinical workflows and patient experience.
METHODS
Audit file data from the Geisinger Clinic (GC) EHR were used to understand the workflow of office visits in PC practice clinics. We used retrospective audit file data to quantify workflow elements7 from the time of patient check-in to check-out relevant to key actors (ie, patient, nurse, physician), actions, and timing. Specifically we measured: (1) who was involved in the ambulatory care of PC patients, (2) how much time each clinician spent with patients, and (3) how much time patients spent waiting to see clinicians. We linked audit data with EHR data to determine how these times varied by patient and encounter factors. This study was approved by the Geisinger Institutional Review Board.
Setting
GC serves over 400 000 primary care patients in more than 40 counties across Pennsylvania and includes 30 family practice clinics. We used the data from GC’s EpicCare EHR audit file for this study.
Source of data and variable definition
Data were merged from the Cadence Enterprise Scheduling system and the Epic EHR audit log file using the unique serial number associated with each encounter. Cadence is an Epic-integrated application used for outpatient scheduling. Cadence data files include, among other items, appointment times and check-in and check-out times. The Epic audit log file maintains a time-stamped record of each transaction that occurs when a patient record is in use. In a typical encounter workflow using an EHR, a clinician views a module in the record (eg, vital signs), accepts a module in the record (eg, saves changes to vital signs), and then exits the module (eg, exits the encounter). The audit log tracks any user who logs in to a patient record, each of the user’s actions (eg, “view,” “accept,” “cancel,” “print,” or “exit”) during an encounter, and the Epic modules in which the actions were taken (eg, encounter, medication lists, orders, etc.).
Selection of primary care encounters
Given the substantial volume of audit file data, we selected a stratified random sample from 1 212 446 PC encounters that occurred in GC PC practice clinics between January 1, 2009, and June 30, 2011. Encounter records were selected if there was at least 1 action recorded in the audit file between check-in and check-out. To reduce the processing time and computational memory required for analysis of the audit file, we then selected a simple random sample of 30 individual dates from within the study period. This yielded a 3% random sample of 36 437 encounters across the 26 family practice clinics active during the study period.
Estimating duration of primary care encounter activity
The duration of PC encounters was calculated from sequential audit file records, and the workflow was defined by the order and number of physicians, nurses, and staff members who interacted with the patient. A total of 26 distinct workflows were identified, but 78% of the encounters followed the same workflow that involved a single nurse interaction followed by a single physician interaction. After excluding atypical encounters (eg, no-shows, nursing facility care) and encounters with data errors (eg, extreme outliers), we identified 22 523 encounters with 1 nurse and 1 doctor for 20 905 unique patients. We provide summary statistics on all workflows (Table 1) but confined the detailed analysis to the most common workflow (Figure 1).
Table 1.
Distribution of clinicians involved in a random sample of 36 437 primary care encounters and the level of service (LOS) of each encounter
| Number of: |
Encounters |
Level of Service for Encounter |
||||||||
|---|---|---|---|---|---|---|---|---|---|---|
| Actors | Nurses | Physicians | N (%) | LOS | LOS | LOS | New | Return | Missing | ROW |
| 1–2 | 3–4 | 5 | TOTAL | |||||||
| 2 | 1 | 1 | 28411 (78.0) | 1.9 | 85.0 | 1.7 | 1.1 | 6.7 | 3.5 | 100 |
| 3 | 2 | 1 | 5756 (15.8) | 1.8 | 83.8 | 2.2 | 1.4 | 8.3 | 2.6 | 100 |
| 3 | 1 | 2 | 598 (1.6) | 2.5 | 86.8 | 1.8 | 1.5 | 4.7 | 2.7 | 100 |
| 4 | 3 | 1 | 825 (2.3) | 2.1 | 80.6 | 3.5 | 1.7 | 7.8 | 4.4 | 100 |
| Nurse-only encountersa | 214 (0.6) | 8.4 | 61.2 | 0.9 | 0.9 | 5.1 | 23.4 | 100 | ||
| All other actor combinations | 633 (1.7) | 3.0 | 82.9 | 3.6 | 1.1 | 5.4 | 3.9 | 100 | ||
| Total | 36 437 (100) | 2.0 | 84.6 | 1.8 | 1.2 | 6.9 | 3.5 | 100 | ||
aNurse-only encounters: 166 encounters with 1 nurse, 41 encounters with 2 nurses, 6 encounters with 3 nurses, 1 encounter with 5 nurses.
Figure 1.
Common workflow representing 78% of workflows identified using the audit file data and confirmed during interviews with clinical and office staff working in primary care.
The common workflow was divided into 4 distinct phases: (1) waiting room time, defined by time of clinic check-in to nurse entry into the exam room; (2) nurse time with patient, defined by nurse entry to nurse exit from exam room; (3) wait time in exam room, defined by nurse exit to physician entry into the exam room; and (4) physician time with patient, defined by physician entry to physician exit from the exam room. Physician time included time with either a physician or physician assistant, and subsequent references to “physicians” include both physicians and physician assistants. Interviews with clinical and office staff confirmed that these audit file data reflected the most typical workflow pattern in primary care.
For this analysis, the total duration of the encounter was defined by the difference between the Cadence check-in and check-out times. Audit records were then used to define when a nurse or physician was with the patient. Because the audit file in the Geisinger system does not distinguish between activities with the patient that take place inside vs outside the exam room, we defined physician activity after check-in, but before the nurse put the patient in a room, as time logged in to the patient record outside the exam room. For the purposes of describing exam room–only activity, we excluded from analysis these outside-the-exam-room audit transactions. In addition, Geisinger uses a “secure and stay” function that allows a user to exit the EHR. When a nurse uses this function, the audit file does not record the nurse exiting the encounter until the physician logs in to the secured/stayed screen, resulting in matching time stamps for the nurse “exit encounter” and the physician “view encounter.” When this occurs, the duration of time that the nurse spent with the patient is overestimated. Therefore, in the event that these time stamps matched, we used the time stamp of the last nurse action prior to the matched time stamp as the nurse exit time.
Patient-level, encounter-level, and clinic-level predictors of encounter times
We used EHR data to determine characteristics of the patients and encounters included in our analysis. Patient factors included age, sex, and race. Encounter factors included the level of service, time of day of encounter, whether the encounter was a same-day encounter, and whether the patient checked in after the designated appointment time. We determined the numbers of procedures, lab orders, and medication orders associated with the encounter, and whether or not the patient had evidence of 1 of the following chronic conditions in his or her medical record: congestive heart failure, cardiovascular disease, diabetes mellitus, or hypertension.
Analysis
A random effect linear model was fit to log-transformed temporal measures of the most common workflow, 1 nurse followed by 1 doctor. Encounters were nested within clinic sites with an appropriate random effect for each site. Analyses were completed to determine if variance in temporal measures was related to patient and encounter features. Encounter intervals were included in the analysis if they occurred between check-in and check-out, were positive time intervals, and did not exceed 6 hours. The top and bottom 1% of encounters for time spent with physician, check-in to nurse entry time, and nurse exit to provider entry time were removed, as well as the bottom 1% of time spent with the nurse, as these represented extreme outliers that were also time intervals that were not plausible.
RESULTS
Approximately 94% of all encounters occurred with either a single nurse and physician (78%) or 2 nurses and a single physician (15.8%). Four encounter types accounted for 97.7% of all encounters. The remaining 2.3% of encounters involved combinations of up to 8 nurses and physicians (Table 1.) More than 80% of the encounters were for a level of service of 3–4. The distribution of levels of service did not differ among the 4 most common encounter types (Table 1).
For these encounters, an adult office visit took, on average, 54.6 minutes, and a majority of that time was spent in either the waiting room or the exam room without the nurse or doctor. For the remainder of the paper, these intervals are referred to as wait times. Time spent with the nurse, an average of 5.0 minutes, was relatively short compared to the average time of 15.5 minutes spent with the physician (Table 2). Patients spent an average of 45% of the encounter time waiting, with 13.5 minutes in the waiting room and 11.2 minutes in the exam room. Time spent waiting was highly skewed, with the upper end of the interquartile range approximately twice the median (Table 2). Interval times varied considerably by patient (Table 3) and encounter features (Table 4). Patients spent 9.4 minutes preparing for and checking out of the encounter. While a portion of this last interval of time may include waiting (eg, waiting to make a co-payment and check out), it also includes activities such as getting dressed, scheduling a follow-up appointment, and completing lab work. Given the range of typical activities during this last interval of the visit, we excluded this time interval from the additional analysis described below.
Table 2.
Duration (in minutes) of primary care encounter intervals among a random sample of 22 523 a encounters in 26 clinics, Geisinger, January 1, 2009 to June 30, 2011
| Encounter Interval | Mean (SD) | Median (IQR) |
|---|---|---|
| Check-in to nurse entrance | 13.5 (11.8) | 9.4 (5.2–18.0) |
| Nurse entrance to nurse exit | 5.0 (3.9) | 4.1 (2.7–6.2) |
| Nurse exit to provider entrance | 11.2 (9.1) | 8.7 (3.9–16.2) |
| Provider entrance to provider exit | 15.5 (9.7) | 13.3 (8.5–20.1) |
| Provider exit to patient check-out | 9.4 (26.5) | 3.5 (1.6–7.5) |
| Check-in to check-out | 54.6 (33.6) | 48.0 (35.0–66.0) |
aEncounters with a workflow of 1 nurse followed by 1 doctor.
Table 3.
Association of patient characteristics with mean and median primary care encounter interval times (in minutes) among a random sample of 22 523a encounters in 26 clinics, Geisinger, 2009–2011
| Variable | Encounters (%) |
Waiting room wait time (minutes) |
Nurse time with patient (minutes) |
Exam room wait time (minutes) |
Physician time with patient (minutes) |
||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| N = 22 523 | |||||||||||||
| Mean | Median | Upper quartile | Mean | Median | Upper quartile | Mean | Median | Upper quartile | Mean | Median | Upper Quartile | ||
| Patient age (years) | |||||||||||||
| ≤18 | 13.6 | 12.4* | 8.7 | 16.0 | 4.7* | 3.9 | 5.6 | 10.1 | 7.9 | 14.7 | 13.0* | 10.6 | 16.8 |
| 19–39r | 20.6 | 12.0† | 8.4 | 15.4 | 4.5† | 3.4 | 5.5 | 10.4† | 7.9 | 15.2 | 14.4† | 12.1 | 18.5 |
| 40–59 | 30.8 | 13.2* | 8.9 | 17.5 | 4.9 | 4.1 | 6.1 | 11.0 | 8.5 | 15.8 | 16.2* | 13.9 | 21.0 |
| 60–79 | 26.6 | 14.9* | 10.6 | 20.3 | 5.4* | 4.5 | 6.8 | 12.1* | 9.7 | 17.5 | 16.7* | 14.7 | 21.4 |
| ≥80 | 8.4 | 15.9* | 11.6 | 21.6 | 5.6* | 4.6 | 7.0 | 12.3* | 10.1 | 17.9 | 16.0* | 14.2 | 20.4 |
| Sex | |||||||||||||
| Female | 57.7 | 13.6* | 9.4 | 18.1 | 5.1* | 4.2 | 6.3 | 11.2 | 8.7 | 16.3 | 15.7* | 13.6 | 20.5 |
| Male r | 42.3 | 13.4 | 9.3 | 17.9 | 4.9 | 4.0 | 6.0 | 11.1 | 8.7 | 16.0 | 15.2 | 13.0 | 19.7 |
| Race | |||||||||||||
| White | 97.1 | 13.5* | 9.3 | 17.9 | 5.0 | 4.1 | 6.2 | 11.2 | 8.7 | 16.3 | 15.5 | 13.3 | 20.1 |
| Non-Whiter | 2.9 | 13.9 | 10.4r | 18.7 | 4.9 | 4.1 | 5.8 | 9.6 | 7.1 | 13.7 | 15.2 | 13.2 | 19.6 |
| Disease Status | |||||||||||||
| CHF, yes | 4.1 | 15.6 | 10.8 | 20.5 | 5.5* | 4.6 | 7.0 | 12.4 | 10.2 | 18.1 | 17.4* | 16.0 | 21.6 |
| CHF, no | 95.9 | 13.4 | 9.3 | 17.8 | 5.0 | 4.1 | 6.1 | 11.1 | 8.6 | 16.1 | 15.4 | 13.2 | 20.0 |
| CVD, yes | 6.3 | 15.9 | 11.4 | 21.1 | 5.5* | 4.7 | 6.9 | 11.9 | 9.5 | 17.2 | 17.0* | 15.0 | 21.6 |
| CVD, no | 93.7 | 13.4* | 9.2 | 17.8 | 5.0 | 4.1 | 6.1 | 11.1 | 8.6 | 16.1 | 15.4 | 13.2 | 20.0 |
| DM, yes | 16.2 | 14.8* | 10.7 | 20.2 | 5.9* | 4.9 | 7.4 | 11.7 | 9.2 | 17.0 | 16.6 | 14.8 | 21.3 |
| DM, no | 83.8 | 13.3 | 9.1 | 17.5 | 4.8 | 4.0 | 5.9 | 11.1 | 8.6 | 16.0 | 15.3 | 13.1 | 19.9 |
| HTN, yes | 38.9 | 14.5 | 10.3 | 19.7 | 5.4* | 4.5 | 6.7 | 11.8 | 9.5 | 17.1 | 16.3 | 14.4 | 21.0 |
| HTN, no | 61.1 | 12.9 | 8.8 | 16.9 | 4.8 | 3.9 | 5.8 | 10.7 | 8.2 | 15.6 | 15.0 | 12.6 | 19.5 |
| Late check-in | |||||||||||||
| Yes | 18.8 | 9.7* | 6.7 | 12.0 | 4.8* | 4.0 | 5.9 | 9.5* | 7.0 | 13.8 | 15.4* | 13.1 | 19.9 |
| No | 81.2 | 14.4 | 10.1 | 19.4 | 5.1 | 4.2 | 6.2 | 11.5 | 9.2 | 16.7 | 15.5 | 13.4 | 20.2 |
| Same-day Visit | |||||||||||||
| Yes | 39.3 | 12.5* | 8.7 | 16.2 | 4.6* | 3.8 | 5.6 | 10.4* | 7.9 | 14.8 | 13.5* | 11.0 | 17.4 |
| No | 60.7 | 14.2 | 9.8 | 19.2 | 5.3 | 4.3 | 6.5 | 11.7 | 9.3 | 17.0 | 16.8 | 14.9 | 21.5 |
aEncounters with a workflow of 1 nurse followed by 1 doctor.
†P < 0.05 test for trend.
*P < 0.05 using a multilevel model adjusting for all other variables in the table.
r: reference group.
Table 4.
Association of encounter characteristics with mean and median primary care encounter interval times (in minutes) among a random sample of 22 523a encounters in 26 clinics, Geisinger, 2009–2011
| Characteristic | Encounters (%) | Waiting room wait time (minutes) |
Nurse time with patient (minutes) |
Exam room wait time (minutes) |
Provider time with patient (minutes) |
||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| N = 22 523 | Mean | Median | Upper quartile | Mean | Median | Upper quartile | Mean | Median | Upper quartile | Mean | Median | Upper quartile | |
| Level of service (LOS) | |||||||||||||
| LOS 1r | 0.1 | 12.4† | 7.2 | 14.9 | 6.1† | 3.8 | 5.2 | 15.0 † | 12.4 | 25.2 | 13.8 † | 11.0 | 16.5 |
| LOS 2 | 1.8 | 12.2 | 7.3 | 15.9 | 4.6 | 3.7 | 5.5 | 9.6* | 7.2 | 14.0 | 11.1* | 8.6 | 14.9 |
| LOS 3 | 36.9 | 13.1 | 9.1 | 17.3 | 4.6 | 3.8 | 5.6 | 10.5 | 8.3 | 15.1 | 12.9 | 10.7 | 16.5 |
| LOS 4 | 49.8 | 13.9 | 9.6 | 18.5 | 5.1 | 4.3 | 6.4 | 11.8 | 9.2 | 17.1 | 16.5 | 14.6 | 21.1 |
| LOS 5 | 1.5 | 14.7 | 10.2 | 19.8 | 7.1 | 5.7 | 9.0 | 10.8* | 8.2 | 15.3 | 24.6* | 23.5 | 31.3 |
| Other | 9.8 | 14.7 | 12.3 | 20.3 | 5.3 | 4.5 | 6.6 | 11.8 | 9.9 | 16.6 | 19.3 | 17.5 | 25.4 |
| Check-in time | |||||||||||||
| <10 amr | 26.7 | 12.1† | 8.5 | 15.7 | 4.9† | 4.1 | 6.0 | 10.7 † | 8.3 | 15.5 | 16.3† | 14.4 | 20.9 |
| 10 am–12:59 pm | 28.9 | 14.4* | 10.1 | 19.3 | 5.0 | 4.2 | 6.2 | 11.3* | 8.8 | 16.4 | 15.4* | 13.1 | 20.2 |
| 1 pm–3:59 pm | 33.3 | 14.1* | 9.7 | 19.0 | 5.1* | 4.2 | 6.4 | 11.5* | 9.1 | 16.8 | 15.3* | 13.1 | 19.9 |
| After 4 pm | 11.1 | 12.9 | 8.8 | 17.1 | 4.9 | 4.0 | 6.0 | 10.7 | 8.0 | 15.6 | 14.2* | 12.0 | 18.4 |
| Procedures | |||||||||||||
| 0r | 45.4 | 13.4 | 9.3 | 17.8 | 4.6† | 3.9 | 5.7 | 11.1† | 8.6 | 16.2 | 12.7† | 10.9 | 16.5 |
| 1 | 22.5 | 13.9 | 9.6 | 18.5 | 5.1* | 4.2 | 6.3 | 11.0* | 8.5 | 15.9 | 16.0* | 14.0 | 20.6 |
| 2 | 12.2 | 13.5 | 9.2 | 18.1 | 5.6* | 4.5 | 6.8 | 11.0 | 8.5 | 15.9 | 18.1* | 16.0 | 23.6 |
| 3 | 6.7 | 13.2 | 9.4 | 17.6 | 5.9* | 4.6 | 7.3 | 11.5 | 9.1 | 16.4 | 19.0* | 17.0 | 24.6 |
| 4 or more | 13.2 | 13.4 | 9.3 | 17.9 | 5.5* | 4.4 | 6.7 | 11.5 | 9.2 | 16.8 | 20.0* | 17.9 | 25.6 |
| Labs ordered | |||||||||||||
| 0r | 79.9 | 13.6 | 9.4 | 18.0 | 5.0 | 4.1 | 6.1 | 11.1 | 8.6 | 16.1 | 14.9 | 12.8 | 19.4 |
| 1 or more | 20.1 | 13.3 | 11.5 | 17.7 | 5.2* | 4.2 | 6.4 | 11.5 | 9.1 | 16.8 | 17.8* | 15.8 | 22.9 |
| Medications ordered | |||||||||||||
| 0r | 34.9 | 13.7† | 9.4 | 18.6 | 4.9 † | 4.0 | 6.0 | 11.2† | 8.7 | 16.2 | 15.6 † | 13.6 | 20.3 |
| 1 | 29.2 | 13.3 | 9.2 | 17.4 | 4.9* | 4.0 | 6.0 | 11.0* | 8.6 | 16.0 | 14.8* | 12.6 | 19.2 |
| 2 | 17.6 | 13.1* | 9.2 | 17.1 | 4.9* | 4.1 | 6.1 | 11.0* | 8.5 | 16.1 | 14.8 | 12.6 | 19.1 |
| 3 | 8.0 | 13.5* | 9.2 | 17.7 | 5.1* | 4.3 | 6.2 | 11.3* | 8.8 | 16.6 | 16.2* | 13.9 | 20.9 |
| 4 or more | 10.3 | 14.1 | 10.0 | 18.5 | 5.8* | 4.8 | 7.4 | 11.7 | 9.3 | 16.5 | 17.8* | 15.6 | 22.8 |
aEncounters with a workflow of 1 nurse followed by 1 doctor.
†P < .05 test for trend.
*P < .05 using a multilevel model adjusting for all other variables in the table.
r: reference group.
Patient features
Time spent by patients in the first 4 intervals of the PC encounter varied by a number of patient-level factors. Total time was greater for older patients than for younger patients and for patients with specific diseases compared to those without the diseases (Table 3). In general, wait time increased with increasing age. and the statistically significant trend with age was steeper for waiting room time (mean of 12.0 for 19–39 years vs 15.9 for 80+ years) than it was for exam room time (mean of 10.4 for 19–39 years vs 12.3 for 80+ years). Those with specific diseases had greater wait times than those who did not, with the greatest differences observed for congestive heart failure and other cardiovascular diseases. Time spent with the nurse, and especially with the physician, was also longer for older patients and for patients with specific diseases (Table 3).
Females tended to have slightly longer interval times with clinicians when compared to males (Table 3). White patients spent less time in the waiting room compared to non-white patients, but race was not associated with variation in other encounter intervals. Patients with a late check-in, defined as documented check-in time that was at least 1 minute later than the scheduled appointment time, had substantially shorter wait times in the waiting room (9.7 vs 14.4 minutes) and, to a lesser degree, shorter wait times in the exam room (9.5 vs 11.5 minutes). Late check-in was also associated with less time with the nurse and the physician. Acute or same-day visits required less time, on average, in all intervals.
Encounter features
Level 3 and level 4 visits accounted for almost 90% of all encounters (Table 4), and the trend in time spent with clinicians increased in relation to the level of service. Notably, waiting room time and, to a lesser degree, exam room time also had a statistically significant increasing trend with level of service. Physician time was approximately twice as long for a service level 5 visit compared to a service level 3 visit. These associations remained after controlling for check-in time and the numbers of procedures, labs, and medications ordered.
A total of 55.6% of encounters occurred before 1 pm. Wait times were shorter at the beginning (before 10 am) and at the end (after 4 pm) of the day. Patients with appointments between 10 am and 4 pm waited an average of 2.0–2.4 minutes longer in the waiting room than patients who had an encounter start time before 10 am. While nurse time did not differ by time of day, physician time with the patient decreased from the morning to the end of the day (Table 4).
In general, the numbers of orders for procedures, labs, and medications were not associated with either waiting room or exam room times, but there was a statistically significant increase in time with both the nurse and the physician in relation to numbers of orders of each type. The greatest differences were observed for procedure orders, where the time the patient spent with the physician was more than 50% longer, on average, when 3 or more procedures were ordered as compared to a visit when no procedures were ordered.
DISCUSSION
Relatively little is known about how patients, nurses, and physicians spend time during PC encounters. Historically, in a paper-based practice, the type of data required to measure time had to be obtained manually. With the rapid adoption of EHRs, time-stamped audit data are routinely captured and offer a means for clinical practices to better understand workflow and how much time patients spend waiting. These data can also be used to improve practice efficiency and the patient experience.
This is the first paper, to our knowledge, to use EHR audit file data to measure how time is spent during ambulatory care encounters. This source of data captures key attributes of clinical workflow, including who is involved, their activities, and the sequencing and duration of activities.7 Our results are consistent with prior studies using more traditional approaches to workflow analysis.7–8 The construct validity of our findings is supported by the expected relationship between increased physician time that is required in relation to increased level of service, patient disease burden, and number of procedures performed during an encounter, among other factors. Our study offers insight on a scalable approach to workflow analysis that makes use of readily available data and that can support rapid learning.8
Our analysis of adult PC encounters reveals that the average duration of face-time that patients had with physicians was 16 minutes, while the average time spent at the encounter overall was 56 minutes. Our findings are consistent with other studies where data were collected manually.14,15 We found that certain patient features (eg, advanced age, female sex, using multiple medications) required more physician time.14 Our results expand on previous findings by identifying patient characteristics associated with physician time and, separately, with nurse time, and with waiting times. Audit file data identify the location at which actions take place (ie, clinic site/location) in ways that could support research and rapid learning about operational improvements (eg, Plan Do Study Act) designed to influence process, structural design (eg, clinic size and layout), or organizational design (eg, staffing models) of clinics on workflow elements, clinical outcomes, and patient, provider, and staff experience.
Our analysis expands ways in which audit files, available with all certified EHR systems, can be used to better understand the care delivery process.16 While the approach we took to summarizing data is informative and can be easily implemented by others, our work only scratches the surface of potential applications of these data. In general, audit file data can be used to understand factors that improve or disrupt clinical processes, informing care delivery and clinic operation decisions.
Opportunities to use these data may be found in simulation modeling. Simulation modeling, widely applied in other industries, opens opportunities to experiment with diverse approaches to delivering care without disrupting the running system.17 Simulation modeling is optimal when detailed time-stamped data are available and, in health care, enables low-cost experimentation with different staffing models, scheduling strategies, and other ideas for improving care delivery.
Audit data reveal the substantial amount of time that patients wait and when and where waiting occurs, informing workflow strategies to improve efficiency and increase patient satisfaction.18,19 For example, efforts could be made to reduce patient wait time. Alternatively, digital patient data-capture protocols that leverage wait time could be implemented to both understand more about the patient needs and reduce the time taken by nurses and physicians to obtain information during patient encounters. The ability to “put the patient to work” during the encounter is becoming a pressing issue, as providers are increasingly incentivized to incorporate patient-reported outcomes into care delivery.20–22
Audit file data revealed noteworthy patterns. For example, patients were rewarded with a shorter waiting room time if they had a late check-in. This is sensible, given the relatively long average duration of waiting room time, and suggests that notifying patients in real time (eg, by texting) could reduce wait time, likely improving patient satisfaction. Compared to physicians, nurses do not spend much time with patients. For more complex patients, nurses could be trained to gather and document more information, saving physician time and also allowing physicians to understand more at the beginning of the encounter. This type of workflow could substantially reduce the cost per total relative value unit.
Audit file data are likely to be important in devising more efficient care processes and tailoring care to patient-specific needs. We showed that physician and nurse time varied by disease burden and by what was ordered during an encounter. Known or predicted features of a scheduled encounter may provide the means to structure efficient approaches to scheduling and to connecting a patient with the most sensible care team. These data could be used to inform scheduling software that, for example, incorporates more exact predictions of provider visit length based on such known patient characteristics.14 Combined EHR clinical and audit data could be used to predict the most appropriate clinician for a patient encounter.
Limitations
This study was confined to only 1 health system and represents the workflow of a single institution. However, our estimates of time are consistent with national estimates.13,14 Furthermore, the audit log file is available to other health systems that use a certified EHR system. Second, we confined some of our analysis to encounters with a single nurse and a single doctor, as this pattern accounted for 78% of all encounters. However, audit file analysis can be broadly applied to a variety of clinical encounters. Finally, while we referred in the EHR to times in the exam room when the physician and nurse were not active as wait time, the audit file data does not allow us to determine what proportion of this time is waiting vs activities relevant to the clinical exam (eg, removing clothes for the exam, providing a urine sample, etc.)
CONCLUSIONS
The ability to study EHR audit data retrospectively makes this low-cost method of workflow analysis uniquely suited to evaluating the rapid changes in operations and health policy. The longitudinal data available in an organization’s audit files allow for the comparison of workflow, wait time, and provider face-time before and after a range of changes, from the expansion of clinic hours to changes in Medicare reimbursement. The use of more traditional methods of workflow analysis, such as time and motion studies, requires a prospective study design and advanced notice of pending policy changes in order to evaluate the impact of these changes. In a time when health systems are struggling to stay up to speed with health reform, such studies are generally not feasible.
FUNDING
This research received no specific grant from any funding agency in the public, commercial, or not-for-profit sector.
COMPETING INTERESTS
The authors have no competing interests to declare.
CONTRIBUTORS
A.H., J.J., W.S., and V.L. contributed to the conception and study design. D.C., V.L., X.T., and A.B. participated in the data acquisition and analysis. All authors participated in interpretation of the data. All authors participated in drafting or revising the manuscript and approved the final version.
REFERENCES
- 1. Ghorob A, Bodenheimer T. Sharing the care to improve access to primary care. NEJM. 2012;3661955:1957. [DOI] [PubMed] [Google Scholar]
- 2. Bodenheimer T, Smith M. Primary care: proposed solutions to the physician shortage without training more physicians. Health Aff. 2010;295:799–805. [DOI] [PubMed] [Google Scholar]
- 3. Dyrbye L, Shanafelt T. Physician burnout: a potential threat to successful health care reform. JAMA. 2011;30519:2009–2010. [DOI] [PubMed] [Google Scholar]
- 4. Shanafelt T, Boon S, Tan L, et al. Burnout and satisfaction with work-life balance among US physicians relative to the general US population. Arch Intern Med. 2012;17218:1377–1385. [DOI] [PubMed] [Google Scholar]
- 5. Weiner JP, Yeh S, Blumenthal D. The impact of health information technology and e-health on the future demand for physician services. Health Affairs. 2013;3211:1998–2004. [DOI] [PubMed] [Google Scholar]
- 6. Sittig DF, Singh H. A new socio-technical model for studying health information technology in complex adaptive healthcare systems. Qual Saf Healthcare. 2010;19 (Suppl 3):i68–i74. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7. Unertl K, Novak L, Johnson K, Lorenzi N. Traversing the many path of workflow research: developing a conception framework of workflow terminology through systematic literature review. J Am Med Inform Assoc. 2010;17:265–273. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8. Lopetegui M, Yen P, Lai A, Jeffries J, Embi P, Payne P. Time motion studies in healthcare: What are you talking about? J Biomed Inform. 2014;49:292–299. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9. Chen E, Cimino J. Patterns of usage for a web-based clinical information system. Stud Health Technol Inform. 2004;170(Pt. 1):18–22. [PubMed] [Google Scholar]
- 10. Hripcsak G, Vawdrey D, Fred M, Bostwick S. Use of electronic clinical documentation: time spent and team interactions. J Am Med Inform Assoc. 2011;18:112–117. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11. Vawdrey D, Wilcox L, Collins S, et al. Awareness of the care team in electronic health records. Appl Clin Inform. 2011;24:395–405. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12. Zhang W, Gunter C, Liebovitz D, Tian J, Malin B. Role prediction using electronic record system audits. AMIA Annu Symp Proc. 2011;2011:858–867. [PMC free article] [PubMed] [Google Scholar]
- 13. Bowes W. Measures use of electronic health record functionality using system audit information. Stud Health Technol Inform. 2010;1601:86–90. [PubMed] [Google Scholar]
- 14. Blumenthal D, Causino N, Chang Y, et al. The duration of ambulatory visits to physicians. J Fam Pract. 1999;48:264–271. [PubMed] [Google Scholar]
- 15. Tai-Seale M, McGuire T, Zhang W. Time allocation in primary care office visits. Health Serv Res. 200742;5:1871–1894. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16. Health Information Technology: Standards, Implementation Specifications, and Certification Criteria for Electronic Health Record Technology, 2014 Edition; Revisions to the Permanent Certification Program for Health Information Technology [Internet]. Federal Register: Department of Health and Human Services; 2012. http://www.gpo.gov/fdsys/pkg/FR-2012-09-04/pdf/2012-20982.pdf. Accessed December 1, 2015.
- 17. Thorwarth M, Arisha A. 2009. Application of discrete-event simulation in health care: a review. http://arrow.dit.ie/cgi/viewcontent.cgi?article=1002&context=buschmanrep. Accessed December 16, 2015.
- 18. Anderson R, Camacho F, Balkrishnan R. Willing to wait? The influence of patient wait time on satisfaction in primary care. BMC Health Serv Res. 2003;262:138–149. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19. Leddy K, Kaldenberg D, Becker B. Timeliness in ambulatory care treatment: An examination of patient satisfaction and wait times in medical practices and outpatient test in treatment facilities. J Ambul Care Manage. 2003;262:138–149. [DOI] [PubMed] [Google Scholar]
- 20. Ralston J, Coleman K, Reid R, Handley M, Larson E. Patient experience should be part of meaningful-use criteria. Health Aff. 2010;294:607–613. [DOI] [PubMed] [Google Scholar]
- 21. Wu A, Kharrazi H, Boulware L, Snyder C. Measure once, cut twice—Adding patient-reported outcome measures to the electronic health record for comparative effectiveness research. J Clin Epidemiol. 2013;66:s12–s20. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22. Work Product of the HITPC Meaningful Use Workgroup—Meaningful Use Stage 3 Recommendations [Internet]. healthit.gov; 2014. https://www.healthit.gov/FACAS/sites/faca/files/hitpc_muwg_stage3_recs_2014_03_11.pdf. Accessed June 8, 2015.

