Abstract
Handoffs and cross-coverage are necessary for maintaining the continuity of patient care, yet both are potential sources of error, and may threaten patient safety and care. Handoffs are the transfer of patient information and accountability from one provider to another. Cross-coverage is the management of patients, of whom physicians who have little or no prior knowledge of, during nightshifts. We observed how physicians give a sign-out after receiving a handoff in a simulated session of an evening handoff and start of nightshift. We collected data from thirty physicians from an academic medical center as they signed out six patients after responding to nurse calls. An error analysis of the sign-out data revealed 42 errors overall, with 28 omissions and 14 “erroneous data” errors. We then propose ways to prevent these errors through modification of the electronic medical record and support tools, and through higher awareness of human factors.
Introduction
Failures in communication between health care professionals and diagnostic reasoning errors pose a potential threat to patient safety and quality of care. Root cause analyses of sentinel events have shown an implication of communication issues in up to 70% of cases in a report by the Joint Commission on Accreditation of Healthcare Organizations1. Communication is particularly important during moments of transition of care, or handoffs, which are necessary to maintain the continuity of care, particularly among hospitalized patients. Handoffs are defined as the transfer of patient information and accountability from one healthcare provider to another2. Handoffs take place between day and night teams, or between wards or departments (e.g., between the emergency department and the internal medicine ward). The handoff practice varies widely between departments and institutions, and refers to in-person, verbal handoffs, phone calls, written handoffs or a combination of these. Recommendations for improving handoffs have emphasized the need to standardize the process2, which has been successfully reported with the “I-PASS handoff bundle”3. Recent studies on handoffs have shown the benefits of improving the handoff process on medical errors4, 5.
Cross-coverage is the process used during nightshifts, when the night team pursues the management of patients in the wards6. Night physicians face additional challenges for patient care, as (1) they manage a larger number of patients, (2) they are unfamiliar with the majority of patients, and therefore need to depend on prior assessments and plans performed by the day colleagues, and (3) work with less staffed teams (both nursing and physician teams), with (4) subsequent lower supervision during the night than during the day. Furthermore, night physicians may also manage patients for whom they have not received a handoff. Therefore they need to be able to rapidly extract pertinent information from the medical records, and other available resources (nurses, or patient summaries). This rapid extraction of pertinent content from the medical records has been called “chart biopsy.” 7
Paper summaries are an interesting support tool for handoffs, because it allows the receiver to have an overview of the patients’ problems. Systems like UW Cores (University of Washington computerized rounding and sign-out system)8 have shown their effectiveness in improving the workflow process and handoff. During focus groups from a prior study with residents and supervising physicians from our division9, discussions around support tools for handoffs raised concerns about errors and redundancy. Paper-based tools, even if auto-populated, are only an instantaneous snapshot of a patient’s medical chart, and the physicians therefore were concerned that they may be a source of error, if modifications appeared after printing the summary. Furthermore, since logging into the chart is required to see the latest vital signs or for prescriptions, most participants did not see an immediate benefit in having a potentially unreliable paper print-out, even if it could perhaps allow for a more ubiquitous or quicker access to patient information. Unsurprisingly, many physicians actually use hand-written notes (on blank paper or on documents such as the admission notes) as support tools throughout the day, which may have an even higher rate of imprecision or error.
Prior studies on handoffs or cross-coverage have for the large part been observational studies or retrospective cohort studies. These designs have limitations, such as lack of comparability and patient specificity, or recall bias. We therefore conducted a simulated handoff and chart biopsy study, an approach that allowed us to use standardized cases with pre-defined nursing cues, and direct observations of patient management and electronic health record (EHR) use.
The aim of the present study was to study the handoff process among residents and supervising physicians, focusing on the potential errors, to identify their origins. We hypothesized that errors during the handoff simulations could arise from the handoff process, the data collection during the chart biopsy, or the clinical reasoning during cross-coverage. We then explore ways to help avoid these errors through EHR design and medical training. The analysis of the handoff quality in terms of the type of support (paper summary, EHR or none) will be presented in a separate report.
Handoffs in General Internal Medicine at our institution
At the University Hospitals of Geneva, the medical teams (2 residents, 2 medical students, and 1 supervising physician) of the Division of General Internal Medicine each manage about 15 to 20 patients. The teams are divided into two zones of about 100 patients, and one nightshift physician cross-covers each zone. A senior physician from the Emergency Department, who is not otherwise involved in the patients’ care, supervises nightshift residents. Therefore, night and day teams are completely different, and rely on handoffs, notes in the medical charts, and interactions with the nursing teams, to maintain the continuity of patient care.
Handoffs take place in a meeting room away from the wards, and last often less than 15 min for the whole zone. This is achieved by only handing off a selection of patients to the night team, selected according to the following criteria9: (1) Patients whose condition has recently deteriorated (unstable), or who have recently developed new symptoms of yet uncertain diagnosis, and who therefore require monitoring, (2) patients who require pending action (e.g., following up on lab or imaging results, or pending consultations by a specialist), (3) patients who are considered complex, or whose management is unusual, (4) patients who are receiving end of life care, or (5) patients for whom the day team anticipates potential complications overnight. Simple, stable patients with no expected complications are not handed off to the night team. A locally developed mnemonic in French (Code IDEALE for CODE status, Identity, main Diagnosis, Evolution, expectAtion, to-do List and quEstions) is sometimes used to introduce handoffs to students or new residents.
Methods
Participants: After approval from the local ethics committee, we recruited volunteer residents (R) and supervising physicians (SP) in General Internal Medicine from our academic hospital between January and November 2015 through convenience sampling.
Simulation: Our study design reproduced a Friday evening handoff session in our Internal Medicine wards, and beginning of evening shift for the participant. The simulation had three phases: a handoff phase, a chart biopsy phase, and a sign-out phase. For clarity in this paper, we used the term “handoff” when the participant was the receiver of information, and the term “sign-out” when the participant was the information giver. The overall duration of the simulation was about 45 min.
In the handoff phase, one investigator (KB) gave a standardized handoff of four randomly chosen patient cases, allowing time for questions and interactions with the participant. Participants were randomly selected to receive a paper summary tool, have access to the EHR or to be the control group, with only blank paper for note taking (available in all 3 groups).
In the chart biopsy phase, the participant was left alone to review the EHR for the beginning of a nightshift. During this phase, the participant received four calls from nurses, two of which were about patients that had been handed off, and two for patients without handoffs. We then informed the participant that her shift needed to be interrupted, and asked her to sign-out to a colleague. This phase lasted about 25 min, or whenever the participant felt prepared to sign-out, to avoid time pressure bias.
In the sign-out phase, participants chose which patients they wanted to discuss, and presented them to an investigator (KB) who had no prior knowledge of the patients.
Data collection: We collected demographic data on the participants, such as level of experience and years of practice. We audio-recorded and transcribed the simulations. For this paper, we focused on the sign-outs, which were de-identified for subsequent analysis.
Scenario preparation: We created 8 fictitious clinical cases of commonly encountered diagnoses in internal medicine wards, inspired from real patient cases. A brief summary of the cases is provided in Table 1. For each scenario, we prepared a mock-up prototype based on our local electronic medical record (EHR), a nursing call, and a paper summary. We chose to use mock-ups of the EHR rather than real patient charts for several reasons. First, we wanted to be able to control all the information that participants had access to for each patient. Second, the mock-up helped avoid an additional cognitive load for participants, by adjusting the chart dates for the study. In a prior study with a similar design10, participants found the date issue confusing, especially since each patient had different hospitalization dates. Third, we did not want any post-test date information to be visible in the patient chart that could affect EHR use, or clinical reasoning (ie., “patient deceased”, or test results from a later date). Although using a mock-up restricted the EHR functionalities, (i.e., search engines in the EHR were not functional, CT scans were limited to a few screen shots), the prototype allowed all the main interactions with the EHR, and all the dates in the EHR were adjusted to the dates of the participants’ sessions to decrease confusion and to optimize the realism of the simulation. Seven of the cases were used in the handoff randomization, whereas the medication error situation (Case E) was used solely in the randomization of nurse call, and thus was never handed off. Cases are listed in Table 1.
Table 1.
Overview of patient cases and content of nurse calls
| Case | Main diagnoses | Nurse call |
|---|---|---|
| A | Inaugural decompensated heart failure due to ischemic heart disease. Diabetes with end-organ complications, perforating ulcer | Retrosternal chest pain |
| B | Liver failure in a patient with Child C cirrhosis. Presented hemoptysis at admission due to anticoagulation for recent pulmonary embolism | Abdominal pain |
| C | Pneumonia in a patient with metastatic prostate cancer, lung atelectasia of undetermined origin and sleep apnea. | Chest pain |
| D | Acute renal failure and electrolyte imbalance in a patient with an ileostomy, and severe chronic obstructive pulmonary disease | Shortness of breath |
| E* | Urosepsis in a hypertensive patient who tends to fall, currently has a broken right hand | Medication error |
| F | Fortuitous discovery of severe hyponatremia, patient with suspected cirrhosis | Fever |
| G | Pneumonia with empyema in a young smoker, who presents pain due to his chest tube | Skin rash |
| H | Decompensated Child C cirrhosis due to G-I bleed and sepsis in a patient with chronic alcoolism, liver failure and encephalitis, recently transfered from the ICU. Currently still delirious. Awaiting paracentesis results for recurrent fever. | Low blood pressure |
* This case was not handed off during the study, and was solely used for nurse calls
Some explanations about the cases help understand the encountered difficulties. In Case A, the participant received a call about acute chest pain, in a patient admitted for an inaugural episode of heart failure from ischemic heart disease. Patient management was quite simple in this case, but needed to be conducted urgently. In this scenario, the main difficulty resided in the retrieval of information about all the cardiology work-up the patient had received since his admission.
Case B was about a patient admitted ten days earlier for liver failure, who presented bleeding (hemoptysis) shortly after being admitted. His anticoagulant therapy was interrupted for a few days, despite the recent (<3 months ago) diagnosis of pulmonary embolism, as discussed with the vascular medicine consultant. The plan was to begin the anticoagulation therapy shortly, since the bleeding had stopped.
Case C was a patient with a history of metastatic prostate cancer who was admitted for pneumonia. The nurse calls about recurrent chest pain, which is of non-cardiac origin, with a range of differential diagnoses.
Case D was a patient with past medical history list that was not updated in the patient summary (it did not include chronic obstructive pulmonary disorder), while the results of the pulmonary function test were reported in the updated problem list, and showed sever lung disease. This discordance was noticed rapidly, and led to questions during the initial handoff, or later verification in the EHR.
Case E was a patient admitted for sepsis, with no particular concerns. She unfortunately takes her neighbor’s medication, leading to the ingestion of a high dosage of beta-blocker and an oral diabetes medication. The erroneous information for her was the indication about the ward she was in.
The patient in Case F was admitted for hyponatremia, and presents with new nocturnal agitation, with a wide range of possible diagnoses. Case G patient was a patient with complicated pneumonia, who had an antibiotic switch the day of the handoff, with the paper summary that had not been updated. All but one participant identified this error during the initial handoff. And finally, case H was a complex scenario of a patient with liver failure and many complications, who presented recurrent fever. His hospital stay was the longest, since he’d already been in the intensive care and intermediate care units.
Paper summaries were a novel feature for our participants, as they are not available in the current EHR system. We designed the summaries as potentially auto-populated fields from the EHR. We included patient identity and age, room and bed, code status, comorbidities, problems and to-do list. We also included hospital duration, which is not easily visible in the EHR. These paper summary features were discussed and validated in focus groups with Rs and SPs in a prior study on handoffs9 in our institution. The paper summaries were tested on pilot participants for this study, and were subsequently shortened. Dates of the summaries were also adjusted to the participants’ sessions.
To address concerns raised about the reliability of paper summaries, we included imprecisions and even discordant information in the paper summaries. For example, we reported a patient on one antibiotic, which was actually switched the day of the simulation sessions. This type of error is commonly encountered as a missed update, or an inappropriate copy-paste. In another case, we failed to update the past medical history list (probable COPD, no pulmonary function testing) after working-up a patient, which revealed severe COPD.
The verbal handoffs were also standardized for each case, with answers for anticipated questions from the participant. Other questions were answered with vague responses, which were meant to lead the participant to look up the information in the EHR. Although the paper summaries contained imprecisions or errors, all verbal handoffs contained the correct information.
Analysis: In this study, we studied all the elements given in the sign-out phase, and focused particularly on errors. Errors were defined as omissions, or erroneous information that was given during the sign-out. We also analyzed the imprecisions in the sign-out phase. The definitions of these terms for our study are presented in Table 2. An expert physician conducted a thematic analysis, and coded the transcripts using Atlas.ti v1 for Mac, taking into consideration the handoff status and nursing call interventions. A second expert physician cross-checked the initial coding. Differences were discussed to reach a consensus.
Table 2.
Definitions
| Terms | Definitions |
|---|---|
| Error | |
| Omission | Missing information, such as fever that was not reported, with a clinical impact |
| Erroneous data | Facts that were mentioned in the sign-out, but were not provided in any of the data sources (handoff, nurses, paper summaries or EHR). |
| Imprecision | Facts that were incomplete or approximate, but with little or no impact on the comprehension or management of the case. For example, an age reported as 80xssomething rather than 83 years would be an imprecision. |
We report the sign-out durations by level of expertise. We use descriptive statistics to present the errors and imprecisions. We compared the mean number of errors in cases between the 3 arms with crude and adjusted (for experience level) linear regression models with robust standard errors. We then analyzed whether the sources of errors were in relation to the EHR data, paper summaries, annotations, or initial verbal handoff. Finally, we studied how to prevent these errors, through content and design of the EHR or paper summaries, or through human factors.
Results
Participants
We recruited 30 participants, 16 residents and 14 supervising physicians (Table 3) with an average of 5.4 years of experience. Overall, SPs had 6.5 years more clinical experience than the Rs. Only two of the participants reported having received handoff training. Six of the interns and seven of the supervising physicians had heard of a locally developed handoff mnemonic. Table 3 shows that the randomization resulted in a slightly larger paper summary group, who also had more supervising clinicians.
Table 3.
Study participant characteristics.
| EHR | Paper | Control | |
|---|---|---|---|
| n (residents) | 9(6) | 11 (4) | 10 (6) |
| Female, n | 4 | 5 | 3 |
| Hand-off training, n | 1 | 1 | 0 |
| Median years experience (range) | 4.0 (0.2-10) | 8.0 (0.2-20) | 4.0 (0.6-15) |
| Nightshift experience (%) | 89 | 91 | 80 |
Participants were all familiar with the local EHR system.
Sign-out description
The sign-out phase duration was 12.2 min on average (range 6.0 – 30.0), with a non-significant difference between SPs (11.3 min) and Rs (13.0 min), with p=.33. One SP participant, P12, chose to present only 3 patients during sign-out, and therefore had the shortest duration. All other participants presented the six patients.
A junior SP participant, P17, was the other extreme of the sign-out duration. He was particularly thorough during the sign-out process, and took time to look up information in the EHR to complete his sign-out. He took nearly five times the amount of time of the shortest sign-out for six patients (30 min vs 6.5 min), despite some pressure from an investigator, and had 2 errors and 5 imprecisions. P17’s explanation for not being able to be more concise was the complexity of the cases.
Errors and imprecisions
According to our definitions of errors, we found 42 errors overall among the 180 sign-outs, or 23.3 per 100 cases. Twenty-eight of the errors were omissions, 14 were erroneous facts. There were 100 imprecisions in the 180 signouts. Eight discordances between coders (all for imprecisions) were discussed to reach a consensus. We present the errors and imprecisions per case in Table 4.
Table 4.
Errors and imprecisions by case
| Case A | Case B | Case C | Case D | Case E* | Case F | Case G | Case H | Total | |
|---|---|---|---|---|---|---|---|---|---|
| Omissions | 7 | 4 | 2 | 0 | 3 | 4 | 2 | 6 | 28 |
| Erroneous facts | 4 | 3 | 0 | 1 | 1 | 1 | 1 | 3 | 14 |
| Total errors | 11 | 7 | 2 | 1 | 4 | 5 | 3 | 9 | 42 |
| Imprecisions | 12 | 17 | 14 | 6 | 7 | 6 | 16 | 22 | 100 |
Cases A and H had more errors than the other cases, even though cases B and G also had more imprecisions. This can be visualized more easily in Figure 1.
Figure 1.

Errors and imprecisions by case
As the number of participants of the three types of support tools differed among the groups, we present error and imprecision rates per case in Table 5. The 0.23 mean level of total errors per case did not differ between the 3 arms, in the crude (p=0.98) and adjusted models (p=0.99).
Table 5.
Errors and imprecision rate per case for each type of support
| EHR access | Paper summary | Control | Total | |
|---|---|---|---|---|
| Omissions | 0.13 | 0.17 | 0.17 | 0.16 |
| Erroneous facts | 0.09 | 0.06 | 0.08 | 0.08 |
| Total errors | 0.22 | 0.23 | 0.25 | 0.23 |
| Imprecisions | 0.43 | 0.56 | 0.67 | 0.56 |
Overall, the small discordances or errors in the paper summaries were identified by the participants, and were not the cause of errors in the sign-out.
Sources of errors
A closer study of the errors and their sources reveals that errors arose at different times of the handoff process. During the verbal handoff, physicians may receive erroneous information from their colleague, or they may receive correct information, but misunderstand the facts, or simply record them imprecisely. One physician had trouble giving his sign-out because he had trouble re-reading his own notes, for example. Although we standardized the handoff content, participants were given the opportunity to ask questions about each patient before moving to the next patient. These interactions were important to clarify information, or to address issues that the received anticipated for the night. The questions asked ranged from medication specifications (“Which antibiotic is he on?”) to lab test results, or contingency plans. For example, participants were asked to follow-up on a lab test for ascites in Case H, and many participants asked if the day team had discussed which antibiotic to prescribe if the liquid showed signs of peritonitis.
The analysis of the erroneous facts in our dataset revealed areas of weakness in the handoff and chart biopsy process. Four of the erroneous facts were related to comprehension of the timeline of events. For example, P21 mistook the bleeding complication in Case B that occurred at admission one week prior as an event of the day, and therefore found the work-up to be incomplete, and made erroneous recommendations. In this example, the error was initiated during the verbal handoff, despite P21 having received paper summaries. Another participant mixed the data from two different patients during the sign-out, not realizing immediately that the EHR was open on another patient’s chart. The other erroneous facts were in the recommendations given, due to clinical reasoning or medical knowledge.
In the analysis of the omissions, we found that these occurred at different parts of the handoff process. Some participants forgot to report nurse calls, others did not document key points such as vital signs when responding to nurse calls. And yet other omissions related to the retrieval of various reports in the EHR, when this information had not been integrated in the progress notes.
Sign-out structure
Overall, participants tended to present patients by ward during the sign-out, rather by degree of urgency or order of initial handoff. Some physicians were a little disorganized, and sometimes only remembered some patient information after having moved to the next patient, such as in the case of P1: near the end of the sign-out, P1 suddenly came back to the first patient, asking whether he’d reported that this patient had also presented fever, and that a septic work-up needed to be done.
Discussion
Handoffs and cross-coverage have both been identified as moments of potential threat for quality and safety of patient care. Observational studies have studied incident reports11, rapid team responses12, or medical errors in the patient charts5. Our error rate of 23.3 per 100 cases is in the range of other studies on handoffs and errors. As a comparison, Starmer et al reported an overall rate of medical errors of 33.8 per 100 admissions4, and 24.5 per 100 admissions5 in their studies on handoffs and medical errors.
Our statistical analysis of the errors did not show any differences among any support tools or no support. This could be due to the small number of errors in the study, since our study was not powered for this outcome. Also, the sources of error are not all related to information retrieval or interpretation. Erroneous facts due to clinical reasoning or medical knowledge deficits may not necessarily be improved by changing the type of support tool.
Interpretation of our error and imprecision rates need to be performed with caution. Our analyses examine differences in facts from our data sources, but not all facts are relevant in a handoff. In fact, physicians need to select the pertinent features in a patient case to remain concise. Omissions therefore may not all be bad. The difficulty lies in the selection process of the pertinent features to disclose.
Furthermore, imprecisions are probably related to the high cognitive load of the handoff process.13 Beyond the case complexity in terms of clinical reasoning, the EHR and available resources can also lead to difficulties. As we found in the results, one of the cases required finding recent reports of the cardiology work-up (Case A). It is this step that led to errors for our participants, because pertinent documents were mixed with all the other documents. Although the search function of our EHR mock-up was limited, compared to the EHR the participants were used to, we note that the participants did not take the extra time to find the documents. When the results were not easily retrievable from the progress notes, for example, some participants presumed that the results were not yet available, and abandoned their search.
Human factors and training
Human factors play an important role in the handoff process. Although higher awareness of the challenges of the handoff process could be addressed through training, the participants of our study seemed to reflect the current state of handoff training. Only two participants (SPs) had received any handoff training prior to this study. Most participants had heard of a locally developed mnemonic for handoffs in our department14. Although none of the participants actually used this mnemonic, they found it useful to illustrate or to teach handoffs to students or peers.
Our findings show that errors generated early in the handoff process can persist throughout the sign-out, simply because there is little verification of the information received. Communication failures are very common in clinical care, present in up to two thirds of preventable events1. Therefore verification of verbal information is essential, and is emphasized in successful handoff training programs like I-PASS3. Moreover, training in handoffs should also draw awareness on the reliability of note-taking: handwriting styles, speed, and comprehension can all affect the quality of the notes. This reliance on notes and paper support tools led one participant to state contradictory facts about a patient, reading from the past medical history list, and then moving on the problem list, without realizing this. The other participants reported the updated status from the problem list only.
Implications for training based on the prior example would be to make sure participants use their clinical reasoning skills to choose the pertinent elements when signing out. Some participants briefly explained their reasoning process when giving recommendations. Having a shared understanding, or at least having the opportunity to clarify certain points of the handoff is commonly included in handoff recommendations3. Another important manner to prevent errors from the onset would be for the receiver to summarize key elements for each patient, to ensure correct comprehension. This is the last component of the I-PASS handoff bundle3.
Support tools functionalities
The low verification of facts by the users emphasizes the need for any handoff support tool to be able to present the most recent information available in the EHR, and to allow manually triggered updates. Although synchronizing support tool data with the EHR helps avoid transcription errors, it could also potentially be a source of errors, if they override previous information, lack traceability or provide information that is not yet processed by the physician in charge.15 For example, if the support tool presents a management plan based on an initial set of lab results, lack of a notification for new results may engender a misunderstanding about the reasoning underlying the plan. Also, there should be little need for repetitive documentation that could favor copy-pasting, such as for problem lists, or any other kinds of lists.
Having a patient’s information summarized in one place helps maintain a structured handoff. The structure of the handoff is important to share mental models of the patients, and to avoid having to jump back to previous patients to provide additional information. While most participants were able to maintain a patient-by-patient sign-out, some got confused with their annotation methods, particularly with the occurrence of new events. Having a designated place to write additional information, such as nursing calls, can help physicians stay organized. Although paper support tools are easier and often more efficient to annotate, the space to write is limited and is not standardized, making it more difficult for the annotations to be shared with the oncoming physician.
Design implications
In this section, we make some suggestion to help avoid errors during the handoff and chart biopsy process. Based on the errors identified in our analysis, we found several important measures to consider improving in the design and content of the EHR and the connected support tools. Our study design with standardized handoffs and repeated simulations with the same cases helped us identify and understand some of the potential sources of errors during handoffs and cross-coverage in more depth. Our findings also emphasize the importance of the initial verbal handoff, which can convey errors or imprecisions that are often not verified later in the process. In this manner, the handoff process resembled the children’s game of “telephone.” It is therefore particularly important to avoid introducing errors at any stage of the handoff or sign-out process.
Our EHR is a home-grown system, used by all healthcare providers in our hospital. Although data entry is specific for each type of professional, the content of the EHR is accessible to all. The EHR is structured in various tabs, and include the following:
A list of reports, admission notes and discharge summaries
Progress notes and consultation notes, which are unstructured, but usually written up as problem lists
Vital signs, and administered medications in a chart
Prescribed medications
Labs
Imaging
Nursing notes
As in many EHR systems, most of the content of the tabs are presented in chronological order. The documents of each tab are presented separately, which means that physicians need to jump from one tab to another to concatenate the information from the reports, admission notes and progress and consultation notes. Collection of this data in the EHR, or “chart biopsy7,” may be facilitated when the day team compiles the information in the progress notes. An eye-tracking study of physicians reading electronic notes, however, showed that large sections are ignored, which emphasizes the need to revise the content of electronic notes16. Furthermore, reports from the end of the day shift however may not be included in the daily summary, if there is no “new report” alert (which can simply be bold characters, or colored text, for example) in the EHR. One way to improve this process would be have better traceability of prescribed labs and work-up tests, and notifications for the final reports. These notifications should reach the patient’s physicians, day or night team, to help guarantee that all results can be acknowledged in a timely manner, and interpreted in the patient’s context.
Improving the design of the EHR system to better support the clinical workflow process would be another way to help prevent errors. The EHRs needs to support several different goals such as billing, quality monitoring and clinical documentation for patient care. The structure of the clinical documentation component in our EHR is similar to the paper charts we initially used, with various sections (i.e., tabs in the EHR). Designing a support tool for handoffs needs to support the workflow and reasoning process of physicians, pulling information from different areas of the EHR into a coherent overview of a patient’s clinical state and trajectory. Although a “snapshot” of the patient’s current state is supported in many EHRs via a dashboard summary page, the challenge lies in tracing the patient’s evolution during the hospital stay, which may not be well described in the progress notes17.
When we consider these requirements, mobile solutions seem to offer opportunities for patient-safe options due to their ubiquitous use and possibilities of personalized alerts. Handoff support tools need to be readily available at all times, to provide the relevant information needed to review patient care throughout a shift. A thoughtfully designed mobile app can pull together information from different parts of the EHR, individualized according to user preferences and each patient’s clinical context. Mobile solutions can allow alerts to target the most appropriate user to help avoid alert fatigue.
Understanding the sequence of events that occurring during a hospital stay can be difficult, if relying solely on the EHR. Our EHR provides a detailed graph of vital signs and medications over time, but tracing the history of events can be tricky. Users may try different approaches from different tabs, using search engines in the prescription history, or skimming though the progress notes. Improving the timeline view, in particular using event-based visualization could help address this issue.18 Furthermore, some authors have looked at handoffs as narratives19, that can help care-providers understand the patient’s story, and the sequence of events that have occurred during the hospitalization.
Improving the progress notes in the EHR could also help improve information retrieval.20 Progress notes are often structured as a problem list and plan, and allow physicians to attribute priorities to the patient’s problems, describe the plan, gather the results and provide contingency statements if needed. Currently, the tab structure in our EHR system makes sharing of information across tabs tedious, having to switch back and forth from one tab to another. Building a medical chart around the problem list,21, 22 which integrates the important findings in a single place could help avoid excess switching. If we consider the clinical reasoning process, we blend parts of the history taking, the physical examination, labs and imaging results into each problem.
The analysis of “erroneous facts” errors revealed difficulties with timeline and event sequencing. During the verbal handoff, the receiver needs to combine many actions, listening and understanding the case, taking notes, anticipating complications, and having to deal with potential interruptions (phone calls, for example). Although receiving a summary is helpful, care must be taken to provide a simple outline with minimal administrative information to avoid cognitive overload. When participants need to read long paragraphs about each problem, their attention to the handoff giver is low, and can lead to misunderstandings.
Strengths and limitations
Handoffs, in particular the verbal handoffs, are difficult to study in a real clinical setting. Differences in patient cases, or changes in patient state do not allow for comparisons of performances. Therefore, we chose to use a simulation setting to reproduce the handoff and cross-coverage context. Although the number of cases was lower than the total average number of cases handed off in the wards, we believe that six cases allow us to address concerns of case specificity. This number is also sufficient to challenge the participants’ cognitive load, and allowed for a realistic, albeit busy, beginning of a nightshift. Standardization of the initial handoff also facilitated the comparison among participants. Although participants reported that complexity of cases was similar to real patients, standardization limits the variance and sources of complexity that complicate communication during handoffs.
The analysis for errors was based mainly on facts reported during the sign-out process. We did not include an analysis of the clinical reasoning process during patient management for this report.
Finally, this study focuses on internal medicine physicians, so the findings may not necessarily be generalizable to other medical specialties. The wide variation in handoff practices also limits its generalizability in other institutions or departments.
Conclusion
Based on a simulated handoff and chart biopsy study, we performed an analysis focused on the errors and imprecisions that were observed during the final sign-out, with no significant differences in error rate by type of support tool. Our findings emphasize several ways to help avoid errors in the handoff process, in the design of EHRs and of support tools, as well as by addressing human factors in future training programs. We propose facilitating the cross-coverage process by providing event-based medical charts, which regroup documents from related events together. Support tools for handoffs should be able to easily present updated patient data in a concise way and should facilitate note-taking. Furthermore, handoff training should raise awareness for overreliance on support tools, and low verification of the collected information. We plan to apply these human factors training suggestions, and emphasize the narrative aspect of handoffs in our next study. Future studies on the handoff process should also include an analysis of the written handoff process.
Acknowledgements
We acknowledge V. Soulier and R. Wipfli for their contribution to this project.
References
- 1.Commission TJ. Sentinel Event Data-Root causes by Event Type. 2013. [March 7, 2013]. Available from: http://www.iointcommission.org/Sentinel Event Statistics/
- 2.Arora V, Johnson J. A model for building a standardized hand-off protocol. Jt Comm J Qual Patient Saf. 2006;32(11):646–55. doi: 10.1016/s1553-7250(06)32084-3. [DOI] [PubMed] [Google Scholar]
- 3.Starmer AJ, Spector ND, Srivastava R, Allen AD, Landrigan CP, Sectish TC. Group IPS. I-pass, a mnemonic to standardize verbal handoffs. Pediatrics. 2012;129(2):201–4. doi: 10.1542/peds.2011-2966. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Starmer AJ, Sectish TC, Simon DW, Keohane C, McSweeney ME, Chung EY, Yoon CS, Lipsitz SR, Wassner AJ, Harper MB, Landrigan CP. Rates of medical errors and preventable adverse events among hospitalized children following implementation of a resident handoff bundle. JAMA. 2013;310(21):2262–70. doi: 10.1001/jama.2013.281961. [DOI] [PubMed] [Google Scholar]
- 5.Starmer AJ, Spector ND, Srivastava R, West DC, Rosenbluth G, Allen AD, Noble EL, Tse LL, Dalal AK, Keohane CA, Lipsitz SR, Rothschild JM, Wien MF, Yoon CS, Zigmont KR, Wilson KM, O’Toole JK, Solan LG, Aylor M, Bismilla Z, Coffey M, Mahant S, Blankenburg RL, Destino LA, Everhart JL, Patel SJ, Bale JF, Jr., Spackman JB, Stevenson AT, Calaman S, Cole FS, Balmer DF, Hepps JH, Lopreiato JO, Yu CE, Sectish TC, Landrigan CP. Group IPS. Changes in medical errors after implementation of a handoff program. N Engl J Med. 2014;371(19):1803–12. doi: 10.1056/NEJMsa1405556. [DOI] [PubMed] [Google Scholar]
- 6.Kakarala K, Jain SH. The “cross-cover” mindset. J Patient Saf. 2012;8(1):1–2. doi: 10.1097/PTS.0b013e318242ad70. Epub 2012/01/0. [DOI] [PubMed] [Google Scholar]
- 7.Hilligoss B, Zheng K. Chart biopsy: an emerging medical practice enabled by electronic health records and its impacts on emergency department-inpatient admission handoffs. J Am Med Inform Assoc. 2013;20(2):260–7. doi: 10.1136/amiajnl-2012-001065. Epub 2012/09/11. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Van Eaton EG, Horvath KD, Lober WB, Pellegrini CA. Organizing the transfer of patient care information: the development of a computerized resident sign-out system. Surgery. 2004;136(1):5–13. doi: 10.1016/j.surg.2004.04.018. Epub 2004/07/03. [DOI] [PubMed] [Google Scholar]
- 9.Blondon KS, Wipfli R, Nendaz MR, Lovis C. Physician handoffs: opportunities and limitations for supportive technologies. AMIA Annu Symp Proc. 2015;2015:339–48. [PMC free article] [PubMed] [Google Scholar]
- 10.Kendall L, Klasnja P, Iwasaki J, Best JA, White AA, Khalaj S, Amdahl C, Blondon K. Use of Simulated Physician Handoffs to Study Cross-cover Chart Biopsy in the Electronic Medical Record. AMIA Annu Symp Proc. 2013;2013:766–75. Epub 2014/02/20. [PMC free article] [PubMed] [Google Scholar]
- 11.Thomas MJ, Schultz TJ, Hannaford N, Runciman WB. Failures in transition: learning from incidents relating to clinical handover in acute care. J Healthc Qual. 2013;35(3):49–56. doi: 10.1111/j.1945-1474.2011.00189.x. [DOI] [PubMed] [Google Scholar]
- 12.Kaplan LJ, Maerz LL, Schuster K, Lui F, Johnson D, Roesler D, Luckianow G, Davis KA. Uncovering system errors using a rapid response team: cross-coverage caught in the crossfire. J Trauma. 2009;67(1):173–8. doi: 10.1097/TA.0b013e31819ea514. discussion 8-9. [DOI] [PubMed] [Google Scholar]
- 13.Young JQ, Ten Cate O, O’Sullivan PS, Irby DM. Unpacking the Complexity of Patient Handoffs Through the Lens of Cognitive Load Theory. Teach Learn Med. 2016;28(1):88–96. doi: 10.1080/10401334.2015.1107491. [DOI] [PubMed] [Google Scholar]
- 14.Dayal N BK, Lazarou I, Savoldelli G, Perrier A, Nendaz M, Gerstel E, editors. A framework for a systematic approach for hospital ward emergencies in internal medicine. Swiss society of Internal Medicine, annual meeting. 2011 [Google Scholar]
- 15.Yackel TR, Embi PJ. Unintended errors with EHR-based result management: a case series. Journal of the American Medical Informatics Association. 2010;17(1):104–7. doi: 10.1197/jamia.M3294. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Brown PJ, Marquard JL, Amster B, Romoser M, Friderici J, Goff S, Fisher D. What do physicians read (and ignore) in electronic progress notes? (1869-0327 (Electronic)) doi: 10.4338/ACI-2014-01-RA-0003. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Bowman S. Impact of electronic health record systems on information integrity: quality and safety implications. Perspect Health Inf Manag. 2013;10:1c. [PMC free article] [PubMed] [Google Scholar]
- 18.Park H, Choi J. V-Model: a new perspective for EHR-based phenotyping. BMC Med Inform Decis Mak. 2014;14:90. doi: 10.1186/1472-6947-14-90. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Hilligoss B, Moffatt-Bruce SD. The limits of checklists: handoff and narrative thinking. BMJ Qual Saf. 2014;23(7):528–33. doi: 10.1136/bmjqs-2013-002705. [DOI] [PubMed] [Google Scholar]
- 20.Mamykina L, Vawdrey DK, Stetson PD, Zheng K, Hripcsak G. Clinical documentation: composition or synthesis? J Am Med Inform Assoc. 2012;19(6):1025–31. doi: 10.1136/amiajnl-2012-000901. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Hodge CM, Kuttler KG, Bowes WA, Narus SP. Problem management module: an innovative system to improve problem list workflow. AMIA Annu Symp Proc. 2014;2014:661–70. [PMC free article] [PubMed] [Google Scholar]
- 22.Chowdhry SM, Mishuris RG, Mann D. Problem-oriented charting: A review. International Journal of Medical Informatics. 2017;103:95–102. doi: 10.1016/j.ijmedinf.2017.04.016. [DOI] [PubMed] [Google Scholar]
