Abstract
Introduction
The Accreditation Council for Graduate Medical Education requires competence in systems-based practice (SBP) demonstrating understanding of complex interactions between systems of care and its impact upon care delivery. Patient safety is a useful vehicle to facilitate learning about these interactions.
Aim
Develop an educational tool, Outcomes Card (OC), to reinforce core concepts of SBP.
Setting
Urgent Care Center at Louis Stokes Cleveland Department of Veterans Affairs Medical Center.
Program Description
Pilot study of an educational intervention for residents that included patient safety didactic sessions and analysis of 2 self-identified clinical cases using the OC. Residents entered the following information on the OC: case description, type of event (error, near miss, and/or adverse event), error type(s), systems, and system failures.
Program Evaluation
Two reviewers independently analyzed 98 cards completed during 60 two-week trainee rotations (81.7% return rate). Interrater reliability for error types between residents and physician supervisor and between reviewers was excellent (κ=0.88 and 0.95, respectively), and for system identification was good (κ=0.66 and 0.68, respectively). The self-assessment survey (56.6% return rate) suggests that residents improved their knowledge of patient safety and had positive attitudes about the curriculum.
Discussion
This pilot study suggests that OCs are feasible and reliable educational tools for enhancing competence in SBP.
Keywords: patient safety, medical education, program evaluation
Traditionally, graduate medical training has emphasized knowledge and skill development for the diagnosis and treatment of an individual patient. However, residents operate in complex health care delivery systems and have not been trained to analyze clinical environments and continually improve patient care.1 Addressing these gaps, the Accreditation Council for Graduate Medical Education (ACGME) endorsed Practice-Based Learning and Improvement (PBLI) and systems-based practice (SBP) competencies.2,3 Systems-based practice competency requires awareness of and responsiveness to the larger context and system of health care and the ability to effectively utilize system resources to provide optimal care.3,4
A system, as defined in the Institute of Medicine Report To Err Is Human,5 is a set of interdependent elements (human and nonhuman) interacting to achieve a common aim. Their interdependence renders them vulnerable to discontinuities (i.e., system failures) that result in suboptimal care. In contrast to PBLIs emphasis on self-reflection about one's individual practice, SBP involves understanding complex system interactions, systems' vulnerabilities, and their impact upon care delivery.3,4,6
Accreditation Council for Graduate Medical Education has not established specific guidelines for teaching or assessing these competencies, but rather encouraged local experimentation. Consequently, residency programs have struggled to meet requirements. The issue of ensuring the safety of patients for whom residents are responsible is a useful vehicle to facilitate learning about complex interactions between systems in the context of clinical care, while highlighting the vulnerabilities of systems that permit errors to occur. Previous studies focusing upon error reporting7,8 and linkages to system redesign8 do not incorporate teaching and assessing the SBP competency. We describe the development of an educational tool that fosters this competency.
PROJECT DESCRIPTION
A pilot educational intervention incorporating didactic and experiential learning, including a patient safety curriculum9 and a tool designated an Outcomes Card (OC) (Appendix A available online), was incorporated into the Internal Medicine Residency Program at University Hospitals of Cleveland and the Louis Stokes Cleveland Department of Veterans Affairs Medical Center (LSCDVAMC). The intervention site was the Urgent Care Center (UCC) at the 300-bed tertiary care LSCDVAMC, which provides emergency services with approximately 30,000 annual visits. Six to nine trainees rotate on 2-week intervals in the UCC. Annually, interns rotate 1 to 2 times, and residents 3 to 5 times.
All UCC residents (July 1 to November 2, 2003) participated. In the first of two 1-hour interactive teaching sessions, residents were oriented to the goals and objectives of the patient safety curriculum and evaluation process, introduced to the OC, and given the following: Leape's Error Classification Guide,10 systems classification guide, safety definitions,5 and instructions on entering the information into a password-protected database created on the hospital's computer system (Appendix A to D available online). Residents were required to complete ≥2 OC on self-identified cases seen during their UCC rotation and receive a satisfactory score in order to achieve SBP competency for this rotation. They were asked to choose cases they believed were associated with error or system failures.
Session one began by presenting an actual LSCDVAMC case involving multiple system failures followed by discussion triggered by questions of how and why the events of the case occurred. Concepts introduced included basic epidemiology and terminology associated with patient safety, an introduction to systems thinking and human factors engineering, and a description of safety culture. These concepts were linked to the initial case and were reinforced by other examples given by faculty and volunteered by residents. Teaching session 2 took place one-week later and residents presented cases in a structured format, like the OC; faculty and participating residents provided feedback.
Dr. Ernest Codman, who pioneered recording of patients' “end results” or outcomes of care, inspired this intervention.11 The OC was developed to include information to trigger reflection about the relationship between errors and associated systems, the interactions between systems, and the vulnerabilities of systems, as opposed to the traditional belief in health care that errors result solely from an individual's actions. Residents were asked to consider the interface between people and the environment (human factors engineering) when considering error types and system failures (gaps in care). Outcomes Card information builds on these constructs by capturing the following patient safety components: patient identifiers, case description, acknowledgement of error (yes/no), near miss (yes/no), and/or adverse event (yes/no) in reference to the case, selection of all relevant error types (Appendix B available online) based on Leape's classification,10 identification of up to 2 different systems (e.g., radiology, intensive care unit, administrative services) (Appendix C available online), and 1 system failure (e.g., inadequate staffing or supervision) associated with each selected error type. Patient safety experts reviewed the OC for face and content validity.
Louis Stokes Cleveland Department of Veterans Affairs Medical Center's Quality Management reviewed OCs to ensure appropriate follow-up action. Moreover, these quality assurance data were protected and not releasable. This, along with a secure database, facilitated buy-in from all stakeholders (administration, program directors, residents).
PROGRAM EVALUATION
Methods
Demographic information was obtained from residency program files. This pilot included sixty 2-week trainee rotations and 45 residents (18% postgraduate year 1 (PGY1s), 40% PGY2s, 35% PGY3s, and 7% PGY4s); 35% of the residents were women and 18% had second graduate degrees. Two expert reviewers independently reviewed the medical record and the case description provided on each OC. One reviewer, the UCC clinical manager who supervises the residents, had completed a 2-year fellowship in quality improvement.12 The second reviewer, a health systems researcher, had expertise in quality assessment. After reviewing the medical record and resident's case description, reviewers identified error, near miss, and/or adverse event in reference to the case, and selected all relevant error types and up to two different systems associated with each error type.
Outcomes Card data were deidentified and analyzed in aggregate. The mean number of errors/case, percent of adverse events, and frequency of error types identified by the resident, physician supervisor, and health systems researcher were assessed separately. Interrater reliabilities were measured with a κ statistic (SPSS V. 12.0 Chicago, Ill).
Cards were scored by comparing the residents' and physician supervisor's acknowledgement of an error, near miss, and/or adverse event, selection of specific error types, and identification of associated systems. Percent agreement was averaged if multiple cards were returned. This assessment of SBP competence served as the summative evaluation for the rotation (0% to 33.3%—unsatisfactory, 33.4% to 66.6% satisfactory, and 66.7% to 100% highly satisfactory). Scores were sent to the Residency Program Director.
Upon finishing the rotation, residents completed an anonymous self-assessment survey (Appendix E available online) that included 3 domains: improved knowledge of patient safety concepts and case analysis (12 items), comfort with using OC skills (4 items), and attitudes related to value of this SBP curriculum (3 items). Residents rated each item using a 5-point Likert scale: (1=very poorly; 5=very well) for improved knowledge, (1=very uncomfortable; 5=very comfortable) for skills, and (1=strongly disagree; 5=strongly agree) for attitudes. Ratings for the items in each domain were summed, and Cronbach's α, a measure of internal consistency, and descriptive statistics (mean, SD) were calculated for the 3 scales.
RESULTS
Ninety-eight OCs were completed (81.7% return rate). The residents, physician supervisor, and health systems researcher identified adverse events in 22.7%, 15.5%, and 18.5% of cases, respectively, and means of 2.3, 2.8, and 3.0 (range 1 to 6) errors/case, respectively. Interrater reliabilities for error types between the resident and physician supervisor and between reviewers were excellent 13 (κ=0.88 and 0.95, respectively), and were good for system identification (κ=0.66 and 0.68, respectively). There was no significant difference in inter-rater reliability on error types or system identification by resident PGY. The 4 most frequently identified error types reported by residents and reviewers were error or delay in diagnosis, failure in communication, avoidable delay in treatment or in responding to an abnormal test, and inadequate monitoring or follow-up of treatment (Table 1).
Table 1.
Error Type | Residents (%) | Physician Supervisor (%) | Systems Researcher (%) |
---|---|---|---|
Error or delay in diagnosis | 19 | 17 | 17 |
Failure in communication | 19 | 24 | 22 |
Avoidable delay in treatment or responding to an abnormal test | 13 | 17 | 17 |
Inadequate monitoring or follow-up of treatment | 11 | 10 | 12 |
End-of-rotation self-assessment surveys were completed by 57% of residents. Cronbach's α ranged from 0.74 to 0.88 for the 3 scales (Appendix E). Mean scores (±SD) on the scales were 48.38±3.98 (potential range 12 to 60) for improved knowledge in patient safety and case analysis; 15.94±1.59 (potential range 4 to 20) for personal comfort with use of OC; and 12.29±1.68 (potential range 3 to 15) for importance of curriculum.
DISCUSSION
Our pilot study indicates that residents were able to identify medical errors and adverse events associated with daily clinical practice, which is consistent with findings of other studies.7,8 This observation and the high level of agreement between the residents and reviewers on error types suggest the reliability and utility of the OC as an educational tool that reflects core components of SBP. Determining error types and thus demonstrating an understanding of the patient care context is the first step in analysis of a case from a systems perspective. In contrast, there was disagreement between the physician supervisor and residents, as well as between reviewers in identifying systems in more than 30% of cases. This suggests not simply a gap in knowledge in this concept but perhaps some limitations in this approach to evaluating knowledge of systems. Future studies will involve development of better instruments, starting with qualitative analysis of the residents' descriptions of system failures.
The implications of this pilot study are multiple. Residents can directly benefit from this education program by gaining knowledge about patient safety and complex interactions among systems. They can gain a better understanding of the health care system where they practice and its resources, two core concepts required to attain competency in SBP. The self-assessment results suggest that the residents gained this knowledge from participating in the curriculum and using the OC and agree with the importance of these curricular efforts. We suspect that the curriculum reinforces the concepts of patient safety and systems thinking, the OC and reference cards enable the application and understanding of these concepts, and the peer and faculty feedback included in the curriculum assists in modifying behavior. This type of curriculum creates an opportunity for open discussion of medical error and adverse events that is lacking in residency training.14 Although not formerly evaluated, changes (system redesign) at our facility resulting from this pilot have included: modification in the computerized medical record to permit inclusion of vital signs only from the day of the visit, development of a centralized hospital admission and transfer process, use of Failure Mode Effects Analysis to evaluate handling of critical laboratory values, creation of a Quality Action Team to review high-risk elopements, and acquisition of a secure medication delivery system in the UCC.
This pilot study has several limitations. Although OC appear to have content validity, further validation is required. The study design prevented distinguishing the contribution of each component of the intervention (curriculum and OC) in the residents' self-reported attainment of knowledge and skills in patient safety, systems thinking, and human factors engineering. Additionally, the assessment of improved knowledge is only through self-report after the curriculum. The short duration (2 weeks) of the residents' rotation limited the time for residents to apply patient safety and systems concepts to the OC and opportunities to evaluate the impact of this intervention upon residents' behavior. Finally, it is important to recognize the commitment of faculty and Quality Management to ensure that appropriate action be taken in response to identified events. In order for a program to be successful in adopting this educational intervention, the following components are essential: faculty members with expertise in patient safety and curriculum development; a residency program that supports patient safety curriculum; and the support of Quality Management in assisting with case reviews. Despite these limitations, our pilot project has forged new ground in an attempt to meet the challenge presented by the ACGMEs inclusion of SBP as a core competency for all graduate medical residents. The OC appears to be a useful tool for clinician educators with experience in patient safety and systems thinking to train residents to identify medical error, systems, and system failures associated with a clinical case. Future studies will include natural controls and a pre–post assessment of the residents' ability to apply concepts on the OC, further analysis of the systems and system failures associated each error type, and assessment of the educational intervention's impact upon residents' attitudes and behavior related to patient safety, as well as its impact upon the health care delivery system.
Supplementary Material
References
- 1.Aron DC, Headrick LA. Educating physicians prepared to improve care and safety is no accident: it requires a systematic approach. Qual Safety Health Care. 2002;11:168–73. doi: 10.1136/qhc.11.2.168. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Leach DC. Evaluation of competency: an ACGME perspective. Am J Phys Med Rehabil. 2000;79:487–9. doi: 10.1097/00002060-200009000-00020. [DOI] [PubMed] [Google Scholar]
- 3.ACGME Outcomes Project. General competencies. Available at: http://www.acgme.org/outcomes/com/comFull.asp#3 Accessed March 5, 2003.
- 4.Swing SR. Assessing the ACGME general competencies: general considerations and assessment methods. Acad Emer Med. 2002;9:1278–88. doi: 10.1111/j.1553-2712.2002.tb01588.x. [DOI] [PubMed] [Google Scholar]
- 5.Institute of Medicine. To Err is Human: Building a Safer Health System. Washington, DC: National Academy Press; 1999. [Google Scholar]
- 6.Ziegelstein RC, Fiebach NH. “The Mirror” and “The Village”: a new method for teaching practice-based learning and improvement and systems-based practice. Acad Med. 2004;79:83–8. doi: 10.1097/00001888-200401000-00018. [DOI] [PubMed] [Google Scholar]
- 7.Weingart SN, Callanan LD, Ship AN, Aronson MD. A physician-based voluntary reporting system for adverse events and medical errors. J Gen Intern Med. 2001;16:809–14. doi: 10.1111/j.1525-1497.2001.10231.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Plews-Ogan ML, Nadkarni MM, Forren S, et al. Patient safety in the ambulatory setting: a clinician-based approach. J Gen Intern Med. 2004;19:719–25. doi: 10.1111/j.1525-1497.2004.30386.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Gosbee JW. A patient safety curriculum for residents and students: the VA healthcare systems pilot project. ACGME Bull. 2002:2–6. [Google Scholar]
- 10.Leape LL, Lawthers AG, Brennan TA, Johnson WG. Preventing medical injury. Qual Rev Bull. 1993;19:144–9. doi: 10.1016/s0097-5990(16)30608-x. [DOI] [PubMed] [Google Scholar]
- 11.Mallon WJ. Ernest Amory Codman: The End Result of a Life in Medicine. Philadelphia: W. B. Saunders Company; 2000. pp. 47–69. [Google Scholar]
- 12.Splaine ME, Aron DC, Dittus RS, et al. A curriculum for training quality scholars to improve the health and health care of veterans and the community at large. Qual Manage Health Care. 2002;10:10–8. doi: 10.1097/00019514-200210030-00006. [DOI] [PubMed] [Google Scholar]
- 13.Landis JR, Koch GG. The measurement of observer agreement for categorical data. Biometrics. 1977;33:159–74. [PubMed] [Google Scholar]
- 14.Pierluissi E, Fischer MA, Campbell AR, Landefeld CS. Discussion of medical errors in morbidity and mortality conferences. JAMA. 2003;290:2838–42. doi: 10.1001/jama.290.21.2838. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.