Abstract
OBJECTIVE
In 2003, duty-hour regulations (DHR) were initially implemented for residents in the United States to improve patient safety and protect resident’s well-being. The effect of DHR on patient safety remains unclear. The study objective was to evaluate the effect of DHR on patient safety.
DESIGN
Using an interrupted time series analysis, we analyzed selected patient safety indicators (PSIs) for 376 million discharges in teaching (T) vs nonteaching (NT) hospitals before and after implementation of DHR in 2003 that restricted resident work hours to 80 hours per week. The PSIs evaluated were postoperative pulmonary embolus or deep venous thrombosis (PEDVT), iatrogenic pneumothorax (PTx), accidental puncture or laceration, postoperative wound dehiscence (WD), postoperative hemorrhage or hematoma, and postoperative physiologic or metabolic derangement. Propensity scores were used to adjust for differences in patient comorbidities between T and NT hospitals and between discharge quarters. The primary outcomes were differences in the PSI rates before and after DHR implementation. The PSI differences between T and NT institutions were the secondary outcome.
SETTING
T and NT hospitals in the United States.
PARTICIPANTS
Participants were 376 million patient discharges from 1998 to 2007 in the Nationwide Inpatient Sample.
RESULTS
Declining rates of PTx in both T and NT hospitals preintervention slowed only in T hospitals postintervention (p = 0.04). Increasing PEDVT rates in both T and NT hospitals increased further only in NT hospitals (p = 0.01). There were no differences in the PSI rates over time for hemorrhage or hematoma, physiologic or metabolic derangement, accidental puncture or laceration, or WD. T hospitals had higher rates than NT hospitals both preintervention and postintervention for all the PSIs except WD.
CONCLUSIONS
Trends in rates for 2 of the 6 PSIs changed significantly after DHR implementation, with PTx rates worsening in T hospitals and PEDVT rates worsening in NT hospitals. Lack of consistent patterns of change suggests no measurable effect of the policy change on these PSIs.
Keywords: patient safety, duty hours, internship and residency, quality indicators
COMPETENCIES: Patient Care, Practice-Based Learning and Improvement, Systems-Based Practice
INTRODUCTION
Now a decade into the era of work-hour regulations for all resident physicians in the United States, with initial national regulations enacted in 2003 and additional mandates in 2011, the effect of these policies on patient safety remains unclear. Duty-hour regulations (DHR) were initially implemented for U.S. medical trainees by the Accreditation Council on Graduate Medical Education (ACGME) in July 2003 as a result of public pressure to achieve greater safety for both patients and residents.1,2 Responding to continued concerns and specifically to the Institute of Medicine’s report “Resident Duty Hours: Enhancing Sleep, Supervision, and Safety,” the ACGME proposed additional requirements for resident duty hours in 2009, which were implemented in 2011, stating “patient safety always has been, and remains our prime directive.”3,4 This was explicitly defined as the safety of patients being cared for by physicians in training and the safety of future patients who will be cared for by physicians after they complete their residency training.4
It is not apparent, however, that the duty-hour reform has achieved the ACGME’s primary goal of improving patient safety. Existing literature describes potential benefits of improvements in resident lifestyle, sleep, mood, operative case volume for surgical residents, and higher in-service testing scores.5-8 Data regarding effects of work-hour regulations on patient safety are equivocal. A systematic review by Fletcher noted no significant difference in patient safety–related outcomes for most of the included studies.9 It is noteworthy that most studies included in that review were limited by study size and inability to adequately control for comorbid conditions in their patient populations. Our group previously used time series analyses with adjustment for comorbidities to evaluate the effect of New York State resident’s work-hour regulations on surgical patient safety indicators (PSIs) and found increased rates in 2 out of the 6 surgical PSIs after the intervention in teaching (T) hospitals, which were not observed in the control group of nonteaching (NT) hospitals.10 Historically, New York State has implemented patient safety–oriented policies much earlier and more readily than other states, including mandatory reporting of outcomes after coronary artery bypass grafting in 1989 and the previously studied resident work-hour restrictions, which were also enacted in 1989, so the patient safety culture in New York may differ from the national culture. A nationwide study examined the effect of DHR on selected PSIs in a population of Medicare patients and Veterans, finding no difference in composite PSIs.11 Although these results may be more generalizable, they are limited by the inherent older age and greater comorbidity burden of its study population. We sought to evaluate the long-term effect of DHR on a nationally representative sample of inpatients using these standardized measures of patient safety. We examined nationwide trends in standardized PSIs among adult inpatients associated with the 2003 DHR. We hypothesized that the 2003 DHR would result in decreased rates of selected PSIs in T hospitals, but no change in the PSI rates in NT hospitals would be observed.
METHODS
Design Overview
To assess effects of the national 2003 implementation of resident work-hour regulations in T hospitals, we used weighted discharges from the Nationwide Inpatient Sample (NIS) 1998 to 200712 and selected PSIs as objective outcome measures. We performed an interrupted time series analysis that included 5 years before and 5 years after the initial duty-hour mandates. Interrupted time series analysis is a suitable method for examining both the immediate effects of policy change, as well as the effects of the intervention over time.13,14 NIS NT hospitals served as a concurrent control group.
Data Sources and Study Groups
NIS data were obtained from the Agency for Healthcare Research and Quality (AHRQ) Healthcare Cost and Utilization Project for the years 1998 to 2007. The NIS provides a large, publicly available database from which investigators can derive patient safety data. The NIS is comprised of discharge data on 6 to 8 million discharges per year from approximately 1000 hospitals (Table 1). This represents 20% of all nonfederal inpatient discharges, which are then weighted to produce national estimates.12 We defined the study group as weighted discharges from those hospitals categorized as “T hospitals” and the control group as weighted discharges from “NT hospitals” based on American Hospital Association survey data. The NIS defined T hospitals based on the following criteria: American Medical Association–approved residency programs, hospitals with a ratio of full-time equivalent interns and residents to beds of at least 0.25, or those with a membership in the Council of Teaching Hospitals.12 PSI data were compiled for study and control groups by discharge quarter.
TABLE 1.
Summary Statistics for Teaching and Nonteaching Hospitals From 1998 to 2007
| Variable | Teaching | Nonteaching | p Value |
|---|---|---|---|
| Total weighted discharges (million) | 171.6 | 204.9 | n/a |
| Hospitals (mean no. sampled/y) | 181.8 | 825.5 | n/a |
| Propensity scores (mean, 95% CI) | |||
| Preintervention | 0.44 (0.43-0.46) | 0.44 (0.43-0.46) | 0.997 |
| Postintervention | 0.46 (0.45-0.48) | 0.46 (0.45-0.48) | 0.998 |
Intervention
Full compliance with DHRs was expected by July 1, 2003 across the nation. These DHRs limited resident work hours to a total of 80 hours per week (averaged over a 4-week period) with several stipulations. In-house call was limited to 24 hours plus an additional 6-hour period to allow for transfer of patient care, didactics, or other educational activities. Residents were expected have 10 hours of rest between each duty period. Residents were also directed to have 1 full day, in 7, free from all clinical duties.2 For this study, we defined the “postintervention” period as beginning immediately following the full implementation of duty-hour mandates, i.e., the third quarter of 2003.
Control Group
NT hospitals served as the concurrent control group for the interrupted time series analysis. NIS sampling uses strata defined by hospital teaching status, bed size, as well as urban-rural status. Therefore, the sampling design helped assure that T and NT hospitals made up a representative proportion of the sample.
Outcome Measures
PSIs are metrics developed by the AHRQ for use with large, administrative data sets to allow examination of potential complications that may occur as a result of treatment within the health care system.15 PSIs were calculated as quarterly rates per 10,000 patient discharges. A total of 6 PSIs were selected as likely being sensitive to the DHR policy change. We hypothesized that DHR may affect technical errors, which could be reflected in rates of iatrogenic pneumothorax (PTx), postoperative hemorrhage or hematoma, postoperative wound dehiscence (WD), or accidental puncture or laceration. Postoperative physiologic and metabolic derangements and postoperative pulmonary embolus or deep venous thrombosis (PEDVT) may reflect lapses in postoperative patient care in which residents are likely to be involved in T hospitals. Trends in rates of each PSI over time were compared preintervention and postintervention in the study group of T hospitals and the concurrent control group of NT hospitals in participating states.
Comorbidity Adjustment
Propensity scores were used to balance the 2 comparison groups on a panel of comorbidity variables described by Elixhauser et al.16 The presence or absence of 29 comorbid conditions (Appendix B) was identified for each NIS discharge record. This information was then used to calculate a propensity score, defined as the probability of being admitted to a T vs a NT hospital for each discharge.17 A mean propensity score for each discharge quarter and hospital teaching status was then calculated. Thus, the calculated mean propensity score represents a summary measure of the comorbidity burden at a particular time point for T vs NT hospitals. The mean propensity score was included in the time series model to risk adjust each PSI.
Statistical Analysis
We used an interrupted time series analysis with an autoregressive integrated moving average model to estimate the effect of the implementation of DHR on the PSI rates. Time series analyses involve sequential assessments of the outcome, in this case, the rates of the selected PSIs, before and after an intervention. Multiple assessments during the preintervention and postintervention periods are required to determine the change in rates, or trend, in the outcome before and after the intervention. The preintervention and postintervention trends, or changes in rates over time, are compared. This method is preferred over comparison of a single preintervention rate and a single postintervention rate because it incorporates trends over multiple assessments before and after the intervention.18 Interrupted time series analyses have been shown to yield reliable estimates of the effect of an intervention and have found comparable results to randomized clinical trials in cases where data from the study arm of randomized controlled trials have been analyzed post hoc using time series methods, so they can be an acceptable, albeit less rigorous, alternative when a randomized controlled trial is not feasible.19
In our time series analysis, we adjusted for secular trends preintervention and postintervention where appropriate. We also adjusted for patient case mix using weighted, mean propensity scores as described earlier. The time series included 22 discharge quarters before the intervention and 18 discharge quarters after the intervention for a total of 40 discharge quarters. We designed the model without a transition period because full compliance was expected beginning July 1, 2003, recognizing that some centers began regulating duty hours before this date. Examinations for autocorrelation were made with each PSI. If our analysis revealed significant autocorrelation, appropriate adjustments were made within the time series model. Autocorrelation adjustment was also confirmed by plotting residual values against time. These values were noted to be randomly arrayed around zero, indicating accounting for autocorrelation.
An appropriate regression model was constructed for each PSI to compare the preintervention and postintervention rates over time. A statistically significant change in the slopes was noted if p < 0.05 for each comparison between the slopes was achieved. A change could be noted if there was a rise or a decline from the initial (preintervention) slope. Secular trends could be accounted for if present. No effect could be interpreted from the analysis if there was no detectable change in the PSI rate over the time series. Analyses and graphical processing were performed using SAS version 9.2 and STATA version 11.
RESULTS
For the study sample of greater than 375 million weighted patient discharges, T and NT groups were similar in number of weighted patient discharges (Table 1 and Appendix A). Propensity scores were similar between T and NT hospitals and were higher in the postintervention phase than in the preintervention phase for both the groups (Table 1). In comorbidity-adjusted analyses (Table 2), there were statistically significant differences in the trends in rates of PTx and PEDVT comparing preintervention with postintervention (Table 3). The preintervention rates for PTx were noted to be declining in both T and NT institutions. Postintervention, rates of PTx in NT institutions continued to decline on an unchanged trajectory, whereas rates stopped declining in T hospitals after the mandate (p = 0.04). Before the ACGME policy change, PEDVT rates were increasing in both T and NT institutions. After DHR implementation, the rates for PEDVT continued to increase on an unchanged trajectory in T hospitals, whereas in NT hospitals, rates increased more steeply after the intervention (p = 0.01). Trends in rates of accidental puncture or laceration, hemorrhage or hematoma, physiologic and metabolic derangements, and WD were not significantly altered after the implementation of the 80-hour workweek (Fig.).
TABLE 2.
Mean PSI Rates During Preintervention and Postintervention Time Periods for Teaching vs Nonteaching Hospitals
| Mean Unadjusted PSI Rates (per 10,000 Discharges) |
Mean Adjusted PSI Rates (per 10,000 Discharges) |
||||
|---|---|---|---|---|---|
| PSI | Institution Type | Pre | Post | Pre | Post |
| Iatrogenic pneumothorax | NT | 6.00 | 4.92 | 6.01 | 4.92 |
| T | 7.94 | 7.14 | 7.97 | 7.12 | |
| Pulmonary embolus or deep venous thrombosis | NT | 65.00 | 84.56 | 64.96 | 89.50 |
| T | 92.01 | 126.93 | 92.07 | 126.89 | |
| Hemorrhage or hematoma | NT | 26.07 | 84.56 | 26.05 | 24.56 |
| T | 29.45 | 27.94 | 29.37 | 28.03 | |
| Physiologic or metabolic derangement | NT | 6.02 | 6.70 | 6.00 | 6.72 |
| T | 8.18 | 10.04 | 8.16 | 10.05 | |
| Accidental puncture or laceration | NT | 30.27 | 28.63 | 30.27 | 28.62 |
| T | 42.16 | 43.77 | 42.27 | 43.65 | |
| Wound dehiscence | NT | 15.25 | 14.09 | 15.29 | 14.08 |
| T | 16.06 | 13.99 | 16.89 | 14.10 | |
PSI = patient safety indicator rates unadjusted vs adjusted by propensity scores.
Pre, preintervention (January 1998 to June 2003); Post, postintervention (July 2003 to December 2007).
TABLE 3.
Change in Rates of Patient Safety Indicators Preintervention vs Postintervention in Teaching and Nonteaching Hospitals
| Propensity Score Adjusted Change in Rate (Event per 10,000 Discharges per Quarter) |
||||
|---|---|---|---|---|
| PSI | Institution Type | Pre | Post | p Value |
| Iatrogenic pneumothorax | NT | −0.13 | 0.08 | 0.14 |
| T | −0.19 | 0.14 | 0.04 | |
| Pulmonary embolus or deep venous thrombosis | NT | 0.65 | 0.59 | 0.01 |
| T | 1.67 | 0.14 | 0.75 | |
| Hemorrhage or hematoma | NT | −0.12 | 0.04 | 0.71 |
| T | −0.20 | 0.15 | 0.33 | |
| Physiologic or metabolic derangement | NT | 0.05 | 0.15 | 0.18 |
| T | 0.18 | 0.07 | 0.49 | |
| Accidental puncture or laceration | NT | −0.13 | 0.12 | 0.56 |
| T | 0.20 | −0.53 | 0.17 | |
| Wound dehiscence | NT | 0.05 | 0.55 | 0.20 |
| T | −0.01 | 0.72 | 0.07 | |
Pre, preintervention; Post, postintervention; p values in bold denote statistically significant differences in rates before and after the intervention.
FIGURE.
Adjusted rates of PSIs by teaching status. PSI rates per 10,000 admissions per discharge quarter. Vertical lines indicate implementation of DHR (third quarter, 2003).
When comparing rates of PSIs in T and NT hospitals (Fig.), all the evaluated PSIs were higher in T institutions, except for WD. The rate for WD in both types of institutions was erratic and without an apparent trend or detectable difference between T and NT institutions.
DISCUSSION
Increased patient safety has been cited as a primary aim for a series of resident DHRs that have been implemented in the United States over the past 3 decades, from New York State’s sentinel Code 405 in 1989 to the ACGME’s landmark mandates in 2003 and most recently amendments to these rules in 2011.9 This study used an interrupted time series analysis to describe changes in patient safety relative to adoption of DHRs for the decade surrounding the 2003 mandates. To our knowledge, this is the only national study that includes the full range of adult inpatients and addresses the effect of resident DHRs on patient safety as measured using the selected PSIs.
We found no consistent changes in the rates of the selected surgical PSIs after implementation of the 2003 ACGME duty-hour mandates. Only 2 of the 6 PSIs were associated with a change in rates following the DHR; one represented an unfavorable change in rates in T hospitals and the other represented an unfavorable change in rates in the control NT hospitals. For PTx, the decline in rates that was evident before DHR slowed in T hospitals, whereas it continued to stably decline in NT hospitals. For PEDVT, the increase in rates that was evident before DHR accelerated in NT hospitals, whereas it remained unchanged in T hospitals. No change in rates over time was observed for the other 4 PSIs evaluated. The lack of consistent patterns in PSI rate changes associated with DHR suggests that DHRs did not have a substantial effect on these PSIs.
It is surprising that the incidence of PEDVT appeared to be increasing nationwide in both T and NT hospitals, albeit more precipitously in NT hospitals. This could be explained by increased surveillance for or coding of this clinical condition or both given the recent nationwide focus on improving patient safety.
The results of this study are in alignment with most prior similar analyses, which overall show no consistent convincing effects of the 2003 DHRs on patient safety. Rosen et al.11 used similar methodology to study the effect of the 2003 DHR on PSIs in Veterans Affairs and Medicare patients, finding essentially no difference in their composite PSIs, prereform and postreform. Others have assessed the effect of DHR on broader measures of patient morbidity and mortality. Shetty and Bhattacharya20 evaluated outcomes for a similar population of patients using the NIS, finding small improvements in mortality for medical, but not surgical patients a year after the 2003 DHR. De Virgilio et al.21 found no difference in morbidity or mortality for level I trauma patients before or after the intervention in 2003.
Volpp et al.22 found no difference in mortality rates in Medicare patients in the early years after implementation of the 2003 DHR, but improvements in mortality were noted 4 to 5 years after the reforms. The authors note that many other temporal factors may have improved overall mortality rates in their population during the study period, making it difficult to discern the true effect of DHR on patient safety outcomes. Privette et al. evaluated mortality as well as provider-related complications for more than 14,000 surgical patients at a single institution, noting improvement in mortality and perioperative complications post-DHRs. Interestingly, the authors noted that coincidentally these changes increased attending physician’s involvement in clinical care, as determined by increased clinical care hours logged by attending physicians and increased usage of billing modifiers, indicating that no qualified resident was available. It is difficult to know whether this trend of increased attending physician’s involvement occurred nationwide.23
The use of PSIs as a single metric of patient safety is a potential limitation of the study. Other outcome measures representative of safe patient care that could be affected by DHR include mortality, readmission, length of stay, and hospital infection rates. As noted earlier, others have previously assessed the effect of DHR on mortality. We selected PSIs because we felt they would be most sensitive to changes in resident involvement in patient care. Specifically, the 6 selected PSIs are clearly delineated events of care involving procedures and perioperative management in which residents are likely to be involved in T hospitals. We recognize that many factors outside of resident involvement may affect occurrence of PSIs even in T hospitals. However, we believe that metrics, such as mortality, may be confounded by an even broader range of factors including hospital and provider-level variation, as well as a potentially greater effect of underlying differences in patient severity of illness, which can be very difficult to risk adjust.
A further limitation of measuring patient safety using PSIs is the methodology for defining and reporting these events. The PSI algorithms used in this study were based on the International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) coding. Each PSI numerator and denominator is comprised of multiple ICD-9-CM codes. Additionally, these ICD-9-CM codes can cover many different clinical situations. Persons responsible for coding may have different interpretations of clinical conditions, resulting in variations by coder or institution. Screening for certain conditions may lead to increased rates of PSIs; similarly, underreporting may lead to decreased rates of PSIs. Additionally the use of PSIs themselves has been evaluated in the past and been found to have variable positive predictive values ranging from 22% to 72%.24-26 Regardless, these algorithms remain one of the only methods by which adverse event rates can be easily calculated without performing costly and lengthy manual record abstraction, physician surveys, or patient interviews.
We used the NIS definition for T hospital, which is relatively broad. This is likely to capture all facilities affected by DHR, but may fail to detect differences in PSIs in facilities with higher vs lower resident-to-bed ratios attributable to DHR. It is not clear whether patient safety outcomes, such as PSIs and mortality, are subject to a “dose response,” in which hospitals with higher resident-to-bed ratios are more susceptible to the effects of changes in resident work hours. Available research on the effect of DHR on mortality suggests the lack of a dose-response effect, with 2 recent studies finding no difference in the degree to which outcomes changed in more vs less teaching intensive facilities.20,22 This is a topic for further research.
When using an interrupted time series analysis, the intervention time is incorporated into the model. Variability in actual implementation of DHRs may affect the validity of these results. Preemptive changes to resident work hours began before 2003 in many institutions, most notably New York institutions, which implemented duty-hour rules in 1989. Additionally, true compliance with the duty-hour mandates has been a concern since implementation in 2003. There are few data in the literature regarding compliance of residency programs to the 80-hour workweek, and most studies focus on single institutions. In a national study, Landrigan et al. published rates of noncompliance with DHR in a national cohort of interns in the first year after policy implementation based on self-reported survey data. They found that 80% of interns failed to comply during at least 1 month of the study period. In total, noncompliance was described in 44% of all intern months.27 They did not discuss whether noncompliance among interns was seen across all programs or if there were specific programs that had more or less compliant residents. By contrast, in a recent study of internal medicine physicians at an institution, self-reported compliance was greater than 95% for all major DHRs and correlated well with individual resident parking garage entry and exit data.28 In light of the variability in implementation and ongoing ambiguity regarding compliance, interpretation of the data in this study is limited by its use of a model in which the intervention was a single event in time (July 2003) without a transition period. The effect of this variability in the study exposure would be to bias our results toward the null hypothesis that the 2003 DHR did not affect the selected PSIs.
The results of large longitudinal studies of duty-hour mandates and patient safety need to be understood within the context of global changes in health care provision. Resident work-hour regulations are one of many contemporary interventions to improve patient safety and quality, which were prompted in part by publication of the Institute of Medicine’s “to Err is Human” report.29 Other coincident interventions have included electronic medical records, computerized order entry, preprocedure checklists, hiring of physician extenders, and, as noted previously, increased involvement of attending physicians in direct patient care. Hence, it is difficult to truly isolate the effect of resident work-hour standards from these other far-reaching interventions. The use of NT hospitals as a control group decreases the influence of these confounding interventions on the primary outcome because the aforementioned interventions have been implemented in both T and NT settings nationwide. Further, the most recent changes implemented in 2011 are not reflected in this analysis, as their implementation is relatively new, but will need to be included in longitudinal analyses in the future.
Our assessment of the effect of the 2003 DHR is limited to its effect on the PSIs. However, we must recognize other potential effects of the regulations on resident’s well-being, perceived competence, and ultimately the ability of surgical residents to matriculate into the professional role of primary physician. In general, DHR has resulted in improved quality of life and decreased in-hospital work hours for surgical residents. However, there has been significant variability in the effect of DHR on operative volume, with some studies noting no difference, some noting decreased operative exposure, and others noting increased operative exposure after DHR.7 More longitudinal studies are needed to evaluate the long-term effect of DHR on postresidency performance, professionalism, and well-being for surgeons.
Despite these limitations, and realizing that multiple factors may confound any large-scale analysis of a broad intervention’s effect on patient safety, this study provides a robust analysis of a large, national database. We hypothesized that DHR would have no discernible effect on patient safety as measured using AHRQ PSIs, and we were not able to find a consistent, measurable effect of the 2003 DHRs on select PSIs. However, recognizing that there have been many positive effects of these regulations, we encourage further study of their effects and open, honest discussion of the findings as we continue to strive toward the primary goal of improving patient safety.
Acknowledgments
This study was supported by AHRQ Health Services Training Grant #T32 HSO 13833-08. B.K.P. and J.S. had full access to all the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis.
Drs Holzman and Poulose have an ongoing grant support from CR Bard and a pending grant funding from Storz. Dr Nealon is involved as a board member for Ruconest. Neither grant nor board membership relate to the content of this work.
APPENDIX A
See Table A1
TABLE A1.
Study Population by Teaching Status
| Sample Hospitals |
Raw Discharges |
Weighted Discharges |
|||||
|---|---|---|---|---|---|---|---|
| Calendar Year | Participating States | T | NT | T | NT | T | NT |
| 1998 | 22 | 195 | 787 | 3,132,644 | 3,668,527 | 16,815,005 | 17,938,471 |
| 1999 | 24 | 198 | 786 | 3,385,811 | 3,813,118 | 17,170,794 | 18,296,919 |
| 2000 | 28 | 172 | 821 | 3,195,481 | 4,243,926 | 15,795,383 | 20,569,365 |
| 2001 | 33 | 172 | 814 | 3,167,539 | 4,285,188 | 16,452,295 | 20,735,351 |
| 2002 | 35 | 178 | 817 | 3,530,994 | 4,322,988 | 17,130,943 | 20,673,111 |
| 2003 | 37 | 175 | 817 | 3,507,256 | 4,466,593 | 16,890,021 | 21,310,686 |
| 2004 | 37 | 175 | 828 | 3,598,070 | 4,406,501 | 17,266,656 | 21,395,130 |
| 2005 | 37 | 164 | 890 | 3,293,582 | 4,701,466 | 16,360,911 | 22,802,923 |
| 2006 | 38 | 198 | 844 | 3,804,142 | 4,254,173 | 18,920,907 | 20,450,041 |
| 2007 | 40 | 191 | 851 | 3,758,898 | 4,273,251 | 18,760,902 | 20,728,565 |
APPENDIX B.
See Table B1
TABLE B1.
Nationwide Inpatient Sample Comorbidities Used to Calculate Propensity Scores
| AIDS | Lymphoma |
|---|---|
| Alcohol abuse | Fluid and electrolyte disorders |
| Deficiency anemias | Metastatic cancer |
| Arthritis/collagen vascular diseases | Neurologic diseases |
| Chronic blood loss anemia | Obesity |
| Congestive heart failure | Paralysis |
| Chronic pulmonary disease | Peripheral vascular disorders |
| Coagulopathy | Psychoses |
| Depression | Pulmonary circulation disorders |
| Diabetes | Renal failure |
| Diabetes with complications | Solid tumor without metastases |
| Drug abuse | Peptic ulcer disease, excluding bleeding |
| Hypertension | Valvular disease |
| Hypothyroidism | Weight loss |
| Liver disease |
Footnotes
CONFLICTS OF INTEREST
The authors have no other conflicts of interest to report.
Author contributions
Study concept and design: Shelton, Kummerow, Arbogast, Griffin, and Poulose
Acquisition of data: Shelton and Poulose
Analysis and interpretation of data: Shelton, Phillips, Arbogast, Holzman, Nealon, and Poulose
Drafting of manuscript: Shelton and Poulose
Critical revision of the manuscript for important intellectual content: Shelton, Kummerow, Griffin, and Poulose
Statistical analysis: Shelton, Phillips, and Arbogast
Administrative, technical, and material support: Poulose and Holzman
Study supervision: Griffin and Poulose
REFERENCES
- 1.Philibert IAS. The ACGME 2011 Duty Hour Standards: Enhancing Quality of Care, Supervision, and Resident Professional Development. 2011.
- 2.Philibert I, Friedmann P, Williams WT. New requirements for resident duty hours. J Am Med Assoc. 2002;288(9):1112–1114. doi: 10.1001/jama.288.9.1112. [DOI] [PubMed] [Google Scholar]
- 3.Institute of Medicine, Board on Healthcare Services Resident Duty Hours: Enhancing Sleep, Supervision, and Safety. 2008.
- 4.Nasca T. An open letter to the GME community. 2008.
- 5.Zare SM, Galanko J, Behrns KE, et al. Psychological well-being of surgery residents before the 80-hour work week: a multiinstitutional study. J Am Coll Surg. 2004;198(4):633–640. doi: 10.1016/j.jamcollsurg.2003.10.006. [DOI] [PubMed] [Google Scholar]
- 6.Zare SM, Galanko JA, Behrns KE, et al. Psychologic well-being of surgery residents after inception of the 80-hour workweek: a multi-institutional study. Surgery. 2005;138(2):150–157. doi: 10.1016/j.surg.2005.05.011. [DOI] [PubMed] [Google Scholar]
- 7.Fletcher KE, Underwood W, 3rd, Davis SQ, et al. Effects of work hour reduction on residents’ lives: a systematic review. J Am Med Assoc. 2005;294(9):1088–1100. doi: 10.1001/jama.294.9.1088. [DOI] [PubMed] [Google Scholar]
- 8.Ferguson CM, Kellogg KC, Hutter MM, Warshaw AL. Effect of work-hour reforms on operative case volume of surgical residents. Curr Surg. 2005;62(5):535–538. doi: 10.1016/j.cursur.2005.04.001. [DOI] [PubMed] [Google Scholar]
- 9.Fletcher KE, Davis SQ, Underwood W, et al. Systematic review: effects of resident work hours on patient safety. Ann Intern Med. 2004;141(11):851–857. doi: 10.7326/0003-4819-141-11-200412070-00009. [DOI] [PubMed] [Google Scholar]
- 10.Poulose BK, Ray WA, Arbogast PG, et al. Resident work hour limits and patient safety. Ann Surg. 2005;241(6):847–856. doi: 10.1097/01.sla.0000164075.18748.38. [discussion 856-60] [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Rosen AK, Loveland SA, Romano PS, et al. Effects of resident duty hour reform on surgical and procedural patient safety indicators among hospitalized Veterans Health Administration and Medicare patients. Med Care. 2009;47(7):723–731. doi: 10.1097/MLR.0b013e31819a588f. doi: http://dx.doi.org/10.1097/MLR.0b013e31819a588f. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Healthcare Cost and Utilization Project (HCUP) Nationwide Inpatient Sample. 1998-2007.
- 13.Biglan A, Ary D, Wagenaar AC. The value of interrupted time-series experiments for community intervention research. Prev Sci. 2000;1(1):31–49. doi: 10.1023/a:1010024016308. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Matowe LK, Leister CA, Crivera C, Korth-Bradley JM. Interrupted time series analysis in clinical research. Ann Pharmacother. 2003;37(7-8):1110–1116. doi: 10.1345/aph.1A109. [DOI] [PubMed] [Google Scholar]
- 15.AHRQ . Patient Safety Indicators Overview. Agency for Healthcare Resources and Quality; Rockville, MD: [Google Scholar]
- 16.Elixhauser A, Steiner C, Harris DR, Coffey RM. Comorbidity measures for use with administrative data. Med Care. 1998;36(1):8–27. doi: 10.1097/00005650-199801000-00004. [DOI] [PubMed] [Google Scholar]
- 17.Luellen JK, Shadish WR, Clark MH. Propensity scores: an introduction and experimental test. Eval Rev. 2005;29(6):530–558. doi: 10.1177/0193841X05275596. [DOI] [PubMed] [Google Scholar]
- 18.Matowe LK, Leister CA, Crivera C, Korth-Bradley JM. Interrupted time series analysis in clinical research. Ann Pharmacother. 2003;37(7-8):1110–1116. doi: 10.1345/aph.1A109. [DOI] [PubMed] [Google Scholar]
- 19.Fretheim A, Soumerai SB, Zhang F, et al. Interrupted time-series analysis yielded an effect estimate concordant with the cluster-randomized controlled trial result. J Clin Epidemiol. 2013;66(8):883–887. doi: 10.1016/j.jclinepi.2013.03.016. [DOI] [PubMed] [Google Scholar]
- 20.Shetty KD, Bhattacharya J. Changes in hospital mortality associated with residency work-hour regulations. Ann Intern Med. 2007;147(2):73–80. doi: 10.7326/0003-4819-147-2-200707170-00161. [Epub 2007, June 4] [DOI] [PubMed] [Google Scholar]
- 21.de Virgilio C, Yaghoubian A, Lewis RJ, et al. The 80-hour resident workweek does not adversely affect patient outcomes or resident education. Curr Surg. 2006;63(6):435–439. doi: 10.1016/j.cursur.2006.03.006. [discussion 440] [DOI] [PubMed] [Google Scholar]
- 22.Volpp KG, Rosen AK, Rosenbaum PR, et al. Did duty hour reform lead to better outcomes among the highest risk patients? J Gen Intern Med. 2009;24(10):1149–1155. doi: 10.1007/s11606-009-1011-z. doi: http://dx.doi.org/10.1007/s11606-009-1011-z [Epub 2009, May 20] [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Privette AR, Shackford SR, Osler T, et al. Implementation of resident work hour restrictions is associated with a reduction in mortality and provider-related complications on the surgical service: a concurrent analysis of 14,610 patients. Ann Surg. 2009;250(2):316–321. doi: 10.1097/SLA.0b013e3181ae332a. doi: http://dx.doi.org/10.1097/SLA.0b013e3181ae332a. [DOI] [PubMed] [Google Scholar]
- 24.White RH, Sadeghi B, Tancredi DJ, et al. How valid is the ICD-9-CM based AHRQ patient safety indicator for postoperative venous thromboembolism? Med Care. 2009;47(12):1237–1243. doi: 10.1097/MLR.0b013e3181b58940. [DOI] [PubMed] [Google Scholar]
- 25.Kaafarani HM, Rosen AK. Using administrative data to identify surgical adverse events: an introduction to the Patient Safety Indicators. Am J Surg. 2009;198(suppl 5):S63–S68. doi: 10.1016/j.amjsurg.2009.08.008. [DOI] [PubMed] [Google Scholar]
- 26.Romano PS, Mull HJ, Rivard PE, et al. Validity of selected AHRQ patient safety indicators based on VA National Surgical Quality Improvement Program data. Health Serv Res. 2009;44(1):182–204. doi: 10.1111/j.1475-6773.2008.00905.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Landrigan CP, Barger LK, Cade BE, et al. Interns’ compliance with accreditation council for graduate medical education work-hour limits. J Am Med Assoc. 2006;296(9):1063–1070. doi: 10.1001/jama.296.9.1063. [DOI] [PubMed] [Google Scholar]
- 28.Chadaga SR, Keniston A, Casey D, Albert RK. Correlation between self-reported resident duty hours and time-stamped parking data. J Grad Med Educ. 2012;4(2):254–256. doi: 10.4300/JGME-D-11-00142.1. doi: http://dx.doi.org/10.4300/JGME-D-11-00142.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Medicine IoM. To Err is Human: Building a Safer Health System. National Academy of Sciences; Washington, DC: 2000. [Google Scholar]

