Abstract
Background
In October 2012, The Centers for Medicare and Medicaid Services (CMS) began publicly reporting American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) surgical outcomes on its public reporting website, Hospital Compare. Participation in this CMS-NSQIP initiative is voluntary. Our objective was to compare CMS-NSQIP participating hospitals to ACS NSQIP hospitals that elected not to participate.
Study Design
Hospital Compare and American Hospital Association Annual Survey data were merged to compare CMS-NSQIP participants to non-participants. Regression models were developed to assess predictors of participation and to assess if hospitals differed on 32 process, 10 patient experience (HCAHPS), and 16 outcome (Hospital Compare & AHRQ) measures. Additionally, performance on two waves of publicly reported ACS NSQIP surgical outcome measures was compared.
Results
Of the 452 ACS NSQIP hospitals, 80 (18%) participated in CMS-NSQIP public reporting. Participating hospitals had more beds, admissions, operations, and were more often accredited (Commission on Cancer and the Council of Teaching Hospitals (COTH) (P<0.05). Only COTH membership remained significant in adjusted analyses (OR 2.45, 95% CI 1.12–5.35). Hospital performance on process, HCAHPS, and outcome measures were not associated with CMS-NSQIP participation for 54 of 58 measures examined. Hospitals with “better-than-average” performance were more likely to publicly report the Elderly Surgery measure (P<0.05). In wave two, an increased proportion of new participants reported “worse-than-average” outcomes.
Conclusions
There were few measurable differences between CMS-NSQIP participating and non-participating hospitals. The decision to voluntarily publicly report may be related to the hospital’s culture of quality improvement and transparency.
INTRODUCTION
In response to payers, purchasers, patients, and professional organizations, public reporting of healthcare outcomes has seen a rapid increase over the past decade.1–3 Despite early successful initiatives in cardiac surgery in 1990s,4 there has been little national public reporting of surgical outcomes.
Hospital Compare is a public reporting program operated by the Centers for Medicare and Medicaid Services (CMS) which reports process-of-care, patient satisfaction, and outcome measure performance for more than 4000 Medicare-certified hospitals in the United States.5 Currently, some postoperative complications are publicly reported, but these are based on administrative data and have been shown to be relatively inaccurate.6, 7 The American College of Surgeons National Surgery Quality Improvement Program (ACS NSQIP) is a quality assessment and improvement program in which clinical data are used to provide hospitals with risk- and case-mix adjusted, nationally benchmarked, 30-day postoperative outcomes.8 This standardized data collection and detailed risk adjustment approach offers hospital quality comparisons that are far more accurate than those provided by administrative data.6, 9
In October 2012, ACS NSQIP partnered with CMS to promote public reporting and transparency of surgical outcomes. ACS NSQIP hospitals were offered the opportunity to voluntarily publicly report three of their ACS NSQIP risk-adjusted surgical outcomes on Hospital Compare (CMS-NSQIP initiative).8 This represents the first national public reporting initiative of postoperative outcomes data based on clinical registry data. The first wave of participation in this voluntary pilot initiative began in October 2012 with a second opportunity for hospitals to join in April 2013.
Our objectives were to examine differences between hospitals that chose to participate in the CMS-NSQIP public reporting initiative vs. those that did not by examining (1) structural characteristics; (2) performance on publicly reported process, patient experience, and outcome measures; and (3) performance on the three ACS NSQIP surgical care outcomes that each hospital could chose to publicly report on Hospital Compare. We hypothesized that the ACS NSQIP hospitals with more structural characteristics reflecting quality and better performance on publicly reported Hospital Compare (process, outcome and patient experience) measures would be more likely to participate in the CMS-NSQIP initiative.
METHODS
Sample
Hospitals were given the opportunity to review their ACS NSQIP outcomes before deciding to publicly report them to CMS. Hospitals that participated in the initial reporting of ACS NSQIP outcomes were identified through the Hospital Compare October 2012 release, as were participants in the April 2013 release (wave two).
Data Sources
Three data sources were used in this study. First, the 2010 American Hospital Association Annual (AHA) Survey was used to ascertain hospital-level structural characteristics for each ACS NSQIP hospital. Second, the 2010 release of the CMS Hospital Compare dataset was used to obtain 58 measures of hospital quality: 6 risk-adjusted outcomes, 32 process-of-care measures, 10 patient experience measures (Hospital Consumer Assessment of HealthCare Providers and Systems (HCAHPS)), and 10 Agency for Healthcare Research Quality (AHRQ) risk-adjusted Patient Safety Indicators (PSI). Lastly, two consecutive waves of ACS NSQIP surgical outcomes of death and serious morbidity after Elderly Surgery, Colon Surgery, and Lower Extremity Bypass were examined.
Measures
Using data from the AHA Annual Survey, differences in 20 hospital characteristics were compared between CMS-NSQIP participants and non-participants including hospital ownership/control (Government, non-governmental non-profit, and for-profit), number of hospital beds (<200, 200–299, 300+), number of hospital admissions and inpatient surgical operations, number of operating rooms, Commission on Cancer accreditation, Joint Commission (JC) accreditation, and membership in the Council of Teaching Hospitals (COTH). Finally, two other measures related to quality of care present in the AHA Annual Survey were included: (1) hospitals tracked and communicated clinical/health information, and (2) hospitals disseminated reports to the community on quality and cost of service.
From the Hospital Compare dataset, 6 risk-adjusted outcome measures were examined: death and readmission related to heart attack, heart failure, and pneumonia. Thirty-two process-of-care measures were examined: 11 heart attack or chest pain measures, 4 heart failure measures, 6 pneumonia process of care measures, and 11 Surgical Care Improvement Project (SCIP) process-of-care measures. Ten HCAHPS patient experience measures were examined. Finally, 10 AHRQ risk-adjusted PSI were compared and categorized as better-than-U.S.-national-rate, no-different-than-U.S.-national rate, and worse-than-U.S.-national rate by Hospital Compare.
Three different ACS NSQIP surgical measures were made available for hospitals to voluntarily publicly report: death or serious complication after:10 (1) elderly surgery, (2) colorectal surgery, or (3) lower extremity bypass surgery. For each hospital that reported ACS NSQIP measures on Hospital Compare, their performance was reported as “better-than-average”, “average,” “worse-than-average” or “not available” if the number of cases were too small to reliably report or the hospital did not perform the measure. The proportion of “better” and “worse” performers was compared between public reporting participants and non-participants for both waves of data. In addition, proportions of hospitals reporting each outcome in wave two were compared to proportions of hospitals reporting each outcome in wave one. Those hospitals that reported an outcome as data “not available” were not included in the comparisons for that specific outcome measure.
Statistical Analysis
Bivariate analyses were completed using chi-squared and t-tests to compare AHA hospital characteristics and Hospital Compare measures by CMS-NSQIP participation status. Multivariable logistic regression models were estimated to identify the hospital characteristics associated with participation in CMS-NSQIP public reporting (vs. non-participants) including number of hospital beds, total surgical operations, JC accreditation, ACS-cancer program accreditation, and COTH accreditation. A second group of multivariable logistic regression models were developed to examine the association between poor performance (defined as hospitals performing below the 25th percentile for a given Hospital Compare measure) and participation in CMS-NSQIP public reporting. Significance levels were adjusted using the Bonferroni correction for multiple comparisons. 11 Each model was adjusted for the number of hospital beds, COTH membership, and type of hospital. As a final step, for both waves of data, we compared percentages of “better-than-average,” “average,” and “worse–than-average” performers for the three ACS NSQIP publicly reported surgical outcomes between CMS-NSQIP participants and the ACS NSQIP sample as a whole, using t-tests for two proportions. Analyses were completed using SAS 9.3 (SAS Institute Inc., Cary, NC, USA). As only de-identified, hospital-level secondary data were used, the Northwestern University IRB deemed this study exempt from human subjects research approval.
RESULTS
Of the 452 ACS NSQIP hospitals included in this analysis, 17.5% participated in the CMS-NSQIP initiative (n=80) by publicly reporting outcomes to Hospital Compare, even if the outcome was reported as data “not available” (i.e., some hospitals agreed to report but did not have sufficient cases for a particular measure to qualify for public reporting) Of the 80 hospitals that participated, 73 (91.3%) reported the Elderly Surgery outcome, 69 (86.3%) reported the Colon Surgery outcome, and 54 (67.5%) reported the Lower Extremity Bypass outcome on Hospital Compare. Of the 80 hospitals, 56.3% reported all three measures, 23.7% reported two of the measures, 10.0% reported just one outcome while an additional 8 hospitals (10%) chose to participate in CMS-NSQIP even though their outcomes for all three measures were reported as “not available.”
Hospital Structural Characteristics
Overall, ACS NSQIP hospitals (n=452) had a mean of 237 (standard deviation (SD) =157) general medical and surgical adult beds, 21,145 (SD=14,806) annual admissions, and 6,488 (SD=5,140) inpatient surgical operations (Table 1). The majority were JCAHO accredited (92.4%), Commission on Cancer accredited (70.8%), and had ACGME residency training programs (61.3%) and/or COTH accreditation (34.3%). In addition, 89.1% reported the “dissemination of reports to the community on quality and cost of service,” while 95.9% reported “tracking and communicating clinical health information across their organization.”
Table 1.
Differences in Hospital Characteristics among CMS-NSQIP Public Reporting Participants and Non-Participants
| Hospital characteristic | All hospitals (n=452) |
Participants (n=80) |
Non- participants (n=372) |
p Value |
|---|---|---|---|---|
| General medical and surgical adult beds, median (SD) | 237 (157) | 261 (165) | 231 (155) | 0.13 |
| Intensive care beds, median (SD) | 29.0 (24.4) | 31.5 (20.5) | 28.9 (24.4) | 0.40 |
| No. of hospital beds, median (SD) | 421 (298) | 492 (296) | 405 (297) | 0.02 |
| <200 (n,%) | 30 (9.5) | 3 (6.7) | 27 (10.1) | 0.84 |
| 200–299 | 67 (21.5) | 10 (22.2) | 57 (21.4) | |
| 300+ | 215 (70.0) | 32 (71.1) | 183 (68.5) | |
| No. of admissions, median (SD) | 21,145 (14,806) | 24,625 (13,395) | 20,391 (15,005) | 0.02 |
| No. of medicare discharges, median (SD) | 8,331 (5,549) | 9,713 (5,111) | 8,032 (5,601) | 0.01 |
| Inpatient surgical operations, median (SD) | 6,488 (5,140) | 8,596 (5,917) | 6,031 (4,845) | <0.001 |
| Outpatient surgical operations, median (SD) | 9,360 (8,465) | 12,873 (11,150) | 8,599 (7,568) | <0.001 |
| Total surgical operations, (m,sd) | 15,848 (12,778) | 21,469 (16,013) | 14,630 (11,636) | <0.001 |
| No. of operating rooms, median (SD) | 22.2 (16.1) | 27.8 (18.6) | 20.8 (15.1) | <0.001 |
| Hospitalists FTE, %, median (SD) | 17.8 (17.5) | 18.0 (14.3) | 17.8 (18.4) | 0.92 |
| Intensivists FTE, %, median (SD) | 8.5 (9.3) | 8.8 (8.6) | 8.4 (9.5) | 0.78 |
| Medical/surgical intensive care, n (%) | 386 (97.7) | 76 (97.4) | 310 (97.8) | 0.85 |
| JCAHO Accreditation, n (%) | 415 (92.4) | 72 (90.0) | 343 (93.0) | 0.37 |
| ACS-cancer program accreditation, n (%) | 318 (70.8) | 64 (80.0) | 254 (68.8) | <0.05 |
| ACGME-residency training approval, n (%) | 275 (61.3) | 57 (71.3) | 218 (59.1) | <0.05 |
| Member of Council of Teaching Hospitals, n (%) | 154 (34.3) | 42 (52.5) | 112 (30.3) | 0.0002 |
| Critical access hospital n (%) | 7 (1.6) | 0 (0) | 7 (1.9) | 0.21 |
| Does hospital track and communicate clinical/health information across organizations? n (%) | 371 (95.9) | 75 (96.2) | 296 (95.8) | 0.87 |
| Does the hospital disseminate reports to the community on quality and cost of service? n (%) | 343 (89.1) | 73 (93.6) | 270 (87.9) | 0.15 |
In comparing CMS-NSQIP public reporting participants and non-participants, CMS-NSQIP participating hospitals had larger number of hospital beds, admissions, Medicare discharges, inpatient and outpatient surgical operations, and hospital operating rooms compared to non-participating hospitals (P<0.05; Table 1). Furthermore, CMS-NSQIP participating hospitals more frequently were accredited by the Commission on Cancer, had an ACGME residency training program, and were COTH members compared to non-participating hospitals (P<0.05). However, on multivariable analysis, only COTH membership remained a significant predictor of initial CMS-NSQIP public reporting participation (OR=2.45, CI: 1.12, 5.35; Table 2).
Table 2.
Association between Hospital Characteristics and CMS-NSQIP Public Reporting Participation
| CMS Participation OR (95% CI) |
|
|---|---|
| No. of hospital beds | |
| <200 | REF |
| 200–299 | 1.27 (0.28, 5.76) |
| 300+ | 1.13 (0.27, 4.69) |
| JCAHO Accreditation | 0.39 (0.14, 1.09) |
| ACS-cancer program accreditation | 1.08 (0.48, 2.41) |
| COTH-teaching hospital | 2.45 (1.12, 5.35) |
| Does the Hospital Disseminate Reports to the community on quality and cost of service? | 1.52 (0.48, 4.75) |
| Control Code | |
| Government non-federal/federal | REF |
| Non-governmental non-for-profit | 1.22 (0.45, 3.30) |
| Investor owned (for-profit) | 0.93 (0.16, 5.49) |
COTH, Council of Teaching Hospitals.
Hospital Compare Measures
After adjusting for multiple comparisons in bivariate analyses for each process, HCAHPS, and outcome measure, only the “heart attack patient given aspirin at arrival” and “accidental puncture or laceration” measures were significantly different between participants and non-participants (P<0.001, Appendix 1, online only). The remaining 54 (96.6 %) of these 58 process, HCAHPS, and outcomes measures showed no significant differences between CMS-NSQIP participating and non-participating hospitals.
After adjusting for number of beds, COTH membership, and hospital control/ownership, performance on 54 of the 58 quality measures was also not associated with participation in CMS-NSQIP public reporting (Appendix 1, online only). There were four measures where poor performance was associated with participation: (1) “accidental puncture or laceration” measure (OR=2.11, CI: 1.04, 4.30); (2) “heart attack patient given aspirin at arrival” measure (OR= 0.26, CI:0.10, 0.64); (3) “heart attack patient given aspirin at discharge” (OR=0.38, CI:0.16, 0.87); and (4) “Definitely recommending the hospital to friends and family” (OR =0.28, CI: 0.11, 0.71).
ACS NSQIP Outcome Measures
Next, the performance of CMS-NSQIP participating vs. non-participating hospitals was compared to determine whether only the better performing hospitals chose to publicly report their ACS NSQIP outcomes. After excluding hospitals from analysis for any outcome that they reported data as “not available,” 71 hospitals reported the Elderly Surgery outcome, 62 reported Colon Surgery, and 31 hospitals reported Lower Extremity Bypass outcomes (Table 3). A greater proportion of CMS-NSQIP public reporting participants (18.3%) were “better-than-average” performers on the Elderly Surgery outcome than the percentage of “better-than-average performers” from the overall ACS NSQIP hospital population (9.2%, P<0.05; Table 3). However, no differences were observed between CMS-NSQIP participants and non-participants for the Colon Surgery or Lower Extremity Bypass outcome measures.
Table 3.
Proportion of Public Reporting Participating and Non-Participating Hospitals with Better, Average, and Worse-than-Average Designations for ACS NSQIP Surgical Outcomes.
| ACS-NSQIP hospitals |
CMS-NSQIP hospitals |
p Value | |||
|---|---|---|---|---|---|
| n | % | n | % | ||
| Better than average | |||||
| Elderly surgery >65 y | 29/316 | 9.2 | 13/71 | 18.3 | <0.05 |
| Colon | 7/306 | 2.3 | 1/62 | 1.6 | 0.73 |
| Lower extremity bypass | 0/270 | 0 | 0/31 | 0 | NA |
| Average | |||||
| Elderly surgery >65 y | 244/316 | 77.2 | 56/71 | 78.9 | 0.76 |
| Colon | 294/306 | 96.1 | 61/62 | 98 | 0.46 |
| Lower extremity bypass | 270/270 | 100 | 31/31 | 100 | NA |
| Worse than average | |||||
| Elderly surgery >65 y | 43/316 | 13.6 | 2/71 | 2.8 | 0.01 |
| Colon | 5/306 | 1.6 | 0/62 | 0 | 0.32 |
| Lower extremity bypass | 0/270 | 0 | 0/31 | 0 | NA |
A two sample T-test between proportions was performed to determine whether there was a significant difference between CMS-NSQIP and ACS-NSQIP hospitals with regard to each outcome.
For the Elderly Surgery outcome, fewer CMS-NSQIP participants (2.8%) reported “worse-than-average” outcomes compared to the ACS NSQIP hospital population as a whole (13.6%, P<0.01). No differences were seen on “worse-than-average” performance for the Colon Surgery outcome or the Lower Extremity Bypass outcome (Table 3).
In wave two of CMS-NSQIP public reporting, there were an additional 22 hospitals that elected to participate (n=102 hospitals). Similar to wave one initial participation, CMS-NSQIP participants in wave two had a higher proportion of hospitals that reported “better-than-average” (18.2%) than was found among all ACS NSQIP hospitals (9.2%, P<0.05) for the Elderly Surgery measure (results not shown). Accordingly, fewer CMS-NSQIP participants reported “worse-than-average” (5.7%) outcomes than the proportion who reported “worse-than-average” in the ACS NSQIP hospital population as a whole (13.6%; P<0.05). No differences were found between CMS-NSQIP hospitals and the ACS NSQIP hospitals for the “colon surgery” or “lower extremity bypass” outcomes. Importantly, in comparing wave one and wave two for the Elderly Surgery group, a larger percentage of new wave two CMS-NSQIP participating hospitals reported “worse-than-average” performance (17.6%) than was reported during initial wave one participation (2.8%).
DISCUSSION
Public reporting and transparency are being demanded by multiple stakeholders.1, 3 Until recently, there were very little data reported publicly regarding surgical outcomes. With the recent option for ACS NSQIP hospitals to voluntarily report their outcomes on Hospital Compare, we were able to study the characteristics of the hospitals that initially opted to participate. We found that only 17.5% of hospitals chose to publicly report, but there were very few differences in hospital characteristics and other publicly reported measures between hospitals that chose to participate and those that did not. This suggests that some unmeasured factor likely drives the decision to publicly report outcomes, such as the institution’s culture and interest in transparency.12
Although hospitals are increasingly releasing information about the quality of care that they provide,13, 14 there has been a general reluctance to release too much information for a variety of reasons.12, 14 These concerns include questioning the validity and accuracy of the data reported, inadequate risk adjustment, and the fear of unintended consequences, such as “cherry picking” (hospitals selecting lower-risk patients to avoid poorer outcomes) or losing patients to better performing hospitals.12, 15–18 Moreover, our work has suggested that surgeons are also reluctant to publicly report their outcomes, or the hospitals outcomes, because of fear that the patient will misinterpret the data, fear of personal/professional vulnerability, and of “dishonest gaming” of the public reporting system.19
Much of this reluctance may have been due to issues with administrative data.6, 20 Hospitals have long been compared on outcomes using data collected for billing, and these data have been shown to be inaccurate.6, 7, 20 These inaccuracies may have created mistrust, and this may be partly responsible for the low initial participation rate that we observed. However, ACS NSQIP data are clinically collected and thought to be far more accurate than administrative data with respect to postoperative outcomes, so hospital concern regarding the validity of these data should be less. This study is the first, to our knowledge, to examine hospital characteristics and quality of care differences between hospitals that voluntarily chose to report their surgical outcomes on Hospital Compare, compared to a group that did not.
Little is known about what drives some hospitals to participate in public reporting; particularly when there are no financial incentives involved, as with the CMS-NSQIP initiative. We found that of all possible hospital structural characteristics, only teaching hospital status predicted participation in the CMS-NSQIP public reporting initiative. It is unclear why teaching hospitals would have increased participation, but a culture of research, innovation and first-mover leadership may lead to an earlier interest in transparency and public reporting.12 This first mover leadership and increased interest in quality improvement and transparency among teaching hospitals is evidenced by the recent uptake of the NSQIP Quality in-Training Initiative (QITI) which aims to integrate outcomes data into surgical education during general surgery residency.21
Although public reporting has been a key factor in stimulating hospital quality-improvement initiatives and improving process metrics,22–24 debate remains on the true measureable effect of public reporting on outcome measures.14, 25 We hypothesized that the hospitals with better publicly reported performance on existing Hospital Compare measures would be more likely to participate in the CMS-NSQIP initiative. Our unadjusted results show that some differences exist between participating and non-participating hospitals on process and outcome measures. However, after controlling for confounders, the majority of the associations disappeared; leaving only a few inconsistent differences between participants and non-participants. This suggests that the decision to publicly report is likely not based solely on a hospital’s knowledge of their current or prior performance, but rather related to the hospital’s culture and attitudes surrounding transparency.12
Finally, we examined if a larger proportion of “better-performing” hospitals were CMS-NSQIP participants than ACS NSQIP hospitals as a whole. First, we found that a greater proportion of CMS-NSQIP hospitals had “better than average” performance for the Elderly Surgery outcome than the ASC NSQIP population overall. This higher percentage of better performing hospitals reporting may be due to the fact that hospitals were able to see their outcomes before deciding to participate. However, we did not see this same trend among the “colon surgery” outcome. Second, analyses of the second wave of CMS-NSQIP participants showed an interesting trend. In the second wave of CMS-NSQIP public reporting, a larger proportion of hospitals with a “worse-than-average” designation for Elderly Surgery and Colon Surgery outcomes chose to participate compared to the first wave. Even though these hospitals were aware that their outcomes would be reported as “worse-than-average,” they still elected to publicly report these data. This increase in transparency from wave one to wave two is interesting and encouraging.12
Certain limitations should be considered. The sample of hospitals used, ACS NSQIP hospitals, may not be representative of all hospitals in the United States. Hospitals that participate in ACS NSQIP do so voluntarily and pay a fee for their participation making them a self-selected group with an interest in quality improvement and possibly public reporting. This may also lead to less variation in the measures examined than if all U.S. hospitals could be assessed. Second, the small number of hospitals reporting better-than-average and worse-than-average for ACS-NSQIP surgical outcomes makes comparisons difficult. However, the proportion of hospitals found to be “better” or “worse” than average is a function of the ACS NSQIP modeling technique itself; which shrinks the majority of hospitals toward the mean and classifies them as average.9 Nonetheless, it is interesting to note which hospitals participate in public reporting, particularly when a “worse-than-expected” hospital voluntarily participates and publicly reports their poor outcomes. Finally, our comparisons are limited to the available hospital characteristics and outcome measures currently collected. There may be other unknown factors that better explain why hospitals choose to participate in public reporting (e.g., engagement of the Board of Directors in quality measurement).2
Conclusion
Our study explored possible reasons for initial participation in the public reporting of surgical outcomes on Hospital Compare. We found few differences between CMS-NSQIP public reporting participants and non-participants in terms of hospital characteristics or performance on other publicly reported quality measures. The reasons why hospitals elect to participate is still unclear, but their past and current performance generally does not drive participation. Thus, a hospital culture that focuses on quality measurement and promotes transparency may explain the results.
Supplementary Material
Acknowledgments
Supported by: the Agency for Healthcare Research and Quality grant (R21HS021857) entitled “Engaging Patients and Hospitals to Expand Public Reporting in Surgery.”
Footnotes
Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
Disclosure Information: Nothing to disclose.
Presented at the American College of Surgeons National Surgical Quality Improvement Program National Conference, San Diego, CA, July 2013.
REFERENCES
- 1.Birkmeyer NJ, Birkmeyer JD. Strategies for improving surgical quality--should payers reward excellence or effort? N Engl J Med. 2006;354:864–870. doi: 10.1056/NEJMsb053364. [DOI] [PubMed] [Google Scholar]
- 2. [Accessed April, 15 2013];Public Reporting on Quality and Costs. Do Report Cards and Other Measures of Providers' Performance Lead to Improved Care and Better Choices by Consumers? 2012 Available at http://healthaffairs.org/healthpolicybriefs/brief_pdfs/healthpolicybrief_65.pdf.
- 3.Institute of Medicine (IOM) Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: National Academy Press; 2001. [PubMed] [Google Scholar]
- 4.Chassin MR. Achieving and sustaining improved quality: lessons from New York State and cardiac surgery. Health Affairs. 2002;21:40–51. doi: 10.1377/hlthaff.21.4.40. [DOI] [PubMed] [Google Scholar]
- 5. [Accessed 03/22/2013];U.S. Department of Health & Human Services. Centers for Medicare & Medicaid Services: Hospital Compare. Available at http://www.hospitalcompare.hhs.gov/
- 6.Lawson EH, Louie R, Zingmond DS, et al. A comparison of clinical registry versus administrative claims data for reporting of 30-day surgical complications. Ann Surg. 2012;256:973–981. doi: 10.1097/SLA.0b013e31826b4c4f. [DOI] [PubMed] [Google Scholar]
- 7.Iezzoni L. Risk Adjustment for Measuring Healthcare Outcomes. Vol. 3. Health Administration Press; Chicago, IL; 2003. [Google Scholar]
- 8.American College of Surgeons National Surgical Quality Improvement Project. [Accessed May 1, 2013];Join Leading Hospitals in CMS National Surgical Quality Pilot. 2012 Available at: http://facs.org/hospitalcompare/
- 9.Cohen ME, Ko CY, Bilimoria KY, et al. Optimizing ACS NSQIP modeling for evaluation of surgical quality and risk: patient risk adjustment, procedure mix adjustment, shrinkage adjustment, and surgical focus. J Am Coll Surg. 2013;217:336–346. e331. doi: 10.1016/j.jamcollsurg.2013.02.027. [DOI] [PubMed] [Google Scholar]
- 10.Merkow RP, Hall BL, Cohen ME, et al. Validity and feasibility of the american college of surgeons colectomy composite outcome quality measure. Ann Surg. 2013;257:483–489. doi: 10.1097/SLA.0b013e318273bf17. [DOI] [PubMed] [Google Scholar]
- 11.Simple Interactive Statistical Analysis (SISA) Bonferroni. [Accessed 04/2/2013]. Available at: http://www.quantitativeskills.com/sisa/calculations/bonhlp.htm. [Google Scholar]
- 12.Makary M. Unaccountable: what hospitals won't tell you and how transparency can revolutionize health care. 1st, U.S. ed. New York: Bloomsbury Press; 2012. [Google Scholar]
- 13. [Accessed 06/24/2013];Informed Patient Institute. 2008-2013 Available at: http://www.informedpatientinstitute.org/index.php.
- 14.Colmers J. The Commonweath Fund, Commission on a High Performance Health System. 2007. Public Reporting and Transparency. [Google Scholar]
- 15.Werner RM, Asch DA. The unintended consequences of publicly reporting quality information. JAMA. 2005;293:1239–1244. doi: 10.1001/jama.293.10.1239. [DOI] [PubMed] [Google Scholar]
- 16.Marshall MN, Shekelle PG, Leatherman S, Brook RH. The public release of performance data: what do we expect to gain? A review of the evidence. JAMA. 2000;283:1866–1874. doi: 10.1001/jama.283.14.1866. [DOI] [PubMed] [Google Scholar]
- 17.Robinowitz DL, Dudley RA. Public reporting of provider performance: can its impact be made greater? Annual Rev Public Health. 2006;27:517–536. doi: 10.1146/annurev.publhealth.27.021405.102210. [DOI] [PubMed] [Google Scholar]
- 18.Shahian DM, Edwards FH, Jacobs JP, et al. Public reporting of cardiac surgery performance: Part 1--history, rationale, consequences. Annals Thorac Surg. 2011;92:S2–S11. doi: 10.1016/j.athoracsur.2011.06.100. [DOI] [PubMed] [Google Scholar]
- 19.Sherman KL, Gordon EJ, Mahvi DM, et al. Surgeons’ perceptions of public reporting of hospital and individual surgeon quality. Med Care. 2013 doi: 10.1097/MLR.0000000000000013. In Press. [DOI] [PubMed] [Google Scholar]
- 20.Klabunde CN, Harlan LC, Warren JL. Data sources for measuring comorbidity: a comparison of hospital records and medicare claims for cancer patients. Med Care. 2006;44:921–928. doi: 10.1097/01.mlr.0000223480.52713.b9. [DOI] [PubMed] [Google Scholar]
- 21.Sellers MM, Reinke CE, Kreider S, et al. American College of Surgeons NSQIP: quality in-training initiative pilot study. J Am Coll Surg. 2013;217:827–832. doi: 10.1016/j.jamcollsurg.2013.07.005. [DOI] [PubMed] [Google Scholar]
- 22.Fung CH, Lim YW, Mattke S, et al. Systematic review: the evidence that publishing patient care performance data improves quality of care. Ann Intern Med. 2008;148:111–123. doi: 10.7326/0003-4819-148-2-200801150-00006. [DOI] [PubMed] [Google Scholar]
- 23.Lee TH. Eulogy for a quality measure. N Engl J Med. 2007;357:1175–1177. doi: 10.1056/NEJMp078102. [DOI] [PubMed] [Google Scholar]
- 24.ARHQ. Agency for Healthcare Research and Quality. [Accessed April 16, 2013];Public Reporting as a Quality Improvement Strategy:A Systematic Review of the Multiple Pathways Public Reporting May Influence Quality of Health Care. 2012 Available at: http://www.effectivehealthcare.ahrq.gov/ehc/products/343/763/CQG-Public-Reporting_Protocol_20110817.pdf.
- 25.ARHQ. Agency for Healthcare Research and Quality. [Accessed August 1, 2013];Closing the Quality Gap Series: Public Reporting as a Quality Improvement Strategy. 2012 Available at: http://www.effectivehealthcare.ahrq.gov/ehc/products/343/1198/Evidencereport208_CQG-PublicReporting_ExecutiveSummary_20120724.pdf.
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
