Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2009 Aug 10.
Published in final edited form as: Stroke. 2008 Sep 4;39(12):3367–3371. doi: 10.1161/STROKEAHA.108.518738

Public Reporting of Quality Data for Stroke: Is it Measuring Quality?

Adam Kelly 1, Joel P Thompson 1, Deborah Tuttle 1, Curtis Benesch 1, Robert G Holloway 1
PMCID: PMC2723834  NIHMSID: NIHMS97496  PMID: 18772446

Abstract

Background and Purpose

Public reporting of quality data is becoming more common and increasingly used to improve choices of patients, providers, and payers. We reviewed the scope and content of stroke data being reported to the public and how well it captures the quality of stroke care.

Methods

We performed a cross-sectional survey of all report cards within the Agency for Healthcare Research and Quality (AHRQ) Report Card Compendium. Stroke quality data were categorized into one of five groups: structure, process, outcomes, utilization, and finances. We also determined the congruence of mortality ratings of New York hospitals provided by two different report cards.

Results

Of 221 available report cards, 19 (9%) reported quality information regarding stroke, and 17 specifically addressed the quality of hospital-based stroke care. The most frequent data reported were utilization measures (n = 15 report cards) and outcome measures (n = 14 report cards). Data regarding finances (n = 4), structure of care (n = 2), and process of care (n = 1) were reported infrequently. Ratings were incongruent in 61 of the 157 hospitals (39%) with the same hospital being rated below average on one report care and average on another in 44 hospitals.

Conclusions

Publicly reported quality data pertaining to patients with stroke are incomplete, confusing, and inaccurate. Without further improvements and a better understanding of the needs and limitations of the many stakeholders, targeted transparency policies for stroke care may lead to worse quality and large economic losses.

Keywords: Quality of Care, Stroke, Mortality

Introduction

Public reporting of health care quality data has become a common policy strategy to improve transparency, accountability, and quality. It is hoped that the power of information will increase trust and drive better choices by patients, referring physicians, and purchasers of health care. Public reporting of hospital quality data is incorporated into federal law and many states have a mandatory public reporting requirement.1,2 Transparency policies, however, can decrease quality and lead to large economic losses if the information provided is incomplete, confusing, inaccurate or distorted.3,4

Stroke care is of high priority on the national quality agenda. Outcomes measures are readily available (using administrative data), process-measures have been developed from research evidence, and the Joint Commission and state health departments have encouraged the adoption of primary stroke centers.511 Quality initiatives have also shown that improvement on some of the measures is achievable.12,13 Despite these observations, the scope and accuracy of publicly reported quality data regarding stroke care is unknown.

In November 2006, the Agency for Healthcare Quality and Research (AHRQ) released its Report Card Compendium, which assembled all publicly available websites reporting health quality data at a single internet site.14 We reviewed the individual report cards to assess the content and scope of publicly available information regarding quality data after stroke. We also investigated the degree to which different sites provided similar estimates of quality regarding the same hospital.

Materials and Methods

All report cards in the Agency for Healthcare Research and Quality Report Card Compendium were accessed from a central summary webpage.14 From this central web page, links were followed and each website was reviewed for the presence of stroke-specific quality data. For each report card that contained stroke information, we identified the sponsor(s), the geographic scope of the quality report, the number of hospitals included in the report card, and the timeliness of the data. In addition, we identified the data sources used, how patients with stroke were defined and the methods of risk-adjustment, if used.15

Each quality measure was categorized as either a measure of structure, process, outcome, utilization, or finance.16,17 Structural measures include the availability and quality of resources, management systems, and designation as a Primary Stroke Center. Process measures (performance measure or effectiveness measures) evaluate the activities of physicians and other health care providers to determine if evidence-based recommendations are followed. Outcome measures evaluate the end result of health care and include measures of mortality, readmissions, complications, and patient/caregiver satisfaction scores. Utilization measures include data pertaining to the frequency of service use, including length-of-stay. Finance measures include economic data pertaining to the provision of stroke care.

We identified the rating of all New York state hospitals as provided by two separate report card sites: the Niagara Health Quality Coalition (NHQC), a not-for-profit corporation, and Health Grades®, a for-profit, nation-wide evaluation program.18,19 Both sites report on inpatient stroke mortality, use ICD-9-CM diagnosis codes to define their population at risk, and exclude patients transferred to another acute care hospital. Minor differences in coding rules exist between the two reporting systems (eg, Health Grades® includes ICD-9 code 436 and excludes patients with a palliative care code, V66.7). Neither report card provided ratings of hospitals with low stroke volumes, which was defined as fewer than 30 cases per year. New York State was chosen as the source for this analysis due to the availability of two report cards, which separately evaluated nearly all hospitals statewide.

The NHQC accepts no advertising, consulting or other funding from the providers it grades and uses statewide hospital administrative data (age 18 years and older) for the calendar year 2005. NHQC uses software developed by the Agency for Healthcare Research and Quality to compares each hospital’s inpatient mortality to a risk-adjusted national average (AHRQ Inpatient Quality Indicators).6 This measure is risk-adjusted using a linear model estimated from a nationwide data set and includes age, gender, and All-Payer Refined Diagnostic Related Groups (APR-DRGs) as developed by 3M.20 NHQC uses a 95% confidence interval to identify which hospitals are better (3-star rating), worse (1-star rating) or not significantly different from the state-wide average (2-star rating).

Health Grades® uses Medicare inpatient billing data (age 65 years and older) for the period 2003 to 2006 and uses a proprietary “disease-specific and outcome-specific” risk-adjustment methodology using demographic characteristics, co-morbid diagnoses, and specific procedures. Health Grades® calculates a predicted mortality for each hospital using a proprietary risk-adjusted methodology, to which the actual hospital mortality is compared. The statistical methodology determining if the actual and predicted rates are significantly different is not readily available at the website. Those hospitals whose mortality was lower than predicted were assigned 5-star ratings and those hospitals whose mortality was higher than predicted were assigned a 1-star rating. Those hospitals whose actual performance was not significantly different from what was predicted received a 3-star rating (2-star and 4-star ratings are not assigned). According to methodology found on the company’s website, for each diagnosis or procedure, approximately 70–80% of hospitals should receive 3-star (average) ratings, while 10–15% of hospitals should receive 1-star and 5-star ratings.

For each hospital in New York State that was included in both report cards, we assessed agreement between the two reporting systems for the following categorical ratings: 1) not significantly different than the state average or from what was predicted (i.e., 2-star rating from NHQC and 3-star rating from Health Grades®), 2) better than the state average or lower than predicted (i.e., 3-star rating from NHQC and 5-star rating from Health Grades®), and 3) worse than the state average or higher than predicted (i.e., 1-star rating from NHQC and 1-star rating from Health Grades®). We performed similar analyses for hospitals designated by the Joint Commission as Primary Stroke Centers and hospitals designated by New York State as Designated Stroke Centers.8,10 Weighted Kappa was used to assess level of agreement.

This study was exempt from review by the University of Rochester’s Research Subject Review Board.

Results

A total of 221 online quality reports were included in the AHRQ compendium as of December 1, 2007. From this total, 5 could not be found from the central AHRQ compendium site and 11 were proprietary or for health plan members only. From the remaining 205 report cards, 19 (9%) reported quality information regarding stroke care. Of these 19 report cards, 17 were hospital-based,18,19,2135 one provided physician self-reported measures regarding stroke prevention care,36 and one indicated the presence of stroke management services for health plans.37 Of the 205 accessible reports, 76 contained data regarding hospital-based measures. Thus, 17 out of 76 hospital quality reports (24%) included stroke-specific quality measures.

The 17 hospital-based report cards are listed in Table 1. The report cards were produced by state health departments,26,29,30,32,34,35 independent research organizations or private-public partnerships,18,25,27 insurance companies,22,23,33 for-profit companies,19,21 hospital associations,24,28 and health systems.31 All sites used hospital administrative data that were 1 to 3 years old, and 13 sites used the Inpatient Quality Indicator risk-adjustment software provided free by the Agency for Healthcare Quality and Research.6 One site created separate reports for non-hemorrhagic and hemorrhagic strokes.34 All other sites grouped subarachnoid hemorrhages (SAH), intracerebral hemorrhages (ICH), and ischemic strokes into one measurement cohort. Two sites included supplemental survey information about the structure and process of care.25,26

Table 1.

Summary of Hospital-Based Report Cards Including Stroke Quality Data

Sponsor(s) Title of Report Card Geographic Area Number of Hospitals Year of Data
About.com, Inc.21 UCompare Health Care Nationwide (US) 5500+ 2005
Blue Cross Blue Shield of Minnesota22 Healthcare Facts Multi-state (MN, eastern ND, western WI) 33 2007
Blue Cross Blue Shield of Tennessee23 Hospital Quality Comparison Statewide (TN) 77 2004–05
Colorado Health and Hospital Association Performance and Quality Group24 Colorado Hospital Quality Statewide (CO) 65 2004–06
Dr Foster Ltd25 Hospital Guide Nationwide (UK) 167 2006
Florida Agency for Health Care Administration26 Florida Compare Care Statewide (FL) 206 2006
Fraser Institute27 Ontario Hospital Report Card Province-wide (ON) 136 2005
Health Grades19 Free Hospital Rankings Nationwide (US) 4776 2004–06
Kentucky Hospital Association28 Quality Data Statewide (KY) 66 2006
Maryland Health Care Commission29 Hospital Guide Statewide (MD) 47 2006
Massachusetts Executive Office of Health and Human Services30 Healthcare Quality and Cost Information Statewide (MA) 71 2004–05
Niagara Health Quality Coalition; Alliance for Quality Health Care18 2007 New York State Hospital Report Card Statewide (NY) 231 2005
Norton Healthcare31 Quality Report Multi-state (IN, KY) 4 2006–07
Office for Oregon Health Policy & Research32 Oregon Hospital Quality Indicators Statewide (OR) 57 2005
PacifiCare33 Hospital Performance Multi-state (CA, WA, OR, TX, OK, NV, AZ) 200 2004
Pennsylvania Health Care Cost Containment Council34 Interactive Hospital Performance Report Statewide (PA) 162 2005–06
Texas Health Care Information Council; Texas Department of State Health Services35 Indicators of Inpatient Care in Texas Hospitals Statewide (TX) 287 2004

Table 2 shows the type of quality measures reported. The most frequently reported measures were utilization (n=15) and outcome (n=14) data. Risk-adjusted inpatient stroke mortality was the most commonly reported outcome measure. Two sites included quality indicators about the structure of stroke care services and only one site (from the UK) addressed the process of stroke care. Four sites included economic information with various methods of presentation (eg, costs, average charges, median charges).

Table 2.

Specific Content of Quality Data included in Report Cards

Structure (2 sites) Process (1 site) Outcomes (14 sites) Utilization (15 sites) Financial (4 sites)

Presence of a stroke unit25 Number of admissions given CT scans within 24–48 hours25 Risk-adjusted in-hospital mortality rate18,19,21,24,2635 Case volume18,19,2126,2830, 3235 Average charge34
Number of beds in stroke unit25 Arrangement for continued physiotherapy upon discharge25 Risk-adjusted complication rate33 Risk-adjusted length of stay23,26,33,34 Risk-adjusted average charge26
Presence of senior stroke clinician in hospital25 Number of patients taking aspirin on admission who were given a non-aspirin anti-platelet at discharge25 Number of in-hospital deaths24 Average length of stay29,30 Cost30
Integration of care with community health and social service organizations25 Review of stroke cases by a neurologist25 Risk-adjusted in-hospital + 30 day mortality rate19 Median charge28
Number of patients nursed in acute stroke unit25 Risk-adjusted in-hospital + 180 day mortality rate19
Designation as a Primary Stroke Center26 Risk-adjusted 15 day re-admission rate29
30 day re-admission rate34
30 day re-admission rate for complications and/or infection34

The results of comparing inpatient mortality ratings for similar hospitals are summarized in Table 3. A total of 157 out of 214 New York State hospitals were evaluated by both the NHQC and Health Grades® (the majority of the 57 unrated hospitals were due to low volume). Health Grades® rated 56 of the 157 hospitals below average compared to only 16 rated below average by NHQC. The two sites provided congruent ratings (average/average, below average/below average, above average/above average) on 96 out of these 157 hospitals (agreement 61.1%, weighted Kappa 0.163, slight agreement above that expected by chance). Ratings were incongruent in 61 out of 157 cases (38.9%), including one case where a hospital was rated above average by one site and below average by the other. The most common disagreement was an average rating by NHQC and a below average rating by Health Grades® (44 of the 157 hospitals, 28%). Only one hospital was rated as above average by both sites. Agreement was similarly low for the 10 Primary Stroke Centers (agreement 40%, weighted Kappa 0.118) and the 107 of the 116 Designated Stroke Centers evaluated by both sites (agreement 61.6%, weighted Kappa 0.173).

Table 3.

New York State Hospital Inpatient Mortality Ratings Using Two Different Report Cards

Niagara Coalition
Below average Average Above average Total
HealthGrades Below average 11 44 1 56
Average 5 84 8 97
Above Average 0 3 1 4
Total 16 131 10 157

Discussion

The science of quality measurement is maturing at a rapid and frenetic pace. In evaluating health care delivery, good quality is no longer assumed. On the contrary, there is an increasing expectation that it should be measured, compared, and paid for if good results are to be achieved. Our study provides a window into what is currently being publicly reported regarding the quality of stroke care, though the results of our study are limited to only those report cards included in the AHRQ Report Card Compendium. Many more organizations, health systems, and hospitals are also likely reporting stroke quality data on the internet. The amount and content of data available in countries outside the U.S. remains uncertain. The results are concerning for several reasons.

First, the data are incomplete. Despite there being well-established process measures for stroke, they were reported by only one site and this involved hospitals from the United Kingdom.8,9,11 Few sites reported on the structural elements of quality (stroke unit, accredited facility, designated stroke center), an easy potential addition given the published guidelines to establish both primary and comprehensive stroke centers.38,39 No sites reported on the quality dimensions of patient-centeredness (eg, patient satisfaction) or health disparities. One reason for this focus on outcomes and utilization quality reporting is the availability of administrative data, which does not generally include process or structure data.

Second, the data are poorly-defined. The most common outcome measure reported is risk-adjusted in-hospital mortality rate, but it is not clear what this rate is actually measuring. Short-term mortality correlates poorly with process measures and is likely related to unsafe care in fewer than 10% of all deaths.40,41 In fact, the majority of stroke deaths occur after deliberate decisions by patients and their families not to pursue unwanted life-prolonging treatments.42 Short-term mortality, therefore, may be more indicative of “good quality” deaths, particularly since more informed patients are more inclined to want less aggressive care (ie, better quality decision-making leading to higher short-term mortality).43 The tremendous variability in how mortality and other outcome data are reported only compounds the confusion. It is also unclear if the average user knows how to interpret and use other measures that are frequently reported, such as utilization data (eg, length of stay) or financial information (eg, charges vs. costs).

Third, the data are unreliable. We found that two separate report cards provided disparate hospital ratings in 39% of comparisons. Disagreement was also observed amongst Primary and Designated Stroke Centers, a subset of hospitals selected for the capacity and quality of stroke care they provide. A recent study showed inconsistent ratings of hospitals among several sites for surgical procedures, but did not quantify the degree of disagreement.44 It is not clear why the report card ratings disagree so frequently. Potential reasons include different sample eligibility criteria, inconsistent methods of risk-adjustment and variable thresholds for defining statistical significant deviations from average or expected results. The potential for systematic bias should also be explored, particularly given the skew in below average ratings found in one of the report cards and their deviation from a pre-defined distribution of outlier status.

Unreliable and invalid publicly reported stroke quality data may have unintended consequences.3,4,43,45,46 Patients may choose the wrong providers, payers may reward or punish providers inappropriately, providers may “game” to improve rankings, hospital leaders may divert resources from worthy improvement efforts, and intermediary companies may profit by stoking fears of losing reputation and market share among affected hospitals. In the end, the public loses trust.

We provide three recommendations. First, efforts are needed to develop a standardized “dossier” of stroke quality measures that meaningfully align with the six worthy aims of health care: effective, safe, patient-centered, equitable, timely, and efficient 17. This objective will include efforts to harmonize existing stroke process measures (which are in progress) and to develop consensus metrics for stroke outcomes that measure “good quality” deaths as well unexpected “never ever” deaths, for which organizations should be held accountable.8,9,15,47,48 In addition, we need to develop and standardize new measures that focus on patient-centered, efficient, and equitable care. Collaborative public-private partnerships with several organizations that are currently committed to providing stroke quality data for internal quality improvement efforts could facilitate such efforts.9,49

Second, there should be more organized skepticism focused on the AHRQ stroke inpatient quality indicator as a primary measure of quality of care.6 The increasing appetite for health care quality data and the easy access of administrative data will likely guarantee the continued use of mortality as a marker of quality. In the short-run, this will placate stakeholders. Fundamental questions remain however, about the appropriateness of combining all types of stroke (SAH, ICH, ischemic) into this one indicator and the impact that such measures may have on the delivery of high quality palliative care. The inpatient time-horizon is confounded by hospital practice patterns and the capacity of non-hospital services, and ignores the longitudinal accountability needed to improve the quality of a chronic condition. Finally, despite its “public access”, the risk-adjusted methodology remains proprietary.20

Third, further national efforts are needed to develop standardized reporting requirements with explicit rules to reduce bias and to ensure a minimum standard for measuring and reporting conduct.50 Much can be learned from the transparency systems that help govern corporate financing, restaurant hygiene, and mortgage lending practices.51 As the quality field continues to mature, there will be increasing efforts to cherry-pick measures for marketing purposes. All measures should be reported, good or bad - there is no substitute to playing by the rules and working with integrity. Discussion is also needed regarding mandatory vs. voluntary reporting, internal reporting with feedback vs. public reporting, and how to finance a sustainable and effective transparency system that is responsive, interactive, and customized to stakeholder preferences and public concerns.

Summary

The modest amount of stroke quality data that is currently available does not portray an accurate measure of the quality of care being provided, and inconsistencies in this data further undermine its effective utilization. The future of stroke quality measurement and reporting is uncertain, but broad improvement in the science and infrastructure are needed to realize its potential to mobilize choices and market forces in order to improve stroke care. Providers must to take a leading role in these efforts and focus on the needs of our patients and the public at large.

Acknowledgments

Robert Holloway consults for Milliman Guidelines reviewing practice guidelines and Maximus, Inc.

References

  • 1.Centers for Medicare and Medicaid Services. [Accessed November 28, 2007];Reporting Hospital Quality Data for Annual Payment Update. http://www.cms.hhs.gov/HospitalQualityInits/20_HospitalRHQDAPU.asp.
  • 2.Steinbrook R. Public Report Cards – Cardiac Surgery and beyond. NEJM. 2006;355:1847–1849. doi: 10.1056/NEJMp068222. [DOI] [PubMed] [Google Scholar]
  • 3.Werner RM, Asch DA. The unintended consequences of publicly reporting quality information. JAMA. 2005;293:1239–1244. doi: 10.1001/jama.293.10.1239. [DOI] [PubMed] [Google Scholar]
  • 4.Lilford R, Mohammed MA, Spiegelhalter D, Thomson R. Use and misuse of process and outcome data in managing performance of acute medical care: avoiding institutional stigma. Lancet. 2004;363:1147–1154. doi: 10.1016/S0140-6736(04)15901-1. [DOI] [PubMed] [Google Scholar]
  • 5.Iezzoni LI, Shwarz M, Ash AS, Mackiernan YD. Predicting in-hospital mortality for stroke patients: results differ across severity-measurement methods. Med Decis Making. 1996;16:348–356. doi: 10.1177/0272989X9601600405. [DOI] [PubMed] [Google Scholar]
  • 6.Agency for Healthcare Research and Quality. [Accessed November 28, 2007];AHRQ Quality Indicators. http://www.qualityindicators.ahrq.gov/
  • 7.Holloway RG, Vickrey BG, Benesch CG, Hinchey JA, Bieber J. Development of Performance Measures for Acute Ischemic Stroke. Stroke. 2001;32:2058–74. doi: 10.1161/hs0901.94620. [DOI] [PubMed] [Google Scholar]
  • 8.The Joint Commission. [Accessed October 10, 2007];Primary Stroke Center Certification. http://www.jointcommission.org/CertificationPrograms/PrimaryStrokeCenters.
  • 9.American Heart Association/American Stroke Association. [Accessed October 10, 2007];Get With the Guidelines –Stroke. http://www.strokeassociation.org/presenter.jhtml?identifier=3002728.
  • 10.New York State Department of Health. [Accessed November 28, 2007];NYS DOH Designated Stroke Centers. http://www.health.state.ny.us/nysdoh/ems/stroke/stroke.htm.
  • 11.Centers for Medicare and Medicaid Services. [Accessed November 28, 2007];Physician Quality Reporting Initiative (PQRI) http://www.cms.hhs.gov/pqri/
  • 12.Jacobs BS, Baker PL, Roychoudhury C, Mehta RH, Levine SR. Improved quality of stroke care for hospitalized Medicare beneficiaries in Michigan. Stroke. 2005 Jun;36:1227–31. doi: 10.1161/01.STR.0000166026.14624.29. [DOI] [PubMed] [Google Scholar]
  • 13.LaBresh KA. Quality of acute stroke care improvement framework for the Paul Coverdell National Acute Stroke Registry: facilitating policy and system change at the hospital level. Am J Prev Med. 2006;31(6 Suppl 2):S246–50. doi: 10.1016/j.amepre.2006.08.012. [DOI] [PubMed] [Google Scholar]
  • 14.Agency for Healthcare Research and Quality. [Accessed December 11, 2007];Report Card Compendium. doi: 10.1080/15360280802537332. http://www.talkingquality.gov/compendium. [DOI] [PubMed]
  • 15.Krumholz HM for the American Heart Association Quality of Care and Outcomes Research Interdisciplinary Writing Group. Standards for statistical models used for public reporting of health outcomes: an American Heart Association Scientific Statement from the Quality of Care and Outcomes Research Interdisciplinary Writing Group. Circulation. 2006;113:456–62. doi: 10.1161/CIRCULATIONAHA.105.170769. [DOI] [PubMed] [Google Scholar]
  • 16.Donabedian A. The quality of care. How can it be assessed? JAMA. 1998;260:1743–8. doi: 10.1001/jama.260.12.1743. [DOI] [PubMed] [Google Scholar]
  • 17.Institute of Medicine, Committee on Health Care in America. Crossing the quality chasm: a new health system for the 21st Century. Washington (DC): National Academy Press; 2001. [Google Scholar]
  • 18.Niagara Health Quality Coalition/Alliance for Quality Healthcare. [Accessed December 11, 2007];2007 New York State Hospital Report Card. http://www.myhealthfinder.com.
  • 19.Health Grades®. [Accessed December 11, 2007];Free Hospital Rankings. http://www.healthgrades.com.
  • 20.3M. [Accessed November 28, 2007];3M APR-DRGs. http://www.3mhis2007.com/apr/
  • 21.About.com: Health. [Accessed December 11, 2007];UCompare Health Care. http://www.ucomparehealthcare.com/hospital_start.html.
  • 22.Blue Cross/Blue Shield of Minnesota. [Accessed December 11, 2007];Healthcare Facts. http://www.healthcarefacts.org.
  • 23.Blue Cross/Blue Shield of Tennessee/Tennessee Hospital Association. [Accessed December 11, 2007];Hospital Quality Comparison. http://www.bcbst.com/tools.
  • 24.Colorado Health and Hospital Association Performance and Quality Group. [Accessed December 11, 2007];Colorado Hospital Quality. http://www.hospitalquality.org.
  • 25.Dr. Foster, Ltd. [Accessed December 11, 2007];Hospital Guide. http://www.drfoster.com/guides.
  • 26.Florida Agency for Healthcare Administration. [Accessed December 11, 2007];Florida Compare Care. http://www.floridacomparecare.gov.
  • 27.Fraser Institute. [Accessed December 11, 2007];Hospital Report Card: Ontario 2006. http://www.hospitalreportcards.ca/
  • 28.Kentucky Hospital Association. [Accessed December 11, 2007];Kentucky Hospital Association Quality Data. http://www.kyha.com/QualityData/IQISite/default.htm.
  • 29.Maryland Healthcare Commission. [Accessed December 11, 2007];Hospital Guide. http://mhcc.maryland.gov.
  • 30.Massachusetts Executive Office of Health and Human Services. [Accessed December 11, 2007];Healthcare Quality and Cost Information. http://www.mass.gov/healthcareqc.
  • 31.Norton Healthcare. [Accessed December 11, 2007];Quality Report. http://www.nortonhealthcare.com/about/qualityreport.
  • 32.Office for Oregon Health Policy and Research. [Accessed December 11, 2007];Oregon Hospital Quality Indicators. http://egov.oregon.gov/DAS/OHPPR/HQ.
  • 33.PacifiCare. [Accessed December 11, 2007];Hospital Performance. http://pacificare.com.
  • 34.Pennsylvania Healthcare Cost Containment Council. [Accessed December 11, 2007];Interactive Hospital Performance Report. http://www.phc4.org/hpr.
  • 35.Texas Healthcare Information Council/Texas Department of State Health Services. [Accessed December 11, 2007];Indicators of Inpatient Care in Texas Hospitals. http://www.dshs.state.tx.us/thcic/publications/hospitals/HospitalReports.shtm.
  • 36.National Committee for Quality Assurance. [Accessed December 11, 2007];Recognized Physician Directory. http://recognition.ncqa.org/
  • 37.Missouri Department of Health and Senior Services. [Accessed December 11, 2007];Managed Care Performance Monitoring. http://www.dhss.mo.gov/ManagedCare/Publications.html.
  • 38.Alberts MJ for the Brain Attack Coalition. Recommendations for the establishment of primary stroke centers. Brain Attack Coalition. JAMA. 2000;283:3102–9. doi: 10.1001/jama.283.23.3102. [DOI] [PubMed] [Google Scholar]
  • 39.Alberts MJ for the Brain Attack Coalition. Recommendations for comprehensive stroke centers: a consensus statement from the Brain Attack Coalition. Stroke. 2005;36:1597–616. doi: 10.1161/01.STR.0000170622.07210.b4. [DOI] [PubMed] [Google Scholar]
  • 40.Peterson ED, Roe MT, Mulgund J, DeLong ER, Lytel BL, Brindis Rg, Smith SC, Jr, Pollack CV, Jr, Neby LK, Harrington RA, Gibler WB, Ohman EM. Association between hospital process performance and outcomes among patients with acute coronary syndromes. JAMA. 2006;295:1912–1920. doi: 10.1001/jama.295.16.1912. [DOI] [PubMed] [Google Scholar]
  • 41.Bradley EH, Herrin J, Elbel B, McNamara RL, Magid DJ, Nallamothu BK, Wang Y, Normand SL, Spertus JA, Krumholz HM. Hospital quality for acute myocardial infarction: correlation among process measures and relationship with short-term mortality. JAMA. 2006;296:72–78. doi: 10.1001/jama.296.1.72. [DOI] [PubMed] [Google Scholar]
  • 42.Jaren O, Selwa L. Causes of mortality on a university hospital neurology service. Neurologist. 2006;12:245–8. doi: 10.1097/01.nrl.0000240859.97587.48. [DOI] [PubMed] [Google Scholar]
  • 43.Holloway RG, Quill TE. Mortality as a measure of quality; implications for palliative and end-of-life care. JAMA. 2007;298:802–4. doi: 10.1001/jama.298.7.802. [DOI] [PubMed] [Google Scholar]
  • 44.Leonardi MJ, McGory ML, Ko CY. Publicly available hospital comparison web sites. Arch Surg. 2007;142:863–96. doi: 10.1001/archsurg.142.9.863. [DOI] [PubMed] [Google Scholar]
  • 45.Marshall MN, Shekelle PG, Leatherman S, Brook RH. The public release of performance data. What do we expect to gain? A review of the evidence. JAMA. 2000;283:1866–74. doi: 10.1001/jama.283.14.1866. [DOI] [PubMed] [Google Scholar]
  • 46.Marshall MN, Romano PS, Davies HTO. How do we maximize the impact of the public reporting of quality of care? Int J Qual Health Care. 2004;16:i57–i63. doi: 10.1093/intqhc/mzh013. [DOI] [PubMed] [Google Scholar]
  • 47.National Framework and Preferred Practices for Palliative and Hospice Care Quality. A Consensus Report. © 2006 National Quality Forum; Washington DC: [Accessed May 22, 2007]. www.qualityforum.org/pdf/reports/palliative/txPHreportPUBLIC01-29-07.pdf. [Google Scholar]
  • 48.National Quality Forum. [Accessed May 22, 2007];Serious Reportable Events in Healthcare 2006. http://www.qualityforum.org/projects/completed/sre/
  • 49.University HealthSystem Consortium. [Accessed December 11, 2007]; http://www.uhc.edu/
  • 50.Pronovost PJ, Miller M, Wachter RM. The GAAP in quality measurement and reporting. JAMA. 2007;298:1800–1802. doi: 10.1001/jama.298.15.1800. [DOI] [PubMed] [Google Scholar]
  • 51.Fung A, Graham M, Weil D. Full Disclosure: The perils and promise of transparency. Cambridge University Press; New York: 2007. [Google Scholar]

RESOURCES