Skip to main content
JAMA Network logoLink to JAMA Network
. 2019 May 6;179(8):1133–1135. doi: 10.1001/jamainternmed.2019.0398

Assessing the Quality of Public Reporting of US Physician Performance

Jun Li 1,, Anup Das 2, Lena M Chen 3
PMCID: PMC6503567  PMID: 31058912

Abstract

This analysis of current data from the Physician Compare website assesses the volume of quality data submitted and whether the information is comprehensive enough to differentiate the performance of included physicians.


To empower patients and their caregivers, federal policymakers have recommended greater transparency about the quality of care delivered by the health care sector.1 Physician Compare is the Centers for Medicare & Medicaid Services’ (CMS’s) flagship website about the quality of care provided by US physicians and other clinicians. The website is in its final phase of expansion, the focus of which has been the addition of clinician-level performance data to existing practice-level data, to further help patients and their caregivers choose high-quality clinicians.2 However, it is unclear whether Physician Compare is comprehensive: reporting is voluntary (although failure to submit data evokes a 2% reimbursement penalty3); clinicians choose the measures4; and CMS displays a subset of measures.2 Therefore, using current data from Physician Compare, we addressed 3 questions: How many and what types of clinicians have quality information? How comprehensive are the quality data? How well do the measures differentiate the performance of included clinicians?

Methods

We included 1 025 015 US clinicians caring for Medicare beneficiaries. We created this sample from the Physician Compare National Downloadable File (https://data.medicare.gov/data/physician-compare) (1 023 552 individuals) and the 2015 Medicare Data on Provider Practice and Specialty database to include individuals missing from the National Downloadable File (1463 individuals). Because clinicians report as individuals and/or as part of a group practice, we obtained quality information from Physician Compare’s 2016 individual and group files. We estimated the prevalence of clinicians with quality information available across specialties, the number of quality domains reported, and the distribution of quality performance. The University of Michigan Institutional Research Board approved the study. Because this study involves secondary analysis of existing data, the University of Michigan Institutional Review Board provided a waiver of the Health Insurance Portability and Accountability Act informed consent. All analyses were conducted using statistical software, Stata, version 15 SE (StataCorp).

Results

Although 238 936 of 1 025 015 clinicians (23.3%) had quality information on Physician Compare, only 2563 (0.3%) had individual quality information (Table 1). Among clinicians with quality data, individual reporters had a median performance score of 98.0% (interquartile range [IQR], 94.5%-100.0%), and group reporters had a median performance score of 68.1% (IQR, 64.5%-72.2%) (Table 1). Scores were based on performance in few domains (individual reporters: median, 2 of 6 possible domains; group reporters: median, 3 of 7 possible domains). Within each domain, clinicians reported on few of the available measures (Table 2). For example, 72.2% of individual reporters had patient safety data, but the median individual reporter had information about 1 of 15, or 6.7% available patient safety measures. Results were generally similar for group reporting.

Table 1. Characteristics of Clinicians With Quality Data on Physician Compare Website for Fiscal Year 2016a.

Specialty Individual or Group Reporting Has Quality Information, No. (%) Individual Reportingb Group Reportingc
Has Quality Information, No. (%) Quality Domains, Median (IQR), No. Quality Performance, Median (IQR), %d Has Quality Information, No. (%) Quality Domains, Median (IQR), No. Quality Performance, Median (IQR), %d
All specialties (N = 1 025 015) 238 936 (23.3) 2563 (0.3) 2 (1-2) 98.0 (94.5-100) 236 508 (23.1) 3 (1-4) 68.1 (64.5-72.2)
Primary care (n = 189 048) 49 703 (20.8) 109 (0.1) 2 (1-2) 92.7 (77.4-97.7) 49 693 (26.3) 2 (1-4) 68.1 (64.5-71.9)
Medical (n = 117 350) 38 763 (16.2) 1194 (1.0) 2 (2-2) 96.3 (93.3-98.3) 37 689 (32.1) 3 (1-4) 68.0 (64.5-71.8)
Surgical (n = 99 220) 26 626 (11.1) 105 (0.1) 1 (1-2) 90.0 (71.0-96.3) 26 551 (26.8) 2 (1-4) 68.1 (64.5-73.6)
OB/gyn (n = 31 566) 8899 (3.7) 0 0 NA 8900 (28.2) 3 (1-4) 68.1 (64.3-73.6)
Hospital-based (n = 124 823) 32 829 (13.7) 471 (0.4) 1 (1-1) 100 (99.0-100) 32 507 (26.0) 3 (1-4) 68.0 (64.5-72.3)
Psychiatry (n = 22 298) 4705 (2.0) 0 0 NA 4705 (21.1) 3 (1-4) 67.8 (64.0-71.8)
Other physician (n = 1099)e 146 (0.1) 21 (1.9) 1 (1-1) 100 (100-100) 128 (11.6) 3 (1-4) 67.5 (64.2-67.8)
Nonphysician (n = 441 335)f 77 265 (32.3) 666 (0.2) 1 (1-1) 100 (98.0-100) 76 639 (17.4) 2 (1-4) 68.3 (64.5-72.9)

Abbreviations: ACO, accountable care organization; CMS, Centers for Medicare & Medicaid Services; IQR, interquartile range; NA, not applicable; OB/gyn, obstetrics and gynecology.

a

Group reporters belonged to an organization that had group-level quality information on the Physician Compare website. A total of 135 clinicians had both individual- and group-level information. We did not examine ACO-level quality since CMS does not allow patients to link individual clinicians to ACOs on Physician Compare.

b

Among individual reporters, 3 clinicians were each associated with 2 specialties.

c

Among group reporters, 304 clinicians were each associated with 2 specialties.

d

Quality performance is the mean of publicly displayed measure rates within clinicians, where a higher percentage means better performance.

e

Other physician reflects undefined physician type.

f

Nonphysician refers to anesthesiology assistant, certified nurse midwife, certified registered nurse anesthetist, chiropractor, clinical nurse specialist, clinical psychologist, clinical social worker, dentist, maxillofacial surgeon, nurse practitioner, occupational therapist, optometrist, oral surgeon, physical therapist, physician assistant, podiatrist, registered dietitian or nutrition professional, and speech language pathologist. The most common nonphysician reporters were nurse practitioners, physician assistants, and certified, registered nurse anesthetists.

Table 2. Characteristics of the Quality Data on Physician Compare Website.

Quality Domain Individual Reporting (n = 2563)a Group Reporting (n = 236 508)a
Clinicians,
No. (%)
No. of
Measures
No. of Measures per Clinician, Median (IQR) Clinicians, No. (%) No. of
Measures
No. of Measures per Clinician, Median (IQR)
Communication and care coordination 652 (25.4) 22 1 (1-1) 18 589 (7.9) 12 1 (1-1)
Community and population health 43 (1.7) 13 7 (6-8) 162 075 (68.5) 7 4 (1-4)
Effective clinical care 1351 (52.7) 25 2 (1-2) 123 384 (52.2) 17 2 (2-2)
Efficiency and cost reduction 0 9 0 0 9 0
Person and caregiver-centered experience and outcomes 78 (3.0) 10 5 (1-7) 0 2 0
Patient safety 1851 (72.2) 15 1 (1-1) 119 857 (50.7) 10 1 (1-1)
Consumer assessment of health care clinicians and systems NA NA NA 181 697 (76.8) 8 8 (8-8)

Abbreviations: CMS, Centers for Medicare & Medicaid Services; IQR, interquartile range; NA, not applicable; QCDRs, qualified clinical data registries.

a

There were 2563 clinicians with individual information and 236 508 clinicians with group information displayed on Physician Compare for performance in fiscal year 2016. The quality measures on Physician Compare come from the Physician Quality Reporting System. For individual reporters, the measures were collected, calculated, and submitted to CMS through QCDRs; for group practice reporters, the measures were submitted to CMS through QCDRs, registries, and web interface.

Discussion

In this study, 76.7% of clinicians had no performance data on Physician Compare, 99.7% had no clinician-level performance data, and among clinicians with data, performance reflected only a few measures and the quality performance was generally high. As currently configured, Physician Compare fell short of its goal of providing information that is widely useful to patients and their caregivers for choosing clinicians.

Several factors may explain our findings. First, 50.1% of eligible clinicians submitted quality data (at the individual or practice level) that might have been used in Physician Compare.4 This may reflect low incentives for clinicians to submit quality data3; thus, CMS could consider providing larger incentives for reporting. Second, while 27.8% of clinicians submitted individual-level data to CMS, 0.3% had individual data on the Physician Compare website.4 This difference may reflect CMS’s selection of measures for public reporting, which was based on criteria such as measure reliability and utility to patients and caregivers.5 The Centers for Medicare & Medicaid Services does not release specific information about why many individual reporters are excluded from the website; such information would help policymakers assess the feasibility of clinician-level public reporting. Third, clinicians are not required to report on all of their patients and can choose which measures to submit to CMS.6 These factors may have contributed to high performance rates observed on Physician Compare for individual clinicians.

Physician Compare’s weaknesses suggest that it will be important for policymakers to consider if and how major revisions to the website would help achieve the Department of Health and Human Services’ goals of increased transparency, or whether a different approach is needed altogether.

References


Articles from JAMA Internal Medicine are provided here courtesy of American Medical Association

RESOURCES