Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2025 Dec 7.
Published in final edited form as: Pediatrics. 2025 Dec 1;156(6):e2024069283. doi: 10.1542/peds.2024-069283

Achievable benchmarks of care in low-value care delivery in children’s hospitals

Alaina Shine 1, Matt Hall 2, Heidi G De Souza 2, Wade Harrison 3, Eric R Coon 1, Alan R Schroeder 4, Mario Reyes 5, Shawn L Ralston 1, Samantha A House 6
PMCID: PMC12680974  NIHMSID: NIHMS2123911  PMID: 41232813

Abstract

Background and Objectives:

Achievable Benchmarks of Care (ABCs) utilize performance data to derive objective and attainable targets for improvement initiatives. We applied the Pediatric Health Information Systems (PHIS) Low-Value Care (LVC) Calculator to describe variation in LVC across hospitals and identify measures with the greatest improvement potential.

Methods:

We applied the 16 LVC Calculator measures applicable to hospitalized patients <18 years to PHIS hospitalizations from 7/1/2022–6/30/2024. We utilized hospital-level data to assess LVC variation using interquartile ranges and calculate ABCs, defined as the average performance attained by top performing hospitals. We then compared median hospital-level performance to ABCs to derive measure-level performance gaps, signifying objective improvement potential. Finally, we performed a quartile analysis identifying hospitals with consistently high or low LVC delivery across measures.

Results:

401,683 hospitalizations at 43 children’s hospitals were eligible for included measures. LVC delivery varied widely across hospitals for many measures. Ten measures demonstrated performance gaps of >10%; the greatest performance gaps were observed for c-reactive protein and/or erythrocyte sedimentation rate for community acquired pneumonia (39%), electrolyte testing in patients with febrile seizure (38%), and blood cultures in community acquired pneumonia (35%). Five measures demonstrated ABCs of <5%. Quartile analyses demonstrated small cohorts of hospitals with consistently high or low performance across all measures.

Conclusions:

This analysis suggests measurable improvement potential for several low-value services and offers measure-specific deimplementation targets. Further study of high and low performing hospitals may identify hospital-level drivers of LVC trends.

Introduction:

Low-value care (LVC), defined as care that should usually be avoided given lack of benefit or potential for harm in most scenarios,1 persists in the pediatric hospital setting,2,3 contributing to unnecessary healthcare costs,4,5 and direct and downstream patient harms.69 While recent research has focused on measuring the prevalence of LVC in pediatric hospitals,2,3,10 literature describing granular patterns of LVC delivery in this clinical context remains limited. In particular, studies exploring variation in LVC across hospitals for multiple conditions are lacking.

Analysis of such comparative performance data may aid in deimplementation through several mechanisms. First, individual hospital data can be used to calculate Achievable Benchmarks of Care (ABCs). Use of ABCs is an established method of performance measurement; ABCs represent the average performance attained by a group of top performing clinicians or hospitals.11 Striving to eradicate a particular low-value service may seem desirable but may be unrealistic, as nuanced features of individual clinical encounters may support service delivery in some scenarios;12 ABCs offer attainable deimplementation targets that are more stringent than those derived using median or mean performance and reflect actual clinical practice. 1214 Prior pediatric studies have set improvement targets using ABC methodology in several areas, including respiratory conditions,12,15,16 readmission rates,17 and a limited set of LVC measures.2

Derivation of ABCs also facilitates identification of measures for which median LVC delivery is markedly greater than a calculated ABC. Termed performance gaps, such differences suggest opportunities for learning from high-performing centers to bring overall LVC rates toward this goal.12 Identification of ABCs and performance gaps at the measure level may aid hospitals in identifying deimplementation priorities and sustaining improvement efforts toward realistic targets. Further, hospitals demonstrating low rates of LVC on a measure may disseminate deimplementation strategies to centers with less optimal performance.

Additionally, assessing overall performance across measures and identifying hospitals with a particularly high or low propensity for LVC delivery may facilitate improved understanding of institutional characteristics associated with these patterns. Such data may encourage hospitals with higher rates of LVC delivery than peer institutions to consider local drivers, invest in deimplementation efforts, and incorporate strategies used by institutions with better performance.

In this study, we applied the Pediatric Health Information System (PHIS; Children’s Hospital Association, Lenexa, KS) LVC Calculator3 to hospital encounters from July 2022-June 2024. The LVC calculator has previously been used to measure prevalence and cost of LVC,3 trends over time in LVC delivery,18 and differences in LVC rates by neighborhood opportunity.19 The current analysis represents the first to assess hospital-level variation in LVC delivery. In our study we aimed to 1) describe variation in performance across hospitals on a suite of LVC measures applicable to hospitalized patients, 2) derive ABCs and performance gaps, and 3) explore hospital-level performance to identify hospitals with consistently high or low performance across all included LVC measures.

Methods:

Data source

This study utilized the PHIS LVC Calculator, an integrated PHIS tool measuring the delivery of low-value services in the emergency department (ED) and/or inpatient settings.3 PHIS is an administrative database containing demographic, diagnostic, procedural, and billing data from 47 children’s hospitals. Data are deidentified at the time of submission and quality is jointly ensured by the Children’s Hospital Association (Lenexa, KS) and participating hospitals.

The methodology of the PHIS LVC Calculator’s development and measure specifications have been previously published.3 The calculator includes 30 LVC measures representing delivery of medications, imaging, procedures, and laboratory studies in clinical scenarios in which literature does not support their use. Four of the 30 included measures are applicable to ED encounters only; the remaining 26 measures can be applied to hospitalizations. The tool was developed by a multidisciplinary stakeholder group with measures derived from previously published measures and recommendations identifying low-value services.2,10,2022 Measure definitions (Supplemental Table 1) were created using International Classification of Disease, 10th Revision, Clinical Modification (ICD-10-CM) diagnosis and procedural codes with an intent to create narrow, specific, measure definitions. This approach utilizes multiple restrictions to best capture care that is likely to be truly low-value while avoiding misclassification of justified care practices. The tool excludes encounters for patients >18 years, those with ICD-10-CM codes indicating complex chronic conditions or neurologic impairment,23,24 those with All-Payer Refined Diagnosis Related Groups (3M Healthcare) for extreme severity of illness, and those with intensive care admission (with exception of neonatal intensive care specific measures). Measure-specific exclusions are applied to remove encounters with diagnostic codes that may justify delivery of a particular service.3 For example, encounters with documentation consistent with complicated pneumonia are excluded from community acquired pneumonia measures given that they may, in some circumstances, warrant services described as low-value for more routine pneumonia cases.

Study Design

We performed a retrospective cohort study applying the PHIS LVC Calculator to eligible hospitalizations (inpatient or observation) from July 1, 2022 to June 30, 2024. This study period offers a multi-year analysis subsequent to the impact of the COVID-19 pandemic, which is known to have influenced inpatient care patterns.18,25 We anticipate this analysis to closely reflect current care patterns, thus allowing establishment of baseline ABCs to serve as comparators for ongoing assessment. Encounters were eligible if they met inclusion criteria for ≥1 LVC Calculator measure applicable to hospitalizations. Of note, for hospitalized children, care delivered following hospitalization cannot be distinguished from care delivered in the ED during the same encounter; as such, LVC delivery for the hospitalized cohort includes services rendered in the ED prior to hospitalization. We therefore excluded hospitals not submitting ED data to PHIS, as rates of utilization at these hospitals are likely not comparable to those reporting service delivery in both the ED and inpatient setting. Hospitals not contributing complete data for the entire study period were also excluded, as were encounters resulting from outside transfer as the delivery of LVC prior to transfer could not be ascertained.

Outcome measures and statistical analysis

We first assessed aggregate rates of LVC delivery across hospitals for the entire study period by measure. We elected to exclude measures from analyses and ABC calculations if the overall rate of LVC among eligible encounters was <5% a priori, as we felt establishing further improvement goals for those measures may be unrealistic.

For each measure demonstrating LVC delivery in ≥5% of eligible encounters across the study period, we assessed the median, range, and interquartile range of LVC delivery across hospitals. Next, we utilized hospital-specific performance data to calculate ABCs using methodology described by Kiefe et al. in 1998,11,26 which has subsequently been applied within several PHIS analyses.2,12,15,16 For each measure, we first ranked included hospitals based on performance. The top performing hospitals that included a cumulative percentage of 10% of eligible encounters were included in the ABC calculation. If the single top performing hospital accounted for at least 10% of encounters, its performance was used to calculate the ABC; otherwise, hospitals were serially added until the 10% threshold was exceeded. ABCs were then calculated by dividing the number of encounters with LVC by the total number of encounters in this cohort, therefore representing the weighted average performance on each measure from these top-performing centers.26 Using this methodology, the number of hospitals contributing to the ABC varies based on relative contribution to denominator encounters.

For each measure, we calculated the difference between median performance across all hospitals and the calculated ABC to establish the performance gap. The larger the performance gap, the greater the potential for improvement for an individual measure.12

Finally, we sought to explore hospital performance consistency across measures. We utilized the range of LVC delivery across hospitals for each measure to identify performance quartiles, with 1st quartile performance indicating the lowest LVC delivery, or best performance, and 4th quartile performance indicating the highest LVC delivery, or worst performance. We then assigned each hospital separate performance scores by tallying the number of measures for which it fell into the first or fourth quartiles. We utilized a scatter plot to visualize hospital-based performance aggregated across all measures and to identify apparent performance outliers. Analyses were performed using SAS version 9.4 (SAS Institute, Cary, NC) and Microsoft Excel version 2408. This study was deemed non-human subjects research by Dartmouth College.

Results:

Of the 26 measures in the LVC Calculator applicable to hospitalized patients, 10 measures were excluded because their overall LVC rates were <5% over the study period. Two hospitals were excluded as they were lacking data for all included measures; one hospital was excluded as they do not submit ED data to PHIS. There were 401,683 encounters eligible for ≥1 of the remaining 16 measures across 43 hospitals.

LVC Delivery Rates

Box plots depicting median rates, ranges, and interquartile ranges of LVC delivery for all measures are shown in Figure 1. The highest median LVC delivery was noted for administration of broad-spectrum antibiotics (defined as those broader than amoxicillin or ampicillin alone or in combination with beta-lactamase inhibitors) in patients with community acquired pneumonia (60%), while the lowest median LVC delivery was for blood cultures in bronchiolitis (5%). For many measures, wide variation was noted in LVC across hospitals. Electrolyte testing for febrile seizures demonstrated the largest IQR (31%), followed by co mplete blood count for febrile seizures (29%), blood cultures for community acquired pneumonia (25%), and c-reactive protein and/or erythrocyte sedimentation rate for community acquired pneumonia (24%).

Figure 1.

Figure 1.

Variation in hospital-level LVC delivery

a. Yellow “X” indicates median; red “∘” indicates Achievable Benchmark of Care

b. Abbreviations used: CAP = Community Acquired Pneumonia; CBC = Complete Blood Count GERD = Gastroesophageal Reflux Disease; CRP = c-reactive protein; ESR = erythrocyte sedimentation rate; CT = Computed Tomography

ABCs and Performance Gap

ABCs and performance gaps for each measure are shown in Table 1. The number of hospitals contributing to the ABC ranged from one (in a single measure where >10% of eligible encounters were delivered at one hospital) to 12. ABCs were highest for broad-spectrum antibiotics for community acquired pneumonia (36%), chest x-ray for asthma (28%), and complete blood count for febrile seizure (26%). Five measures had an ABC ≤5%, indicating the potential ability to achieve very low rates of LVC delivery. Performance gaps were greatest for c-reactive protein and/or erythrocyte sedimentation rate for community acquired pneumonia (39%), electrolyte testing in patients with febrile seizure (38%), and blood cultures in community acquired pneumonia (35%). The remaining community acquired pneumonia measure, broad spectrum antibiotics, also had a relatively large performance gap at 24%. The performance gap for acid suppression therapy for gastroesophageal reflux in infants (29%) was also among the highest.

Table 1.

Achievable Benchmarks of Care and Performance Gaps Across Hospitals

Measure Eligible Encounters Median* 25th Percentile 75th Percentile Hospitals contributing encounter data to calculate Achievable Benchmark of Care** (N) Achievable Benchmark of Care Performance Gap (Rank)
Broad spectrum antibiotics for CAP 12252 60.3 47.3 67.1 6 36.0 24.3 (6)
CBC for simple febrile seizure 1107 50.0 33.3 62.5 9 25.6 24.4 (5)
Acid suppression for GERD 14213 47.9 38.3 58.6 6 19.0 28.8 (4)
Electrolytes for simple febrile seizure 1107 47.2 31.3 62.1 9 9.4 37.8 (2)
Chest x-ray for asthma 47774 42.0 31.4 45.8 8 27.6 14.4 (9)
CRP and/or ESR in CAP 12273 42.0 34.2 58.5 1 3.4 38.6 (1)
Blood culture for CAP 12298 41.3 27.8 52.4 5 6.7 34.6 (3)
Bronchodilators for bronchiolitis 45701 36.4 30.9 45.1 4 13.5 23.0 (7)
Chest x-ray for bronchiolitis 45701 32.3 24.6 37.1 5 17.1 15.1 (8)
Concurrent antipsychotic medications 41530 15.8 11.2 21.4 12 8.8 6.9 (13)
CT for abdominal pain 14923 14.8 11.3 18.8 5 6.0 8.7 (12)
Corticosteroids for bronchiolitis 45701 14.5 11.4 17.5 8 8.4 6.1 (14)
CT for first generalized afebrile atraumatic seizure 10523 13.3 7.9 19.3 2 3.0 10.4 (10)
CT for simple febrile seizure 1323 9.5 0.0 ‘14.3 9 0 9.5 (11)
Antibiotics for viral upper respiratory infection 60701 5.7 4.1 8.1 3 2.2 3.4 (16)
Blood cultures for bronchiolitis 40411 5.3 3.5 8.3 7 1.5 3.8 (15)
*

Medians are ranked in descending order, with the highest median listed first.

**

Represents the number of hospitals whose encounters were included in the ABC calculation (i.e. encompassed the top 10% performance).

Abbreviations used: CAP = Community Acquired Pneumonia; CBC = Complete Blood Count GERD = Gastroesophageal Reflux Disease; CRP = c-reactive protein; ESR = erythrocyte sedimentation rate; CT = Computed Tomography

Hospital Performance Across Measures

Figure 2 demonstrates the relationship between measures with top-quartile performance and bottom-quartile performance for each included hospital. One hospital demonstrated top-quartile performance, indicating low rates of LVC, for 12 measures with bottom-quartile performance for zero measures, while 6 additional hospitals demonstrated top-quartile performance for ≥8 measures and ≤2 measures with bottom-quartile performance. Six hospitals demonstrated bottom-quartile performance for ≥8 measures, with ≤3 measures for which they performed in the top quartile.

Figure 2:

Figure 2:

Scatter Plot of 1st and 4th Quartile Performance by Hospital

a. Larger diamonds indicate >1 hospital with same performance, with the number representing the number of hospitals with the same performance

Discussion:

In this cross-sectional study applying the PHIS LVC Calculator to pediatric hospitalizations, we identified wide variation in LVC delivery across hospitals for some services. The derived ABCs and associated performance gaps can be used to inform deimplementation efforts. We identified some measures with large performance gaps, signifying that deimplementation of these services across PHIS hospitals may be achieved through application of strategies utilized at high-performing centers. Initial exploration of hospital performance across measures identified some hospitals with consistently high or low performance, warranting further assessment of drivers and outcomes associated with these patterns.

Assessing and applying comparative performance data has long been identified as a key element of QI initiatives in healthcare. Such data allow identification of high performing clinicians or hospitals, which can facilitate knowledge sharing to optimize success for others. Additionally, those demonstrating suboptimal performance may be motivated to improve as they recognize their standing among peers.27 Such data are felt to be particularly critical to successful deimplementation efforts, as “normalizing” change through data sharing and peer benchmarking are proposed tactics to address the unique challenges associated with discontinuing well-established practices.2830 However, limitations in data sources and analytic resources are often barriers to continual data provision supporting improvement work.

Unwarranted variation in care delivery, or variation unrelated to patient need or preferences, may signal concern for suboptimal care quality and opportunity for standardization.31,32 By quantifying the extent of variation present across hospitals and establishing attainable performance targets, the current analysis could help to guide and prioritize evidence-based improvement efforts to reduce LVC. Measures with notable performance gaps included laboratory testing in febrile seizures and community acquired pneumonia as well as acid suppression use in gastroesophageal reflux disease, among others. Notably, bronchodilators and chest x-ray in bronchiolitis also had relatively high-performance gaps at 23% and 15%, respectively. Guidelines recommending against routine use of these services have been present for over a decade and associated deimplementation efforts have been robust.3336 Despite this, our data suggest that targeted deimplementation efforts informed by experiences of hospitals with strong performance could realistically yield meaningful reductions in these high volume services.

Our analysis additionally establishes data-driven aims for this set of LVC measures. While setting a specific aim is a central part of QI design, often performance targets are developed empirically. Without objective, realistic targets, a team may begin to make progress in a particular area and shift focus before full improvement potential is reached, or alternatively may aim for an unrealistic goal and experience dissatisfaction if that goal cannot be achieved. ABCs serve as clinically relevant, achievable targets to guide QI efforts. While eradication is not a realistic expectation for most low-value services, 5 measures included in our analysis had ABCs <5%, indicating that very low rates of delivery may be attainable. Measures with high ABCs, such as broad-spectrum antibiotics in community acquired pneumonia (ABC 36%), reflect higher overall rates of LVC delivery across the network for these services and few hospitals demonstrating superior performance. A high ABC should not suggest that performance cannot, or should not, be improved for such measures; rather, it suggests that novel methods not currently employed by PHIS hospitals will likely be needed to alter performance. Continued measurement of ABCs over time should aid in assessment of performance changes and ongoing adjustment of achievable deimplementation goals.

The current analysis offers an initial exploration of hospital-level LVC delivery patterns. This preliminary analysis identified hospitals whose top- and bottom-quartile performance on included measures diverged from the remainder of the cohort. These findings warrant next steps. First, development of a more refined approach to measuring aggregate hospital-level performance on the LVC Calculator will aid in establishing and tracking broader hospital performance in this domain of healthcare value. While the development of a valid composite performance measure is a complex undertaking in the setting of variable LVC prevalence and encounter volumes, such a measure may foster information sharing from high-performers and motivate those with lower performance to focus on improvement opportunities. Additionally, future quantitative and qualitative analyses may further elucidate structural or cultural drivers of high and low LVC performance. While some hospital-based characteristics, such as geographic location, teaching status, and payor mix have been implicated as potential influences on LVC delivery,37,38 variation in performance among this relatively similarly structured set of PHIS-participating hospitals raises questions surrounding other local influences, which may include provider attitudes surrounding LVC and availability of deimplementation resources, among others.

Our study has several limitations. The LVC Calculator relies on administrative data, which is limited in its ability to assess nuanced drivers of LVC delivery. While the tool applies stringent exclusion criteria to reduce misclassification of appropriate care, it is possible that some warranted care has been identified as low-value; conversely, our narrow measure definitions may underestimate true rates of LVC.

While ABCs serve as objective performance targets, they may not represent the ideal rate of delivery for all measures. Ongoing assessment of clinical outcomes associated with reduction in delivery of the services assessed in this analysis is warranted. Additionally, inherent to the methodology of ABCs, the number of hospitals contributing to measure-specific ABCs is variable and dependent on encounter volumes. In our cohort, the ABCs for three measures incorporated ≤ 3 hospitals; this group included the measure with the largest performance gap (CRP and/or ESR in community acquired pneumonia), which was determined by the performance of a single hospital. It is possible that patient- or hospital-level differences may exist between these centers and others in the PHIS network, which may influence peer hospital ability to reach this performance level. Particularly for measures with ABCs derived from a small cohort of hospitals, clinicians may consider selection of a less stringent initial performance target, such as 25th percentile performance, while monitoring progression of network-level ABCs over time.

While we propose that performance variation is one characteristic that should be considered in prioritizing deimplementation efforts, other characteristics not considered in this analysis, including absolute volume of LVC delivered, degree of potential harm and costs associated with a service, and strength of the literature supporting the recommendation should also be considered. Our analysis of consistency in hospital performance considered only measures in which hospitals were top- or bottom-quartile performers and did not account for the disparate eligible encounter volumes; a more comprehensive approach to overall hospital performance determination, including the development of a structured composite performance index, is an important next step in characterizing hospital-level LVC performance and may yield differing results. We did not evaluate associations between specific hospital characteristics and LVC performance; such an analysis may offer further insight into performance drivers. Finally, the LVC Calculator is applicable only to data from US children’s hospitals participating in PHIS; our findings may not be generalizable to other settings.

Conclusions:

We identified variation in performance on a suite of LVC measures between hospitals, with some measures demonstrating a median performance across included PHIS hospitals well above achievable benchmarks, suggesting ample improvement opportunity. Considering performance gaps when prioritizing areas for deimplementation and utilizing strategies employed by high-performing hospitals may aid in successful deimplementation across the PHIS network. For measures with persistent LVC and high ABCs, novel deimplementation methods should be considered. Focused qualitative and quantitative investigations of the structure and culture of hospitals with notably strong or weak performance across measures may elucidate drivers and outcomes associated with these patterns.

Supplementary Material

Supplemental Table 1

Supplemental Table 1 Alternative text

A table listing each low-value care calculator measure with contextual information regarding measure development, including encounter inclusion and exclusion criteria.

Article Summary:

This analysis used hospital-level performance data to calculate Achievable Benchmarks of Care for 16 low-value care measures and assessed how median performance compared to benchmarks.

What’s Known on This Subject:

Low-value care remains prevalent in children’s hospitals. Utilization of hospital-level performance data on low-value care measures may aid in identification of services with realistic improvement potential and in setting performance improvement goals.

What this Study Adds:

We calculated Achievable Benchmarks of Care and performance gaps (i.e. difference between median performance and achievable benchmarks) for 16 low-value care measures. Large performance gaps for several measures suggest improvement potential. Across-measure performance consistency identified for some hospitals warrants investigation.

Funding/Support:

The work reported in the manuscript was not funded. Dr. House receives support from the National Institute of General Medical Sciences of the National Institutes of Health under Award Number P20GM148278.

Role of Funder:

The National Institutes of Health had no role in the design or conduct of this study. The content presented in this manuscript is solely the responsibility of the authors.

Abbreviations:

LVC

Low-value care

PHIS

Pediatric Health Information System

ABC

Achievable Benchmarks of Care

ICD-10-CM

International Classification of Disease, 10th Revision, Clinical Modification

ED

Emergency Department

QI

Quality improvement

Footnotes

Conflicts of Interest: The authors have no conflicts of interest to disclose.

References:

  • 1.Verkerk EW, Tanke MAC, Kool RB, van Dulmen SA, Westert GP. Limit, lean or listen? A typology of low-value care that gives direction in de-implementation. Int J Qual Health Care. Nov 1 2018;30(9):736–739. doi: 10.1093/intqhc/mzy100 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Reyes M, Paulus E, Hronek C, et al. Choosing Wisely Campaign: Report Card and Achievable Benchmarks of Care for Children’s Hospitals . Hospital pediatrics. Nov 2017;7(11):633–641. doi: 10.1542/hpeds.2017-0029 [DOI] [PubMed] [Google Scholar]
  • 3.House SA, Hall M, Ralston SL, et al. Development and Use of a Calculator to Measure Pediatric Low-Value Care Delivered in US Children’s Hospitals. JAMA Netw Open. Dec 1 2021;4(12):e2135184. doi: 10.1001/jamanetworkopen.2021.35184 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Berwick DM, Hackbarth AD. Eliminating waste in US health care. Jama. Apr 11 2012;307(14):1513–6. doi: 10.1001/jama.2012.362 [DOI] [PubMed] [Google Scholar]
  • 5.Shrank WH, Rogstad TL, Parekh N. Waste in the US Health Care System: Estimated Costs and Potential for Savings. Jama. Oct 7 2019;doi: 10.1001/jama.2019.13978 [DOI] [PubMed] [Google Scholar]
  • 6.Butler AM, Brown DS, Durkin MJ, et al. Association of Inappropriate Outpatient Pediatric Antibiotic Prescriptions With Adverse Drug Events and Health Care Expenditures. JAMA Netw Open. May 2 2022;5(5):e2214153. doi: 10.1001/jamanetworkopen.2022.14153 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Chalmers K, Gopinath V, Brownlee S, Saini V, Elshaug AG. Adverse Events and Hospital-Acquired Conditions Associated With Potential Low-Value Care in Medicare Beneficiaries. JAMA Health Forum. Jul 2021;2(7):e211719. doi: 10.1001/jamahealthforum.2021.1719 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Ganguli I, Simpkin AL, Lupo C, et al. Cascades of Care After Incidental Findings in a US National Survey of Physicians. JAMA Netw Open. Oct 2 2019;2(10):e1913325. doi: 10.1001/jamanetworkopen.2019.13325 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Brownlee SM, Korenstein D. Better understanding the downsides of low value healthcare could reduce harm. Bmj. Mar 23 2021;372:n117. doi: 10.1136/bmj.n117 [DOI] [PubMed] [Google Scholar]
  • 10.Chua KP, Schwartz AL, Volerman A, Conti RM, Huang ES. Use of Low-Value Pediatric Services Among the Commercially Insured. Pediatrics. Dec 2016;138(6)doi: 10.1542/peds.2016-1809 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Weissman NW, Allison JJ, Kiefe CI, et al. Achievable benchmarks of care: the ABCs of benchmarking. J Eval Clin Pract. Aug 1999;5(3):269–81. doi: 10.1046/j.1365-2753.1999.00203.x [DOI] [PubMed] [Google Scholar]
  • 12.Ralston SL, House SA, Harrison W, Hall M. The Evolution of Quality Benchmarks for Bronchiolitis. Pediatrics. Sep 2021;148(3)doi: 10.1542/peds.2021-050710 [DOI] [PubMed] [Google Scholar]
  • 13.Ralston S, Parikh K, Goodman D. Benchmarking Overuse of Medical Interventions for Bronchiolitis. JAMA pediatrics. Sep 2015;169(9):805–6. doi: 10.1001/jamapediatrics.2015.1372 [DOI] [PubMed] [Google Scholar]
  • 14.Parikh K, Agrawal S. Establishing superior benchmarks of care in clinical practice: a proposal to drive achievable health care value. JAMA pediatrics. Apr 2015;169(4):301–2. doi: 10.1001/jamapediatrics.2014.3580 [DOI] [PubMed] [Google Scholar]
  • 15.Parikh K, Hall M, Mittal V, et al. Establishing benchmarks for the hospitalized care of children with asthma, bronchiolitis, and pneumonia. Pediatrics. Sep 2014;134(3):555–62. doi: 10.1542/peds.2014-1052 [DOI] [PubMed] [Google Scholar]
  • 16.Reyes MA, Etinger V, Hronek C, et al. Pediatric Respiratory Illnesses: An Update on Achievable Benchmarks of Care. Pediatrics. Aug 1 2023;152(2)doi: 10.1542/peds.2022-058389 [DOI] [PubMed] [Google Scholar]
  • 17.Montalbano A, Quinonez RA, Hall M, et al. Achievable Benchmarks of Care for Pediatric Readmissions. Journal of hospital medicine. Sep 2019;14(9):534–540. doi: 10.12788/jhm.3201 [DOI] [PubMed] [Google Scholar]
  • 18.House SA, Marin JR, Coon ER, et al. Trends in Low-Value Care Among Children’s Hospitals. Pediatrics. Jan 1 2024;153(1)doi: 10.1542/peds.2023-062492 [DOI] [PubMed] [Google Scholar]
  • 19.Ugalde I SA, McCoy E, Marin JR, Hall M, Goyal MK, Molloy MJ, Stephens JR, Steiner MJ, Tchou MJ, Markham JL, Cotter JM, Noelke C, Morse R, House SA. . Association of Childhood Opportunity Index with Low-Value Care in Children’s Hospitals. [DOI] [PMC free article] [PubMed]
  • 20.Foundation ABoIM. Choosing Wisely. Accessed October 1, 2018, 2018. https://www.choosingwisely.org/clinician-lists/
  • 21.House SA, Coon ER, Schroeder AR, Ralston SL. Categorization of National Pediatric Quality Measures. Pediatrics. Apr 2017;139(4)doi: 10.1542/peds.2016-3269 [DOI] [PubMed] [Google Scholar]
  • 22.Research AfHQa. Pediatric Quality Measures Program Accessed October 1, 2018. https://www.ahrq.gov/pqmp/measures/all-pqmp-measures.html
  • 23.Feudtner C, Feinstein JA, Zhong W, Hall M, Dai D. Pediatric complex chronic conditions classification system version 2: updated for ICD-10 and complex medical technology dependence and transplantation. BMC Pediatr. Aug 8 2014;14:199. doi: 10.1186/1471-2431-14-199 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Association CsH. High-Intensity Neurologic Impairment Code Accessed July 11, 2019. https://www.childrenshospitals.org/Research-and-Data/Pediatric-Data-and-Trends/2019/High-Intensity-Neurologic-Impairment-Codes
  • 25.Schroeder AR, Dahlen A, Purington N, et al. Healthcare utilization in children across the care continuum during the COVID-19 pandemic. PloS one. 2022;17(10):e0276461. doi: 10.1371/journal.pone.0276461 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Kiefe CI, Weissman NW, Allison JJ, Farmer R, Weaver M, Williams OD. Identifying achievable benchmarks of care: concepts and methodology. Int J Qual Health Care. Oct 1998;10(5):443–7. doi: 10.1093/intqhc/10.5.443 [DOI] [PubMed] [Google Scholar]
  • 27.Harris AHS, Hagedorn HJ, Finlay AK. Delta Studies: Expanding the Concept of Deviance Studies to Design More Effective Improvement Interventions. Journal of general internal medicine. Feb 2021;36(2):280–287. doi: 10.1007/s11606-020-06199-x [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.McDaniel CE, House SA, Ralston SL. Behavioral and Psychological Aspects of the Physician Experience with Deimplementation. Pediatr Qual Saf. Jan-Feb 2022;7(1):e524. doi: 10.1097/pq9.0000000000000524 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Gangathimmaiah V, Drever N, Evans R, et al. What works for and what hinders deimplementation of low-value care in emergency medicine practice? A scoping review. BMJ Open. Nov 9 2023;13(11):e072762. doi: 10.1136/bmjopen-2023-072762 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Schondelmeyer AC, Bettencourt AP, Xiao R, et al. Evaluation of an Educational Outreach and Audit and Feedback Program to Reduce Continuous Pulse Oximetry Use in Hospitalized Infants With Stable Bronchiolitis: A Nonrandomized Clinical Trial. JAMA Netw Open. Sep 1 2021;4(9):e2122826. doi: 10.1001/jamanetworkopen.2021.22826 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Sirovich BE, Gottlieb DJ, Welch HG, Fisher ES. Regional variations in health care intensity and physician perceptions of quality of care. Ann Intern Med. May 2 2006;144(9):641–9. doi: 10.7326/0003-4819-144-9-200605020-00007 [DOI] [PubMed] [Google Scholar]
  • 32.Bronner KK, Goodman DC. The Dartmouth Atlas of Health Care - bringing health care analyses to health systems, policymakers, and the public. Res Health Serv Reg. Jul 27 2022;1(1):6. doi: 10.1007/s43999-022-00006-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Tyler A, Krack P, Bakel LA, et al. Interventions to Reduce Over-Utilized Tests and Treatments in Bronchiolitis. Pediatrics. Jun 2018;141(6)doi: 10.1542/peds.2017-0485 [DOI] [PubMed] [Google Scholar]
  • 34.Berg K, Nedved A, Richardson T, Montalbano A, Michael J, Johnson M. Actively Doing Less: Deimplementation of Unnecessary Interventions in Bronchiolitis Care Across Urgent Care, Emergency Department, and Inpatient Settings. Hospital pediatrics. May 2020;10(5):385–391. doi: 10.1542/hpeds.2019-0284 [DOI] [PubMed] [Google Scholar]
  • 35.Mussman GM, Lossius M, Wasif F, et al. Multisite Emergency Department Inpatient Collaborative to Reduce Unnecessary Bronchiolitis Care. Pediatrics. Feb 2018;141(2)doi: 10.1542/peds.2017-0830 [DOI] [PubMed] [Google Scholar]
  • 36.Ralston S, Comick A, Nichols E, Parker D, Lanter P. Effectiveness of quality improvement in hospitalization for bronchiolitis: a systematic review. Research Support, Non-U.S. Gov’t Review. Pediatrics. Sep 2014;134(3):571–81. doi: 10.1542/peds.2014-1036 [DOI] [PubMed] [Google Scholar]
  • 37.Ganguli I, Morden NE, Yang CW, Crawford M, Colla CH. Low-Value Care at the Actionable Level of Individual Health Systems. JAMA internal medicine. Nov 1 2021;181(11):1490–1500. doi: 10.1001/jamainternmed.2021.5531 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.Hoshiko S, Smith D, Fan C, Jones CR, McNeel SV, Cohen RA. Trends in CT scan rates in children and pregnant women: teaching, private, public and nonprofit facilities. Pediatric radiology. May 2014;44(5):522–8. doi: 10.1007/s00247-014-2881-8 [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplemental Table 1

Supplemental Table 1 Alternative text

A table listing each low-value care calculator measure with contextual information regarding measure development, including encounter inclusion and exclusion criteria.

RESOURCES