Skip to main content
Health Services Research logoLink to Health Services Research
. 2013 Jul 5;49(1):52–74. doi: 10.1111/1475-6773.12085

Partial and Incremental PCMH Practice Transformation: Implications for Quality and Costs

Michael L Paustian , Jeffrey A Alexander, Darline K El Reda, Chris G Wise, Lee A Green, Michael D Fetters
PMCID: PMC3922466  PMID: 23829322

Abstract

ObjectiveTo examine the associations between partial and incremental implementation of the Patient Centered Medical Home (PCMH) model and measures of cost and quality of care.

Data SourceWe combined validated, self-reported PCMH capabilities data with administrative claims data for a diverse statewide population of 2,432 primary care practices in Michigan. These data were supplemented with contextual data from the Area Resource File.

Study DesignWe measured medical home capabilities in place as of June 2009 and change in medical home capabilities implemented between July 2009 and June 2010. Generalized estimating equations were used to estimate the mean effect of these PCMH measures on total medical costs and quality of care delivered in physician practices between July 2009 and June 2010, while controlling for potential practice, patient cohort, physician organization, and practice environment confounders.

Principal FindingsBased on the observed relationships for partial implementation, full implementation of the PCMH model is associated with a 3.5 percent higher quality composite score, a 5.1 percent higher preventive composite score, and $26.37 lower per member per month medical costs for adults. Full PCMH implementation is also associated with a 12.2 percent higher preventive composite score, but no reductions in costs for pediatric populations. Incremental improvements in PCMH model implementation yielded similar positive effects on quality of care for both adult and pediatric populations but were not associated with cost savings for either population.

ConclusionsEstimated effects of the PCMH model on quality and cost of care appear to improve with the degree of PCMH implementation achieved and with incremental improvements in implementation.

Keywords: PCMH, medical home, cost, quality


Policy makers and payers have begun to engage with primary care providers in testing an alternative model of practice organization and orientation under the rubric of the “Patient Centered Medical Home (PCMH).” The PCMH is defined as a holistic and integrated model of primary care designed to improve the processes and outcomes of health care, including increasing value, improving responsiveness to patients, and improving outcomes in the areas of access, timeliness, patient centeredness, safety, equity, and efficiency (Patient-Centered Primary Care Collaborative 2007). The recently passed federal health care reform bill, the Patient Protection and Affordable Care Act (HR3590), includes federal PCMH demonstration programs, and PCMH implementation is under way in a wide variety of practice settings across the country (Patient-Centered Primary Care Collaborative 2009; Fields, Leshen, and Patel 2010; The Commonwealth Fund 2011). Despite enthusiasm to rapidly pilot and scale up the PCMH model, including efforts to incorporate PCMH into state Medicaid programs, attempts to empirically examine the effects of PCMH on quality and cost-related outcomes are still in the early stages and are often focused on small and self-selected samples of physician practices.

A central problem in PCMH-effectiveness studies is determining when, or to what degree, PCMH has been implemented (Alexander and Bae 2012; Hoff, Weller, and DePuccio 2012). In some studies implementation is assumed but not measured, whereas in other cases, it is measured in terms of degree but only in “check the box” fashion. Because of the high potential for measurement error, both are problematic for studying the effects of PCMH. In our study we assume that the complex, multicomponent nature of PCMH increases the probability that components of the PCMH model will be implemented incrementally rather than at a single point in time. We further assume that the “path” to full implementation of PCMH will likely vary among physician practices. For example, some practices might initially implement extended access and then add individual care management, whereas others might start with developing a patient registry followed by e-prescribing, etc. As the PCMH model assumes its constituent components will function as a system of care, it is not clear how cost and quality outcomes will be affected if complete implementation is not achieved, or if different paths to implementation will result in similar effects on quality and cost outcomes.

Our study attempts to advance PCMH research by examining the following research questions: (1) Is partial implementation of the PCMH model associated with outcomes related to primary care cost and quality of care? (2) Are incremental changes in implementation of PCMH associated with outcomes related to primary care cost and quality of care? and (3) Do different approaches to PCMH implementation show similar associations with outcomes related to primary care cost and quality of care?

Background

The current body of research concerning implementation of PCMH among physician practices is limited, although several recent studies have examined the adoption of components of the PCMH model (Rittenhouse et al. 2008, 2011). These studies suggest that PCMH elements such as whole-person orientation and continuity of care through use of personal physicians have been more readily adopted (Goldberg and Kuzel 2009). However, processes related to care coordination and integration, enhanced access, team-based care, and support from appropriate information systems have not been adopted as broadly (Audet, Davis, and Schoenbaum 2006; Rittenhouse et al. 2008; Friedberg et al. 2009).

Several studies have linked medical homes with improved health indicators. In general, these studies suggest that medical homes are associated with increased patient satisfaction (Palfrey et al. 2004; Gill et al. 2005; DeVoe et al. 2008; Reid et al. 2009) and decreased hospitalizations and emergency room visits (Martin et al. 2007; Cooley et al. 2009; Rankin et al. 2009). Studies have also identified improvements in quality (Rankin et al. 2009; Reid et al. 2009) and reduced clinician burnout (Reid et al. 2009). Findings have been more mixed regarding PCMH impact on reducing medical costs (McBurney, Simpson, and Darden 2004; Damiano et al. 2006).

Although promising, conclusions from these PCMH-effectiveness evaluations warrant further discussion, given differences in PCMH measurement approaches. Some studies directly measured the degree of “medical homeness,” whereas others simply assumed the presence of a medical home (i.e., Cooley et al. 2009). Other studies measured PCMH by using questions from the Consumer Assessment of Healthcare Providers and Systems Survey or by measuring elements of PCMH defined by the specific study. The variation and limitations in current PCMH measurement approaches, coupled with the relatively small sample size of many PCMH studies, increase the difficulty of cross-study comparisons and drawing conclusions about PCMH effects. Additional investigation using larger samples and more granular PCMH measurement is necessary to assess how the degree of PCMH implementation affects patient care outcomes, including whether this care delivery model will lead to better processes and outcomes across a variety of practice types.

In this study we address the question of whether PCMH is an “all or nothing” form of care delivery by examining the hypothesis that more extensive implementation of the PCMH model will be positively associated with indicators of lower cost and higher quality of care. Underlying this hypothesis are the assumptions that partial benefit of PCMH on patient outcomes will be obtained even if total implementation of all components of the model is not in place, and that the approach (or sequencing) to implementing components of the PCMH model can be tailored to the circumstances of each primary care practice. We also test the proposition that incremental progress in the implementation (regardless of the initial level of implementation) of the PCMH model will be associated with lower cost and higher quality of care.

Methods

Sample

The physician practice sample in this study comes from the population of 4,192 practices in Michigan with at least one physician affiliated with a physician organization participating in the Blue Cross Blue Shield of Michigan (BCBSM) Physician Group Incentive Program (PGIP) in 2010. Within PGIP, primary care practices are eligible for two PCMH-related incentives: (1) partial reimbursement for PCMH capability implementation and (2) 10 percent office visit and preventive visit fee enhancements for practices that achieve significant PCMH capability implementation in combination with delivering high quality of care (Share and Mason 2012). Among these practices, 2,494 self-reported at least one primary care physician (PCP). These practices account for 5,750 primary care physicians or 65 percent of all primary care physicians practicing in Michigan (Michigan Department of Community Health 2010), and operate in 82 of Michigan’s 83 counties. Because our research questions focus on the effects of PCMH transformation on primary care outcomes, a practice needed at least one primary care physician and must have reported medical home capabilities in both June 2009 and June 2010 to be included. Practices were excluded if more than one half of the physicians in the practice were nonprimary care specialists. Application of inclusion and exclusion criteria resulted in a study group of 2,432 practices providing primary care to approximately 1.5 million BCBSM members.

Data Sources

Data for this research were obtained from (1) the BCBSM Self-Reported Database (SRD); (2) BCBSM member enrollment data files; (3) BCBSM claims data; and (4) the Health Resources and Services Administration (HRSA) Area Resource File 2009–2010 (ARF 2010). Physician organizations report practice and physician information to BCBSM through the SRD semiannually. This database includes physician demographic information, physician organization and practice affiliation (i.e., which physicians make up each practice), a primary care physician indicator, practice PCMH capabilities based on interpretive guideline case definitions (Blue Cross Blue Shield of Michigan 2010a), and approximate date of implementation for each capability. BCBSM field staff trained in process improvement conducted 235 practice site visits between June 2009 and June 2010 to validate capability reporting and provide educational guidance. To address potential overreporting that may be encouraged through the incentive design, practices must demonstrate functional use of capabilities during the site visit, such as a populated patient registry usable at the point of care.

We used enrollment information to obtain demographic data on members who received care at these practices and administrative claims data for the services received. An administrative claims-based retrospective primary care member attribution algorithm was applied to determine the BCBSM commercial member cohort, denominator population, for each practice. Members were assigned based on 24 months of outpatient and office-based claims history, beginning July 1, 2008 and ending June 30, 2010, with attribution assigned to the physician who had the most evaluation and management office or preventive visits with that member; ties were resolved by assigning members to the physician with the most recent evaluation and management office or preventive visit with that member or to the physician who rendered the most total services for that member.

We incorporated items from HRSA’s Area Resource File to address potential confounding from market area socioeconomic or demographic sources not available through the SRD or through BCBSM administrative data.

Measures

PCMH Implementation Score

We developed a practice-level PCMH implementation score to reflect the degree of PCMH implementation across 13 domains of function (Table 1) by combining SRD PCMH capability information in a multistage process. In the first stage, capabilities reported as “fully in place” were assigned a value of 1, whereas capabilities reported as “not in place” were assigned a value of 0. When capabilities had multiple gradients, the capability score was calculated as a proportion of the maximum gradient. For example, the Patient–Provider Agreement domain (Appendix A) asked respondents to identify the percentage of their patient population who had established documented patient–provider agreements from the following options: 10, 30, 50, 60, 80, or 90 percent. A response of 30 percent implementation on this patient–provider agreement communication was assigned a value of 0.33 (0.3/0.9). Domain-specific scores were calculated in the second stage, by summing all capability scores within the domain and dividing by the maximum capability score possible, that is, distinct capabilities within that domain. Lastly, we calculated the overall PCMH implementation score as the mean of all 13 domain-specific scores. Thus, a one unit change in the continuous overall PCMH implementation score corresponds to the difference between no implementation (0) and full implementation (1), although almost all practices in our study group fall along the continuum between these two endpoints. This method intentionally gives equal scoring weight to each PCMH domain to reflect the unknown relative importance of each domain, and thereby avoids giving greater weight to domains with a greater number of capabilities.

Table 1.

Patient-Centered Medical Home (PCMH) Domains of Function, Number of Total Capabilities, and Distinct Capabilities by Domain

Domain Description No. of Capabilities Maximum Distinct Capabilities
Patient–provider agreement Practice has developed and is using PCMH-related communication tools 8 3
Patient registry An all-payer registry is used to manage established patients in the practice 17 17
Performance reporting Performance reports are generated that allow tracking and comparison of results for the established population of patients in the practice 12 12
Individual care management Practice has ability to deliver coordinated care management services with an integrated team of multidisciplinary providers and a systematic approach is in place to deliver comprehensive care that addresses patients’ full range of health care needs 15 15
Extended access Patients have 24-hour access to a clinical decision maker by phone, and clinical decision maker has a feedback loop within 24 hours or next business day to the patient’s PCMH 9 7
Test results tracking and follow-up Practice has test tracking process documented and in place which requires tracking and follow-up for all tests and results, with identified timeframes for notifying patients of results 9 9
E-prescribing Practice has adopted and uses electronic prescribing and clinical decision support tools to improve the safety, quality, and cost effectiveness of the prescription process 2 1
Preventive services Primary prevention program is in place that focuses on identifying and educating patients about personal health behaviors to reduce their risk of disease and injury 8 8
Linkage to community services A comprehensive review of, and linkage to, community resources has been completed 8 8
Self-management support A systematic approach to empowering patients to understand their central role in effectively managing their illness, making informed decisions about care, and engaging in healthy behaviors is in place 8 8
Patient web portal A patient web portal is in use by the practice to allow for electronic communication between patients and physicians, and provide greater access to medical information and technical tools 12 12
Coordination of care For patients with selected chronic conditions, a mechanism is established for being notified of each patient admit and discharge or other type of encounter, and appropriate transition plans are in place 9 7
Specialist referral process Procedures are in place to guide each phase of the specialist referral process 9 9
Total capabilities 126 116

Baseline PCMH Implementation and Change in PCMH Implementation

Although the PCMH program began in January 2008, practices needed adequate time to establish enough capabilities before assessing their impact. The PCMH implementation score was calculated for both the June 2009 and June 2010 reporting cycles. Capability implementation dates from the June 2010 reporting cycle were used to correct for any overreporting of capabilities in the June 2009 time period due to differences in interpretation of capability case definitions from the interpretive guidelines and to account for any interpretive guideline changes made between reporting time periods. This correction creates greater comparability of the two measurement time points for the degree of PCMH implementation. Change in PCMH implementation, or incremental implementation, was recorded as the difference between the June 2010 PCMH implementation score and the June 2009 PCMH implementation score. Thus, the estimated value for incremental implementation is dependent on the baseline level of PCMH implementation previously attained. The maximum combined value of the partial implementation (baseline) and the incremental implementation (change) is 1.

Cost and Quality Outcomes

For this study, we examined one practice-level measure of cost and four composite measures of quality of care for the July 2009 to June 2010 time period (Jaen et al. 2010; Higgins et al. 2011; The Commonwealth Fund 2012). Total combined medical and surgical allowed cost per member per month (PMPM) was calculated separately for the adult and pediatric primary care–attributed member population. Composite measures of quality were selected over individual quality measures because few physicians have sufficient numbers of patients on any individual quality measure for comparative purposes (Scholle et al. 2009) and because of concerns about heterogeneity in physician performance across individual quality measures (Parkerton et al. 2003). Four overall percentage (Reeves et al. 2007) composite quality and preventive measures were generated from HEDIS-defined and BCBSM-defined individual quality and preventive measures (Blue Cross Blue Shield of Michigan 2010b): adult quality, adult preventive, pediatric quality, and pediatric preventive (See Appendix B). The pediatric quality composite measure was log transformed after assessment of its normality.

Practice Characteristics

Four practice characteristics were used in the analysis: practice size, primary care focus, BCBSM market share, and inpatient service provision. Practice size was evaluated categorically based on total number of physicians, including specialists, in the practice as reported in the SRD. We used the primary care physician indicator in the SRD to classify practices as “primary care focus” if only primary care physicians were present and “mixed specialty” if both primary care and nonprimary care specialty physicians were present. Total BCBSM-paid services per PCP delivered between July 2009 and June 2010 were calculated for each practice as a proxy for BCBSM volume within the practice. We also calculated the proportion of these services provided within an inpatient setting to ensure comparability of physician practice patterns across practices.

Patient Cohort Characteristics

We used the primary care–attributed member cohort for each practice and their member enrollment information to estimate the following five practice-level patient characteristics: proportion of members under 18 years, proportion of members over 64 years, proportion of members who were female, and mean Ingenix® Symmetry version 6 prospective risk score (Ingenix® 2008) for both adult and pediatric members in the practice. The prospective risk score employs a large national database of aggregated claims and membership information to derive a numerical, diagnosis-based episode assessment used to predict future medical costs.

Market Characteristics

We examined five county-level market characteristics and one physician organization characteristic to address additional sources of variation that might influence cost and quality outcomes. County-level data from the ARF 2010 were used for median household income, percent of residents who were nonwhite or Hispanic, total primary care physicians per 1,000 population, and total county population. We also calculated county-level BCBSM market share based on member subscriber addresses from BCBSM member enrollment information and total estimated county population from the ARF 2010. County-level measures were weighted for each practice to account for the proportion of their care provided to members residing in each county. The weighted total population estimate for the practice was converted into a categorical measure of urbanicity based on 1990 census metropolitan statistical area classifications. Physician organization size was measured as the total number of practices with at least one primary care physician.

Analytic Approach

The unit of analysis for this study was the primary care physician practice, the functional unit at which the PCMH program was targeted. We analyzed relationships between level of PCMH implementation, change in PCMH implementation, and our outcome measures in multivariable models using both generalized estimating equations and random intercept generalized linear-mixed models. Generalized estimating equations were calculated to account for clustering of practices within physician organizations and to estimate the mean population-level practice effects of PCMH implementation on measures of cost and quality (Hubbard et al. 2010). Random intercept linear-mixed models were fitted to address potential heterogeneity in outcomes conditional on the physician organization of the practice (Singer 1998). To assess the effects of previously implemented PCMH capabilities in combination with newly implemented PCMH capabilities, the models for each of the six outcomes included both the level of partial PCMH implementation measured at baseline and the incremental PCMH implementation that occurred during the study time period. All analyses were performed in SAS® version 9.2 (SAS Institute Inc 2011). Outcome-specific exclusion criteria were applied prior to constructing multivariable models. Pediatric practices (N = 296), defined as practices with at least 80 percent of attributed members below 18 years, were excluded from adult outcome analyses. Practices that failed to meet the minimum sample threshold of 50 members for cost measures or 30 events for composite quality and preventive outcomes were excluded from their respective analyses. Practices whose cost outcome measure exceeded three interquartile range (IQR) units from the median or whose composite outcome measures exceeded two IQR units from the median were excluded from their respective analyses to minimize the impact of the tail of the distributions on parameter estimates. Measures of standard influence on predicted value, a standard regression diagnostic, were calculated to identify “influential observations” that might have excessive influence and bias parameter estimates (Fox 1991). After applying exclusion criteria, we compared excluded practices to retained practices to assess differences that might impact interpretation or generalizability of the results.

We examined potential colinearity of predictor variables using Pearson correlation coefficients and then evaluated variable colinearity in the multivariable models. Percentage of members over 64 years and percent of services provided in the inpatient setting were dropped due to colinearity with the mean risk score.

Results

As of June 2009, 1,647 study practices (67.7 percent of the study group) had implemented at least one PCMH capability. The mean baseline PCMH implementation score across study practices was 0.11 (SD: 0.13), whereas the mean PCMH implementation score among practices with at least one capability was 0.17 (SD: 0.13). By June 2010, 2,219 study practices (91.2 percent) had implemented at least one PCMH capability. The mean PCMH implementation score had risen to 0.34 (SD: 0.24) and was 0.37 (SD: 0.22) among practices with at least one capability. The mean change in PCMH implementation between the two time periods in study practices was 0.22 (SD: 0.20).

Table 2 illustrates the member, practice, market characteristics, and outcome distributions of the study practice population. On average, solo physician practices had 334 BCBSM members, practices with two or three physicians had 771 members, practices with four or five physicians had 1,482 members, and practices with six or more physicians had 2,122 members. Solo physician practices accounted for 1,411 (58.0 percent) study practices but were more likely to be excluded in each analysis due to sample size criteria.

Table 2.

Practice, Patient Cohort, and Market Characteristics of Physician Practices with at Least One Primary Care Physician Participating in the BCBSM Physician Group Incentive Program, July 2009 to June 2010

Adult and Family Practices (N = 2,136) Pediatric Practices (N = 296)
Median IQR Median IQR
Outcomes
 Adult preventive composite 74.3% 66.5–80.0% NA NA
 Adult quality composite 70.4% 63.6–76.6% NA NA
 Pediatric preventive composite 33.7% 21.2–46.7% 64.0% 56.6–70.6%
 Pediatric quality composite 76.2% 50.0–100.0% 86.8% 78.7–92.5%
 Adult cost PMPM $296.67 $252.40–$357.56 NA NA
 Pediatric cost PMPM $80.19 $56.50–$105.17 $96.82 $81.44–$114.42
Continuous variables
 PCMH score June 2009 0.06 0–0.19 0.06 0–0.15
 PCMH change to June 2010 0.19 0.05–0.35 0.23 0.08–0.38
 Median household income $48,363 $44,843–$58,332 $50,666 $43,929–$55,321
 Total practices in PO with a PCP 111 59–710 104 55–177
 Services per PCP 1,979 1,132–3,209 3,054 1,870–5,071
 PCP’s per 1,000 population 0.98 0.71–1.26 1.04 0.77–1.40
 Mean prospective risk score (adult) 1.60 1.41–1.87 0.67 0.58–0.77
 Mean prospective risk score (pediatric) 0.45 0.38–0.54 0.44 0.40–0.50
 Percent of nonwhite attributed members 20.5% 12.1–26.8% 21.3% 14.0–25.9%
 Percent of female attributed members 50.8% 45.9–58.2% 48.6% 46.7–50.7%
 Percent BCBSM market share 31.1% 25.7–34.4% 31.3% 26.0–34.7%
Categorical Variables N % N %
Practice size
 Solo physician practice 1,274 59.6 137 46.3
 2–3 physicians 500 23.4 87 29.4
 4–5 physicians 189 8.8 43 14.5
 6 or more physicians 173 8.1 29 9.8
Practice specialty
 Mixed 83 3.9 13 4.4
 Primary care only 2,053 96.1 283 95.6
Metropolitan statistical area status
 Metropolitan: 1,000,000 or more persons 754 35.3 121 40.9
 Metropolitan: 250,000–999,999 persons 514 24.1 84 28.4
 Metropolitan: 100,000–249,999 persons 406 19.0 49 16.6
 Metropolitan: below 100,000 persons 69 3.2 5 1.7
 Micropolitan 208 9.7 23 7.8
 Rural 177 8.3 14 4.7

Note IQR, interquartile range; NA, not applicable; PCMH, patient-centered medical home; PMPM, per member per month.

Table 3 shows the PCMH effect estimates from the multivariable models adjusting for member, practice, and market characteristics. Full multivariable model results are available in appendices C–E.

Table 3.

Multivariable Generalized Estimating Equation Results: Medical Home Implementation and Adult and Pediatric Composite Quality of Care Measures, Composite Preventive Measures, and Per Member Per Month (PMPM) Medical and Surgical Costs

Baseline PCMH Score – June 2009 PCMH Change from June 2009 to June 2010
Outcome Variable Beta Estimate 95% CI (Lower) 95% CI (Upper) p-Value Beta Estimate 95% CI (Lower) 95% CI (Upper) p-Value
Preventive composite scores
 Adult preventive composite 5.1% 0.5% 9.7% .0316 3.3% 1.1% 5.4% .0028
 Pediatric preventive composite 12.2% 5.1% 19.3% .0008 4.9% 0.6% 9.3% .0260
Quality composite scores
 Adult quality composite 3.5% −0.4% 7.4% .0806 5.2% 2.6% 7.8% <.0001
 Pediatric quality composite* 0.10 −0.05 0.25 .1745 0.05 −0.07 0.17 .4113
Medical and surgical PMPM costs
 Adult PMPM cost −$26.37 −$53.08 $0.33 .0529 −$3.08 −$18.07 $11.91 .6868
 Pediatric PMPM cost −$1.72 −$13.13 $9.70 .7682 $7.45 −$1.33 $16.24 .0964

Note Full multivariable model results are available in appendices C–E. Covariates included the following practice characteristics: mean prospective risk score, percent female attributed members, paid services per PCP, practice size based on total physicians in the practice, practice as primary care only or mixed primary and specialty care, and whether the practice was a pediatric practice or not; and market characteristics: total primary care practices in the physician organization, median household income, PCPs per 1,000 population, BCBSM market share, percent nonwhite residents, and metropolitan statistical area status.

*

Log transformed.

BCBSM, Blue Cross Blue Shield of Michigan; CI, confidence interval; PCMH, patient-centered medical home.

Preventive Care Measures

The multivariable model for the adult prevention composite measure included 1,636 study practices after practice exclusions: insufficient sample size (303), missing predictors (42), and exceeding IQR thresholds (155). Practices included in the model had a median of 194 prevention opportunities per practice, and the mean adult preventive composite score was 74.8 percent, ranging from 38.7 to 94.5 percent. After multivariable adjustment, a practice that achieved full PCMH implementation would have a 5.1 percent higher adult preventive composite score compared with a practice that never achieved any PCMH implementation (p = .0316). A practice without preexisting PCMH infrastructure that implemented all PCMH capabilities during the study time period would have a 3.3 percent higher adult preventive composite score compared with that same practice with no incremental PCMH implementation (p = .0028). Effect estimates were slightly higher for baseline implementation (6.3 percent) and incremental implementation (3.8 percent) when practices exceeding the IQR thresholds were included in the model as a sensitivity test.

The pediatric preventive composite measure multivariable model included 1,218 study practices after practice exclusions: insufficient sample size (1,151), missing predictors (39), and exceeding IQR thresholds (24). The median number of preventive opportunities was 115 per practice included in the model, and the mean pediatric preventive composite score was 44.5 percent, ranging between 10.1 and 85.2 percent. After multivariable adjustment, a practice that achieved full PCMH implementation would have a 12.2 percent higher pediatric preventive composite score compared with a practice that never achieved any PCMH implementation (p = .0008). A practice without preexisting PCMH infrastructure that implemented all PCMH capabilities during the study time period would have a 4.9 percent higher pediatric preventive composite score compared with that same practice with no incremental PCMH implementation (p = .0260). While the random intercept model resulted in an increased effect of full baseline PCMH implementation from 12.2 to 15.7 percent, it did not improve model fit (AIC: −1,577) over the model without a random intercept (AIC: −1,719) for the pediatric preventive composite measure. Random intercept models did not identify any noticeable changes in parameters or model fit for any other outcome.

Quality of Care Measures

The multivariable model for the adult quality composite measure consisted of 1,590 study practices after practice exclusions: insufficient sample size (498), missing predictors (43), and exceeding IQR thresholds (5). For practices included in the model, the median number of quality opportunities per practice was 166, and the mean adult quality composite score was 70.2 percent, ranging between 33.8 and 96.4 percent. After multivariable adjustment, a practice that achieved full PCMH implementation would have a 3.5 percent higher adult quality composite compared with a practice that never achieved any PCMH implementation (p = .0806). A practice without preexisting PCMH infrastructure that implemented all PCMH capabilities during the study time period would have a 5.2 percent higher adult quality composite score compared with that same practice with no incremental PCMH implementation (p < .0001).

A total of 337 study practices were included in the multivariable model for the pediatric quality measure after practice exclusions: insufficient sample size (2,077), missing predictors (7), and exceeding IQR thresholds (11). Practices included in the multivariable model had a median of 61 quality opportunities per practice, and the mean untransformed pediatric quality measure was 80.4 percent, ranging between 41.8 and 100 percent. Although positively associated, neither baseline nor incremental PCMH implementation was significantly associated with a higher pediatric quality score after multivariable adjustment (p = .1745 and p = .4113, respectively).

Cost Measures

A total of 1,787 study practices were used for the adult PMPM cost multivariable model after the following practice exclusions: insufficient sample size (286), missing predictors (45), exceeding IQR thresholds (15), and influential observations (3). Practices included in the model had a median population of 303 adult members, and their overall mean adult PMPM cost was $310.79, ranging between $102.72 and $1,053.45. After multivariable adjustment, a practice that achieves full PCMH implementation would have a $26.37 PMPM lower adult PMPM cost compared with a practice that never achieved any PCMH implementation (p = .0529). Based on a practice with median population characteristics, this difference corresponds to a 7.7 percent (95 percent CI: −0.1 to 15.4 percent) lower adult PMPM cost. Incremental PCMH implementation was associated with lower cost, albeit nonsignificant (p = .6868).

The pediatric PMPM cost multivariable model included 956 practices after practice exclusions: insufficient sample size (1,438), missing predictors (28), exceeding IQR thresholds (5), and influential observations (5). Practices included in the multivariable model had a median 172 pediatric members per practice, and their mean pediatric PMPM cost was $91.40, ranging from $21.59 to $236.14. After multivariable adjustment, baseline PCMH implementation was not associated with cost (p = .7682). However, a practice without preexisting PCMH infrastructure that implemented all PCMH capabilities during the study time period would have a $7.45 higher pediatric PMPM compared with that same practice with no incremental PCMH implementation (p = .0964).

Discussion

Our study demonstrated positive associations between baseline PCMH implementation and composite measures of quality of care for both adults and children, and we observed these associations for both primary and secondary preventive care. PCMH implementation was also associated with lower overall medical and surgical costs for adults, but not for children. This cost finding is consistent with the underlying PCMH principles of improved chronic care management and care coordination due to the higher chronic disease burden in adults.

Notably, these associations between PCMH and cost and quality measures were observed 2 years into the program, even though practices were still expanding their PCMH infrastructure. These findings suggest that partial implementation of the PCMH model may have quality and cost benefits well before full PCMH implementation has been achieved, and these benefits likely increase as practices progress toward full implementation. Although the PCMH is intended to operate as a system of care, our findings suggest that its constituent elements may have independent benefits and that substantial progress on all or most elements is not needed to significantly improve care.

Composite measures of quality and preventive care were also positively associated with incremental PCMH implementation, controlling for baseline levels of PCMH implementation. However, cost measures were not associated with incremental PCMH implementation. These observations may reflect the proximal impacts of recent implementation, whereby initial improvements in quality are followed later by reductions in cost. Although not statistically significant, the increased costs for pediatric members associated with incremental PCMH implementation may result from an increased emphasis and use of preventive services reflected in the pediatric preventive composite measure of well-child visits and immunizations.

The measures of association between PCMH implementation and quality and cost-related outcomes are intended to estimate the potential PCMH effects when full implementation is eventually achieved. However, extrapolation of results observed at partial implementation to estimate full implementation effects should be interpreted cautiously, as the relationship between these outcomes and PCMH implementation may change at higher levels of implementation. While the effect could diminish at higher levels, we believe that synergistic interaction of PCMH elements will more likely result in greater effects on outcomes as practices achieve higher levels of PCMH implementation. Although the measurement instrument developed for this program may not capture all elements crucial to creating a fully functional PCMH, this instrument was developed with significant physician input, incorporated the PC-PCC guiding principles, and was specifically designed to capture partial PCMH implementation and incremental PCMH changes under multiple implementation scenarios (Alexander et al. 2013).

A major strength of this study is that these practices encompass nearly two thirds of primary care physicians practicing in Michigan, span 82 of the 83 counties in Michigan, represent both small and large practices, urban and rural practices, practices within integrated systems, and practices loosely affiliated in independent physician associations—all approaching PCMH implementation from a broad array of perspectives suited to the individual practices. The external validity is furthered by evaluating the impact on the total population rather than smaller chronic disease subpopulations that may overstate the total population benefits of a PCMH. Although limited to a single payer’s data, BCBSM members represent a substantial portion of practice populations statewide, but this raises questions as to whether the PCMH effects we observed extend to other commercial or noncommercial payer populations. However, recent literature suggests a need to evaluate PCMH in a broad array of practice settings with greater emphasis on the diversity of practices rather than the number of patients (Peikes et al. 2012).

Our PCMH evaluation was limited to a 1-year period, so we cannot determine whether these results reflect true causal relationships or preexisting practice patterns. For example, physicians motivated to improve quality and reduce costs may be more likely to implement PCMH capabilities, and such motivation may explain observed associations between PCMH and quality and cost measures. The PCMH change variable may capture elements of physician motivation, as more motivated practices might expand their capabilities more rapidly initially. However, such expansion may wane as fewer opportunities to expand PCMH capabilities are available and only relatively difficult to implement capabilities remain. Additional limitations include the use of county-level socioeconomic characteristics instead of individual socioeconomic characteristics and the lack of information on benefit design.

This study demonstrated relationships between the PCMH model, higher quality of care, and reduced cost of care. Based on this initial assessment, the study supports continued investment in PCMH implementation. However, further evaluation is needed to determine the sources of utilization (e.g., emergency department, inpatient) affected by the PCMH that contribute to lower costs, whether the association between PCMH and medical costs extends to pharmacy costs, whether these effects span multiple payers, and whether these effects persist under a longitudinal design. Additional efforts should aim to understand which specific PCMH elements contribute to higher quality care and lower cost of care, and potential for synergism between PCMH elements with regard to these outcomes. Importantly, the associations with quality we observed span a broad array of primary and secondary preventive measures rather than narrowly focused on any one individual measure. Further research is needed to determine whether these relationships span other quality measures, are limited to specific subsets of quality measures, or extend into additional health areas such as preconception care where chronic condition management may yield additional health benefits (Johnson et al. 2006). The diverse population of practices in this study population will allow for subsequent examination of these areas and whether PCMH effects are universal across practices or dependent on the practice setting.

Finally, although multiple statistically significant relationships between PCMH and quality measures were observed in this study, our results do not directly address the clinical significance of these relationships, an important distinction given the potentially high costs associated with implementing and maintaining a transformational change such as PCMH. Three points are relevant to this issue. First, many of our quality measures are based on established clinical guidelines strongly linked to improved health outcomes in other studies, such as monitoring HbA1c levels in persons with diabetes (Larsen, Horder, and Mogensen 1990). Second, it may be premature to assess the cost effectiveness of PCMH as many study practices are still in the early stages of PCMH implementation and have not yet fully realized its clinical benefits. Third, even relatively small changes at the physician practice level may translate into important differences at the population level. The fact that our findings apply to a large study sample of over 2,000 practices in Michigan partially supports this claim and may mitigate to some extent the costs of implementation.

If the cost savings and quality improvement relationships observed in this study are reinforced by additional evaluations of the PCMH model, further support for PCMH may be warranted. Implementing PCMH capabilities presents a considerable challenge for many primary care practices, with significant investment of time and expense (Nutting et al. 2011). Requiring primary care practices to shoulder this investment alone may severely limit PCMH implementation. Payers, purchasers, and providers should consider methods for sharing cost savings derived from PCMH implementation to provide further incentives to support ongoing efforts to implement the PCMH model. If cost savings and improved quality can indeed be derived at intermediate stages of PCMH implementation, and sustained during more advanced stages of implementation, the potential for a permanent program of shared savings that support continuous improvements may well be viable.

Acknowledgments

Joint Acknowledgment/Disclosure Statement: This study was funded by a grant from the Agency for Healthcare Research and Quality: R18 RFA-HS-10-002. Data were supplied by Blue Cross Blue Shield of Michigan. Michael Paustian and Darline El Reda are employees of Blue Cross Blue Shield of Michigan.

Disclosures

None.

Disclaimers

None.

Supporting Information

Additional Supporting Information may be found in the online version of this article:

Appendix A: Example Showing Capability and Domain Scoring for the Patient—Provider Agreement PCMH Functional Domain, June 2010.

Appendix B: Individual Component Measures of Adult and Pediatric Quality and Preventive Composite Measures and Source Definition for Each Individual Component Measure.

Appendix C: Multivariable Generalized Estimating Equation Model for the Association between Medical Home Implementation and Adult and Pediatric Composite Quality of Care Measures in BCBSM PGIP Practices, July 2009 to June 2010.

Appendix D: Multivariable Generalized Estimating Equation Model for the Association between Medical Home Implementation and Adult and Pediatric Composite Preventive Measures in BCBSM PGIP Practices, July 2009 to June 2010.

Appendix E: Multivariable Generalized Estimating Equation Model for the Association between Medical Home Implementation and Adult and Pediatric Medical and Surgical per Member per Month Costs in BCBSM PGIP Practices, July 2009 to June 2010.

hesr0049-0052-sd1.docx (54KB, docx)
Appendix SA1

Author Matrix.

hesr0049-0052-sd2.pdf (1.1MB, pdf)

References

  1. Alexander JA, Bae D. Does the Patient-Centred Medical Home Work? A Critical Synthesis of Research on Patient-Centred Medical Homes and Patient-Related Outcomes. Health Services Management Research. 2012;25(2):51–9. doi: 10.1258/hsmr.2012.012001. [DOI] [PubMed] [Google Scholar]
  2. Alexander JA, Wise CG, Green LA, Fetters MD, Mason MH, El Reda DK. Assessment and Measurement of Patient-Centered Medical Home Implementation: the BCBSM Experience. Annals of Family Medicine. 2013;11:S74–81. doi: 10.1370/afm.1472. (Suppl 1): [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Area Resource File (ARF) Rockville, MD: US Department of Health and Human Services, Health Resources and Services Administration, Bureau of Health Professions; 2010. 2009–2010. [Google Scholar]
  4. Audet AM, Davis K, Schoenbaum SC. Adoption of Patient-Centered Care Practices by Physicians: Results from a National Survey. Archives of Internal Medicine. 2006;166(7):754–9. doi: 10.1001/archinte.166.7.754. [DOI] [PubMed] [Google Scholar]
  5. Blue Cross Blue Shield of Michigan. BCBSM Physician Group Incentive Program Patient-Centered Medical Home Domains of Function: Interpretive Guidelines. Detroit, MI: Blue Cross Blue Shield of Michigan, Value Partnerships; 2010a. September. [Google Scholar]
  6. Blue Cross Blue Shield of Michigan. BCBSM Physician Group Incentive Program Evidence Based Care Report (EBCR) Measure Specifications for PGIP 2010 Program Year. Detroit, MI: Blue Cross Blue Shield of Michigan, Value Partnerships; 2010b. November. [Google Scholar]
  7. Cooley WC, McAllister JW, Sherrieb K, Kuhlthau K. Improved Outcomes Associated with Medical Home Implementation in Pediatric Primary Care. Pediatrics. 2009;124(1):358–64. doi: 10.1542/peds.2008-2600. [DOI] [PubMed] [Google Scholar]
  8. The Commonwealth Fund. 2011. Patient-Centered Coordinated Care Programs [accessed on March 18, 2011]. Available at http://www.commonwealthfund.org/Content/Program-Areas/Delivery-System-Innovation and-Improvement/Patient-Centered-Coordinated-Care.aspx.
  9. The Commonwealth Fund. 2012. Recommended Core Measures for Evaluating the Patient-Centered Medical Home: Cost, Utilization, and Clinical Quality [accessed on June 19, 2012]. Available at http://www.commonwealthfund.org/Publications/Data-Briefs/2012/May/Measures-Medical-Home.aspx.
  10. Damiano PC, Momany ET, Tyler MC, Penziner AJ, Lobas JC. Cost of Outpatient Medical Care for Children and Youth with Special Health Care Needs: Investigating the Impact of the Medical Home. Pediatrics. 2006;118(4):e1187–94. doi: 10.1542/peds.2005-3018. [DOI] [PubMed] [Google Scholar]
  11. DeVoe JE, Wallace LS, Pandhi N, Solotaroff R, Fryer GE., Jr Comprehending Care in a Medical Home: A Usual Source of Care and Patients Perceptions about Healthcare Communication. Journal of the American Board of Family Medicine. 2008;21(5):441–50. doi: 10.3122/jabfm.2008.05.080054. [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Fields D, Leshen E, Patel K. Driving Quality Gains and Costs Savings through Adoption of Medical Homes. Health Affairs. 2010;29(5):819–26. doi: 10.1377/hlthaff.2010.0009. [DOI] [PubMed] [Google Scholar]
  13. Fox J. Regression Diagnostics: An Introduction. Sage University Paper Series on Quantitative Applications in the Social Sciences, 07-079. Newbury Park, CA: Sage; 1991. [Google Scholar]
  14. Friedberg M, Safran D, Coltin K, Dresser M, Schneider E. Readiness for the Patient-Centered Medical Home: Structural Capabilities of Massachusetts Primary Care Practices. Journal of General Internal Medicine. 2009;24(2):162–9. doi: 10.1007/s11606-008-0856-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  15. Gill JM, Bittner Fagan H, Townsend B, Mainous AG., III Impact of Providing a Medical Home to the Uninsured: Evaluation of a Statewide Program. Journal of Health Care for the Poor and Underserved. 2005;16(3):515–35. doi: 10.1353/hpu.2005.0050. [DOI] [PubMed] [Google Scholar]
  16. Goldberg DG, Kuzel AJ. Elements of the Patient-Centered Medical Home in Family Practices in Virginia. Annals of Family Medicine. 2009;7(4):301–8. doi: 10.1370/afm.1021. [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Higgins A, Stewart K, Dawson K, Bocchino C. Early Lessons from Accountable Care Models in the Private Sector: Partnerships between Health Plans and Providers. Health Affairs. 2011;30(9):1718–27. doi: 10.1377/hlthaff.2011.0561. [DOI] [PubMed] [Google Scholar]
  18. Hoff T, Weller W, DePuccio M. The Patient-Centered Medical Home: A Review of Recent Research. Medical Care Research and Review. 2012;69(6):619–44. doi: 10.1177/1077558712447688. .” Published online before print May 29, 2012. [DOI] [PubMed] [Google Scholar]
  19. Hubbard AE, Ahern J, Fleischer NL, Lippman M, Van der Laan SA, Jewell N, Bruckner T, Satariano WA. To GEE or Not to GEE: Comparing Population Average and Mixed Models for Estimating the Associations between Neighborhood Risk Factors and Health. Epidemiology. 2010;21(4):467–74. doi: 10.1097/EDE.0b013e3181caeb90. [DOI] [PubMed] [Google Scholar]
  20. Ingenix®. Symmetry Episode Risk Groups®: A Successful Approach to Health Risk Assessment. Eden Prairie, MN: Ingenix; 2008. [Google Scholar]
  21. Jaen CR, Ferrer RL, Miller WL, Palmer RF, Wood R, Davila M, Stewart EE, Crabtree BF, Nutting PA, Stange KC. Patient Outcomes at 26 Months in the Patient-Centered Medical Home National Demonstration Project. Annals of Family Medicine. 2010;8(Suppl 1):S57–67. doi: 10.1370/afm.1121. [DOI] [PMC free article] [PubMed] [Google Scholar]
  22. Johnson K, Posner SF, Biermann J, Cordero JF, Atrash HK, Parker CS, Boulet S, Curtis MG CDC/ATSDR Preconception Care Work Group; Select Panel on Preconception Care. Recommendations to Improve Preconception Health and Health Care—United States. A Report of the CDC/ATSDR Preconception Care Work Group and the Select Panel on Preconception Care. MMWR Recommended Reports. 2006;55(RR-6):1–23. [PubMed] [Google Scholar]
  23. Larsen ML, Horder M, Mogensen EF. Effect of Long-Term Monitoring of Glycosylated Hemoglobin Levels in Insulin-Dependent Diabetes Mellitus. New England Journal of Medicine. 1990;323(15):1021–5. doi: 10.1056/NEJM199010113231503. [DOI] [PubMed] [Google Scholar]
  24. Martin AB, Crawford S, Probst JC, Smith G, Saunders RP, Watkins KW, Luchok K. Medical Homes for Children with Special Health Care Needs. Journal of Health Care for the Poor and Underserved. 2007;18(4):916–30. doi: 10.1353/hpu.2007.0099. [DOI] [PubMed] [Google Scholar]
  25. McBurney PG, Simpson KN, Darden PM. Potential Cost Savings of Decreased Emergency Department Visits through Increased Continuity in a Pediatric Medical Home. Ambulatory Pediatrics. 2004;4(3):204–8. doi: 10.1367/A03-069R.1. [DOI] [PubMed] [Google Scholar]
  26. Michigan Department of Community Health. MDCH Survey of Physicians, Survey Findings 2010. Lansing, MI: Public Sector Consultants; 2010. January 2011. [Google Scholar]
  27. Nutting PA, Crabtree BF, Miller WL, Stange KC, Stewart E, Jaen C. Transforming Physician Practices to Patient-Centered Medical Homes: Lessons from the National Demonstration Project. Health Affairs. 2011;30(3):439–45. doi: 10.1377/hlthaff.2010.0159. [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Palfrey JS, Sofis LA, Davidson EJ, Liu J, Freeman L, Ganz ML. The Pediatric Alliance for Coordinated Care: Evaluation of a Medical Home Model. Pediatrics. 2004;113(5 Suppl):1507–16. [PubMed] [Google Scholar]
  29. Parkerton PH, Smith DG, Belin TR, Feldbau GA. Physician Performance Assessment: Nonequivalence of Primary Care Measures. Medical Care. 2003;41(9):1034–47. doi: 10.1097/01.MLR.0000083745.83803.D6. [DOI] [PubMed] [Google Scholar]
  30. Patient-Centered Primary Care Collaborative. 2007. Joint Principles of the Patient-Centered Medical Home [accessed May 7, 2013]. Available at http://www.pcpcc.net/joint-principles.
  31. Patient-Centered Primary Care Collaborative. 2009. Proof in Practice: A Compilation of Patient Centered Medical Home Pilot and Demonstration Projects [accessed on June 7, 2013]. Available at http://www.pcpcc.net/sites/default/files/medial/PilotGuidePip.pdf.
  32. Peikes D, Zutshi A, Genevro JL, Parchman ML, Meyers DS. Early Evaluations of the Medical Home: Building on a Promising Start. American Journal of Managed Care. 2012;18(2):105–16. [PubMed] [Google Scholar]
  33. Rankin KM, Cooper A, Sanabria K, Binns HJ, Onufer C. Illinois Medical Home Project: Pilot Intervention and Evaluation. American Journal of Medical Quality. 2009;24(4):302–9. doi: 10.1177/1062860609335759. [DOI] [PubMed] [Google Scholar]
  34. Reeves D, Campbell SM, Adams J, Shekelle PG, Kontopantelis E, Roland MO. Combining Multiple Indicators of Clinical Quality: An Evaluation of Different Analytic Approaches. Medical Care. 2007;45(6):489–96. doi: 10.1097/MLR.0b013e31803bb479. [DOI] [PubMed] [Google Scholar]
  35. Reid RJ, Fishman PA, Yu O, Ross TR, Tufano JT, Soman MP, Larson EB. Patient-Centered Medical Home Demonstration: A Prospective, Quasi-Experimental, before and after Evaluation. American Journal of Managed Care. 2009;15(9):71–87. [PubMed] [Google Scholar]
  36. Rittenhouse DR, Casalino LP, Gillies RR, Shortell SM, Lau B. Measuring the Medical Home Infrastructure in Large Medical Groups. Health Affairs. 2008;27(5):1246–58. doi: 10.1377/hlthaff.27.5.1246. [DOI] [PubMed] [Google Scholar]
  37. Rittenhouse D, Casalino L, Shortell S, Alexander JA. Small and Medium-Size Physician Practices Use Few Patient-Centered Medical Home Processes. Health Affairs. 2011;30(8):1575–85. doi: 10.1377/hlthaff.2010.1210. [DOI] [PubMed] [Google Scholar]
  38. SAS Institute Inc. SAS®. Cary, NC: SAS; 2011. 9.2 for Windows. [Google Scholar]
  39. Scholle SH, Roski J, Dunn DL, Adams JL, Dugan DP, Pawlson LG, Kerr EA. Availability of Data for Measuring Physician Quality Performance. American Journal of Managed Care. 2009;15(1):67–72. [PMC free article] [PubMed] [Google Scholar]
  40. Share DA, Mason MH. Michigan’s Physician Group Incentive Program Offers a Regional Model for Incremental ‘Fee for Value’ Payment Reform. Health Affairs. 2012;31(9):1993–2001. doi: 10.1377/hlthaff.2012.0328. [DOI] [PubMed] [Google Scholar]
  41. Singer JD. Using SAS Proc Mixed to Fit Multilevel Models, Hierarchical Models and Individual Growth Models. Journal of Educational and Behavioral Statistics. 1998;24(4):323–55. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Appendix A: Example Showing Capability and Domain Scoring for the Patient—Provider Agreement PCMH Functional Domain, June 2010.

Appendix B: Individual Component Measures of Adult and Pediatric Quality and Preventive Composite Measures and Source Definition for Each Individual Component Measure.

Appendix C: Multivariable Generalized Estimating Equation Model for the Association between Medical Home Implementation and Adult and Pediatric Composite Quality of Care Measures in BCBSM PGIP Practices, July 2009 to June 2010.

Appendix D: Multivariable Generalized Estimating Equation Model for the Association between Medical Home Implementation and Adult and Pediatric Composite Preventive Measures in BCBSM PGIP Practices, July 2009 to June 2010.

Appendix E: Multivariable Generalized Estimating Equation Model for the Association between Medical Home Implementation and Adult and Pediatric Medical and Surgical per Member per Month Costs in BCBSM PGIP Practices, July 2009 to June 2010.

hesr0049-0052-sd1.docx (54KB, docx)
Appendix SA1

Author Matrix.

hesr0049-0052-sd2.pdf (1.1MB, pdf)

Articles from Health Services Research are provided here courtesy of Health Research & Educational Trust

RESOURCES