Abstract
PURPOSE The objective of this study was to elucidate the effect of facilitation on practice outcomes in the 2-year patient-centered medical home (PCMH) National Demonstration Project (NDP) intervention, and to describe practices’ experience in implementing different components of the NDP model of the PCMH.
METHODS Thirty-six family practices were randomized to a facilitated intervention group or a self-directed intervention group. We measured 3 practice-level outcomes: (1) the proportion of 39 components of the NDP model that practices implemented, (2) the aggregate patient rating of the practices’ PCMH attributes, and (3) the practices’ ability to make and sustain change, which we term adaptive reserve. We used a repeated-measures analysis of variance to test the intervention effects.
RESULTS By the end of the 2 years of the NDP, practices in both facilitated and self-directed groups had at least 70% of the NDP model components in place. Implementation was relatively harder if the model component affected multiple roles and processes, required coordination across work units, necessitated additional resources and expertise, or challenged the traditional model of primary care. Electronic visits, group visits, team-based care, wellness promotion, and proactive population management presented the greatest challenges. Controlling for baseline differences and practice size, facilitated practices had greater increases in adaptive reserve (group difference by time, P = .005) and the proportion of NDP model components implemented (group difference by time, P=.02); the latter increased from 42% to 72% in the facilitated group and from 54% to 70% in the self-directed group. Patient ratings of the practices’ PCMH attributes did not differ between groups and, in fact, diminished in both of them.
CONCLUSIONS Highly motivated practices can implement many components of the PCMH in 2 years, but apparently at a cost of diminishing the patient’s experience of care. Intense facilitation increases the number of components implemented and improves practices’ adaptive reserve. Longer follow-up is needed to assess the sustained and evolving effects of moving independent practices toward PCMHs
Keywords: Primary care; family practice; National Demonstration Project; organizational change; patient-centered medical home, patient-centered care; adaptive reserve; outcomes assessment; practice-based research
INTRODUCTION
The Future of Family Medicine (FFM) report of the American Academy of Family Physicians (AAFP) recognized the dire need for family practice to improve its practice model in an uncertain health care environment.1 A report in 2004 by 1 of the 6 FFM task forces provided the first outline of its “New Model” of primary care practice and recommended a large-scale national demonstration project.2 The AAFP launched the National Demonstration Project (NDP) in 2006 to explore the feasibility of implementing the new model in existing family practices. Based initially on the FFM report, the NDP model was widely examined and considered, and further refined with the publication of the Joint Principles of the Patient-Centered Medical Home (PCMH).3 The model continues to evolve as experience with it grows.
Despite widespread enthusiasm for such change,4–8 there is little systematic evidence of what it takes to transform a traditional family practice into a PCMH, nor of the relative difficulty for practices attempting the specific changes required. Although many demonstration projects are planned and already in the field,9 the NDP is the first national, large-scale demonstration project, with a detailed multimethod evaluation of what it takes to implement the PCMH.
The NDP compared 2 approaches to implementation: facilitated and self-directed. The facilitated approach used an intense combination of on-site assistance from practice change facilitators, learning sessions, national consultants, and preselected vendors of a range of health information technology. The self-directed approach entailed access to Web-based practice improvement tools and services. Articles in this supplement describe our observation of the NDP intervention process,10 patient-level outcomes,11 and a qualitative analysis of the practices’ experience in integrating the NDP model components into their operations.12 This article specifically focuses on the effect of facilitation vs self-direction on practice-level outcomes. We tested hypotheses that, compared with self-directed practices, facilitated practices would be able to put more NDP model components in place, would receive higher ratings as PCMHs from their patients, and would be better able to improve their adaptive reserve (capability to make and sustain change).12–14 We also present a secondary analysis examining the effect of adaptive reserve at baseline on the ability to implement NDP model components in all practices. Finally, we present qualitative data on practices’ experience in implementing the different components of the NDP model.
METHODS
We obtained approval for the evaluation protocol from the appropriate institutional review boards (IRBs), including those of the AAFP and the academic institutions of each evaluation team member.
Participants and Settings
The NDP was launched in June 2006 by TransforMED, a division of the AAFP,15 to test an evolving model of the PCMH.10 Thirty-six family practices were selected by an NDP Technical Advisory Committee from a national pool of 337 practices that completed a detailed online application. The Committee chose practices that appeared ready to take on the NDP model and that, as a group, were maximally diverse in terms of geographic location, size, age, physician and staff structure, ownership arrangements, and scope of practice.
Overall, the practices were located in 25 states, with 11 situated in rural communities, 16 in suburban areas, and 9 in urban areas. Ten practices were solo physicians (some having a midlevel clinician), 8 practices were small (2–3 physicians), 10 practices were medium sized (4–6 physicians), and 8 practices were large (≥7 physicians). Twenty-two practices were owned by physicians, 1 was owned by a governing board, and 13 were owned by larger hospital or medical systems.
Table 1▶ shows a comparison of the practices in the 2 groups on baseline characteristics: basic practice demographics, number of NDP model components in place, and patient ratings as a PCMH. Of the 27 characteristics compared, only 3 differed significantly (P <.05) between groups. Since the self-directed practices generally started with more model components in place at baseline, we adjusted subsequent analyses for baseline status.
Table 1.
Characteristic | Facilitated Practices | Self-Directed Practices | P Value |
---|---|---|---|
Demographicsa | n=17 | n=18 | |
Age of practice, y | .22 | ||
≤5 | 35 | 33 | |
6–10 | 24 | 5 | |
11–20 | 6 | 29 | |
>20 | 35 | 33 | |
Size | .35 | ||
Solo (± midlevel clinicians) | 35 | 22 | |
Small (2–3 physicians) | 24 | 12 | |
Medium (4–6 physicians) | 17 | 44 | |
Large (≥7 physicians) | 24 | 22 | |
Setting | .85 | ||
Rural | 29 | 33 | |
Suburban | 53 | 55 | |
Urban | 18 | 11 | |
Ownership structure | .89 | ||
Physician owned | 59 | 61 | |
Health or hospital system owned | 41 | 39 | |
NDP model components in placea | n=16 | n=15 | |
Access to care and information (overall, 6 items) | 30 | 30 | .92 |
Same-day appointments | 44 | 53 | .59 |
Group visits | 6 | 7 | .96 |
e-Visits | 6 | 0 | .32 |
Care management (overall, 4 items) | 38 | 48 | .22 |
Practice services (overall, 5 items) | 95 | 91 | .29 |
Continuity of care (overall, 5 items) | 56 | 65 | .25 |
Maternity care | 81 | 87 | .68 |
Hospital care | 88 | 100 | .16 |
Practice management (overall, 5 items) | 42 | 59 | .04 |
Quality and safety (overall, 5 items) | 32 | 43 | .19 |
Medication management | 62 | 93 | .04 |
Patient satisfaction feedback | 44 | 53 | .59 |
Health information technology (overall, 5 items) | 28 | 31 | .66 |
Electronic medical record | 69 | 73 | .78 |
Electronic prescribing | 44 | 40 | .83 |
Practice Web site | 25 | 33 | .61 |
Interactive patient portal | 0 | 0 | – |
Practice-based care teams (overall, 4 items) | 20 | 48 | .001 |
Patient-rated PCMH attributesb | n=17 | n=16 | |
Comprehensive care | .81 | .84 | .08 |
Coordination of care | .74 | .76 | .53 |
Access to care | .88 | .88 | .89 |
Personal relationship over time | .76 | .76 | .58 |
Global practice experience | .27 | .32 | .28 |
NDP = National Demonstration Project; PCMH = patient-centered medical home.
Notes: Data are based on 35 practices that started the NDP; 1 of the original 36 practices was not able to obtain approval for the project from their institutional review board and withdrew their baseline data.
a Values are percentages.
b Values are ratings on a scale from 0 to 1, where higher values indicate a higher level of the attribute.
During the NDP, 5 practices withdrew from the project. One facilitated practice dropped out because of local IRB issues and another closed because of financial difficulty. Among the self-directed practices, 2 dropped out because their larger system closed or restructured their office, and 1 dropped out because of local competing demands for time and attention. Analyses are based on complete data for 16 facilitated and 15 self-directed practices.
The NDP Intervention
The evaluation team randomly assigned practices to facilitated and self-directed interventions groups. Details of the 2-year intervention are described elsewhere in this supplement.10 In brief, facilitated practices received ongoing assistance from a change facilitator; ongoing consultation from experts in practice economics, health information technology, and quality improvement; and discounted software technology, training, and support. They also participated in four 2-day learning sessions and regular group conference calls. Self-directed practices were given access to Web-based practice improvement tools and services, but did not receive any on-site assistance. The self-directed practices organized their own retreat halfway through the 2-year project and shared their interim experiences. They also participated in the final learning session with the facilitated practices.
Measure Development and Data Collection
Assessing Implementation of Model Components
The NDP model originated with the FFM report1 and went through several iterations from its inception in June 2006, including modification consistent with the joint principles of the PCMH.16 The version of the model we used for the evaluation is described elsewhere in this supplement10 and consisted of 55 components within 8 domains, including access to care and information, care management, practice services, continuity of care services, practice management, quality and safety, health information technology, and practice-based care teams.15
Of the 55 NDP model components, we judged 16 to be unmeasurable by our observational methods. For example, assessment of some components would require further observation of patient visits (eg, evidence-based practices), careful observation of staff activities beyond self-report (eg, patient participation), and judgment calls on our part that could not be made consistently (eg, culturally sensitive care). In some cases, we simply did not have reliable data on the status of the component at baseline (eg, optimized coding and billing). Removing these unmeasurable components left 39 NDP model components that we assessed for each practice at several points in time. Supplemental Appendix 1 (available at: http://www.annfammed.org/cgi/content/full/8/suppl_1/s33/DC1) shows all 55 components and their operational definition, and indicates which were not assessed.
For this analysis, we developed a template to guide data collection and to assess the status of each practice for the 39 model components. Initial data were collected during a 2- to 3-day site visit by one of the authors (E.E.S.) to each self-directed practice (summer through fall of 2007) and each facilitated practice (summer through fall 2008). On-site interviews with multiple practice staff were used to establish the model components that were in place, as well as when and how they were implemented. We recognize that memory of when events occurred may create error in judging when some components were implemented. After the end of the NDP, we followed up extensively by telephone interviews with 1 or more informants in each practice (always the physician champion and often the office manager) to ensure accuracy of the final assessment of model components in place at the end of the NDP and to assess components implemented in the 9 months after the project ended. We also gathered additional qualitative data on the processes, barriers, challenges, and special notes of accomplishment that fleshed out the practice-specific experience with the implementation process.
Before the telephone interviews, we reviewed previous practice data to customize standard questions used for each practice. We asked specific open-ended follow-up questions on model components. For example, for population management, we asked, describe to me how you are tracking your patients requiring care for chronic conditions, such as diabetes, now? When did you start this process and how is that different from before? Who is chiefly responsible for the tasks? During the interviews, we made notes on the template and then expanded and edited these notes immediately after the interview. When possible, we collected direct quotes.
We constructed a categorical variable for each of the 39 NDP model components for each practice. Components fell into 4 categories: not implemented at all, in place at baseline, implemented during the NDP, and implemented in the 9 months after the NDP ended. Where ambiguity about status of a model component remained, 2 members of the team (E.E.S. and P.A.N.) discussed the data and made a consensus judgment. In cases where clear consensus was not achieved, we recontacted the practice by phone or e-mail for additional data. We repeated this process until we were confident of the accuracy of data. In some cases, the practice implemented a component, tried it, and decided not to continue to use it. In these instances, we considered the components to be implemented, although we recognize that they were not successfully sustained. Finally, we tabulated the categorical data by practice and examined patterns across practices and practice groups.
Assessing Patient-Rated PCMH Attributes
Development and administration of the patient outcomes survey (POS) is described in detail elsewhere in this supplement.17 NDP staff mailed the POS to a cross-sectional sample of 120 consecutive patients of any age seen in the practice on 3 target dates: baseline (July 3, 2006), 9 months (April 1, 2007), and 26 months (August 1, 2008). The POS included more than 100 items, most of which used a 5-point Likert-type scale. Response rates across all 31 practices for the POS were 27% (wave 1), 22% (wave 2), and 21% (wave 3).
For this analysis, we constructed a practice-level measure of the patient’s assessment of the PCMH attributes of the practice (the patient-rated PCMH) that consisted of 23 items in 5 scales (Table 2▶). Analysis of these data as patient-level outcomes are reported elsewhere in this supplement.11 As a group, the patient-rated PCMH measure addressed the 4 pillars of primary care (easy access to first-contact care, comprehensive care, coordination of care, and personal relationship over time) that have been shown to be associated with improved outcomes and reduced cost.21–23 For the 4 pillars of primary care, we used well-validated measures: the Ambulatory Care Experience Survey (ACES)19 for organizational access and the Components of Primary Care Index (CPCI)18 for measures of comprehensive care, coordination of care, and accumulated knowledge as a proxy for personal relationship over time. The patient-rated PCMH also used 2 new items in a fifth scale to assess the global practice experience, as rated on Likert-type scales regarding statements of “I am delighted with this practice” and “I receive the care I want and need when and how I want and need it.”11,20 We conducted a reliability analysis using PASW Statistics version 17 (SPSS Inc, Chicago, Illinois) by loading all 23 items, resulting in a measure with a Cronbach α of .92. Cronbach α is a measure of the internal consistency of a scale. High values (eg, >.7) indicate that all variables in the set correlate well with one another.
Table 2.
Component | Items |
---|---|
Comprehensive care (from CPCI18) | Handles emergencies |
Care of almost any medical problem I may have | |
Go for help with a personal or medical problem | |
Go for care for an ongoing medical problem such as high blood pressure | |
Go for a checkup to prevent illness | |
Coordination of care (from CPCI18) | Keeps track of all my health care |
Follows up on a problem I’ve had, either at the next visit or by mail, e-mail, or phone | |
Follows up on my visit to other health care professionals | |
Helps me interpret my laboratory tests, x-rays, or visits to other doctors | |
Communicates with other health professionals I see | |
Access to care (from ACES19) | Help as soon as needed for an illness or injury |
Appointment for a checkup or routine care as soon as needed | |
Answer to medical question the same day when calling during regular office hours | |
Help or advice needed when calling after regular office hours | |
Personal relationship over time (from CPCI18) | Knows a lot about my family medical history |
Have been through a lot together | |
Understands what is important to me regarding my health | |
Knows my medical history very well | |
Takes my beliefs and wishes into account in caring for me | |
Knows whether or not I exercise, eat right, smoke, or drink alcohol | |
Knows me well as a person (such as hobbies, job, etc) | |
Global practice experience (new scale20) | I receive the care I want and need when and how I want and need it |
I am delighted with this practice |
ACES = Ambulatory Care Experience Survey; CPCI = Components of Primary Care Index; PCMH = patient-centered medical home.
Notes: Scores on this scale consisted of the average summed responses of the 23 items in 5 subscales. Cronbach α for the 23-item scale was .92.
Assessing Practice Adaptive Reserve
Reports have identified the importance of a practice’s ability to make and sustain change.24–26 We have termed this characteristic the adaptive reserve and have observed how it becomes important in times of stress and rapid change.12,14,27 Adaptive reserve includes the practice relationship infrastructure; alignment of management functions in which clinical care, practice operations, and financial functions share and reflect a consistent vision; facilitative leadership; teamwork; sensemaking; a positive work environment; and a culture of learning.17 The relationship infrastructure in turn consists of trust, mindfulness, heedful interaction, respectful interaction, cognitive diversity, a balance of social and task relatedness, and a balance of rich and lean communication venues.28
We created the adaptive reserve scale from the clinician staff questionnaire (CSQ) described elsewhere in this supplement.17 The purpose of the CSQ was to measure and track changes over the course of the NDP in how clinicians and office staff perceived key practice attributes, such as modes of communication, leadership styles, culture of learning, psychological safety, and approach to cultural diversity. The CSQ was distributed to all clinical and nonclinical practice staff at each practice in person and collected by mail in 3 waves (baseline, 9 months, and 26 months). Staff who agreed to participate returned the CSQ by mail directly to the study center. To comply with the IRB protocol, the questionnaires did not include an individual identifier, so the 3 waves of the CSQ represent repeated cross-sections of the staff at each practice. Response rates for the CSQ were 60% (wave 1), 48% (wave 2), and 52% (wave 3).
We submitted 82 items from the CSQ to a principal components factor analysis separately for each of the 3 waves, as described in detail elsewhere in this supplement.17 The analysis identified a 23-item scale that addressed the relationship infrastructure, facilitative leadership, culture of learning, and work environment (Table 3▶). Items for alignment of management functions were not included as the importance of this characteristic emerged from our analysis12 and it was not included in the original CSQ. The adaptive reserve measure had a Cronbach α of .97.
Table 3.
Component | Items (Attributes Measured) |
---|---|
Relationship infrastructure | People in our practice actively seek new ways to improve how we do things (mindfulness) |
People at all levels of this office openly talk about what is and isn’t working (mindfulness) | |
We regularly take time to consider ways to improve how we do things (mindfulness) | |
People are aware of how their actions affect others in this practice (heedful interactions) | |
Most people in this practice are willing to change how they do things in response to feedback from others (respectful interaction) | |
After trying something new, we take time to think about how it worked (reflection) | |
We regularly take time to reflect on how we do things (reflection) | |
This practice encourages everyone (front office staff, clinical staff, nurses, and clinicians) to share ideas (cognitive diversity) | |
I can rely on the other people in this practice to do their jobs well (trust) | |
Difficult problems are solved through face-to-face discussions in this practice (communication) | |
Facilitative leadership | Practice leadership promotes an environment that is an enjoyable place to work |
Leadership in this practice creates an environment where things can be accomplished | |
Leadership strongly supports practice change efforts | |
The practice leadership makes sure that we have the time and space necessary to discuss changes to improve care | |
Sensemaking | When we experience a problem in the practice, we make a serious effort to figure out what’s really going on |
People in this practice have the information that they need to do their jobs well | |
Teamwork | I have many opportunities to grow in my work |
People in this practice operate as a real team | |
Work environment | Most of the people who work in our practice seem to enjoy their work |
This practice is a place of joy and hope | |
Culture of learning | Mistakes have led to positive changes here |
It is hard to get things to change in our practice | |
This practice learns from its mistakes |
Notes: Scores on this scale were computed as the summed averaged of the individual responses for each practice. Cronbach α for the 23-item scale was .97.
Data Analysis
We analyzed the effect of facilitation on the 3 main outcomes, namely, the proportion of model components implemented by the practice during the NDP, the patient-rated PCMH, and the practice’s adaptive reserve. We used a full factorial repeated-measures analysis of variance (ANOVA) to assess the main effects (eg, mean differences between groups) and the within-group change over time. This approach also allowed us to determine whether one group changed more rapidly over time (group-by-time interaction).29 We weighted the analysis by the number of respondents in each practice because of varying response rates. Although the patient-rated PCMH and the practice adaptive reserve measures are based on individual responses, variables in this analysis were aggregated practice-level scores. This approach precluded the necessity of using multilevel methods because the practice, rather than the individual, was the unit of analysis.
In a secondary analysis, we assessed whether practice adaptive reserve at baseline was associated with number of model components implemented. We used an ordinary least squares regression model and adjusted for the number of model components in place at baseline.
We also examined qualitatively the patterns of practice adoption of the NDP model components. We produced marginal counts for each of the 39 measurable components. For each practice and for each model component, we assessed whether and when the component was implemented, and patterns of implementation across practices.
Finally, we analyzed qualitative field notes made during and immediately after telephone interviews with practices to enrich understanding of the challenges faced in implementing the components and some of the patterns of adoption that emerged.
RESULTS
Effect of the NDP Intervention
Results of the analyses of the effect of the NDP intervention on the 3 main outcomes are shown in Table 4▶. Practices in both groups significantly increased the proportion of NDP model components in place from baseline to the 26-month follow-up; however, the facilitated practices had fewer components in place at baseline, and the significant interaction term (between group and time) indicates that facilitation significantly increased component implementation. The patient-rated PCMH measure significantly decreased in both facilitated and self-directed groups, with no significant difference between them. Finally, practice adaptive reserve increased during the NDP in the facilitated practices but remained essentially the same in self-directed practices, with a significant difference between groups.
Table 4.
Outcome | Facilitated Practices, Mean (SD) (n=16) | Self-Directed Practices, Mean (SD) (n=15) | ANOVA P Values |
---|---|---|---|
NDP model components in placea | |||
Baseline | .42 (.40) | .54 (.40) | Between group: .19 |
26 months | .72 (.45) | .70 (.47) | Within group: <.001 |
Group differences by time: .005 | |||
Patient-rated PCMHb | |||
Baseline | 3.42 (0.66) | 3.51 (0.75) | Between group: .41 |
26 months | 3.38 (0.68) | 3.41 (0.93) | Within group : .03 |
Group differences by time: .34 | |||
Practice adaptive reservec | |||
Baseline | .69 (.35) | .69 (.38) | Between group: .51 |
26 months | .74 (.38) | .68 (.46) | Within group: .09 |
Group differences by time: .02 |
ANOVA = analysis of variance; NDP = National Demonstration Project; PCMH = patient-centered medical home.
Notes: Patient-rated PCMH and practice adaptive reserve are scale scores described in the text. ANOVA analyses were weighted by the number of respondents as a proxy for practice size.
a The proportion of 39 measurable model components in place.
b Scores represent the average summed responses for 23 items (shown in Table 2▶), with a range of 1 (strongly disagree) to 5 (strongly agree). Items were reverse-scored when appropriate so that higher numbers reflect more positive ratings.
c Scores represent the average summed responses for 23 items (shown in Table 3▶). Items were reverse-scored when appropriate and rescaled to reflect a range from 0 to 1, where higher scores reflect more adaptive reserve.
Adaptive Reserve and Component Implementation
Baseline adaptive reserve appeared to influence the number of NDP model components implemented. After adjusting for difference between groups in components in place at baseline (P = .04), there was a nonsignificant trend toward implementation of more model components in practices having greater adaptive reserve (standardized β = .23, SE = .21, P = .08). We should note that our analysis had only 60% power for detecting a significant effect (P <.05) on group, time, and group-by-time effect for these variables.
Patterns of Practice Implementation of Components
Data on implementation of individual NDP model components for all 31 practices completing the NDP are summarized in Table 5▶ and shown in detail by practice in Supplemental Appendix 2 (available at http://www.annfammed.org/cgi/content/full/8/suppl_1/s33/DC1). Practices in both groups already had many model components in place at baseline (Table 5▶). The facilitated practices had on average 17.0 (43.6%) and the self-directed practices had on average 20.1 (51.5%) of the 39 model components when the NDP began. In general, most practices entered the NDP with most of the components of practice services and many of the components that had to do with the scope of services (after-hours coverage, hospital care, maternity care, and disease prevention) in place. The few missing components in these areas were completed during the NDP.
Table 5.
Facilitated Practices (n=16) |
Self-Directed Practices (n=15) |
All Practices (N=31) |
||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Domain and Component | In Place at Baseline | Implemented During NDP | Implemented in the 9 mo After NDP | Not Implemented | In Place at Baseline | Implemented During NDP | Implemented in the 9 mo After NDP | Not Implemented | In Place at Baseline | Implemented During NDP | Implemented in the 9 mo After NDP | Not Implemented |
Access to care and information | ||||||||||||
Same-day appointments | 7 | 8 | 1 | 0 | 8 | 6 | 1 | 0 | 15 | 14 | 2 | 0 |
Laboratory results highly accessible | 8 | 7 | 0 | 1 | 4 | 7 | 4 | 0 | 12 | 14 | 4 | 1 |
Online patient services | 1 | 9 | 0 | 6 | 0 | 4 | 5 | 6 | 1 | 13 | 5 | 12 |
e-Visits | 1 | 5 | 0 | 10 | 0 | 3 | 0 | 12 | 1 | 8 | 0 | 22 |
Group visits | 1 | 8 | 2 | 5 | 1 | 5 | 0 | 9 | 2 | 13 | 2 | 14 |
After-hours access coverage | 14 | 2 | 0 | 0 | 15 | 0 | 0 | 0 | 29 | 2 | 0 | 0 |
Care management | ||||||||||||
Population management | 2 | 6 | 0 | 8 | 2 | 4 | 3 | 6 | 4 | 10 | 3 | 14 |
Wellness promotion | 4 | 3 | 0 | 9 | 5 | 1 | 0 | 9 | 9 | 4 | 0 | 18 |
Disease prevention | 11 | 5 | 0 | 0 | 13 | 2 | 0 | 0 | 24 | 7 | 0 | 0 |
Patient engagement/education | 7 | 5 | 0 | 4 | 9 | 3 | 1 | 2 | 16 | 8 | 1 | 6 |
Practice services | ||||||||||||
Comprehensive acute and chronic care | 16 | 0 | 0 | 0 | 15 | 0 | 0 | 0 | 31 | 0 | 0 | 0 |
Prevention screening | 12 | 4 | 0 | 0 | 12 | 3 | 0 | 0 | 24 | 7 | 0 | 0 |
Surgical procedures | 16 | 0 | 0 | 0 | 15 | 0 | 0 | 0 | 31 | 0 | 0 | 0 |
Ancillary therapeutic/support | 16 | 0 | 0 | 0 | 12 | 3 | 0 | 0 | 28 | 3 | 0 | 0 |
Ancillary diagnostic services | 16 | 0 | 0 | 0 | 14 | 1 | 0 | 0 | 30 | 1 | 0 | 0 |
Continuity of care | ||||||||||||
Community-based services | 7 | 3 | 0 | 6 | 8 | 1 | 0 | 6 | 15 | 4 | 0 | 12 |
Hospital care | 14 | 0 | 0 | 2 | 15 | 0 | 0 | 0 | 29 | 0 | 0 | 2 |
Behavioral health care | 8 | 1 | 0 | 7 | 8 | 2 | 0 | 5 | 16 | 3 | 0 | 12 |
Maternity care | 13 | 0 | 0 | 3 | 13 | 0 | 0 | 2 | 26 | 0 | 0 | 5 |
Case management | 3 | 3 | 0 | 10 | 5 | 1 | 1 | 8 | 8 | 4 | 1 | 18 |
Practice management | ||||||||||||
Disciplined financial management | 9 | 6 | 0 | 1 | 10 | 3 | 2 | 0 | 19 | 9 | 2 | 1 |
Cost-benefit decision making | 9 | 2 | 2 | 3 | 10 | 5 | 0 | 0 | 19 | 7 | 2 | 3 |
Revenue enhancement | 5 | 7 | 2 | 2 | 4 | 4 | 2 | 5 | 9 | 11 | 4 | 7 |
Personnel/HR management | 8 | 6 | 1 | 1 | 12 | 2 | 1 | 0 | 20 | 8 | 2 | 1 |
Optimized office design | 3 | 3 | 7 | 3 | 8 | 3 | 0 | 4 | 11 | 6 | 7 | 7 |
Quality and safety | ||||||||||||
Medication management | 10 | 3 | 0 | 3 | 14 | 1 | 0 | 0 | 24 | 4 | 0 | 3 |
Patient satisfaction feedback | 7 | 3 | 0 | 6 | 8 | 3 | 0 | 4 | 15 | 6 | 0 | 10 |
Clinical outcomes analysis | 4 | 5 | 0 | 7 | 4 | 4 | 1 | 6 | 8 | 9 | 1 | 13 |
Quality improvement | 4 | 6 | 0 | 6 | 5 | 2 | 2 | 6 | 9 | 8 | 2 | 12 |
Practice-based team care | 1 | 4 | 1 | 10 | 1 | 6 | 0 | 8 | 2 | 10 | 1 | 18 |
Health information technology | ||||||||||||
Electronic medical record | 11 | 3 | 0 | 2 | 11 | 3 | 0 | 1 | 22 | 6 | 0 | 3 |
Electronic prescribing | 7 | 7 | 0 | 2 | 6 | 9 | 0 | 0 | 13 | 16 | 0 | 2 |
Population management/registry | 0 | 7 | 0 | 9 | 1 | 3 | 3 | 8 | 1 | 10 | 3 | 17 |
Practice Web site | 4 | 9 | 0 | 3 | 5 | 4 | 2 | 4 | 9 | 13 | 2 | 7 |
Interactive patient portal | 0 | 5 | 2 | 9 | 0 | 4 | 5 | 6 | 0 | 9 | 7 | 15 |
Practice-based care teams | ||||||||||||
Provider leadership | 5 | 6 | 2 | 3 | 9 | 3 | 1 | 2 | 14 | 9 | 3 | 5 |
Shared mission and vision | 3 | 7 | 1 | 5 | 9 | 2 | 1 | 3 | 12 | 9 | 2 | 8 |
Effective communication | 3 | 7 | 2 | 4 | 9 | 2 | 1 | 3 | 12 | 9 | 3 | 7 |
Task designation by skill set | 2 | 6 | 1 | 7 | 2 | 7 | 1 | 5 | 4 | 13 | 2 | 12 |
HR=human resources; NDP=National Demonstration Project.
Notes: Values shown are numbers of practices. Within each group (facilitated, self-directed, or all), numbers total across rows.
Practices in both groups were successful in implementing many additional components: facilitated practices added an average of 10.7 new components and self-directed practices added an average of 7.7. Additionally, in the 9 months after the end of the NDP, practices continued to implement model components, an additional 1.5 in facilitated and 2.5 in self-directed practices. Consequently, by the end of this 9-month period, practices in both groups had in place at least 70% of the components: an average of 27.7 and 27.9 of the 39 NDP model components in the facilitated and self-directed practices, respectively.
Model components were not all equally likely to be implemented. All practices were able to implement same-day appointments and nearly all were able to implement electronic prescribing and make laboratory results highly accessible to patients. Many were able to improve practice management processes such implementing as more disciplined financial management, cost-benefit decision making, revenue enhancement, improved personnel management, and more efficient office design. Many practices also had or developed a practice Web site, although providing a fully functioning patient portal proved more difficult for most.
The NDP practices appeared to be early adopters of electronic medical records (EMRs). Twenty-two (71%) had EMRs in place at baseline (well above the national average30–32), another 6 implemented them during the NDP, and only 3 (2 facilitated and 1 self-directed) did not have EMRs by 26 months. The 2 facilitated practices without EMRs were both part of larger systems for whom EMR implementation was an evolving priority.
At the same time, some model components presented greater challenges. These components included e-visits, group visits, team-based care, wellness promotion, and population management (involving 3 model components). We describe these more challenging components in greater detail below.
e-Visits
Practices in both groups struggled with e-visits. Only 1 practice had implemented e-visits before the NDP and only 8 more put e-visits into place during the NDP. Of these, 4 used them for a time, but subsequently stopped. There were several reasons why e-visits were not popular among practices. Several practices felt they were useful “when they worked”; however, they had great difficulty using the templates from their commercial vendor and obtaining commercial products to enable patients to pay with credit cards online. Further, e-visits were also seen as not being very efficient and requiring a great deal of effort when it came to marketing them to patients. For example, one physician noted he spent more time on the computer for an e-visit than he did in the examination room for a conventional visit. At least 5 practices indicated that a lack of coverage by health plans was a major impediment, and they resisted asking patients to pay for a service they saw as providing only marginal value. Interestingly, several practices noted that once they had implemented same-day visits, adding e-visits seemed contradictory to both them and their patients. In fact, the facilitated practice that came into the NDP with e-visits in place reported a rapid decline in patient interest once they had implemented same-day visits.
On the other hand, the majority of practices enthusiastically used e-mail to communicate with patients in some fashion. A few found e-mail to be a useful way for patients to ask questions that could be triaged within the practice for an appropriate response. Several physicians felt that using e-mail provided a useful adjunct to office visits, but did not serve to replace them. Some practices reported that they believed providing e-mail access to their patients reduced the number of telephone calls. Among both groups, only 7 practices did not communicate with patients in some manner using e-mail.
Group Visits
Group visits also presented a quandary for many NDP practices. Only 1 practice in each group had experimented with group visits before the NDP. Fifteen practices implemented 1 or more group visits during the NDP; however, 9 of these practices subsequently discontinued group visits, citing lack of time and support for planning and a general sense that they did not have enough value to justify the financial investment. Two practices independently estimated that they would need 7 to 8 patients per session to break even financially, a goal they were unable to reach. On the other hand, several practices planned to continue to explore group visits, and a number were modifying their format to shift emphasis onto wellness, support groups, and sessions that emphasized education over providing visit-type services. Group visits were a particular challenge for small practices, which had difficulty in finding time, space, and a critical mass of patients. One self-directed practice pointed out that the time and energy spent planning and organizing the group visit directly competed with that available for nurse visits to educate patients with chronic illness. All practices struggled with how to code for group visits, because of the lack of clear guidelines and the variation they observed in expert opinion.
Team-Based Care
Practices often had trouble implementing team-based care. Many took initial steps by creating stable physician–medical assistant teams and locating physicians and medical assistants in the same work area; however, these actions were generally viewed only as important intermediate steps and did not constitute team care. Creating care teams required breaching the traditional gap in front-back office communication by developing shared visions of how care teams affect the patient experience, having frequent front-back office meetings and retreats, and reconfiguring office work flow and patient flow across front-back office functions. Developing team-based care also required substantial effort in cross-training and systematically creating agendas for ongoing training in expanded tasks. One physician observed, “Taking the time to train my staff to take part in the history and physical exam was the smartest thing I ever did.” In addition, establishing standing orders and protocols for ordering laboratory tests and refilling prescriptions were important team-based care roles. A number of practices reported that daily huddles33 were an important way to model team behavior. Two practices strengthened both their care teams and their community connections by providing a lunch allowance for staff to meet with other service providers in the community and bring back relevant information for the practice. In the words of one facilitated practice physician, “We have always been in touch with community services, the difference is now we are using the practice care team to help build a knowledge base—it’s not just [physician name] and me anymore. The MAs [medical assistants] help to coordinate this stuff.”
Nevertheless, practices cited a number of barriers to care teams, including reliance on part-time staff and physicians that created challenges to continuity. Additionally, part-time staff had less incentive to expend effort on a larger shared practice vision. A barrier that did not surface without probing was the perception of many physicians about their role and a reluctance to sharing that role with others. As one physician noted, “Doctors should be doing the doctor things.” Another physician pointed out that he had gained an appreciation of care teams during the NDP, but that other physicians in the practice are “stuck in the old way of doctoring.”
Wellness Promotion
Four facilitated practices and 5 self-directed practices had an emphasis on wellness coming into the NDP. Another 3 practices in the facilitated group and 1 in the self-directed group made substantial progress on this component during the NDP. Nine practices in each group did not report progress in emphasizing wellness, however. Although virtually all practices valued wellness as an integral part of the scope of their work, they largely cited time and energy as barriers. Several physicians saw an important association between an emphasis on wellness and expanding team care in the practice, both of which were seen as challenges to be faced as the practice developed further. Many of the practices reporting an emphasis on wellness were able to offer wellness services through their larger hospital or medical system. Several practices pointed out an association between an emphasis on wellness and strong connections with the larger community. Most of the practices who strengthened their community connections during the NDP did so through participation in health fairs or sponsoring community health or fitness events.
Population Management
Three of the NDP model components focused on building practices’ ability to monitor and proactively address the health care of subpopulations. Two of the components—population management (in the care management domain) and population management/registries (in the health information technology domain)—had overlapping properties for identifying groups of patients with selected characteristics such as diabetes. The third component, case management, addressed processes for identifying, tracking, and taking action for patients with complex comorbidities and preventing those patients from falling through the cracks. Nevertheless, the technologic solutions for information support and population management were often less than ideal. Private practices were typically at the mercy of EMR vendors’ time lines, whereas system-owned practices had to wait until a feature became a priority for their system’s information technology department. Many practices found their EMR could not provide information support for population-based care, although some systems could print out lists of patients with certain conditions so practices could catch patients as they came in for visits. As the physician in one facilitated practice said, “we’re ready and willing—the software isn’t willing yet!”
Some practices therefore used billing data to identify target populations by age and sex, and send an e-mail or postal “blast” for special purposes, such as encouraging influenza shots. In several practices, particularly motivated individual physicians created their own work-around for a topic-specific population management issue. For example, one self-directed practice jury-rigged their EMR to produce population reports and point-of-service reminders, and to place the reminders on their patient portal. Although facilitated practices had access to an innovative and sophisticated proprietary disease management tool, most who tried to implement the tool discovered it did not integrate easily with their EMR. Many practices decided to wait until their EMR offered an upgrade with population management features. In addition to limitations in available technology and the added time required for activities not traditionally included in primary care practice, there was resistance to change in roles for existing personnel and to the required shift in paradigm from care of 1 patient at a time to population-based, proactive care of groups of patients. Even in many practices embedded within a larger system that was capable of producing population reports, it was still up to the practice to request and use the reports. In many cases, this capacity was not used.
DISCUSSION
This analysis of the effects of the NDP revealed some important findings on 3 hypothesized practice-level effects regarding facilitation of the intervention. There are, however, several important limitations of the analysis. First are limitations to generalizability of the findings that include highly selected practices and extremely capable change facilitators, both working under intense national scrutiny. Similarly, the NDP facilitated intervention was very intense and involved a learning evaluation (described elsewhere in this supplement10) that interacted in important ways with the unusual capability and motivation of the practices. Although these characteristics are helpful in examining the feasibility of implementing many of the features of a PCMH, they limit our ability to understand how more typical practices will be able to adopt these features. Future efforts to adopt a PCMH model may find that less can be accomplished in more typical settings or that adoption requires even more time and resources.
A second limitation is possible bias in the 3 outcome measures. The number of NDP model components implemented was derived from self-report by practice informants, although we were able to triangulate the assessment with multiple practice informants, facilitator reports, and e-mail streams. The response rates for the 2 surveys (in the range of 20% for the POS and 50% for the CSQ) may have produced selection bias, and perhaps a rosier result, and need to be replicated in other studies. A third limitation was that because the NDP focused so intensely on specific model components, further work is required to understand the strategies for ensuring that implementing PCMH model components leads eventually to strong patient-centered characteristics. Finally, the NDP did not incorporate new reimbursement strategies into the intervention, and the effect of various types of reimbursement reform must be studied in current and future demonstration projects.
The NDP facilitated intervention increased the practices’ adaptive reserve,12–14 a characteristic shown in our qualitative analysis to be important for success in adopting model components.12 In the analysis reported here, we also observed a nonsignificant positive association between adaptive reserve at baseline and implementation of NDP model components. This is an important finding and taken together with our qualitative findings,12 suggests that strengthening adaptive reserve will serve the practices well over the next decade as they continue the transformation to PCMHs and adapt to rapidly changing demands of the health care environment.
Most practices in both groups were able to implement many of the NDP model components over the 2 years of the initiative. Facilitation appeared to significantly increase the number of adopted model components, with an average of 10.7 added in facilitated practices, compared with 7.7 in self-directed practices. The cost and effort required in the NDP intervention to achieve a modest difference in model components implemented call into question the feasibility of such an intense intervention as a national strategy for adopting a PCMH model. Importantly, the self-directed practices were also successful in adopting model components, and practices in both groups ended up with at least 70% of model components in place. The ability of many self-directed practices to make substantial progress suggests that not all practices need intense assistance. We believe from this and other work that a practice’s baseline adaptive reserve can be an important indicator of the magnitude and kind of assistance a practice may need.
We also observed that facilitation did not directly increase patient ratings of their practice as a PCMH. In fact, the patient-rated PCMH significantly decreased in both groups. Whether the intense effort to adopt model components or the nature of the components themselves (eg, an EMR in the examination room) has a deleterious effect on the patient experience is not clear from our data, but deserves further study. The differing effect of facilitation on implementing model components and on patient-rated PCMH attributes suggests that from a patient’s perspective, a PCMH is more than the sum of the NDP model components.11 Changing a practice in a way that improves the patient’s experience requires either a different set of strategies or more time for existing strategies to take effect. Adopting NDP model components is a very proximal step in a complex chain of events that also includes effective and consistent application of model components to the patient population before improved patient-level outcomes will be realized.
Finally, not all changes included in the NDP model required the same level of effort. In looking across all practices, it became apparent that changes were relatively harder if they had an impact on multiple roles and processes, required coordination across work units, necessitated additional resources and expertise, and challenged the traditional model of primary care. Some model components were necessarily implemented in sequence. For example, practices often postponed addressing case management until they had a functioning registry in place, while such a registry, in turn, was rarely available as a routine function of the EMR. Components were also more difficult to implement when they required shifts in the ways people thought about and understood their roles. For example, adopting a team care approach required that multiple roles in the practice be redefined, representing a more difficult task than implementing same-day appointments. Although the latter was very difficult, it did not generally create a ripple effect through the practice that disrupted the practice’s working relationships and style. Changes were also more challenging when they required that individuals or groups adopt a different mental model of their work. Adopting team care was seen to conflict with some physicians’ vision of their work as a doctor, whereas adopting a population-based approach to care required the entire practice to shift from a model of (in the words of one physician) “get ‘em in, get ‘em out” to one that viewed population-based proactive care of defined populations as legitimate work of the practice.
Acknowledgments
The NDP was designed and implemented by TransforMED, LLC, a wholly-owned subsidiary of the AAFP. We are indebted to the participants in the NDP and to TransforMED for their tireless work.
Conflicts of interest: The authors’ funding partially supports their time devoted to the evaluation, but they have no financial stake in the outcome. The authors’ agreement with the funders gives them complete independence in conducting the evaluation and allows them to publish the findings without prior review by the funders. The authors have full access to and control of study data. The funders had no role in writing or submitting the manuscript.
Disclaimer: Drs Stange and Nutting, who are editors of the Annals, were not involved in the editorial evaluation of or decision to publish this article.
Funding support: The independent evaluation of the National Demonstration Project (NDP) practices was supported by the American Academy of Family Physicians (AAFP) and The Commonwealth Fund. The Commonwealth Fund is a national, private foundation based in New York City that supports independent research on health care issues and makes grants to improve health care practice and policy.
Publication of the journal supplement is supported by the American Academy of Family Physicians Foundation, the Society of Teachers of Family Medicine Foundation, the American Board of Family Medicine Foundation, and The Commonwealth Fund.
Dr Stange’s time was supported in part by a Clinical Research Professorship from the American Cancer Society.
Disclaimer: The views presented here are those of the authors and not necessarily those of The Commonwealth Fund, its directors, officers, or staff.
REFERENCES
- 1.Martin JC, Avant RF, Bowman MA, et al; Future of Family Medicine Project Leadership Committee. The future of family medicine: a collaborative project of the family medicine community. Ann Fam Med. 2004;2(Suppl 1):S3–S32. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Spann SJ; Task Force 6 and the Executive Editorial Team. Task Force Report 6. Report on financing the new model of family medicine. Ann Fam Med. 2004;2(Suppl 3):S1–S21. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.American Academy of Family Physicians. Joint principles of the Patient-Centered Medical Home. Del Med J. 2008;80(1):21–22. [PubMed] [Google Scholar]
- 4.Davis K, Schoenbaum S, Audet A. A 2020 vision of patient-centered primary care. J Gen Intern Med. 2005;20(10):953–957. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Fisher ES. Building a medical neighborhood for the medical home. N Engl J Med. 2008;359(12):1202–1205. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Iglehart JK. No place like home—testing a new model of care delivery. N Engl J Med. 2008;359(12):1200–1202. [DOI] [PubMed] [Google Scholar]
- 7.Lewis M. Medical home model improves efficiency, docs say. Med Econ. August 15, 2008.
- 8.Berenson RA, Hammons T, Gans DN, et al. A house is not a home: keeping patients at the center of practice redesign. Health Aff (Mill-wood). 2008;27(5):1219–1230. [DOI] [PubMed] [Google Scholar]
- 9.Patient-Centered Primary Care Collaborative. Proof in Practice: A Compilation of Patient Centered Medical Home Pilot and Demonstration Projects. 2009. http://www.pcpcc.net/pilot-guide. Accessed Jan 18, 2010.
- 10.Stewart EE, Nutting PA, Crabtree BF, Stange KC, Miller WL, Jaén CR. Implementing the patient-centered medical home: observation and description of the National Demonstration Project. Ann Fam Med. 2010;8(Suppl 1):s21–s32. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Jaén CR, Ferrer RL, Miller WL, et al. Patient outcomes at 26 months in the patient-centered medical home National Demonstration Project. Ann Fam Med. 2010;8(Suppl 1):s57–s67. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Nutting PA, Crabtree BF, Miller WL, Stewart EE, Stange KC, Jaén CR. Journey to the patient-centered medical home: a qualitative analysis of the experiences of practices in the National Demonstration Project. Ann Fam Med. 2010;8(Suppl 1):s45–s56. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Miller WL, Crabtree BF, Nutting PA, Stange KC, Jaén CR. Primary care practice development: a relationship-centered approach. Ann Fam Med. 2010;8(Suppl 1):s68–s79. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Nutting PA, Miller WL, Crabtree BF, Jaen CR, Stewart EE, Stange KC. Initial lessons from the first National Demonstration Project on practice transformation to a patient-centered medical home. Ann Fam Med. 2009;7(3):254–260. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.TransforMED. National Demonstration Project. http://www.transformed.com/ndp.cfm. Accessed Feb 5, 2010.
- 16.American Academy of Family Physicians (AAFP), American Academy of Pediatrics (AAP), American College of Physicians (ACP), American Osteopathic Association (AOA). Joint Principles of the Patient-Centered Medical Home. February 2007. http://www.aafp.org/pcmh/principles.pdf. Accessed Feb 5, 2010.
- 17.Jaén CR, Crabtree BF, Palmer R, et al. Methods for evaluating practice change toward a patient-centered medical home. Ann Fam Med. 2010;8(Suppl 1):s9–s20. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Flocke SA. Measuring attributes of primary care: development of a new instrument. J Fam Pract. 1997;45(1):64–74. [PubMed] [Google Scholar]
- 19.Safran DG, Karp M, Coltin K, Chang H, Li A, Ogren J. Measuring patients’ experience with individual primary care physicians: results of a statewide demonstration project. J Gen Intern Med. 2006;21(1):13–21. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Institute of Medicine. Committee on Quality of Health Care in America. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: National Academy Press; 2001.
- 21.Rosenthal TC. The medical home: growing evidence to support a new approach to primary care. J Am Board Fam Med. 2008;21(5):427–440. [DOI] [PubMed] [Google Scholar]
- 22.Starfield B. Primary Care: Concept, Evaluation, and Policy. New York, NY: Oxford University Press; 1992.
- 23.Starfield B, Shi L. The medical home, access to care, and insurance: a review of evidence. Pediatrics. 2004;113(5 Suppl):1493–1498. [PubMed] [Google Scholar]
- 24.Cohen D, McDaniel RR Jr, Crabtree BF, et al. A practice change model for quality improvement in primary care practice. J Healthc Manag. 2004;49(3):155–168; discussion 169–170. [PubMed] [Google Scholar]
- 25.Edmondson AC. Learning from failure in health care: frequent opportunities, pervasive barriers. Qual Saf Health Care. 2004;13 (Suppl 2):ii3–ii9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Shortell SM. Increasing value: a research agenda for addressing the managerial and organizational challenges facing health care delivery in the United States. Med Care Res Rev. 2004;61 (3 Suppl):12S–30S. [DOI] [PubMed] [Google Scholar]
- 27.Crabtree BF, Miller WL, McDaniel RR Jr, Stange KC, Nutting PA, Jaen CR. A survivor’s guide for primary care physicians. J Fam Pract. 2009;58(8):E1–E7. [PMC free article] [PubMed] [Google Scholar]
- 28.Lanham HJ, McDaniel RR Jr, Crabtree BF, et al. How improving practice relationships among clinicians and nonclinicians can improve quality in primary care. Jt Comm J Qual Saf. 2009;35(9): 457–466. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Stevens JP. Applied Multivariate Statistics for the Social Sciences. Mahway, NJ: Lawrence Erlbaum Associates, Inc; 1996.
- 30.Bates DW. Physicians and ambulatory electronic health records. Health Aff (Millwood). 2005;24(5):1180–1189. [DOI] [PubMed] [Google Scholar]
- 31.DesRoches CM, Campbell EG, Rao SR, et al. Electronic health records in ambulatory care—a national survey of physicians. N Engl J Med. 2008;359(1):50–60. [DOI] [PubMed] [Google Scholar]
- 32.Hing ES, Burt CW, Woodwell DA. Electronic medical record use by office-based physicians and their practices: United States, 2006. Adv Data. 2007;(393):1–7. [PubMed]
- 33.Stewart EE, Johnson BC. Huddles: improve office efficiency in mere minutes. Fam Pract Manag. 2007;14(6):27–29. [PubMed] [Google Scholar]