Skip to main content
Health Services Research logoLink to Health Services Research
. 2013 Nov 1;48(6 Pt 2):2181–2207. doi: 10.1111/1475-6773.12114

Connecting the Dots and Merging Meaning: Using Mixed Methods to Study Primary Care Delivery Transformation

Debra L Scammon 1,2,*, Andrada Tomoaia-Cotisel 2,3,*, Rachel L Day 2, Julie Day 4, Jaewhan Kim 2, Norman J Waitzman 5, Timothy W Farrell 2,6,7, Michael K Magill 2,8
PMCID: PMC4097840  PMID: 24279836

Abstract

Objective. To demonstrate the value of mixed methods in the study of practice transformation and illustrate procedures for connecting methods and for merging findings to enhance the meaning derived.

Data Source/Study Setting. An integrated network of university-owned, primary care practices at the University of Utah (Community Clinics or CCs). CC has adopted Care by Design, its version of the Patient Centered Medical Home.

Study Design. Convergent case study mixed methods design.

Data Collection/Extraction Methods. Analysis of archival documents, internal operational reports, in-clinic observations, chart audits, surveys, semistructured interviews, focus groups, Centers for Medicare and Medicaid Services database, and the Utah All Payer Claims Database.

Principal Findings. Each data source enriched our understanding of the change process and understanding of reasons that certain changes were more difficult than others both in general and for particular clinics. Mixed methods enabled generation and testing of hypotheses about change and led to a comprehensive understanding of practice change.

Conclusions. Mixed methods are useful in studying practice transformation. Challenges exist but can be overcome with careful planning and persistence.

Keywords: Practice transformation, patient centered medical home, mixed methods, primary care


Primary care redesign is described as change within a complex adaptive system (Nutting et al. 2009). Studying transformation within this dynamic system requires a mixed methods design because neither qualitative nor quantitative approaches alone are sufficient to understand complex phenomena (Creswell et al. 2011; Jaen 2010). We conducted a retrospective mixed methods study of practice redesign in an integrated network of university-owned, primary care practices (Community Clinics or CCs). In this context, we are interested in documenting practice change, exploring the process of and experience with transformation and investigating the outcomes related to changes in practice.

Our research team consists of two members of CCs leadership (Clinical Quality Director and Executive Medical Director) as well as University of Utah researchers in various disciplines (family medicine, marketing, health care administration, economics, and public health). Over several years, the team has provided both research and strategic input to CCs leadership in an effort to improve clinic operations. During weekly meetings we discuss research methods, emerging findings, and implications.

In this article, we describe the specific methods and data sources used, and demonstrate how we connected our methods and merged qualitative and quantitative data to generate novel learning. We conclude with comments about the ways in which our mixed methods study can help health care professionals and researchers understand practice transformation, as well as its implications for management in fostering further changes to the care model.

Research Context

The CC network includes 10 practices that vary in size, composition of providers, services offered, and community demographics. Each practice is part of the CC network, which in turn is part of University Health Care. CC providers are employed and practice at one of the 10 clinics located in and around Salt Lake City, UT.

The CCs began implementing Care by Design (CBD), our version of the Patient Centered Medical Home model, in 2003. CBD is built on three key principles: appropriate access, care teams, and planned care. The CBD model is described elsewhere (Magill et al. 2006, 2009; Bodenheimer 2007; Blash, Dower, and Chapman 2011; Egger et al. 2012) and the context within which it has been implemented is detailed in the appendix to Day et al. (2013). Although it is a comprehensive model, the operational constraints of ongoing practices required incremental change. Components of CBD were introduced sequentially and were modified as the transformation proceeded. The CC's motivation for implementation was to simultaneously improve patient care, provider satisfaction, and financial performance.

A framework consisting of three levels and two cross-cutting factors depicts contextual factors important to understanding health care interventions (Tomoaia-Cotisel et al. 2013). We summarize the context for our research within this framework in Table 1.

Table 1.

2011 Contextual Factors of the University of Utah Community Clinics

Contextual Level Level 2: Larger Organization Ownership: Network of Community-Based Clinics Owned by University Health Care, An Academic Medical Center. Degree of Integration: University Health Care Includes Four Hospitals, Specialty and Primary Care Clinics, and University of Utah Health Plans. Contractual Arrangements: Primarily Fee-For-Service; NCI Grant Funding (EMR Reminders, Training, and Workflow Redesign)

Level 1: Practice Setting Level 3: External Environment

Clinic Year Opened No. of Primary Care/Total No. of Providers* Primary Care Patients’ Race and Insurance Status CY 2011 Visits FY 2011 Additional Characteristics Level of Urbanization; Transport Available
1 1985 12/22 55% Caucasian, 26% Other; 39% Commercial, 29% Medicaid, 22% Medicare 48,244 Multispecialty, multilingual, evening/weekend urgent care Metropolitan area; bus
2 1999 5/7 85% Caucasian, 8% Other; 65% Commercial, 14% Medicaid, 13% Medicare 20,155 Family practice with pediatrics Rural area; bus
3 2001 5/6 89% Caucasian, 4% Other; 58% Commercial, 23% Self-pay, 16% Medicare 14,449 Multispecialty and primary care Affluent rural area; bus
4 1988 4/8 71% Caucasian, 16% Other; 41% Commercial, 42% Medicaid, 11% Medicare 27,247 Family practice and pediatrics Suburban; bus
5 1976 8/16 72% Caucasian, 17% Other; 41% Commercial, 36% Medicaid, 18% Medicare 41,128 Multispecialty, primary care, OB/GYN and pediatrics Suburban; bus, metro
6 1989 5/6** 80% Caucasian, 12% Other; 41% Commercial, 22% Medicaid, 24% Medicare 13,133 Exclusively primary care Bedroom community; bus
7 2003 6/10 81% Caucasian, 11% Other; 66% Commercial, 15% Medicaid, 17% Medicare 11,574 Family oriented community practice Suburban; bus, metro
8 1996 14/14 73% Caucasian, 14% Other; 51% Commercial, 25% Medicaid, 20% Medicare 17,502 High volume mental illness, residency training site Urban; bus
9 1989 5/5 73% Caucasian, 14% Other; 66% Commercial, 9% Medicaid, 16% Medicare 16,763 Residency training site Urban; bus, metro
10 2007 4/4 85% Caucasian, 6% Other; 58% Commercial, 14% Medicaid, 23% Medicare 9,288 Primary care Suburban; bus
Implementation pathway Stepwise implementation process (appropriate access, care teams, then planned care), with existing EMR
Motivation for implementation Senior management's desire to improve care delivery; demonstrate business case for new delivery model
*

In the present study, primary care providers include Family Medicine, Internal Medicine, and Internal Medicine/Pediatrics providers because measures of chronic and preventive care services used to assess clinical quality applied only to adults.

**

One provider is a pediatrician.

We report Caucasian and Other. Remaining categories are American Indian and Alaska Native, Asian, Black or African American, Native Hawaiian and Other Pacific Islander, Patient Refused, and Unknown.

Several contextual factors made this mixed methods study possible. First, the CCs performed real-time assessments throughout the 10 year focus of this project. Their assessments provided longitudinal data with which we could examine the transformation. Quarterly, biannual, and annual reports about the transformation and its outcomes incorporated a variety of data, including observations, data queries, surveys, and financial and performance metrics. Second, the infrastructure required to capture these data was already in place. Our research team was granted access to this full set of data.

Consideration of the data's strengths and weaknesses guided decisions about what additional information was required to study the transformation. Prospective data from a broad set of stakeholders were desired and grant funding made this possible. However, data collection had to fit into the workflow of the operational clinics. It was also critical that new data be relevant to CCs’ leadership and managers, while capturing their experience of the transformation along with that of the providers and staff.

Mixed Methods Design

Our study aims included a multifaceted, retrospective investigation of practice transformation. They required multiple types of data, including data from different levels within the CCs (senior leadership, clinic leaders, clinic providers, and staff) and the perspectives of multiple stakeholders (providers and patients) (Jaen et al. 2010). Our analysis involved the integration of quantitative and qualitative data collected specifically for this project, as well as data drawn from existing sources. While each data source was included for specific purposes, the process of merging pieces of data to triangulate and contextualize meaning evolved organically (Creswell et al. 2011).

Figure 1 provides a schematic of our study aims and presents the specific sources of data we assembled to address each aim. A description of each data source, the types of data collected, and the timing of data collection are presented in Table 2.

Figure 1.

Figure 1

Research Aims and Data Sources

Table 2.

Overview of Research Methods and Data Collection

Data Collection Method Description of Method Type of Data Information Gathered Administration
Aim 1: Document transformation and process of change
 Archival search We used historical documents to gain perspective on the evolution of the care model over time Community Clinics Council (CCC) meeting minutes (senior leadership team, clinic medical directors, clinic managers from each CC); agendas and meeting planning committee minutes from Staff Development Institutes (SDI) (day-long education and strategic planning sessions attended by all CC providers and staff) When/how Appropriate Access (AA), Care Team (CT) and Planned Care (PC) were rolled out Monthly CCC meeting minutes—2003–2011; SDI focus—2003 (AA) 2004 (CT) 2006 (PC)
 Care by Design (CBD) implementation assessment We administered an internally developed implementation assessment tool multiple times. Based on detailed operational descriptors for AA, CT, and PC, and scales to measure performance. Assessments incorporate multiple types of data including observations, chart audit, EMR data Assessments of implementation of principles of CBD (AA, CT, and PC) and individual elements of each. Direct observations of patient care and staff interactions, medical record audits of five patient charts for each provider in all clinics, and reports from our data warehouse 28 measures: AA—6; CT—14; PC—8 Fall 2008 Spring 2009 Summer 2011 Spring 2012
 Clinic characterization audit We gathered background information about each clinic from clinic managers. Data were compared to objective data when such data were available (e.g., human resource department records of staff and FTE status) Web-based survey completed by clinic managers Size of clinic, patient volume, care team composition, presence of specialists, services offered, ancillaries on site, area competition Summer 2011
 In-clinic observations We conducted observations at each clinic to gather information about the “feeling” in the clinic. Data collection included touring the clinic and observing patient flow and care team interactions for 1–3 days On-site observations systematically noted and observer journaling Facilities design, work flow, patient flow, clinic processes Summer 2011
Aim 2: Experience with transformation
 Leadership interviews Personal interviews with clinic leadership were used to gain perspective on their personal experience with leading and managing the evolution of CBD. Their perspectives were essential to understanding the strategic goals for implementation of the new model of care and the management practices that were employed throughout the transformation Personal semistructured interviews with senior leadership team, clinic medical directors, clinic managers; N = 40 Personal experience with leading the care team rollout; experience managing the evolution of CBD, what and why; success metrics; incentives. On-the-ground reality faced by clinic level leaders implementing senior leadership's vision Summer-fall 2011
 Employee interviews Personal interviews with providers and staff were used to gain insights about their experiences with the transformation process. Our goal was to obtain perspectives from different members of the care teams and staff with different roles within the clinics Individual semistructured interviews with providers and staff. Both the provider and medical assistants (MAs) from the same care teams were interviewed, and for those clinics with registered nurses on staff, nurses were interviewed n = 46 Personal experience with implementing care team; experience with local adaptations and innovations; culture and values. Personal characteristics—position and years with CC Summer 2011
 Patient focus groups Patients, key stakeholders in this project, were interviewed in focus groups to assess their perceptions of changes implemented and how those changes impacted them. We invited patients who had experienced the changes made in their clinics Patients with chronic conditions and multiple visits to CC over multiple years. n = 63 Personal experience with change; relationships with provider and care team; communication; continuity Summer-fall 2011
Aim 3: Outcomes of change
 Clinical data The CCs collect performance data on a set of clinical quality metrics for chronic and preventive care, including both process (frequency of testing of HbA1c) and outcome (HbA1c in control) measures Electronic medical records: process and outcome measures of chronic and preventive care Quality data elements are based on measures included in the Medicare Care Management performance demonstration project. Percentages of the eligible patients who received recommended screenings Annually
 Operations data A variety of data on provider productivity and financial performance are generated by CC operations staff. These data enabled us to examine the financial impacts of transformation for the CC organization Operations reports Provider productivity, financial performance, characteristics of patients seen Quarterly
 Provider/staff satisfaction survey The CCs conduct provider and staff satisfaction surveys annually Standardized satisfaction survey conducted as part of operations AMGA provider satisfaction survey; Moorhead staff satisfaction survey Annually
 Patient satisfaction survey The CCs regularly conduct patient satisfaction surveys. Over the time period under study CCs changed both the survey used and the method of administration Standardized patient satisfaction survey conducted as part of operations AMGA patient satisfaction survey; Press Ganey patient satisfaction survey; Press Ganey patient satisfaction e-survey Semiannually through 7/2011; per visit 8/2011–2012
 Employee survey To get a sense of the working environment in each clinic we designed an employee survey using standard measures of employee beliefs and attitudes. Surveys were distributed during staff meetings at each clinic to all primary care providers and staff involved in direct patient care Standardized survey including:Team Development Instrument (TDI); Maslach Burnout Inventory (MBI); Organizational Culture Assessment Instrument (OCAI);Clinical Support for Patient Activation Measure (CS-PAM). n = 144 in 2011; n = 127 in 2012 TDI—31 items; MBI—16 items; OCAI—24 items; CS-PAM—13 items Personal characteristics—position and years with CC Annually 2011, 2012
 Cost and utilization of care To assess cost and utilization of care we acquired data from CMS and Utah's All Payer Claims Database (APCD). APCD is a new compilation of health care claims records in Utah, publicly administered through the Utah Department of Health, which incorporates a comprehensive profile of health care utilization, regardless of source of payment CMS; APCD; CC Operations data CMS—65+; APCD—patients <65 covered by Medicaid or commercial insurance. Enrollment, inpatient, outpatient, pharmaceutical costs 2007–2009

Aim 1: Documenting Transformation and Exploring Process of Change

Our first aim was to document practice change. We used existing data from an internally developed tool that assessed CBD's level of implementation. In addition, archived documents provided a sense of the sequence and management of change. Because context varied among our clinics, we collected new quantitative and qualitative data to provide contextualization.

Aim 2: Experience with Transformation

A second aim was to explore stakeholders’ experience with change. New data were collected from individuals involved in planning and implementing the transformation, employees adjusting to the changes, and patients receiving care within the new model. To this end, we used semistructured interviews and focus group discussions.

Aim 3: Assessing the Transformation's Outcomes

A third aim was to examine several of the transformation's outcomes. We used existing internal data and identified additional data sources with the potential to illuminate significant effects of the model. Operational data allowed us to assess the transformation's impact on quality measures; patient, provider, and staff satisfaction; and clinic operations. We obtained data from the Centers for Medicare and Medicaid Services (CMS) and from Utah's All Payer Claims Database (APCD) to assess information about cost and utilization of care. To determine the transformation's impacts on employees, we designed a survey incorporating standardized measures of aspects of work-life impacted by change.

This study was approved by the University of Utah Institutional Review Board.

Connecting the Dots and Merging Meaning

Two key steps underpinned the success of the study: connecting the elements of our mixed methods design throughout data collection, and merging findings from our mixed methods during analysis and interpretation (Crabtree et al. 2010; Jaen et al. 2010; Creswell et al. 2011, pp. 5–6). In the following sections, we describe both successes and challenges in making connections and merging meanings.

Connecting the Dots

Connecting multiple sources of data revealed a more complete understanding of the transformation process. Important connections were those across time, across contextual levels, and across research team members and methods. These connections are depicted in Figure 2a–c and discussed below.

Figure 2.

Figure 2

(a) Connecting the Dots—Connections across Time; (b) Connecting the Dots—Connections across Contextual Levels; (c) Connecting the Dots—Connections across Research Team Members and Methods

Connections across Time

Documenting practice change was the first step of our analysis. We combined qualitative data with documentary evidence. In the first year of the project, a document review produced archival data used to construct a timeline of important actions and milestones for CBD implementation. However, formal documents did not tell the whole story as senior leadership made many decisions “off-line.” Recognizing gaps in the archival data, we turned to personnel interviews, an example of integrating through building (see Fetters et al. 2013).

The timeline reflected when each CBD component was introduced, allowing us to identify appropriate time periods for trend analyses. Trend analyses used internal longitudinal data about the level of implementation of CBD components and individual elements within each component, as well as outcomes.

Connecting historical “real time” data with employees’ subjective recollections was challenging. Informants tended to use today's lens to reflect on the past, a bias called “presentism” (Fischer 1970). In addition, employees’ experiences of changes were of different “realities.” For example, some leaders had been with CCs since before the transformation began. Others had joined the organization at various points during the transformation. Furthermore, some personnel had been part of the change process at more than one clinic and had experienced differences in implementation. In both the interviews and the analysis, presentism and experiences were considered in light of these contextual factors. The acquisition of new data was carefully sequenced, allowing insights from one source to inform components of the larger study. For example, the contextualization of local clinic environments allowed us to explore specific environmental factors during employee interviews. Research team members reviewed information acquired through the Clinic Characterization Audit (CCA) before conducting employee interviews and tailored the questions to each individual site, another example of integrating through building. Conducting in-clinic observations and employee interviews simultaneously allowed us to ask about observed activities and processes.

Issues and Solutions

In retrospective analyses, there is a potential for recall bias. To overcome this bias, we used two types of historical anchors (Martyn and Belli 2002; Happ et al. 2004): (1) the informant's role and clinic location at that time, and (2) the transformation timeline. Informants who had worked at more than one CC clinic were asked to recall in which clinic they were working and the role they played during each phase of the transformation. They were also provided temporal cues during the interview (e.g., the year in which components of CBD were rolled out).

This information not only helped anchor the informant but also helped us interpret their comments. There are also other methods that could be used to calibrate these differences. For example, purposive recruitment of employees with experience at specific points during the redesign process, with specific clinics, and/or in specific roles could be used. This would be an example of integrating through connecting (see Fetters et al. 2013).

Despite careful planning, unanticipated disruptions emerged during in-clinic observations. During one scheduled site visit, a group of providers called an impromptu team huddle. This disrupted the observation of “normal” clinic activity. Because the clinic was one of the larger sites among the CCs, the researcher adjusted her schedule to observe another group of providers. She returned to the first group later that day during time originally scheduled for reflection. Formal reflection was postponed until after the clinic closed, thereby allowing the researcher to adapt to the changed circumstances. In conducting observations, it was important to incorporate flexibility.

Connections across Contextual Levels

Connections across organizational boundaries were critical to the success of this project. Using the contextual framework noted earlier (see Table 1) (Tomoaia-Cotisel et al. 2013), we describe connections between the individual practices (Level 1), between the practices and the clinic network (Level 2), and between the network and the larger external environment (Level 3).

Connecting the Practices to Each Other and to the CC Network

Researching the scalability of CBD has been an important part of our project. We began by connecting clinic-level data for CBD implementation analyses across the network. Our clinics vary with regard to characteristics, such as patient mix and provider mix (Table 1). For example, family medicine faculty and residents staff two practices, while nonfaculty clinicians and only a few residents staff the other eight. Team structure and dynamics are more fluid in the faculty/resident practices because these clinicians see patients on a more part-time basis. In addition, new residents arrive annually and clinical skills change rapidly over the course of their 3-year training. These team-related factors affected implementation. Recognizing the differences, exploring their impact in qualitative analyses, and controlling for them in quantitative analyses provided a better understanding of the factors influencing CBD's implementation and outcomes as well as factors that affect external validity.

Connecting the CC Network to the External Environment

One motivation for implementation of CBD was to lower cost of care, both within CC and for care irrespective of where and how it was received. To examine the total cost of health care services associated with levels of CBD implementation, it was necessary to connect data across organizational boundaries and cultivate relationships between research team members, CC, and University Health Care analysts. The CC's implementation data were linked to Enterprise Data Warehouse files available from University Health Care. We also established relationships between the research team and data managers from the Research Data Assistance Center, Buccaneer (CMS data distributors), and the Utah Department of Health (for the APCD data). Specifically, to have access to the APCD database, our team assisted in building the APCD infrastructure by contributing to the coverage of APCD staffing costs.

Issues and Solutions

Many states are now in the process of creating an APCD-like database to inform research—which is in itself a huge undertaking. Even once created, the availability and quality of data may be outside the control of researchers. Data quality may be a particular issue as such databases are built by linking to multiple, previously independent sources. Teams that wish to take advantage of new external data sources should have contingency plans in place in case problems are encountered. Incorporating flexibility into timelines for access to and use of these data is critical to overcoming unanticipated challenges. Mixed methods designs allow for flexibility when such contingencies arise, since other aspects of the research can proceed when delays arise in accessing specific data sources such as the APCD.

Connections across Research Team Members and Methods

While collecting new data, connections across individual researchers and methods increased the team's efficiency and enhanced insights. For example, the researchers who designed the employee interview and the employee survey worked closely together determining what data were to be acquired through each method. This strengthened our ability to triangulate among disparate data sources, reduced unnecessary redundancy, facilitated interpretation of quantitative data, and optimized potential insights. Also, as employee interviews and patient focus groups progressed over a 2-month period, the researchers leading these two efforts held reflection sessions once or twice a week, in which they not only discussed emerging themes, thus facilitating ongoing analysis, but also identified issues for follow-up and issues that could be explored further, thus facilitating the refinement of ongoing data collection. This is an example of integrating through embedding (see Fetters et al. 2013).

Furthermore, each researcher was in charge of several methods, thus connecting them and facilitating their on-the-spot integration. For example, the same team member who conducted the semistructured interviews developed the CCA and led the in-clinic observations. This enhanced continuity and facilitated integration of data from the CCA and on-site observations with the employee interviews.

Issues and Solutions

However, concentrating data collection in one researcher may introduce bias. To mitigate this bias, we included three safeguards (Tashakkori and Teddlie 1998): (1) one of three research assistants accompanied the researcher on each site visit and participated in reflection throughout and after the visit; (2) periodically, this researcher and the three RAs met as a group; and (3) regular meetings with all research team members and periodic researcher subgroup meetings were also held. These activities allowed for reflection, formal recognition of emerging hypotheses, the development of consensus about what was being observed, and identification of things to follow up on when returning to the clinic.

Merging Data to Make Meaning

Merging data from our multiple sources in our convergent design was the next step.

We reported analyses exploring the impact of practice redesign at the network level elsewhere (Day et al. 2013). Using correlation analysis, Day and colleagues found associations between the extent of implementation of CBD and several outcome measures. These analyses relied upon multiple components of our data: the CBD implementation assessments, quality measures, patient and provider satisfaction surveys, and financial and administrative data. Findings revealed some unanticipated relationships, including some potential trade-offs between different types of outcomes. Importantly, they revealed some relationships that might not otherwise have been discovered. For example, continuity with the primary care provider was correlated with quality, patient and provider satisfaction, and financial performance—relationships we would have missed had we examined correlations of implementation within a single domain of outcomes.

In this section, we further illustrate the potential for triangulation across multiple components of our project by exploring the Planned Care component of CBD at 1 of our 10 clinics. Planned care promotes a comprehensive perspective on the patient visit. Two key elements of planned care are (1) reports of newly obtained laboratory results available for use during the visit and (2) the provision of an after visit summary (AVS) which reviews what was said and done during the visit, and includes the provider's follow-up instructions. Drawing from a sub-set of data sources, we illustrate the iterative process by which we integrated findings from our quantitative and qualitative data. By moving from one data source to another, we developed emergent hypotheses and subsequently tested them. Table 3 presents the quantitative data consulted as we tested and revised our hypotheses.

Table 3.

Care Team Quantitative Data

Data Source* Data Element Team 1 Led by Provider A Team 5 Led by Provider B
CBD implementation assessment Labs done prior to visit 0% 80%
AVS given to patient 36% 75%
Productivity Work RVUs Provider A has a 1.94 times higher RVU count than Provider B
Appointment count Provider A has a 1.47 times higher appointment count than Provider B
Quality scores§ Coronary artery disease 77% 91%
Diabetes 65% 78%
Heart failure 40% 38%
Preventive care 49% 80%
Total 54% 80%
Employee surveys** Teamness†† 71 (70,71) 56 (45, 65)
Professional efficacy‡‡ 26 (25, 27) 27 (7, 35)
Exhaustion‡‡ 9 (6, 12) 20 (13, 29)
Cynicism‡‡ 11 (10, 12) 15 (1,28)
*

More detailed information on the methods, instrument, and administration of each data source can be found in Table 2.

July 2011 Care by Design Implementation Assessment.

April to June 2011 average productivity data; relative value units (RVUs), computed according to industry standards.

§

Thirteen months ending June 2011 average quality scores; percentage of eligible patients receiving recommended screenings.

**

All surveys were administered during regular employee meetings in spring 2011; averages (rage) for the team.

††

Based upon the TDM: (0–100) higher scores indicates stronger team identity.

‡‡

Based on the MBI: professional efficacy (0–36), exhaustion (0–30), and cynicism (0–30); higher score indicates a higher level of that component.

A review of CBD implementation data across the clinics (data not shown) revealed that one clinic frequently referenced as an exemplar of the model actually demonstrated only average implementation scores. In our data, the clinics’ CBD implementation scores are the mean implementation scores across all of the care teams in a particular clinic. We speculated that this aggregation could be disguising some differences in implementation among teams in this clinic.

To test the hypothesis that performance varied among teams, we examined this clinic's team CBD implementation scores. Focusing on planned care, we looked specifically at labs done prior to visit and whether patients were given AVSs and found that Team 1 was the lowest implementing team and Team 5 was the highest implementing team. (See “CBD Implementation,” Table 3).

This observation prompted us to ask why two teams in the same environment were practicing so differently. We hypothesized that differences might be related to providers’ commitment to the vision for CBD. To test this hypothesis, we consulted provider and staff interview data, paying particular attention to providers’ overall approach to implementation and to discussions involving planned care.

Provider A (Team 1) describes himself as one of the busiest providers and says he is interested in new ways of doing things that increase efficiency and that work for him.

I've always had an open mind to everything that's been presented in terms of “will it help me provide better care to my patients, more efficient care to my patients, meet their needs…” but I'm also one of the busiest providers in the University system and the busiest provider here in terms of volumes… [so] it's more a matter of you know, what works for me on the day-to-day basis.—Provider A (Team 1)

In contrast, throughout his interview, Provider B (Team 5) describes the tension that he sees between visit productivity and the time needed to perform preventive and health maintenance services:

Again, if you are at a 20 minute visit and somebody comes in for bronchitis and (if) your patient hasn't had a mammogram; hasn't had a colonoscopy; hasn't had a flu shot; hasn't had, you know, we can't do those things if you're trying to crank things out. … they [senior leadership] like to emphasize quality care and all the preventative medicine stuff, but the practicalities are there's not a lot of time to do that stuff, … there needs to be more time allocated to that. I mean sort of like the push for the clinics is to see lots and lots of people, and generate lots of revenue, but you have to slow down to provide all of those quality issues that you need. So, there's no reimbursement for quality.”—Provider B (Team 5)

Our qualitative data suggested that the time needed to provide comprehensive care impacted how providers implemented CBD. With this in mind, we proposed a new hypothesis: that commitment to CBD implementation was higher among providers who emphasize quality over productivity. To test this hypothesis, we compared the same two providers’ productivity and quality scores (see “Productivity” and “Quality”, Table 3). Provider A (Team 1) is almost twice as productive as Provider B (Team 5) in terms of work relative value units (RVUs) and appointment count. Provider B has higher quality scores for chronic conditions. This comparison supported the possibility that commitment to CBD implementation was higher among providers who emphasize quality over productivity.

Productivity is important to financial solvency of the clinic. Quality is important to patient health. Given the tension observed, we hypothesize that attempting to increase either (or both) puts a burden on the care teams. We consulted employee surveys (see “Employee Surveys,” Table 3) to explore evidence of this burden in team functioning and team morale; in particular, responses to the Team Development Measure (PeachHealth 2012) and the dimensions of the Maslach's Burnout Inventory (Maslach, Leiter, and Schaufeli 2009). Provider A's team (Team 1) reported a more unified team identity than did Provider B's team (Team 5). Team 1 rated their level of burnout as generally lower than did Team 5. To better understand this tension, we went back to the qualitative interviews. A medical assistant (MA) from Team 1 explained:

[Use of AVSs] with other doctors, yes. [On Team 1], no, just because [we have] so many regulars that we see so frequently that they are like “I don't want any more paperwork” for the AVSs. We do give them but the patients leave them in the rooms… They just like to see [Provider A] and get their medications.—MA (Team 1)

Interview data revealed that Provider A's MAs work specifically for him and focus on “rooming” patients.

An MA from Team 5 shared the following thoughts about the previsit planning portion of planned care:

What we do, our team, with our doctors, anytime we have a physical and annual exam…we know what the doctor's going to do, so… we drop in their history… we drop in the health maintenance… [we drop in] all the history from any labs done; any radiology done… and, then we put that in the chart so that it's ready for the visit. And then, if we have a patient coming in for a follow-up from some kind of out-sourced specialty exam, like with a cat scan or an MRI, we get those results and we have them sitting on the desk in the room for the visit. A lot of times the doc's already done that. That's primarily the only prep we do … it's usually done that morning when we get to work.—MA (Team 5)

Team 5 MAs appear to be more involved in previsit planning.

These descriptions strengthened our understanding of the tension between productivity and quality to which Provider B referred. They further suggest explanations for differences among teams in terms of team identity and burnout. A practice style in which MAs have clearly defined roles, are focused on less complex tasks, and have a team orientation that prioritizes visit efficiency may contribute to higher team scores and lower burnout. A provider who prioritizes CBD implementation may work with different MAs and expect them to take on extra responsibilities. Following this practice style in a system in which incentives are misaligned and employee resources are limited may contribute to lower team identity and higher burnout.

The iterative process used to explore our multiple data sources allowed us to generate working hypotheses about observed outcomes and to test them by consulting additional data sources. We were able to simultaneously generate and test hypotheses about specific changes, and gain a more comprehensive and nuanced understanding of practice transformation more generally. Specifically, analysis of our personnel interviews revealed that a provider's commitment to implementation of the new care model had implications for care team members. Our interview data revealed that the way in which team members operate appears to impact their perceptions. As demonstrated through responses to our employee survey, team identity and burnout were related to team roles. The tension between productivity and quality, the ways in which these goals are approached by providers, and the impacts staffing models have on team members all deserve careful consideration as care delivery transformation is pursued.

Value of Mixed Methods Research to Fostering Practice Transformation

Implementation of mixed methods research is tedious and time consuming. The complexity of the processes and logistics involved with identifying and collecting data from diverse sources can be daunting. The sheer volume of assembled data can be difficult to distill without losing some of the nuanced implications. These nuances can subtly shape and change the direction of the research and consequently what is learned. Thus, finding ways to manage the complexities of mixed methods research is invaluable.

At the beginning of our project there was a clear separation between operations and research objectives. Clinic leadership wanted to understand how the new care delivery model impacted quality of care and financial viability. Researchers wanted insights about the change process. The two seemingly disparate goals became interdependent. Currently, the new CC executive director has expressed a keen interest in our research and the best way to integrate findings into clinic operations. The organization is receptive to changes it previously resisted, and the data amassed through our mixed methods project provides an evidence base to support the CC's ongoing practice redesign.

Findings from the mixed methods study facilitated more effective engagement between the research team and CC leadership. Our qualitative findings give us narrative—stories and on-the-ground experiences—while our quantitative findings give us numbers—data that illuminate relationships between components of our CBD model and important outcomes. Together they provide insights that will help shape the CC's future planning and strategies.

Research findings have already influenced specific changes in both the CC's strategic direction and operations management. Quantitative data analyses revealed that CBD implementation was incomplete and varied across clinics. Clinic and care team level analyses revealed substantial implementation variation. Qualitative interviews helped us understand contextual factors contributing to this variability. CC leadership is applying these insights to redesign clinic functions. Our data suggest that team-based care and continuity of care are linked to enhanced performance across outcomes (e.g., clinical quality, satisfaction, financial performance; Day et al. 2013). The CC are now prioritizing continuity as a core principle of transformed care. Our project also compelled CC leadership to reevaluate its approach to implementation, focus on the most essential parts of the model, and plan for investments to ensure the successful transformation of care delivery. CC is pursuing National Committee for Quality Assurance recognition, and insights from our mixed methods analyses are helping CC structure requests for resources and demonstrate return on investment of the CBD model in a value-based payment model as health care reform is implemented.

Discussion

In this article, we describe the mixed methods design employed in our investigation of the transformation of CC to the Care by Design model. We illustrate the connections between our data sources, including connections across time, across contextual levels, and across research team members and methods. We describe the processes by which we merged findings from the various components of our project to enhance our understanding of the transformation process and its impacts. Through the use of mixed methods we have been able to gain a more nuanced perspective on implementation of our new care model on the ground. Different data sources helped us appreciate multiple perspectives and revealed different aspects of the change process and its outcomes.

The use of mixed method designs ideally involves the inclusion of multidisciplinary research teams with members from inside and outside the organization, and it requires significant time for both data collection and analysis. Future work should explicitly focus on the cost-effectiveness of such research. Documentation of the costs of each component of the research and careful assessment of the insights gained as the result of use of mixed methods are essential to determination of its worth.

Conclusion

Using a mixed methods design, we were able to gain new insights about not just the “what” of the change process but also the clinic-specific “why” behind the experiences with changes. Our multiple data sources suggest that the effectiveness of transformation is highly dependent upon the response of clinic leadership, providers, and staff to the changes put into action by senior leadership. Our mixed methods project resulted in a rich description of multimodal efforts to drive change and a sense of the magnitude of effort required to effect transformation of a complex system. Through integrative analysis of our data, new hypotheses emerged regarding the inherent interaction of components of a complex redesign effort. Exploring the tensions created through these interdependencies should be a focus of future research.

Acknowledgments

Joint Acknowledgment/Disclosure Statement: The authors acknowledge the technical assistance of Annie Sheets-Mervis, M.S.W., for data collection and analysis, supervision, administrative support, and technical writing; Ken Gondor and Lisa Simpson for data analysis; Kimberly Brunisholz, M.S.T., for data entry and analysis; Jennifer Tabler, B.S., for data analysis and manuscript preparation; and Lisa Gren, Ph.D., for manuscript preparation. Also, we are grateful for the participation of Community Clinics leadership, providers, and staff in personal interviews, and patients in focus group discussions. This project was supported by grant R18HS019136 from the Agency for Health Care Research and Quality as part of its program to study transformation of primary care, Michael K. Magill, M.D., Principal Investigator. The content is solely the responsibility of the authors and does not necessarily represent the official views of the Agency for Health Care Research and Quality.

Disclosures: The authors report no conflicts of interest and no financial disclosures.

Supporting Information

Additional supporting information may be found in the online version of this article:

Appendix SA1: Author Matrix.

hesr0048-2181-SD1.pdf (102KB, pdf)
hesr0048-2181-SD2.txt (16.9KB, txt)

References

  1. Blash L, Dower C, Chapman S. 2011. “University of Utah Community Clinics—Medical Assistant Teams Enhance Patient-Centered, Physician-Efficient Care.” [accessed on September 19, 2013]. Available at http://futurehealth.ucsf.edu/Content/11660/2011_04_University_of_Utah_Community_Clinics--Medical_Assistant_Teams_Enhance_Patient-Centered_Physician-Efficient%20Care.pdf.
  2. Bodenheimer T. 2007. “Building Teams in Primary Care: 15 Case Studies” [accessed on September 19, 2013]. Available at http://www.chcf.org/resources/download.aspx?id=%7bAE45ED83-61A2-4402-A9E0-0E362526A89F%7d.
  3. Crabtree BF, Nutting PA, Miller WL, Stange KC, Stewart EE, Jaén CR. “Summary of the National Demonstration Project and Recommendations for the Patient-Centered Medical Home”. Annals of Family Medicine. 2010;8(Suppl 1):S80–90. doi: 10.1370/afm.1107. [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Creswell JW, Klassen AC, Plano Clark VL, Clegg Smith K. 2011. “Best Practices for Mixed Methods Research in the Health Sciences.” Office of Behavioral and Social Sciences Research (OBSSR). [accessed on September 19, 2013]. Available at http://obssr.od.nih.gov/mixed_methods_research/pdf/Best_Practices_for_Mixed_Methods_Research.pdf.
  5. Day J, Scammon DL, Kim J, Sheets-Mervis A, Day R, Tomoaia-Cotisel A, Waitzman NJ, Magill MK. “Quality, Satisfaction, and Financial Efficiency Associated with Elements of Primary Care Practice Transformation: Preliminary Findings”. Annals of Family Medicine. 2013;11(Suppl 1):S50–9. doi: 10.1370/afm.1475. [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Egger MJ, Day J, Scammon DL, Li Y, Wilson A, Magill MK. “Correlation of the Care by Design Primary Care Practice Redesign Model and the Principles of the Patient-Centered Medical Home”. Journal of the American Board of Family Medicine. 2012;25(2):216–23. doi: 10.3122/jabfm.2012.02.110159. [DOI] [PubMed] [Google Scholar]
  7. Fetters MD, Curry LA, Creswell JW. “Achieving Integration in Mixed Methods Designs - Principles and Practices”. Health Services Research. 2013;48(S2):2134–56. doi: 10.1111/1475-6773.12117. [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Fischer DH. Historians’ Fallacies: Toward a Logic of Historical Thought. New York: Harper Torchbook; 1970. [Google Scholar]
  9. Happ MB, Swigart V, Tate J, Crighton MH. “Event Analysis Techniques”. Annals of Advanced Nursing Science. 2004;27(3):239–48. doi: 10.1097/00012272-200407000-00008. [DOI] [PubMed] [Google Scholar]
  10. Jaen CR, Crabtree BF, Palmer RF, Ferrer RL, Nutting PA, Miller WL, Stewart EE, Wood R, Davila M, Stange KC. “Methods for Evaluating Practice Change toward a Patient-Centered Medical Home”. Annals of Family Medicine. 2010;8(Suppl 1):S9–20. doi: 10.1370/afm.1108. [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Magill MK, Lloyd RI, Palmer D, Terry S. “Successful Turnaround of a University-owned, Community-Based Multidisciplinary Practice Network”. Annals of Family Medicine. 2006;4(Suppl 1):S12–8. doi: 10.1370/afm.540. [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Magill MK, Day J, Mervis A, Donnelly SM, Parsons M, Baker AN, Johnson L, Egger MJ, Nunu J, Prunuske J, James BC, Burt R. “Improving Colonoscopy Referral Rates through Computer-Supported Primary Care Practice Redesign”. Journal of Health Care Quality. 2009;31(4):43–53. doi: 10.1111/j.1945-1474.2009.00037.x. [DOI] [PubMed] [Google Scholar]
  13. Martyn KK, Belli RF. “Retrospective Data Collection Using Event History Calendars”. Nursing Research. 2002;51(4):270–4. doi: 10.1097/00006199-200207000-00008. [DOI] [PubMed] [Google Scholar]
  14. Maslach C, Leiter MP, Schaufeli WB. “Measuring Burnout”. In: Cooper CL, Cartwright S, editors. The Oxford Handbook of Organizational Well-Being. Oxford, UK: Oxford University Press; 2009. pp. 89–108. [Google Scholar]
  15. Nutting PA, Miller WL, Crabtree BF, Jaen CR, Stewart EE, Stange KC. “Initial Lessons from the First National Demonstration Project on Practice Transformation to a Patient-Centered Medical Home”. Annals of Family Medicine. 2009;7(3):254–60. doi: 10.1370/afm.1002. [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. PeachHealth. 2012. Team Development Measure [accessed October 20, 2012]. Available at http://www.PeachHealth.org.
  17. Tashakkori A, Teddlie C. Mixed Methodology: Combining Qualitative and Quantitative Approaches. Thousand Oaks, CA: Sage Publications; 1998. Applied Social Research Methods Series. Vol. 46. [Google Scholar]
  18. Tomoaia-Cotisel A, Scammon DL, Waitzman NJ, Cronholm PF, Halladay JR, Driscoll DL, Solberg LI, Hsu C, Tai-Seale M, Hiratsuka VY, Shih SC, Fetters MD, Wise CG, Alexander JA, Hauser D, McMullen C, Scholle SH, Tirodkar MA, Schmidt L, Donahue K, Parchman ML, Stange KC. “Context Matters: The Experience of 14 Research Teams in Systematically Reporting Contextual Factors Important for Practice Change”. Annals of Family Medicine. 2013;11(Suppl 1):S115–23. doi: 10.1370/afm.1549. [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

hesr0048-2181-SD1.pdf (102KB, pdf)
hesr0048-2181-SD2.txt (16.9KB, txt)

Articles from Health Services Research are provided here courtesy of Health Research & Educational Trust

RESOURCES