Abstract
This article discusses development and testing of the Provider and Staff Perceptions of Integrated Care Survey, a 21-item questionnaire, informed by Singer and colleagues’ seven-construct framework. Questionnaires were sent to 2,936 providers and staff at 100 federally qualified health centers and other safety net clinics in 10 Midwestern U.S. states; 332 were ineligible, leaving 2,604 potential participants. Following 4 mailings, 781 (30%) responded from 97 health centers. Item analyses, exploratory factor analysis, and confirmatory factor analysis were undertaken. Exploratory factor analysis suggests four latent factors: Teams and Care Continuity, Patient Centeredness, Coordination with External Providers, and Coordination with Community Resources. Confirmatory factor analysis confirmed these factor groupings. For the total sample, Cronbach’s alpha exceeded 0.7 for each latent factor. Descriptive responses to each of the 21 Provider and Staff Perceptions of Integrated Care questions appear to have potential in identifying areas that providers and staff recognize as care integration strengths, and areas that may warrant improvement.
Keywords: integration, coordination, care management, survey, psychometric analysis
Introduction
Integrated care is increasingly recognized as important for patients diagnosed with complex and chronic conditions such as diabetes (Armitage, Suter, Oelke, & Adair, 2009). Effective integration may also promote prevention, or timely diagnosis, of chronic and acute conditions. In 2011, Singer and colleagues proposed a Framework for Measuring Integrated Patient Care emphasizing the importance of both care coordination and patient-centered care (Singer et al., 2011). The seven dimensions of the framework include five constructs related to coordination and two related to patient centeredness. The framework regards integration as a multidimensional construct, with both coordination and patient-centeredness elements; integrated patient care is defined as “patient care that is coordinated across professionals, facilities, and support systems; continuous over time and between visits; tailored to the patients’ needs and preferences; and based on shared responsibility between patient and caregivers for optimizing health” (Singer et al., 2011, p. 113).
Existing tools, such as the Patient Perception of Integrated Care (PPIC) survey, offer opportunities to assess patient perceptions of integrated care (Singer, Friedberg, Kiang, Dunn, & Kuhn, 2013). The PPIC survey was designed to be completed by patients with chronic conditions. However, to our knowledge, the Medical Home Care Coordination Survey is the only available measure to assess non–condition-specific care coordination for primary care from health care team and patient perspectives (Zlateva et al., 2015). However, the Medical Home Care Coordination Survey was not developed with reference to the comprehensive Framework for Measuring Integrated Patient Care, which includes elements of patient centeredness in recognition of the patient and family’s role in caregiving for patients with complex chronic needs. Individual patients are ideally placed to rate their perceptions of care integration at the level of the individual. However, providers and staff are also likely to have valuable insights into aspects of care that should aid integration and may have awareness of ongoing barriers to effective integration that are unrecognized, or unknowable, by many patients.
New Contribution
Recent efforts to improve the quality, delivery, and organization of care through models such as the patient-centered medical home have also highlighted new ways to improve care integration for patients across a variety of care settings (Derrett et al., 2014; Quinn et al., 2013; Rosenthal et al., 2013). However, few tools exist for providers and staff to evaluate care integration. Informed by Singer et al.’s (2011) Framework for Measuring Integrated Patient Care, we developed a new questionnaire, Provider and Staff Perceptions of Integrated Care (PSPIC). Future improvements in care delivery may benefit from the valuable insights of providers and staff who may identify aspects of care integration that are working well in health care services, as well as those that need improvement. Providers and staff may be aware of opportunities for greater integration that single patients moving through a health care service may not. Therefore, our team sought to develop a questionnaire for providers and staff that may serve as complementary perspective alongside current questionnaires, such as Singer et al.’s PPIC survey.
This article discusses the development and testing of the PSPIC questionnaire, which was administered as part of a survey of providers and staff at federally qualified health centers and other safety net clinics. Exploratory factor analyses (EFA) undertaken to identify latent factors underpinning the PSPIC, confirmatory factor analyses (CFA), and internal consistency and discriminant validity characteristics of the PSPIC are presented.
Conceptual Framework
In the Framework for Measuring Integrated Patient Care, Singer and colleagues have conceptualized seven constructs of care: coordinated within care team; coordinated across teams (e.g., within and between organizations); coordinated between care teams and community resources; continuous familiarity with the patient over time; continuous proactive and responsive action between visits; patient centered; and shared responsibility (Singer et al., 2011). Uniquely, the Framework for Measuring Integrated Patient Care conceptualizes integration in a manner that recognizes the central role of the patient: “Achieving integrated care requires delivering care that is not only coordinated but also patient- centered (i.e., accounts for patients’ needs, preferences, and the important role that patients and family members play as active participants in care; Singer et al., 2013, p. 145). The constructs and their definitions are displayed in Table 1, as conceptualized and defined by Singer and colleagues in their original work (Singer et al., 2011).
Table 1.
Construct | Description |
---|---|
1. Coordinated within care team | The individual providers (which may include physicians, nurses, other clinicians, support staff, and administrative personnel who routinely work together to provide medical care for a specified group of patients; hereafter the “care team”) deliver consistent and informed patient care and administrative services for individual patients, regardless of the care team member providing them. |
2. Coordinated across care teams | All care teams that interact with patients, including specialists, hospital personnel, and pharmacies and deliver consistent and informed patient care and administrative services, regardless of the care team providing them. |
3. Coordinated between care teams and community resources | Care teams consider and coordinate support for patients by other teams offered in the community (e.g., Meals on Wheels). |
4. Continuous familiarity with patient over time | Clinical care team members are familiar with the patient’s past medical condition and treatments; administrative care team members are familiar with patient’s payment history and needs. |
5. Continuous proactive and responsive action between visits | Care team members reach out and respond to patients between visits; patients can access care and information 24/7. |
6. Patient centered | Care team members design care to meet patients’ (also family members and other informal caregivers’) needs and preferences; processes enhance patients’ engagement in self-management. |
7. Shared responsibility | Both the patient and his or her family and care team members are responsible for the provision of care, maintenance of good health, and management of financial resources. |
This table to display Singer et al.’s Framework for Measuring Integrated Patient Care first appeared as Table 2 in the original article: Singer, S. J., Burgers, J., Friedberg, M., Rosenthal, M. B., Leape, L., & Schneider, E. (2011). Defining and measuring integrated patient care: Promoting the next frontier in health care delivery. Medical Care Research and Review, 68(1), 112–127.
We used the Singer framework to consider dimensions of coordination and patient-centeredness from the perspectives of providers and staff. While patients’ perceptions of integrated care are essential, provider and staff perceptions of integrated care may also inform about the ways in which patients are supported in their care or ways in which the delivery system is failing to meet the needs of patients. The PSPIC was developed in order to elicit provider and staff perceptions of integrated care.
Method
PSPIC Questionnaire Development
To develop questions for the PSPIC, we drew on the seven-construct Framework for Measuring Integrated Patient Care (Singer et al., 2011), reviewed care integration literature, sought input from hospital-based clinicians, and obtained feedback from research partners and clinicians via group meetings focused on the face validity and potential utility of the PSPIC. We identified potential items for the PSPIC through review of existing surveys designed to measure similar constructs from the perspective of providers and staff: integration, care coordination, and patient-centeredness. In conjunction with the seven-construct Framework for Measuring Integrated Patient Care (Singer et al., 2011), the two lead authors identified questions from external surveys with components related to the constructs in the conceptual framework. The two lead authors generated a priority list of items that they believed best preserved the content of each domain from Singer’s Framework for Measuring Integrated Patient Care based on face validity. For domains where we could not locate items in the literature, we adapted existing items that could represent each construct or we developed new items as needed. The priority items for the initial PSPIC were pooled and we discussed and selected items through iterative discussion and refinement as a team to determine a final set of questions that would map to the seven domains. An eight-member research team initially reviewed the candidate questions for the PSPIC and the seven-construct Framework for Measuring Integrated Patient Care (Singer et al., 2011). The research team included individuals with expertise in primary care, quality improvement in health centers, health disparities, and survey design and analysis. Assessment of potential PSPIC items was guided by three main criteria: (1) face validity, so that items represented key aspects of the seven constructs; (2) clarity, so that questions were easy to understand; and (3) brevity, so that a low-burden survey could be completed by busy providers and staff working in health centers. Ultimately, a 21-item questionnaire was developed for inclusion and testing in a survey of health center providers and staff. The objective of the questionnaire was to assess provider and staff integration according to Singer’s seven constructs in the Framework for Measuring Integrated Patient Care: (1) care coordination within clinic, (2) coordination with external providers, (3) coordination with community resources, (4) familiarity with patients, (5) contacting patients between office visits, (6) patient-centered care, and (7) shared responsibility. Each of the constructs was represented within the PSPIC by domains containing three questions; each question was rated on a 5-point Likert-type response scale (Strongly Disagree = 1, Disagree = 2, Neither Disagree or Agree = 3, Agree = 4, and Strongly Agree = 5).
Study Design, Setting, and Survey Administration
Setting
The cross-sectional study was conducted by investigators from the University of Chicago, University of Otago, Harvard TH Chan School of Public Health, and the MidWest Clinicians’ Network (MWCN; a network of more than 100 U.S. federally qualified health centers and other safety net clinics).
We included the 21-question PSPIC as the first set of questions within a larger self-administered written questionnaire. The larger questionnaire, not the focus of this article, also included questions about activities specific to the care of patients with diabetes, work satisfaction, and work environment. MWCN provided investigators with a list of all health providers and staff at 100 health centers affiliated with the MWCN in 10 Midwestern states: Illinois, Iowa, Indiana, Kansas, Minnesota, Michigan, Missouri, Nebraska, Ohio, and Wisconsin. Between August 2012 and February 2013, we mailed questionnaires to all health center personnel.
Participants and Data Collection
Participants provided informed consent by completing the self-completed questionnaire and returning it to the University of Chicago in a prepaid reply envelope. A onetime incentive of $2 was included with the invitation to complete the questionnaire. Follow-up questionnaires were mailed to nonrespondents three times. The study was approved by the University of Chicago Institutional Review Board.
Respondent Characteristics
In addition to completing the PSPIC, respondents reported their role at the health center, years since completing formal training, gender, race/ ethnicity, and location (urban or rural) of their clinic.
Statistical Analyses
Item analyses were undertaken to examine ceiling and floor effects and the convergent and divergent properties of the questions within each of the seven domains of the Framework for Measuring Integrated Patient Care. Response characteristics for each of the 21 PSPIC questions were described according to proportions answering each of the 21 questions, proportions with the lowest and highest response options, and mean question scores. The extent to which the individual questions related to the other two questions in the same construct (item–rest) was examined using Pearson’s correlation; as were relationships between each individual question with the average of three items in each of the other six domains (item–item). We hypothesized that within-domain correlations would be stronger than correlations with the other six domains. Internal consistency of each of the seven domains was considered using Cronbach’s alpha.
We also sought to determine, by undertaking EFA, whether there was an underlying latent structure that may be useful in understanding care integration. Respondents from each of the participating health clinics were first stratified by clinic and then randomly assigned to one of two groups. Data from the first group were used for EFA model-building using the principal factor method and Promax oblique rotation using Stata (StataCorp, 2013). Three criteria were used to determine latent factors: (1) factors with eigenvalues greater than the mean eigenvalue, (2) scree plots to identify the point at which eliminating additional factors would not eliminate significant variance, and (3) retaining only those factors with two, or more, questions loading (>0.3) onto that factor. The eigenvalue criterion is a variation of Kaiser–Guttman rule where the more appropriate threshold for common factor analysis is the mean eigenvalue rather than unity (Yeomans & Golder, 1982). Sampling adequacy was assessed using the Kaiser–Meyer–Olkin (KMO) measure, where KMO values of less than 0.8 indicate that, overall, questions have too little in common to merit a factor analysis (Kaiser, 1974).
Following EFA, CFA was undertaken with data collected from the second group. CFA was undertaken using the user-written command “confa” and its postestimation commands (“estat fitindices” and “estat correlate”) in Stata (Kolenikov, 2009). Goodness of fit for the confirmatory model was assessed using the Bentler–Bonnett nonnormed fit index and the Tucker–Lewis index (Tucker & Lewis, 1973), where values >0.90 to 0.95 are considered indicators of good fit (Hu & Bentler, 1995). We also assessed the standardized root mean square residual and root mean square error approximation, where values <0.05 to 0.08 are recommended (Browne & Cudeck, 1993; Steiger, 1990, 2000). Last, internal consistency of the underlying factors was considered using Cronbach’s alpha.
Results
Respondents
Study questionnaires were sent to 2,936 potential participants. Subsequently, 332 questionnaires were returned with information that the named staff member was no longer working at the health center, leaving 2,604 eligible providers and staff from 100 health centers. Following repeated mail-outs, 788 people from 97 health centers responded. Of these, seven had not answered any of the 21 PSPIC questions, providing 781 respondents (30%) with data available for analysis. Among respondents, 75% were female, 79% were White, 30% were physicians, and 70% reported practicing in urban areas (Table 2). The median time since completing training was 13 years (inter-quartile range = 5–22 years).
Table 2.
Respondent characteristics | n | % |
---|---|---|
Gender | ||
Male | 165 | 21 |
Female | 587 | 75 |
Unreported | 29 | 4 |
Race/ethnicity | ||
White | 619 | 79 |
Black/African American | 61 | 8 |
Asian | 37 | 5 |
American Indian or Alaska Native | 4 | <1 |
Native Hawaiian other Pacific Island | 1 | <1 |
Multiple races or race/ethnicity not listed above | 18 | 2 |
Unreported | 41 | 5 |
Role in health center | ||
Licensed practical nurse/medical assistant | 122 | 16 |
Registered nurse | 79 | 10 |
Advanced practice nurse/nurse practitioner/midwife/ | 186 | 24 |
physician assistant | ||
Physician or physician in training | 238 | 30 |
Other/role unreported | 156 | 20 |
Clinic location | ||
Urban (city/suburb) | 546 | 70 |
Rural (rural/frontier) | 218 | 28 |
Unreported | 17 | 2 |
PSPIC Questions and Responses
Table 3 presents the 21 PSPIC questions and response characteristics grouped according to the seven domains of the Framework for Measuring Integrated Patient Care. Of the 781 respondents, few were missing responses (<1%) to items. Overall, respondents tended to agree with each of the 21 statements; of the 744 participants who answered all 21 items the mean response score was 3.75 (SD = 0.96). The proportion of strongly agree responses ranged from 8.4% to 46.7%. The proportion of strongly disagree responses ranged from 0.5% to 8.1%.
Table 3.
PSPIC questions | n (%) | Mean | SD | Strongly disagree % | Strongly agreea % | |
---|---|---|---|---|---|---|
1. Care coordination | ||||||
1a | Patient care is well- coordinated among providers, nurses, and clinic staff | 779 (99.7) | 3.9 | 0.9 | 1.4 | 21.1 |
1b | Providers and staff meet frequently (e.g., team huddles) to plan for patient visits | 774 (99.1) | 3.2 | 1.2 | 8.1 | 13.4 |
1c | Candid and open communication exists between providers and other staff | 774 (99.1) | 4.0 | 0.9 | 2.5 | 27.3 |
2. Coordination with external providers | ||||||
2a | Patient care is well- coordinated with external health care providers (e.g., specialists, hospitals) | 778 (99.6) | 3.6 | 0.9 | 1.4 | 12.2 |
2b | We have good systems in place to track referrals to external providers | 776 (99.4) | 3.7 | 1.0 | 1.7 | 20.6 |
2c | We routinely receive discharge summaries after our patients are hospitalized | 770 (98.6) | 3.5 | 1.1 | 3.6 | 16.9 |
3. Coordination with community resources | ||||||
3a | Patient care is well- coordinated with community resources (e.g., support groups, food pantry, shelters) | 779 (99.7) | 3.6 | 1.0 | 2.2 | 16.9 |
3b | Providers and staff are well-informed about available community resources for patients | 777 (99.5) | 3.6 | 1.0 | 1.8 | 15.6 |
3c | We have established relationships with community agencies to facilitate our referrals to them | 778 (99.6) | 3. 8 | 0.9 | 1.0 | 19.3 |
4. Familiarity with the patient | ||||||
4a | Providers and staff are well-informed at the time of each patient visit about patients’ medical history and current treatments | 779 (99.7) | 3.9 | 0.8 | 1.0 | 19.9 |
4b | Providers and staff are well-informed about patients’ current social needs (e.g., housing, transportation) | 778 (99.6) | 3.4 | 0.9 | 1.3 | 8.4 |
4c | Patients see the same care team or provider for routine clinic visits | 776 (99.4) | 3.8 | 0.9 | 2.5 | 19.9 |
5. Contact between office visits | ||||||
5a | We routinely contact patients with chronic conditions to help them manage their conditions | 775 (99.2) | 3.2 | 1.0 | 4.4 | 9.2 |
5b | We routinely contact patients to remind them of regular preventive or follow- up visits (e.g., flu vaccine or routine lab tests) | 776 (99.4) | 3.5 | 1.1 | 3.7 | 16.5 |
5c | We routinely contact patients to inform them of abnormal laboratory results | 775 (99.2) | 4.4 | 0.7 | 0.5 | 46.7 |
6. Patient-centered care | ||||||
6a | Care is designed to meet the preferences of patients and their families | 778 (99.6) | 3.8 | 0.8 | 1.0 | 15.9 |
6b | We regularly use feedback from patients and families to improve services | 778 (99.6) | 3.8 | 0.9 | 1.3 | 18.9 |
6c | We communicate with patients in a way that they understand (e.g., appropriate language and literacy) | 776 (99.4) | 4.2 | 0.6 | 0.3 | 31.4 |
7. Patients and providers | ||||||
7a | Providers and staff view patients as equal partners in their care | 776 (99.4) | 3.9 | 0.8 | 0.5 | 20.5 |
7b | When developing a treatment plan, providers and staff routinely encourage patients to actively participate in setting goals | 775 (99.2) | 4.0 | 0.8 | 0.5 | 22.6 |
7c | Providers and staff routinely work with patients to develop self-management skills for managing their health conditions | 774 (99.1) | 4.0 | 0.8 | 0.5 | 20.7 |
Items were rated on a 5-point Likert-type response scale (Strongly Disagree = 1, Disagree = 2, Neither Disagree or Agree = 3, Agree = 4, and Strongly Agree = 5).
PSPIC Item Analyses
Correlations between each of the 21 items and their theorized domain based on the Framework for Measuring Integrated Patient Care were all stronger than correlations between each individual item and the other six domains (Table 4). Furthermore, 20 items were strongly correlated with their theorized domains based on the Framework for Measuring Integrated Patient Care (correlations ranging between r = 0.71 and r = 0.89); Question 5c was moderately correlated (r = 0.53). In relation to internal consistency, Cronbach’s alpha exceeded 0.7 for five of the domains; the remaining two approached 0.7 (α = 0.66 for “Familiarity with the patient” and α = 0.67 for “Contact between office visits”).
Table 4.
Questions by integrated care framework construct | Correlation with domain scoresa | α | |||||||
---|---|---|---|---|---|---|---|---|---|
1. Care coordination | 1 | 2 | 3 | 4 | 5 | 6 | 7 | ||
1a | Patient care is well- coordinated among providers, nurses, and clinic staff | 0.80 | 0.47 | 0.44 | 0.44 | 0.48 | 0.48 | 0.44 | |
1b | Providers and staff meet frequently (e.g., team huddles) to plan for patient visits | 0.81 | 0.35 | 0.34 | 0.36 | 0.41 | 0.40 | 0.37 | |
1c | Candid and open communication exists between providers and other staff | 0.77 | 0.39 | 0.37 | 0.45 | 0.40 | 0.46 | 0.40 | 0.74 |
2. Coordination with external providers | |||||||||
2a | Patient care is well- coordinated with external health care providers (e.g., specialists, hospitals) | 0.50 | 0.82 | 0.49 | 0.40 | 0.45 | 0.43 | 0.40 | |
2b | We have good systems in place to track referrals to external providers | 0.41 | 0.80 | 0.42 | 0.39 | 0.42 | 0.37 | 0.34 | |
2c | We routinely receive discharge summaries after our patients are hospitalized | 0.30 | 0.79 | 0.34 | 0.27 | 0.31 | 0.24 | 0.22 | 0.76 |
3. Coordination with community resources | |||||||||
3a | Patient care is well-coordinated with community resources (e.g., support groups, food pantry, shelters) | 0.41 | 0.47 | 0.89 | 0.37 | 0.37 | 0.36 | 0.38 | |
3b | Providers and staff are well-informed about available community resources for patients | 0.43 | 0.43 | 0.88 | 0.41 | 0.41 | 0.43 | 0.38 | |
3c | We have established relationships with community agencies to facilitate our referrals to them | 0.40 | 0.44 | 0.87 | 0.39 | 0.40 | 0.38 | 0.37 | 0.87 |
4. Familiarity with the patient | |||||||||
4a | Providers and staff are well-informed at the time of each patient visit about patients’ medical history and current treatments | 0.43 | 0.37 | 0.34 | 0.79 | 0.46 | 0.42 | 0.38 | |
4b | Providers and staff are well-informed about patients’ current social needs (e.g., housing, transportation) | 0.39 | 0.29 | 0.47 | 0.75 | 0.38 | 0.39 | 0.40 | |
4c | Patients see the same care team or provider for routine clinic visits | 0.35 | 0.32 | 0.20 | 0.71 | 0.35 | 0.31 | 0.32 | 0.66 |
5. Contact between office visits | |||||||||
5a | We routinely contact patients with chronic conditions to help them manage their conditions | 0.45 | 0.38 | 0.41 | 0.42 | 0.84 | 0.42 | 0.44 | |
5b | We routinely contact patients to remind them of regular preventive or follow- up visits (e.g., flu vaccine or routine lab tests) | 0.43 | 0.38 | 0.36 | 0.40 | 0.86 | 0.42 | 0.40 | |
5c | We routinely contact patients to inform them of abnormal laboratory results | 0.36 | 0.36 | 0.23 | 0.38 | 0.53 | 0.42 | 0.39 | 0.67 |
6. Patient-centered care | |||||||||
6a | Care is designed to meet the preferences of patients and their families | 0.45 | 0.35 | 0.37 | 0.44 | 0.47 | 0.83 | 0.63 | |
6b | We regularly use feedback from patients and families to improve services | 0.49 | 0.35 | 0.38 | 0.38 | 0.44 | 0.85 | 0.53 | |
6c | We communicate with patients in a way that they understand (e.g., appropriate language and literacy) | 0.40 | 0.34 | 0.32 | 0.37 | 0.41 | 0.73 | 0.55 | 0.76 |
7. Patients and providers | |||||||||
7a | Providers and staff view patients as equal partners in their care | 0.45 | 0.34 | 0.32 | 0.41 | 0.42 | 0.61 | 0.8 | |
7b | When developing a treatment plan, providers and staff routinely encourage patients to actively participate in setting goals | 0.43 | 0.33 | 0.39 | 0.44 | 0.46 | 0.60 | 0.88 | |
7c | Providers and staff routinely work with patients to develop self-management skills for managing their health conditions | 0.40 | 0.33 | 0.39 | 0.40 | 0.48 | 0.59 | 0.87 | 0.85 |
Bold results report the Pearson correlation of each of the 21 items to its own construct; non-italicized results are the correlations of the individual items to the average of the three questions within each of the six other domains.
Exploratory Factor Analysis
Of 781 respondents, 380 were randomly allocated to the EFA group. The EFA, based on their data, suggests four latent factors. Table 5 (columns 2–5) presents the factor loadings; loadings smaller than 0.3 are suppressed for clarity. All 21 PSPIC questions loaded onto one of the four factors; no questions loaded onto multiple factors according to the 0.3 threshold (Field, 2013; Yong & Pearce, 2013). We labeled Factor 1 as “Teams and Care Continuity” (Questions 1a, 1b, 1c, 4a, 4b, 4c, 5a, 5b); Factor 2 as “Patient Centeredness” (5c, 6a, 6b, 6c, 7a, 7b, 7c); Factor 3 as “Coordination with External Providers” (2a, 2b, 2c), and Factor 4 as “Coordination with Community Resources” (3a, 3b, 3c). The KMO was 0.92; above the acceptable 0.8 threshold for factor analyses (Dziuban & Shirkey, 1974). The percentage of variance in each individual item explained by the model varies from 31% to 81%, with a mean of 52%, and 15 out of 21 items being above 40% and further 3 being above 35%.
Table 5.
EFA factors (n = 380) | CFA factors (n = 401) | ||||||||
---|---|---|---|---|---|---|---|---|---|
|
|
||||||||
PSPIC questions | Factor 1 | Factor 2 | Factor 3 | Factor 4 | Factor 1 | Factor 2 | Factor 3 | Factor 4 | |
1. Care coordination | |||||||||
1a | Patient care is well-coordinated among providers, nurses, and clinic staff | 0.52 | 0.62 | ||||||
1b | Providers and staff meet frequently (e.g., team huddles) to plan for patient visits | 0.60 | 0.59 | ||||||
1c | Candid and open communication exists between providers and other staff | 0.43 | 0.60 | ||||||
2. Coordination with external providers | |||||||||
2a | Patient care is well- coordinated with external health care providers (e.g., specialists, hospitals) | 0.70 | 0.72 | ||||||
2b | We have good systems in place to track referrals to external providers | 0.66 | 0.75 | ||||||
2c | We routinely receive discharge summaries after our patients are hospitalized | 0.63 | 0.61 | ||||||
3. Coordination with community resources | |||||||||
3a | Patient care is well-coordinated with community resources (e.g., support groups, food pantry, shelters) | 0.69 | 0.89 | ||||||
3b | Providers and staff are well-informed about available community resources for patients | 0.65 | 0.90 | ||||||
3c | We have established relationships with community agencies to facilitate our referrals to them | 0.71 | 0.80 | ||||||
4. Familiarity with the patient | |||||||||
4a | Providers and staff are well-informed at the time of each patient visit about patients’ medical history and current treatments | 0.55 | 0.52 | ||||||
4b | Providers and staff are well- informed about patients’ current social needs (e.g., housing, transportation) | 0.50 | 0.48 | ||||||
4c | Patients see the same care team or provider for routine clinic visits | 0.31 | 0.46 | ||||||
5. Contact between office visits | |||||||||
5a | We routinely contact patients with chronic conditions to help them manage their conditions | 0.69 | 0.67 | ||||||
5b | We routinely contact patients to remind them of regular preventive or follow-up visits (e.g., flu vaccine or routine lab tests) | 0.63 | 0.66 | ||||||
5c | We routinely contact patients to inform them of abnormal laboratory results | 0.34 | 0.35 | ||||||
6. Patient-centered care | |||||||||
6a | Care is designed to meet the preferences of patients and their families | 0.65 | 0.66 | ||||||
6b | We regularly use feedback from patients and families to improve services | 0.52 | 0.64 | ||||||
6c | We communicate with patients in a way that they understand (e.g., appropriate language and literacy) | 0.66 | 0.45 | ||||||
7. Patients and providers | |||||||||
7a | Providers and staff view patients as equal partners in their care | 0.70 | 0.64 | ||||||
7b | When developing a treatment plan, providers and staff routinely encourage patients to actively participate in setting goals | 0.77 | 0.63 | ||||||
7c | Providers and staff routinely work with patients to develop self- management skills for managing their health conditions | 0.78 | 0.60 |
Note. EFA = exploratory factor analysis; CFA = confirmatory factor analysis.
Factor 1: Teams and Care Continuity. Includes 8 questions: 1a, 1b, 1c, 4a, 4b, 4c, 5a, 5b. Factor 2: Patient Centeredness. Includes 7 questions: 5c, 6a, 6b, 6c, 7a, 7b, 7c. Factor 3: Coordination with External Providers. Includes 3 questions: 2a, 2b, 2c. Factor 4: Coordination with Community Resources. Includes 3 questions: 3a, 3b, 3c.
Confirmatory Factor Analysis
Of 781 respondents, 401 were randomly allocated to CFA group. The CFA model confirmed the factor groupings from the EFA (Table 5, columns 6–9). The goodness of fit (nonnormed fit index = 0.98, Tucker–Lewis index = 0.98, standardized root mean square residual = 0.07, root mean square error approximation = 0.07) was within the acceptable range. Measures of goodness of fit indicate the model is plausible. For the total sample, Cronbach’s alpha exceeded 0.7 for each of the four latent factors (Factor 1—Teams and Care Continuity α = 0.82; Factor 2—Patient Centeredness α = 0.87; Factor 3—Coordination with External Providers α = 0.75; Factor 4—Coordination with Community Resources α = 0.88). We compared the seven-factor conceptual model with the four-factor model (unreported analyses). The comparison did not suggest that one model fit was better than the other. There were no differences in Akaike information criterion or Bayesian information criterion between two models (to the nearest 100).
Discussion
The 21-question PSPIC appears to be an internally consistent measure of integrated care as used by providers and staff working at federally qualified health centers and safety net clinics. Item analyses indicate that all questions were more strongly correlated with their theoretical domain based on the Framework for Measuring Integrated Patient Care than with the other six domains. The EFA, with data collected from half the respondents, identified four latent factors underpinning the PSPIC, which we labelled Teams and Care Continuity, Patient Centeredness, Coordination with External Providers, and Coordination with Community Resources. The four discrete factors were confirmed in the CFA with data from the remaining half of respondents, and all four factors had acceptable levels of internal consistency.
The 781 health center providers and staff who completed the PSPIC had a range of roles and came from both urban and rural health centers. There were few missing responses (<1%) to each of the 21 PSPIC questions suggesting the acceptability of questions to respondents.
Implications for Research and Practice
We have developed and tested a 21-question measure of provider and staff perceptions of integrated care. The four factors within the PSPIC offer a potential simplified framework to think about integrated care and yet they also provide support for the Framework for Measuring Integrated Patient Care, which we used to inform the development of the original seven domains (Singer et al., 2011). For example, two of the theoretical domains remained as distinct factors—“Coordination with External Providers” (2a, 2b, 2c); “Coordination with Community Resources” (3a, 3b, 3c), indicating that providers and staff perceive these as discrete components. Four of the remaining five theoretical domains remained intact, albeit grouped with other questions, within larger factor groupings. The three “Coordination within the health center” questions were grouped together with the three “Familiarity with the patient” questions in Factor 1 (“Teams and Care Continuity”; Questions 1a, 1b, 1c, 4a, 4b, 4c, 5a, 5b). Similarly, the three “Patient-centered care” questions were grouped together with the three “Shared responsibility for care between clinic, patients and family members” questions in Factor 2 (“Patient Centeredness”; Questions 5c, 6a, 6b, 6c, 7a, 7b, 7c).
One theoretical domain became separated during the factor analysis—Domain 5 (“Contact between office visits”) where Question 5c was grouped within the “Patient Centeredness” factor, and Questions 5a and 5b within the “Teams and Care Continuity” factor. Informing patients of their laboratory results (Question 5c) does seem to align with notions of patient-centered care and shared decision-making; if patients are genuine partners in care then knowledge of their laboratory tests is important. Contacting patients to support them with chronic conditions (Question 5a) and to remind them about preventive or follow-up visits (Question 5b) could conceivably have been similarly aligned. Instead, perhaps the finding that these two questions grouped into the “Teams and Care Continuity” factor (along with questions from the two theoretical domains of “Coordination within the health center” and “Familiarity with the patient”) suggests that providers and staff perceive relationships between team work and being able to provide outreach services such as appointment reminders, and that staff familiarity with the patient leads to effective follow-up and preventive health check reminders. Last, it is notable that from the perspective of providers and staff, the Theoretical Domains 6 (Patient-centered care) and 7 (Shared responsibility for care between clinic, patients, and family members) are grouped together within Factor 2 (Patient Centeredness). For example, it is difficult to envisage care that is patient-centered that does not regard patients as equal partners in their care (Question 7a) or encourage patients to actively participate in setting goals (Question 7b).
We invite others to use and evaluate the PSPIC measure. In addition to testing the PSPIC in different provider and staff groups, future research could investigate whether or not the PSPIC is sensitive to changes in health service delivery over time, in settings with patient-centered medical home implementation or among organizations participating in new accountable care organization models (Derrett et al., 2014; Sugarman, Phillips, Wagner, Coleman, & Abrams, 2014). Given the comparable fit of the models, users may choose to use either the seven-factor or four-factor models, depending on their particular need (e.g., for some, alignment with the domains of the PPIC may be important). We are now undertaking analyses examining relationships between the PSPIC and measures of staff burnout, job satisfaction, and morale, which were asked as part of a larger survey of providers and clinical staff in safety net clinics. Several studies have documented the demands and challenges that providers and staff encounter while working in new models of primary care, which pursue ambitious goals to deliver timely, high-quality, well-coordinated, patient-centered care. However, these transitions in primary care have also illustrated the importance of addressing provider and staff needs given that care delivery changes can result in burnout, low job satisfaction, and low morale (Knapp et al., 2014; Lewis et al., 2012; Quinn et al., 2013). Interventions to change workflow and to implement targeted quality improvement projects that address clinicians’ concerns have shown promise for decreasing burnout, while interventions to improve communication among clinicians and staff have shown promise in improving clinician satisfaction (Linzer et al., 2015). Therefore, studies to examine associations between provider and staff ratings of integrated care and morale, job satisfaction, and burnout may add to our knowledge of positive and negative provider and staff experiences associated with integrated care delivery. Care coordination and continuity between multiple providers and settings are key to helping patients manage their conditions over time (Bodenheimer, 2008). It would also be useful to determine whether the PSPIC is associated with health service utilization and clinical quality. Potential benefits of care coordination include reduced hospital admissions and improved quality of chronic disease management (Berry, Rock, Houskamp, Brueggeman, & Tucker, 2013; White, Carney, Flynn, Marino, & Fields, 2014); however, despite the recent focus on methods to evaluate and improve care coordination there is limited evidence regarding the specific models, activities, tasks, and measures associated with well-coordinated care that may lead to optimal patient outcomes.
Future research could also consider associations between the PPIC measure completed by patients and the PSPIC completed by providers and staff (Singer et al., 2013). Past research has demonstrated how patient, provider, staff, and leadership perspectives on health care delivery can vary (Noël et al., 2014; Smith et al., 2006). A measure such as the PSPIC may make routine evaluation of care integration easier for health system administrators as it may complement more resource-intensive patient surveys. If findings differ between the PPIC and PSPIC measures, we would also have the opportunity to consider what lies behind the different perceptions held by patients and providers of integrated care delivery.
Limitations
First, this survey was developed to assess perceptions of integrated care among providers and staff working in federally qualified health centers and other safety net clinics, limiting generalizability to other primary care settings. However, given the challenges of providing care in this context, federally qualified health centers and safety net clinics constitute an important setting to investigate integrated care. Second, analysis among nonrespondents was not performed. We had limited characteristics (e.g., name of clinic, state in which clinic is located) with which to compare respondents and nonrespondents. Third, the response rate for this study was lower than intended, despite our use of small incentives and repeat follow-up mail-outs of our questionnaires. Internationally, the decline in survey response rates among health providers, and particularly among other staff, has been acknowledged (Cook, Dickinson, & Eccles, 2009). Nevertheless, for the purpose of assessing the PSPIC questionnaire, respondents had a range of personal and health center characteristics and reported varied responses to the PSPIC. Our sample was also large enough to allow us to conduct both exploratory and confirmatory factor analyses. Fourth, 52% of the total variance in 21 items is explained by the factor solution. However, in social sciences, a model that accounts for less than 60% of the total variance can be regarded as satisfactory (Hair, Black, Babin, & Anderson, 2010).
Conclusion
Integrated care is a critical concept and yet we currently have limited measures. Recently, new measures of care coordination have also been developed (Zlateva et al., 2015). The Framework for Measuring Integrated Patient Care was the foundation for our domains and items for the development of the PSPIC, conceptualizing integrated care as comprising both elements of care coordination and patient-centeredness (Singer et al., 2011). Care coordination and patient-centeredness elements are regarded as linked in the PSPIC by providers and staff. Considerable effort has rightly been concentrated on improving the coordination of health services, as health care organizations have implemented accountable care organizations and the patient-centered medical home model in an effort to find better ways to coordinate care across care settings and to improve care for complex patients (Alidina, Rosenthal, Schneider, & Singer, 2016; Blewett & Owen, 2015; Vogus & Singer, 2016). However, despite recent innovations in health information technology and improvements to the organizational structure of health care delivery, key aspects of primary care such as accessibility, continuity, comprehensiveness, coordination, and patient-centeredness require ongoing attention and improvement (Berenson et al., 2008). The Framework for Measuring Integrated Patient Care and our emerging empirical findings from the PSPIC measure support the current emphasis on improving the patient-centeredness of health services which, together with past and ongoing care coordination efforts, may result in services that are truly integrated for patients.
Acknowledgments
We are grateful to the respondents who participated in the survey.
Funding
The authors disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This project was supported by the Commonwealth Fund (Grant 20080366) and the Chicago Center for Diabetes Translation Research (NIDDK P30 DK092949). Dr. Derrett was funded by a Commonwealth Fund Harkness Fellowship. Mr. Nocon was supported by an Agency for Healthcare Research and Quality training grant (AHRQ T32 HS000084). Dr. Chin was supported by a NIDDK Midcareer Investigator Award in Patient-Oriented Research (NIDDK K24 DK071933).
Footnotes
The views expressed in this article are those of the authors and not necessarily those of the individuals or organizations that contributed to its development. Dr. Sarah Derrett presented the results of this study as an oral presentation, “Development and Testing of the Provider and Staff Perceptions of Integrated Care (PSPIC) Survey: Provisional Results,” at the 4th World Congress on Integrated Care, Wellington, New Zealand; November 23–25, 2016.
Declaration of Conflicting Interests
The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
References
- Alidina S, Rosenthal M, Schneider E, Singer S. Coordination within medical neighborhoods: Insights from the early experiences of Colorado patient-centered medical homes. Health Care Management Review. 2016;41:101–112. doi: 10.1097/HMR.0000000000000063. [DOI] [PubMed] [Google Scholar]
- Armitage GD, Suter E, Oelke ND, Adair CE. Health systems integration: State of the evidence. International Journal of Integrated Care. 2009;9(2) doi: 10.5334/ijic.316. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Berenson RA, Hammons T, Gans DN, Zuckerman S, Merrell K, Underwood WS, Williams AF. A house is not a home: Keeping patients at the center of practice redesign. Health Affairs. 2008;27:1219–1230. doi: 10.1377/hlthaff.27.5.1219. [DOI] [PubMed] [Google Scholar]
- Berry LL, Rock BL, Houskamp BS, Brueggeman J, Tucker L. Care coordination for patients with complex health profiles in inpatient and outpatient settings. Mayo Clinic Proceedings. 2013;88:184–194. doi: 10.1016/j.mayocp.2012.10.016. [DOI] [PubMed] [Google Scholar]
- Blewett LA, Owen RA. Accountable care for the poor and underserved: Minnesota’s Hennepin Health model. American Journal of Public Health. 2015;105:622–624. doi: 10.2105/AJPH.2014.302432. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bodenheimer T. Coordinating care—A perilous journey through the health care system. New England Journal of Medicine. 2008;358:1064–1071. doi: 10.1056/NEJMhpr0706165. [DOI] [PubMed] [Google Scholar]
- Browne MW, Cudeck R. Alternative ways of assessing model fit. In: Bollen KA, Long JS, editors. Testing structural equation models. Newbury Park, CA: Sage; 1993. pp. 136–162. [Google Scholar]
- Cook JV, Dickinson HO, Eccles MP. Response rates in postal surveys of healthcare professionals between 1996 and 2005: An observational study. BMC Health Services Research. 2009;9(1):160. doi: 10.1186/1472-6963-9-160. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Derrett S, Gunter KE, Nocon RS, Quinn MT, Coleman K, Daniel DM, Chin MH. How 3 rural safety net clinics integrate care for patients: A qualitative case study. Medical Care. 2014;52:S39–S47. doi: 10.1097/MLR.0000000000000191. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dziuban CD, Shirkey EC. When is a correlation matrix appropriate for factor analysis? Some decision rules. Psychological Bulletin. 1974;81:358–361. [Google Scholar]
- Field A. Discovering statistics using IBM SPSS statistics. Thousand Oaks, CA: Sage; 2013. [Google Scholar]
- Hair JF, Jr, Black WC, Babin BJ, Anderson RE. Multivariate data analysis. 7. Upper Saddle River, NJ: Pearson Education International; 2010. [Google Scholar]
- Hu L-T, Bentler PM. Evaluating model fit. In: Hoyle RH, editor. Structural equation modelling: Concepts, issues, and applications. Thousand Oaks, CA: Sage; 1995. pp. 76–99. [Google Scholar]
- Kaiser HF. An index of factorial simplicity. Psychometrika. 1974;39:31–36. [Google Scholar]
- Knapp C, Chakravorty S, Madden V, Baron-Lee J, Gubernick R, Kairys S, … Thompson L. Association between medical home characteristics and staff professional experiences in pediatric practices. Archives of Public Health. 2014;72(1):36. doi: 10.1186/2049-3258-72-36. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kolenikov S. Confirmatory factor analysis using confa. Stata Journal. 2009;9:329–373. [Google Scholar]
- Lewis SE, Nocon RS, Tang H, Park SY, Vable AM, Casalino LP, … Birnberg JM. Patient-centered medical home characteristics and staff morale in safety net clinics. Archives of Internal Medicine. 2012;172:23–31. doi: 10.1001/archinternmed.2011.580. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Linzer M, Poplau S, Grossman E, Varkey A, Yale S, Williams E, … Barbouche M. A cluster randomized trial of interventions to improve work conditions and clinician burnout in primary care: Results from the Healthy Work Place (HWP) study. Journal of General Internal Medicine. 2015;30:1105–1111. doi: 10.1007/s11606-015-3235-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Noël PH, Parchman ML, Palmer RF, Romero RL, Leykum LK, Lanham HJ, … Bowers KW. Alignment of patient and primary care practice member perspectives of chronic illness care: A cross-sectional analysis. BMC Family Practice. 2014;15(1):57. doi: 10.1186/1471-2296-15-57. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Quinn MT, Gunter KE, Nocon RS, Lewis SE, Vable AM, Tang H, … Chin MH. Undergoing transformation to the patient centered medical home in safety net health centers: Perspectives from the front lines. Ethnicity and Disease. 2013;23:356–362. [PMC free article] [PubMed] [Google Scholar]
- Rosenthal MB, Friedberg MW, Singer SJ, Eastman D, Li Z, Schneider EC. Effect of a multipayer patient-centered medical home on health care utilization and quality: The Rhode Island Chronic Care Sustainability Initiative Pilot Program. JAMA Internal Medicine. 2013;173:1907–1913. doi: 10.1001/jamainternmed.2013.10063. [DOI] [PubMed] [Google Scholar]
- Singer SJ, Burgers J, Friedberg M, Rosenthal MB, Leape L, Schneider E. Defining and measuring integrated patient care: Promoting the next frontier in health care delivery. Medical Care Research and Review. 2011;68:112–127. doi: 10.1177/1077558710371485. [DOI] [PubMed] [Google Scholar]
- Singer SJ, Friedberg MW, Kiang MV, Dunn T, Kuhn DM. Development and preliminary validation of the Patient Perceptions of Integrated Care survey. Medical Care Research and Review. 2013;70:143–164. doi: 10.1177/1077558712465654. [DOI] [PubMed] [Google Scholar]
- Smith CS, Morris M, Hill W, Francovich C, McMullin J, Christiano J, … Milne C. Testing the exportability of a tool for detecting operational problems in VA teaching clinics. Journal of General Internal Medicine. 2006;21:152–157. doi: 10.1111/j.1525-1497.2006.00313.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- StataCorp. Stata Statistical Software. Release 13 [Computer software] College Station, TX: StataCorp LP; 2013. [Google Scholar]
- Steiger JH. Structural model evaluation and modification: An interval estimation approach. Multivariate Behavioural Research. 1990;25:173–180. doi: 10.1207/s15327906mbr2502_4. [DOI] [PubMed] [Google Scholar]
- Steiger JH. Point estimation, hypothesis testing and interval estimation using the RMSEA: Some comments and a reply to Hayduk and Glaser. Structural Equation Modeling. 2000;7:149–162. [Google Scholar]
- Sugarman JR, Phillips KE, Wagner EH, Coleman K, Abrams MK. The safety net medical home initiative: Transforming care for vulnerable populations. Medical Care. 2014;52:S1–S10. doi: 10.1097/MLR.0000000000000207. [DOI] [PubMed] [Google Scholar]
- Tucker LR, Lewis C. A reliability coefficient for maximum likelihood factor analysis. Psychometrika. 1973;38(1):1–10. [Google Scholar]
- Vogus TJ, Singer SJ. Creating highly reliable accountable care organizations. Medical Care Research and Review. 2016;73:660–672. doi: 10.1177/1077558716640413. [DOI] [PubMed] [Google Scholar]
- White B, Carney PA, Flynn J, Marino M, Fields S. Reducing hospital readmissions through primary care practice transformation: This study found that a” culture of continuity” using processes that strengthen outpatient-inpatient caregiver communication improves patient care. Journal of Family Practice. 2014;63(2):67–75. [PubMed] [Google Scholar]
- Yeomans KA, Golder PA. The Guttman-Kaiser criterion as a predictor of the number of common factors. Statistician. 1982;31:221–229. [Google Scholar]
- Yong AG, Pearce S. A beginner’s guide to factor analysis: Focusing on exploratory factor analysis. Tutorials in Quantitative Methods for Psychology. 2013;9:79–94. [Google Scholar]
- Zlateva I, Anderson D, Coman E, Khatri K, Tian T, Fifield J. Development and validation of the Medical Home Care Coordination Survey for assessing care coordination in the primary care setting from the patient and provider perspectives. BMC Health Services Research. 2015;15(1):226. doi: 10.1186/s12913-015-0893-1. [DOI] [PMC free article] [PubMed] [Google Scholar]