Skip to main content
Journal of General Internal Medicine logoLink to Journal of General Internal Medicine
. 2013 Jul 30;29(1):98–103. doi: 10.1007/s11606-013-2566-2

Measuring Primary Care Organizational Capacity for Diabetes Care Coordination: The Diabetes Care Coordination Readiness Assessment

Douglas L Weeks 1,, Jennifer M Polello 1, Daniel T Hansen 1, Benjamin J Keeney 2, Douglas A Conrad 3
PMCID: PMC3889951  PMID: 23897130

ABSTRACT

BACKGROUND

Not all primary care clinics are prepared to implement care coordination services for chronic conditions, such as diabetes. Understanding true capacity to coordinate care is an important first-step toward establishing effective and efficient care coordination. Yet, we could identify no diabetes-specific instruments to systematically assess readiness and/or status of primary care clinics to engage in diabetes care coordination.

OBJECTIVE

This report describes the development and initial validation of the Diabetes Care Coordination Readiness Assessment (DCCRA), which is intended to measure primary care clinic readiness to coordinate care for adult patients with diabetes.

DESIGN

The instrument was developed through iterative item generation within a framework of five domains of care coordination: Organizational Capacity, Care Coordination, Clinical Management, Quality Improvement, and Technical Infrastructure.

PARTICIPANTS

Validation data was collected on 39 primary care clinics.

MAIN MEASURES

Content validity, inter-rater reliability, internal consistency, and construct validity of the 49-item instrument were assessed.

KEY RESULTS

Inter-rater agreement indices per item ranged from 0.50 to 1.0. Cronbach’s alpha of the entire instrument was 0.964, and for the five domain scales ranged from 0.688 to 0.961. Clinics with existing care coordinators were rated as more ready to support care coordination than clinics without care coordinators for the entire DCCRA and for each domain, supporting construct validity.

CONCLUSIONS

As providers increasingly attempt to adopt patient-centered approaches, introduction of the DCCRA is timely and appropriate for assisting clinics with identifying gaps in provision of care coordination services. The DCCRA’s strengths include promising psychometric properties. A valid measure of diabetes care coordination readiness should be useful in diabetes program evaluation, assistance with quality improvement initiatives, and measurement of patient-centered care in research.

Electronic supplementary material

The online version of this article (doi:10.1007/s11606-013-2566-2) contains supplementary material, which is available to authorized users.

KEY WORDS: care coordination, diabetes, primary care, internal medicine, readiness assessment

INTRODUCTION

Care coordination is an essential component of the broad effort in the United States to improve healthcare quality and efficiency.1 Although consensus on a single definition for care coordination has not emerged, a definition accepted by the Agency for Healthcare Quality and Research states that, “Care coordination is the deliberate organization of patient care activities between two or more participants (including the patient) involved in a patient’s care to facilitate the appropriate delivery of health care services. Organizing care involves the marshalling of personnel and other resources needed to carry out all required patient care activities and is often managed by the exchange of information among participants responsible for different aspects of care.”2 Failures in care coordination—especially at points of care transition—are responsible for medication errors, hospital readmissions, avoidable emergency department visits, duplicate testing, and progression of disease from inadequate delivery of preventive services.36 By contrast, the documented benefits of well-coordinated care include effective provider-to-provider and provider-to-patient communication, and better disease management.710

The burden of poorly coordinated care is particularly evident for people with chronic conditions, such as diabetes, who often are expected to navigate a complex healthcare system with little support from multiple providers struggling to communicate even among themselves. While care coordination may assist patients with managing diabetes, the burden for coordinating care in the ambulatory setting is particularly high for primary care providers (PCPs) who typically serve as the hub for communication among multiple providers and the patient.11 Diabetes care coordination, like care coordination for other chronic conditions, is successful when it is a patient-centered, evidence-based team activity focusing on sequencing of care among specialists, and supported by timely and accurate health information exchange.7,12

In a common model for providing diabetes care coordination in the primary care setting, coordination activities are the responsibility of specific office staff, such as nurses or medical assistants, who are designated as care coordinators. Care coordinators provide services to assist the patient with disease management, including: communicating frequently with the patient about health indicators such as day-to-day blood glucose levels; teaming with the PCP to communicate with the patient about medications to achieve self-management goals; tracking and supporting patients when they obtain services outside the practice; contacting patients soon after a hospital encounter; scheduling and tracking preventive health screenings; providing appropriate self-management education; assuring information exchange among team members (including the patient); linking the patient with community resources; and contributing to care planning.

Not all primary care clinics are prepared to implement diabetes care coordination services, and often clinics that nominally provide care coordination are lacking specialty services that optimize coordinated care for diabetes patients.13 Understanding true capacity to coordinate care is an important first-step toward establishing effective and efficient care coordination. Yet, we could identify no diabetes-specific instruments to systematically assess readiness of primary care clinics to engage in diabetes care coordination. To meet this need, we have developed and implemented an instrument to assess primary care clinic readiness for diabetes care coordination.

In this article, we report on development and initial psychometric evaluation of the Diabetes Care Coordination Readiness Assessment (DCCRA). The DCCRA aims to identify actionable barriers to and facilitators of implementation of diabetes care coordination for adults in primary care practices. A parallel aim of the DCCRA is to focus discussion within a clinic on action planning to improve care coordination capacity. The DCCRA was designed to be re-administered across time to assess progress toward full capacity. Although designed for self-administration by a clinic, DCCRA data for this article were obtained by care coordination consultants through onsite DCCRA administration sessions in primary care clinics.

METHODS

Development of the DCCRA occurred in two phases. Phase I involved generation of content domains, development of an item pool per domain, determination of a scoring methodology, and implementation of a consensus process to develop the initial instrument. Phase II involved pilot testing of the initial instrument to develop the final DCCRA, and assessment of reliability and construct validity of the final DCCRA. The study was reviewed by the local Institutional Review Board (IRB) and determined to be exempt from IRB oversight.

Phase I: Development of Representative Content

Development of the DCCRA used a content-validity approach: initially, a comprehensive search was conducted in Medline and the Cumulative Index for Nursing and Allied Health Literature for existing instruments that assessed diabetes care coordination readiness in the primary care setting; however, no instruments were identified. Secondarily, we reviewed care coordination-oriented instruments referenced in the Care Coordination Measures Atlas,14 which yielded no instruments specific to diabetes care coordination readiness. Finally, clinical practice guidelines, measurement frameworks, and peer-reviewed literature on successful care coordination programs for any chronic condition were examined for tools that measured constructs related to readiness for care coordination in any chronic condition. Although no validated readiness assessment instruments were identified, we evaluated this literature in team meetings to develop primary care coordination best-practice themes, from which five content domains for the DCCRA were established: (1) organizational capacity, (2) diabetes care coordination best practices, (3) clinical management capability, (4) quality improvement capability, and (5) technical infrastructure.1422

Forty-three candidate items covering the five domains were drafted. Items corresponded to achievable objectives that optimized care coordination. The initial domains and pool of items were cognitively tested with an advisory group consisting of three primary care physicians, one of whom was part of a team implementing a medical home model in a clinic system, an endocrinologist, two primary care clinic nurses, two certified diabetes educators who engaged in care coordination, two experts in survey item and measurement scale design, a health economist, a health informatics researcher, and a health services researcher. Although we considered including patients in cognitive testing, because the unit of observation for the assessment is the clinic, and then processes within the clinic, we decided not to include patients in this phase. The advisory group was asked to identify defects in existing items, rate comprehensiveness of existing items, and suggest additional items covering relevant content not represented in the initial item pool.

Based on feedback from the advisory group, six more items were added and some items were reworded to yield a pool of 49 items. Each item used a 4-point response scale ranging from “Not Prepared,” to “Moderately Prepared,” to “Highly Prepared,” to “Actively Performing.” Two additional constructs were assessed per item: (1) level of importance of the objective to the practice, and (2) variation among providers in the practice in achieving the objective. These items assisted in establishing priorities for action to improve care coordination readiness. Each of these items was rated on a “High,” Medium,” or “Low” response scale.

The pool of 49 items and measurement scale were pre-tested by providers in five primary care clinics. Based on the pre-test, a few items were refined to arrive at the final 49-item DCCRA. The final version of the DCCRA is available in the Online Appendix.

Phase II: Pilot Testing and Psychometric Evaluation of the DCCRA

Phase II involved data collection with the DCCRA on a sample of 39 primary care clinics who had signed agreements to participate in the Beacon Community of the Inland Northwest (BCIN) regional diabetes care coordination intervention project. The BCIN supports the information needs for diabetes care coordination by propagating health information exchange among electronic medical records of ambulatory and inpatient facilities, laboratories, imaging facilities, and pharmacies. A primary objective of the BCIN project is to utilize health information exchange-enabled care coordination to improve preventive health services receipt and patient outcomes in adult patients with type-2 diabetes. Clinics were administered the DCCRA as preparation for participation in the BCIN project. Clinics were eligible to participate in the BCIN if they were current users of a certified electronic healthcare record; were located in the BCIN catchment area (Spokane Hospital Referral Region); and provided primary care to patients with type 2 diabetes who were at least 18 years of age.

The DCCRA was administered in a single face-to-face meeting in the clinic. Attendees included a minimum of one provider, the individual responsible for managing health information technology, and the individual responsible for managing clinic policies and procedures. Among clinics currently engaging in care coordination, individuals responsible for care coordination duties attended. Choice of specific attendees was at the discretion of the clinic. Two BCIN care coordination consultants rated readiness of the clinic on each of the 49 DCCRA objectives based on clinic responses at the meeting. One of the consultants conducted the meeting, with the other consultant listening to and rating the clinic on each item, without exposure to ratings of the other consultant. Therefore, two sets of independent scores were obtained per meeting. The same two consultants performed all DCCRA administrations for this article.

Reliability Analyses

The first 39 clinics to be administered the DCCRA constituted the psychometric evaluation sample for this report. Data were accrued from May, 2011 to October, 2012. Per-item inter-rater agreement in ratings was analyzed with Gwet’s first-order agreement coefficient, AC1.23,24 AC1 uses a chance-agreement probability that is calibrated to the propensity of random ratings estimated from observed ratings. Unlike other measures of chance-adjusted agreement, such as Cohen’s Kappa,25 AC1 is not distorted by high or low prevalence of the trait being rated; thus, it presents an unbiased estimate of inter-rater agreement.26 Standard errors for AC1 were derived with unconditional variance estimates in order to characterize precision of agreement coefficients to the general population of potential raters. AC1 was calculated for ratings of readiness to perform diabetes care coordination, ratings of importance to the practice, and ratings of variation among providers in the practice. Although arbitrary guidelines for judging the magnitude of inter-rater reliability represented by measures of agreement are commonly criticized, such standards do provide a benchmark against which to qualitatively judge reliability. To that end, we judged the strength of agreement coefficients against guidelines proposed by Landis and Koch: 0.21–0.40 = fair agreement, 0.41–0.60 = moderate agreement, 0.61–0.80 = substantial agreement, 0.81–1.0 = almost perfect agreement.27 AC1 analyses were conducted with AgreeStat (Advanced Analytics, LLC, Gaithersburg, MD).

Cronbach’s alpha was derived as a measure of whole-instrument internal consistency reliability for the ratings of readiness to perform for the entire 49-item scale, and separately for each domain. Cronbach’s alphas were obtained with two-way mixed effects repeated measures models with participant effects considered random and item effects considered fixed. We assumed the overall DCCRA to be internally consistent at an alpha value > 0.80, and internally consistent for individual domains at an alpha value > 0.70 due to the attenuation of reliability coefficients with reduced items.28,29 Cronbach’s alpha analyses were conducted with SPSS v 19.0 (SPSS Inc, Chicago, IL).

Construct Validity Analyses

Fifteen of the 39 clinics (38 %) had pre-existing care coordination programs. Known groups method was used as an indicator of construct validity by comparing DCCRA total scores and scores per domain from clinics known to have a care coordination program to those without such a program to determine if scores could discriminate among clinic-types.30 Item ratings of 0 for “Not Prepared,” 1 for “Moderately Prepared,” 2 for “Highly Prepared,” and 3 for “Actively Performing” were averaged for the two raters per item, and then summed for the entire instrument and per domain. Group comparisons were made with independent t-tests in SPSS. Significant differences at P < 0.05 in DCCRA summed scores would substantiate construct validity of the DCCRA through ability to discriminate readiness among clinics with and without care coordinators.

RESULTS

As seen in Table 1, the percentage of patients with diabetes averaged almost 12 % across clinics, with the clinics being heterogeneous with regard to numbers of patients with diabetes, and types and numbers of providers in the practice. Administration time for the DCCRA ranged from 20 to 60 min, with an average administration time of 37 min.

Table 1.

Characteristics of the Validation Sample of 39 Primary Care Clinics

Minimum Maximum Mean Standard deviation
Number of physicians in practice 1 27 6.3 5.5
Number of advanced registered nurse practitioners in practice 0 13 2.8 2.9
Number of physicians assistants in practice 1 14 3.6 3.2
Number of nurses and medical assistants in practice 0 37 4.9 9.8
Total number of patients with type 2 diabetes 42 2,700 699.3 694.9
Percent of patients with type 2 diabetes 2 38 11.8 9.4

Reliability of the Ratings

AC1 values and the associated standard error per item are located in Table 2. AC1 values for ratings of readiness to perform showed at least substantial agreement for all items, with the exception of item 2.9d, which showed moderate agreement. For ratings of importance to the practice, AC1 values indicated at least substantial agreement for all items. For ratings of variation among providers in the practice, AC1 values indicated at least substantial agreement for 35 of 49 items, moderate agreement for 13 of 49 items, and fair agreement for the remaining item (item 2.5). In general, the DCCRA had acceptable inter-rater reliability across all three ratings performed for each of the 49 items.

Table 2.

AC1 Inter-Rater Reliability Coefficients and Associated Standard Errors (SE) Per Item for the Readiness to Perform, Importance to the Practice, and Variation Among Provider Ratings

Item Readiness to perform Importance to practice Variation among providers
AC1 value SE AC1 value SE AC1 value SE
1.1 0.83 0.071 1.00 0.000 0.67 0.090
1.2a 0.86 0.061 1.00 0.000 0.75 0.082
1.2b 0.82 0.071 0.83 0.068 0.78 0.077
1.3 0.91 0.053 0.92 0.047 0.84 0.067
1.4a 0.77 0.079 0.76 0.094 0.78 0.076
1.4b 0.83 0.070 0.85 0.065 0.73 0.085
1.5 0.71 0.087 0.97 0.030 0.64 0.093
1.6a 0.95 0.037 1.00 0.000 0.97 0.027
1.6b 1.00 0.000 0.94 0.039 0.94 0.041
1.7 0.72 0.083 0.75 0.080 0.62 0.092
2.1 0.84 0.067 0.85 0.064 0.74 0.084
2.2 0.84 0.070 0.74 0.087 0.50 0.101
2.3 0.79 0.074 0.83 0.066 0.68 0.089
2.4 0.84 0.067 0.79 0.086 0.74 0.083
2.5 0.50 0.095 0.92 0.047 0.36 0.103
2.6 0.62 0.090 0.88 0.062 0.53 0.101
2.7 0.70 0.087 0.63 0.099 0.48 0.101
2.8 0.70 0.084 0.60 0.102 0.61 0.095
2.9a 0.92 0.047 0.92 0.047 0.77 0.077
2.9b 0.78 0.076 0.77 0.076 0.74 0.083
2.9c 0.73 0.083 0.88 0.062 0.53 0.099
2.9d 0.48 0.105 0.73 0.086 0.53 0.100
2.9e 0.68 0.087 0.64 0.092 0.56 0.099
2.10 0.72 0.083 0.69 0.092 0.58 0.097
3.1 0.84 0.070 0.77 0.077 0.63 0.093
3.2 0.73 0.085 0.78 0.080 0.54 0.103
3.3 0.82 0.070 0.83 0.068 0.64 0.094
3.4 0.75 0.080 0.68 0.088 0.56 0.098
3.5 0.75 0.081 0.60 0.102 0.56 0.100
4.1 0.91 0.052 0.91 0.051 0.61 0.095
4.2a 0.75 0.080 0.77 0.076 0.63 0.094
4.2b 0.87 0.063 0.86 0.060 0.67 0.092
4.3a 0.79 0.073 0.86 0.063 0.72 0.087
4.3b 0.81 0.071 0.85 0.072 0.61 0.096
4.4 0.70 0.085 0.86 0.062 0.68 0.088
4.5a 0.97 0.029 0.97 0.029 0.91 0.051
4.5b 0.97 0.029 0.97 0.029 0.91 0.051
5.1 0.91 0.048 0.92 0.046 0.79 0.076
5.2 0.85 0.065 0.80 0.070 0.57 0.101
5.3a 0.88 0.058 0.83 0.065 0.70 0.088
5.3b 0.71 0.086 0.82 0.070 0.68 0.092
5.3c 0.82 0.070 0.88 0.058 0.67 0.091
5.4 0.89 0.055 1.00 0.000 0.91 0.051
5.5 0.76 0.083 0.92 0.048 0.90 0.053
5.6 0.72 0.083 0.71 0.084 0.50 0.101
5.7 0.84 0.066 0.67 0.091 0.49 0.101
5.8 0.72 0.083 0.80 0.071 0.54 0.097
5.9 0.94 0.038 0.89 0.056 0.91 0.051
5.10 0.77 0.076 0.86 0.060 0.63 0.092

Cronbach’s alpha for ratings of readiness to perform for the whole instrument was 0.964, indicating very good whole-scale internal consistency. We recalculated Cronbach’s alphas with each item deleted, in turn, from the scale. The value of alpha was negligibly affected by item deletion, with Cronbach’s alphas ranging from 0.962 to 0.967 for separate deletion. Cronbach’s alpha values per domain, displayed in Table 3, exceeded our criterion for internal consistency of 0.70 for the Organizational Capacity, Care Coordination, Quality Improvement, and Technical Infrastructure domains. The failure of alpha for the Clinical Management domain to exceed our criterion of 0.70 for internal consistency was most likely due to the small number of items in this domain (five items). However, removal of any of the Clinical Management domain items from the entire scale reduced the magnitude of Cronbach’s alpha for the whole scale. Thus, we did not consider the value of alpha for this domain to be a detractor for overall internal consistency of the DCCRA. Given the magnitudes of Cronbach’s alphas, we concluded that the DCCRA was internally consistent as a whole and per domain.

Table 3.

Cronbach’s Alpha Values for the Whole DCCRA and Per Domain

Value of Cronbach’s alpha
Whole DCCRA 0.964
Organizational Capacity Domain 0.751
Care Coordination Domain 0.837
Clinical Management Domain 0.688
Quality Improvement Domain 0.961
Technical Infrastructure Domain 0.841

Construct Validity

Clinics with existing care coordination programs were rated as significantly more prepared for care coordination on the entire DCCRA and in each domain. Means, standard deviations and associated P values for group comparisons are displayed in Table 4. Based on ability to discriminate among known groups, we concluded that the DCCRA exhibited construct validity.

Table 4.

Means and Standard Deviations for Summed Ratings of Readiness to Engage in Diabetes Care Coordination for the Whole DCCRA and Per Domain for Clinics With and Without an Existing Care Coordinator

Care coordinator working in practice Mean Standard deviation P value for comparison
Whole DCCRA Yes 109.4 18.8 0.001
No 86.9 19.7
Organizational Capacity Domain Yes 23.3 3.7 0.014
No 19.4 5.0
Care Coordination Domain Yes 31.0 6.5 0.038
No 26.0 7.2
Clinical Management Domain Yes 11.3 2.5 0.001
No 7.3 3.8
Quality Improvement Domain Yes 14.3 5.8 0.040
No 10.5 5.1
Technical Infrastructure Domain Yes 28.9 5.8 0.027
No 23.4 7.9

P values for comparisons among clinics are displayed

The total number of points that can be achieved in the readiness ratings of the DCCRA is 147. These initial data indicate that, even among clinics with care coordinators, no clinic exhibited full engagement in all aspects of care coordination as assessed by the DCCRA. Thus, the instrument does not appear to suffer from a ceiling effect.

DISCUSSION

Primary care providers seeking to improve their readiness to coordinate care for patients with diabetes must first understand their capacity/current status for care coordination. The DCCRA was designed to assist primary care practices with determining such capacity, while identifying areas to address gaps in care coordination capacity. Coordinating care in the ambulatory environment requires more than establishing the capability to share electronic health information among organizations. Primary care clinics will often need to implement organizational strategies, make difficult changes in clinic culture, and redesign workflow to promote care coordination skills among staff. Many of these elements are assessed in the DCCRA under five key domains: organizational capacity, diabetes care coordination best practices, clinical management capability, quality improvement capability, and technical infrastructure. Psychometric analyses indicated that the DCCRA exhibits inter-rater reliability, internal consistency, and construct validity by differentiating among known groups.

As providers increasingly attempt to adopt patient-centered approaches, the DCCRA is appropriate not only for assessing current readiness for care coordination services, but also as a planning tool for addressing gaps. Some of these gaps in capacity can be addressed through practice workflow changes specifically designed to assign accountability roles within a clinic for care coordination processes. Understanding who is responsible for what is critical for effective team coordination in patient-centered approaches. Some clinics will identify the need to hire, train, and delegate staff to coordinate referrals, transitions of care, monitor patient clinical data, and assist with diabetes medication management. Other gaps may be addressed with targeted education on development of patient-centered communication skills fundamental to coordinating care.8 Still other gaps may be resolved by the clinic looking outwardly in order to establish or solidify relationships with medical specialists and social services. Underlying a clinic’s efforts to enhance readiness in care coordination, though, is the capability for the clinic to manage the timely and effective sharing of patient information among providers. This aspect of care coordination is prominently addressed in the Technology Infrastructure domain of the DCCRA.

Development of the DCCRA was an activity of the federally funded Beacon Community of the Inland Northwest, which is promoting diabetes care coordination supported by health information exchange among clinics and hospitals in the vast area encompassed by the Spokane Hospital Referral Region. As an element of our Beacon project intervention, we utilize DCCRA results to inform clinics about where they are doing well and areas where capacity development, internal action planning and/or external coaching and education (offered by our Beacon care coordination consultants) could be beneficial. Our coaching and education has included: helping clinics prioritize quality improvement efforts, providing diabetes-specific education to clinic-based diabetes care coordinators, implementing clinical transformation education, assisting clinics to develop patient-centered policies and procedures, and supporting technical staff to measure clinical outcomes. The DCCRA results are integral for matching the focus of coaching and education with needs of the clinics. Although developed to assess clinic progress within the Beacon project, we feel the DCCRA is transferrable to any primary care clinic wishing to assess diabetes care coordination readiness.

While the DCCRA was specifically developed and initially validated for diabetes care, our future work intends to determine whether the DCCRA framework can be extended to assess readiness to coordinate care for other ambulatory care-sensitive chronic conditions seen in primary care. In addition, our forthcoming exploration of the psychometric properties of the DCCRA will establish dimensionality of the instrument through factor analysis.

Conclusions

The DCCRA is aimed at enhancing delivery of diabetes care coordination by informing the clinic of its capacity for best-practice care. It is designed to be self-administered by a clinic, and can be administered over time to assess progress with initiatives to enhance care coordination capacity. A valid measure of care coordination readiness can be used in program evaluation, assistance with quality improvement initiatives, and would enhance research on measurement of patient-centered care.

Electronic supplementary material

ESM 1 (104.6KB, pdf)

(PDF 104 kb)

Acknowledgements

This work was supported by Grant Number 90BC001101 from the Office of the National Coordinator for Health Information Technology (ONC), Department of Health and Human Services (DHHS), awarded to Inland Northwest Health Services. Article contents are solely the responsibility of the authors and do not necessarily represent the official views of ONC or DHHS.

Portions of the data in this manuscript were presented at American Public Health Association 139th Annual Meeting and Exposition, October 31 to November 2, 2011, Washington, DC, and the AcademyHealth Annual Research Meeting, June 12–14, 2011, Seattle, WA.

Conflict of Interest

The authors declare that they do not have a conflict of interest.

REFERENCES

  • 1.National Priorities Partnership. National Priorities and Goals: aligning our Efforts to Transform America’s Healthcare. Washington, DC: National Quality Forum; 2008
  • 2.McDonald KM, Sundaram V, Bravata DM, et al. Care coordination. In: Shojania KG, McDonald KM, Wachter RM, Owens DK, eds. Closing the Quality Gap: a Critical Analysis of Quality Improvement Strategies. Technical Review 9 (Prepared by Stanford-UCSF Evidence-Based Practice Center under contract No. 290020017). Vol. 7. Rockville, MD: Agency for Healthcare Research and Quality, June 2007. AHRQ Publication No. 04(07)00517
  • 3.Cornu P, Steurbaut S, Leysen T, et al. Discrepancies in medication information for the primary care physician and the geriatric patient at discharge. Ann Pharmacother. 2012;46:983–990. doi: 10.1345/aph.1R022. [DOI] [PubMed] [Google Scholar]
  • 4.Jencks SF, Williams MV, Coleman EA. Rehospitalizations among patients in the Medicare fee-for-service program. N Engl J Med. 2009;360:1418–1428. doi: 10.1056/NEJMsa0803563. [DOI] [PubMed] [Google Scholar]
  • 5.Tang N, Stein J, Hsia RY, Maselli JH, Gonzales R. Trends and characteristics of US emergency department visits, 1997–2007. JAMA. 2010;304:664–670. doi: 10.1001/jama.2010.1112. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Stewart BA, Fernandes S, Rodriguez-Huertas E, Landzberg M. A preliminary look at duplicate testing associated with lack of electronic health record interoperability for transferred patients. J Am Med Inform Assoc. 2010;17:341–344. doi: 10.1136/jamia.2009.001750. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Antonelli RC, McAllister JW, Popp J. Making Care Coordination a Critical Component of the Pediatric Healthcare System: a Multidisciplinary Framework. New York: The Commonwealth Fund; 2009. [Google Scholar]
  • 8.Hess BJ, Lynn LA, Holmboe ES, Lipner RS. Toward better care coordination through improved communication with referring physicians. Acad Med. 2009;84:S109–S112. doi: 10.1097/ACM.0b013e3181b37ac7. [DOI] [PubMed] [Google Scholar]
  • 9.O’Malley AS, Tynan A, Cohen GR, Kemper N, Davis MM. Coordination of care by primary care practices: strategies, lessons and implications. Res Briefs. 2009;12:1–16. [PubMed] [Google Scholar]
  • 10.Shetty G, Brownson CA. Characteristics of organizational resources and supports for self management in primary care. Diabetes Educ. 2007;33:185S–192S. doi: 10.1177/0145721707304171. [DOI] [PubMed] [Google Scholar]
  • 11.Pham HH, O’Malley AS, Bach PB, Saiontz-Martinez C, Schrag D. Primary care physicians’ links to other physicians through Medicare patients: the scope of care coordination. Ann Intern Med. 2009;150:236–242. doi: 10.7326/0003-4819-150-4-200902170-00004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.MacPhail LH, Neuwirth EB, Bellows J. Coordination of diabetes care in four delivery models using an electronic health record. Med Care. 2009;47:993–999. doi: 10.1097/MLR.0b013e31819e1ffe. [DOI] [PubMed] [Google Scholar]
  • 13.Stellefson M, Dipnarine K, Stopka C. The Chronic Care Model and diabetes management in US primary care settings: a systematic review. Prev Chronic Dis. 2013;10:120180. doi: 10.5888/pcd10.120180. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.McDonald KM, Schultz E, Albin L, et al. Care Coordination Atlas Version 3 (Prepared by Stanford University under subcontract to Battelle on Contract No. 290040020). AHRQ Publication No. 110023EF. Rockville, MD: Agency for Healthcare Research and Quality; 2010
  • 15.Preferred Practices and Performance Measures for Measuring and Reporting Care Coordination: a Consensus Report. Washington: NQF; 2010. [Google Scholar]
  • 16.Cooley WC, McAllister JW, Sherrieb K, Clark RE. The Medical Home Index: development and validation of a new practice-level measure of implementation of the Medical Home model. Ambul Pediatr. 2003;3:173–180. doi: 10.1367/1539-4409(2003)003&#x0003c;0173:TMHIDA&#x0003e;2.0.CO;2. [DOI] [PubMed] [Google Scholar]
  • 17.Bonomi AE, Wagner EH, Glasgow RE, VonKorff M. Assessment of chronic illness care (ACIC): a practical tool to measure quality improvement. Health Serv Res. 2002;37:791–820. doi: 10.1111/1475-6773.00049. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Weiner BJ, Amick H, Lee SY. Conceptualization and measurement of organizational readiness for change: a review of the literature in health services research and other fields. Med Care Res Rev. 2008;65:379–436. doi: 10.1177/1077558708317802. [DOI] [PubMed] [Google Scholar]
  • 19.California Healthcare Foundation and Community Clinics Initiative of Tides. Community Clinic EHR Readiness Assessment. http://www.careinnovations.org/knowledge-center/ehr-readiness-assessment/ (accessed July 12, 2013)
  • 20.Safety Net Medical Home Initiative, Horner K, Schaefer S, Wagner E. Care Coordination: reducing Care Fragmentation in Primary Care. 1st ed. Phillips KE, ed. Seattle, WA: The MacColl Institute for Healthcare Innovation at the Group Health Research Institute and Qualis Health; 2011
  • 21.Birnberg JM, Drum ML, Huang ES, et al. Development of a safety net medical home scale for clinics. J Gen Intern Med. 2011;26:1418–1425. doi: 10.1007/s11606-011-1767-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Brownson CA, Miller D, Crespo R, et al. Development and use of a quality improvement tool to assess self-management support in primary care. Jt Comm J Qual Saf. 2007;33:408–416. doi: 10.1016/s1553-7250(07)33047-x. [DOI] [PubMed] [Google Scholar]
  • 23.Gwet K. Handbook of Inter-Rater Reliability: how to Measure the Level of Agreement Between Two or Multiple Raters. Gaithersburg: Stataxis Publishing Company; 2001. [Google Scholar]
  • 24.Gwet KL. Computing inter-rater reliability and its variance in the presence of high agreement. Br J Math Stat Psychol. 2008;61:29–48. doi: 10.1348/000711006X126600. [DOI] [PubMed] [Google Scholar]
  • 25.Cohen J. A coefficient of agreement for nominal scales. Educ Psychol Meas. 1960;20:37–46. doi: 10.1177/001316446002000104. [DOI] [Google Scholar]
  • 26.Cicchetti DV, Feinstein AR. High agreement but low kappa: II. Resolving the paradoxes. J Clin Epidemiol. 1990;43:543–549. doi: 10.1016/0895-4356(90)90159-M. [DOI] [PubMed] [Google Scholar]
  • 27.Landis JR, Koch GG. The measurement of observer agreement for categorical data. Biometrics. 1977;33:159–174. doi: 10.2307/2529310. [DOI] [PubMed] [Google Scholar]
  • 28.Bland JM, Altman DG. Cronbach’s alpha. BMJ. 1997;314:572. doi: 10.1136/bmj.314.7080.572. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Kline P. The Handbook of Psychological Testing. 2. London: Routledge; 1999. [Google Scholar]
  • 30.Hattie J, Cooksey RW. Procedures for assessing the validities of tests using the “known-groups” method. Appl Psychol Meas. 1984;8:295–305. doi: 10.1177/014662168400800306. [DOI] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

ESM 1 (104.6KB, pdf)

(PDF 104 kb)


Articles from Journal of General Internal Medicine are provided here courtesy of Society of General Internal Medicine

RESOURCES