Skip to main content
Journal of General Internal Medicine logoLink to Journal of General Internal Medicine
. 2010 Aug 25;25(Suppl 4):586–592. doi: 10.1007/s11606-010-1358-1

Developing Measures of Educational Change for Academic Health Care Teams Implementing the Chronic Care Model in Teaching Practices

Judith L Bowen 1,, David P Stevens 2, Connie S Sixta 3, Lloyd Provost 4, Julie K Johnson 5, Donna M Woods 6, Edward H Wagner 7
PMCID: PMC2940445  PMID: 20737234

Abstract

BACKGROUND

The Chronic Care Model (CCM) is a multidimensional framework designed to improve care for patients with chronic health conditions. The model strives for productive interactions between informed, activated patients and proactive practice teams, resulting in better clinical outcomes and greater satisfaction. While measures for improving care may be clear, measures of residents’ competency to provide chronic care do not exist. This report describes the process used to develop educational measures and results from CCM settings that used them to monitor curricular innovations.

SUBJECTS

Twenty-six academic health care teams participating in the national and California Academic Chronic Care Collaboratives.

METHOD

Using successive discussion groups and surveys, participants engaged in an iterative process to identify desirable and feasible educational measures for curricula that addressed educational objectives linked to the CCM. The measures were designed to facilitate residency programs’ abilities to address new accreditation requirements and tested with teams actively engaged in redesigning educational programs.

ANALYSIS

Field notes from each discussion and lists from work groups were synthesized using the CCM framework. Descriptive statistics were used to report survey results and measurement performance.

RESULTS

Work groups generated educational objectives and 17 associated measurements. Seventeen (65%) teams provided feasibility and desirability ratings for the 17 measures. Two process measures were selected for use by all teams. Teams reported variable success using the measures. Several teams reported use of additional measures, suggesting more extensive curricular change.

CONCLUSION

Using an iterative process in collaboration with program participants, we successfully defined a set of feasible and desirable education measures for academic health care teams using the CCM. These were used variably to measure the results of curricular changes, while simultaneously addressing requirements for residency accreditation.

Electronic supplementary material

The online version of this article (doi:10.1007/s11606-010-1358-1) contains supplementary material, which is available to authorized users.

KEY WORDS: Chronic Care Model, quality improvement, graduate medical education, ambulatory care, practice-based learning and improvement, systems-based practice

BACKGROUND

As the number of individuals living with a chronic illness increases, caring for patients with chronic health conditions has become a common endeavor for physicians-in-training in all settings.1 Optimal management of chronic illnesses should be a central goal for any medical education curriculum. Yet, the existing care models and their associated learning environments fall short of achieving expected outcomes for many chronic conditions.2 In an effort to address this gap in care, new models of care delivery have been tested in clinical practices,3,4 but few report efforts to make similar changes in academic training environments.5,6 In 2005, the Institute for Improving Clinical Care of the Association of American Medical Colleges (AAMC) launched a national collaborative designed to train health care teams that included residents-in-training to implement a new model for delivering systematic chronic care, the Chronic Care Model (CCM), in academic primary care practices. Based on the early successes of this national collaborative, the California Healthcare Foundation funded a similar effort in the state of California in 2007.

The Chronic Care Model (CCM) is a multidimensional framework designed to assist practices with improving care for patients with chronic health conditions.7 It differs from the traditional ambulatory visit where an individual physician evaluates an uninformed, passive patient. The redesigned practice strives for productive interactions between informed, activated patients and proactive practice teams, and results in better clinical outcomes and greater patient and clinician satisfaction. The CCM is summarized in Table 1. Practices that employ the CCM use evidence to provide quality care (decision support); redesign the way team care is delivered to improve quality (delivery system design); implement information systems to improve communication between team members, provide evidence-based reminders for care decisions, monitor the effectiveness of care for individual patients and populations, and identify patients in need of planned care interventions (clinical information systems); and support patients in their self-care through education, skill building, and self-belief (self-management support). These attributes are situated in the larger context of the health system that values and provides incentives for quality chronic care delivery (health care organization) and the community that supports patients in their self-management (community resources and policies). The CCM uses well-defined, evidence-based process and outcome measures to assess progress toward optimal management of individuals and populations with a specified chronic condition. For example, a practice monitoring improvement of care for patients with diabetes mellitus will track the number of patients reaching the target for glycemic control (outcome measure) as well as up-to-date monofilament foot examinations (process measure).

Table 1.

Chronic Care Model Components

GOAL: Productive interactions between activated patients and a prepared practice team leads to improved care that is safe, effective, timely, patient-centered, efficient, and equitable2,7
 
Elements related to the larger health system
COMPONENT EXAMPLES
Health system (HS) Visible support from organization’s leadership
Align incentives with improved quality of chronic care delivery
Community resources (C) Linking patients and their families/caregivers to effective community programs
Elements related to the clinical practice level
COMPONENT EXAMPLES
Delivery system design (DSD) Team members have defined roles and tasks
Planned visits are used to deliver evidence-based care
Teams provide follow-up and care management
Decision support (DS) Evidence-based guidelines are embedded in daily practice
Specialist expertise is integrated into team care
Guidelines and goals of care are shared with patients
Clinical Information Systems (CIS) Timely clinical reminders are built into the system
Population data is used to monitor delivery system re-design
Tracking of individual patients supports care planning
Self-management support (SMS) The patient’s role is central to managing the health condition
Strategies to support patients self-efficacy are integrated into care
Patients are linked to community resources to provide ongoing support in meeting individual goals

Implementing the CCM in academic primary care practices where residents provide some or all of the clinical care requires making changes in the educational program simultaneously with making changes in the practice. While the outcome measures for improving care may be clear, no similar measures of educational progress existed at the time the national collaborative was launched. Because a shift in residency program accreditation was underway,8 we saw this as an opportunity to assist the programs involved in the collaboratives with addressing new competency-based accreditation requirements.

In 1999, the Accreditation Council of Graduate Medical Education (ACGME) endorsed six new general competencies that all graduates must achieve and linked these competences to residency program accreditation.8 Two of these competencies in particular, systems-based practice (SBP) and practice-based learning and improvement (PBLI), can readily be met through resident participation in efforts to implement the CCM in their continuity clinic practices or other ambulatory settings.

Our aim was to develop a discrete set of educational measures at the interface between the CCM and the SBP and PBLI competencies that would facilitate curricular change in residency programs committed to redesigning their clinical practices using the CCM. We report here the iterative process undertaken to develop measures of educational change and the resulting measures that teams participating in the national and California collaboratives used to monitor curricular innovations.

METHODS

Setting

In January 2005, the AAMC, in partnership with MacColl Institute for Healthcare Innovation (Group Health Research Institute, Seattle, WA) and supported by the Robert Wood Johnson Foundation, called for proposals from teaching hospitals to participate in a national academic chronic care collaborative for the purpose of implementing the CCM in academic practice settings. Participants were required to form quality improvement teams that included both practice (e.g., physician and nurse champions) and educational (e.g., faculty and resident) leaders. Thirty-six self-selected quality improvement teams from 22 institutions across the US participated in this national collaborative. For most teams, the residency program director or an associate program director with the ability to facilitate change in the curriculum served as the educational leader. Teams participated in a series of five training sessions (four face-to-face and one virtual Web-based) between June 2005 and November 2006 to learn about the CCM and the model for improvement, a method using rapid cycle tests of change to improve the quality of care.9 In order to monitor their local improvement efforts, teams reported monthly on various process and outcome measures related to their work. For example, if the team was working to improve the quality of diabetes care, reported measures included the percent of the diabetic population in the practice with a hemoglobin A1c measured in the previous 6 months (process measure) and the percent of the population with A1c measurements less than 7.0% (outcome measure). In addition, teams were expected to implement educational changes that integrated the CCM into the residency program curriculum.

Based on the early successes of this national collaborative, the California Healthcare Foundation funded a similar effort for teaching hospitals in California in 2007. Teams participating in the national collaborative served as the development cohort, and those participating in the California collaborative served as the implementation cohort for the educational measures.

At the first training session of the national collaborative, we engaged volunteer participants in an iterative development process to (1) define potential educational objectives at the interface between the SBP and PBLI competencies and the Chronic Care Model, and propose specific measurement strategies, (2) determine which educational measures were both desirable and feasible at the local residency program and practice level, (3) select two required measures for use across all teams involved in the collaborative, and (4) report on educational outcomes using these measures and additional optional measures as desired. We chose this iterative method because it mirrored the quality improvement model whereby the end users are engaged in the development process as the most knowledgeable contributors and most invested in using the product.

Educational Measures

Using small group process, education representatives from each team and all other interested participants worked to define potential educational objectives and measures of educational change. The ACGME competencies were reviewed with attention to the specific detailed objectives for SBP and PBLI competencies. Participants were subdivided into six smaller groups based on the six components of the CCM and charged with generating learning objectives that would address their assigned CCM component and some part of the SBP or PBLI competencies. For example, the small group assigned to the Clinical Information Systems (CIS) component of the CCM suggested: “Residents routinely receive reports about their practice outcomes for the disease of interest (e.g., diabetes mellitus)” as an educational objective linked to the PBLI competency.

Ideas from each group were recorded and used to generate a list of learning objectives tied to each component of the CCM and linked to either the SBP or PBLI competency. This summary report was presented to the entire group of participants. Through discussion, the ideas were validated as representative and further revised for clarity. For example, the suggestion above was revised to “Residents will improve their understanding of the practice’s performance for the condition of interest.”

The authors then developed a survey with 17 proposed educational measures linked to these educational objectives (see online Appendix). For the example objective above, the measurement was: “Number of residents receiving, reviewing, and discussing at least one registry report for the practice population.” The disease registry is the primary way that practices keep track of the relevant clinical quality measures as a reflection of the quality of care patients are receiving. All participating teams were invited to rate each potential educational measure based on the value it would provide to the team in facilitating educational change at the training site (“desirability”) and if it would be feasible for the team to implement the measure at the training site (“feasibility). Likert scales were used to rate each item where 1 = not at all desirable/feasible, 3 = neutral, and 5 = highly desirable/feasible. Surveys and reminders were distributed via e-mail. Mean ratings were calculated for all survey items. A minimal threshold mean score of “4 = somewhat desirable/feasible” or higher was used to retain educational measures for further consideration.

During subsequent workshops at the next collaborative training session, survey results were presented, and the six potential educational measures meeting the minimal threshold for consideration were discussed. Based on these discussions and the goals of the collaborative, the authors selected two ‘required’ measures for application across the collaborative. They were chosen for high desirability and high feasibility and the potential to promote educational change in training programs in new curricular areas. The remaining 15 measures were optional for teams.

Teams identified the population of ‘learners’ they considered subjects of the educational changes. In most cases, the targeted learners were the residents expected to participate in the chronic care curriculum, but some teams also included others (e.g., teaching faculty) who were learning about the CCM. Teams were required to send monthly reports to the leadership of the collaborative including counts of learners who had met the expectations of the required measures and counts of learners for any optional educational measures they chose to use. We used descriptive statistics to report results.

RESULTS

Successive workshop discussions generated several learning objectives. For the PBLI competency, participants generated learning objectives related to two components of the CCM: clinical information systems and decision support. For the SBP competency, learning objectives were generated for four CCM components: health system, delivery system design, self-management support, and community resources. Table 2 shows the reformulation of these learning objectives, linking CCM components, ACGME competencies, and potential educational measures for each objective. Most of the educational measures were classified as “process” measures or ways of tracking resident “exposure” to new concepts. Six of the 17 measures were considered “outcome” measures, requiring residents to demonstrate observable behaviors.

Table 2.

Relationship Between Two ACGME Competencies,8 the Chronic Care model Components, Educational Objectives, and Measures of Educational Change

PRACTICE-BASED LEARNING AND IMPROVEMENT
Residents must demonstrate the ability to investigate and evaluate their care of patients, to appraise and assimilate scientific evidence, and to continuously improve patient care based on constant self-evaluation and life-long learning. Residents are expected to develop skills and habits to be able to meet the following goals:
-Identify strengths, deficiencies, and limits in one’s knowledge and expertise;
-Set learning and improvement goals;
-Identify and perform appropriate learning activities;
-Systematically analyze practice using quality improvement methods and implement changes with the goal of practice improvement;
-Incorporate formative evaluation feedback into daily practice;
-Locate, appraise, and assimilate evidence from scientific studies related to their patients’ health problems;
-Use information technology to optimize learning;
-Participate in the education of patients, families, students, residents, and other health professionals
 
CCM component Educational Objectives “Residents will…” Measures of educational change. Number of residents…
Decision support -Improve their knowledge and understanding of guideline development and implementation, including specific guidelines used to guide practice for the population of interest -Reviewing scientific evidence and development process behind guidelines
-Appraising the literature for clinical care guidelines
-Locate and appraise scientific evidence related to questions arising in the care of patients with the chronic condition of interest using electronic literature databases and EBM strategies -Identifying and answering a clinical question that arises in the delivery of care
-Teaching their colleagues how to use evidence to improve chronic disease management
Clinical information systems -Improve their understanding of how patient registries are created, including steps for data validation prior to interpretation -Participating in teaching sessions that address registry creation, validation, and interpretation
-Improve their understanding of the practice’s performance for the condition of interest -Receiving, reviewing, and discussing at least one registry report; analyzing the practice report and outlining an evidence-based improvement recommendation
Delivery system design -Learn and practice the rapid-cycle change method (PDSA) of improvement9 in response to practice performance reports -Participating in a PDSA cycle to test a change
 
SYSTEMS-BASED PRACTICE
Residents must demonstrate an awareness of and responsiveness to the larger context and system of health care, as well as the ability to call effectively on other resources in the system to provide optimal health care. Residents are expected to:
-Work effectively in various health care delivery settings and systems relevant to their clinical specialty;
-Coordinate patient care within the health care system relevant to their clinical specialty;
-Incorporate considerations of cost awareness and risk-benefit analysis in patient and/or population-based care as appropriate;
-Advocate for quality patient care and optimal patient care systems;
-Work in inter-professional teams to enhance patient safety and improve patient care quality;
-Participate in identifying system errors and implementing potential systems solutions
 
CCM component Educational objectives “Residents will…” Measures of educational change. Number of residents…
Health system -Improve their understanding of their own health system’s support for chronic care and describe institutional policies that support the care of patients with chronic conditions -Participating in health system teaching sessions
Delivery system design -Improve their understanding of and skills in functioning as members of health care teams focusing on systematic practice improvement to achieve desired practice outcomes -Actively participating on a practice improvement (QI) team
-Periodically and systematically assess patients’ needs in and satisfaction with the practice -Receiving the results of the P-ACIC10 completed by patients in the residents’ practice
Self-management support -Learn and practice patient-centered self-management strategies -Learning and demonstrating self-management support strategies
Community resources Residents will routinely assess patients’ community support and community-based activities -Documenting assessment of patients’ use of community-based support programs
Residents will improve their own knowledge of community-based programs, improve the system’s tracking of community support programs, and facilitate support for patients and patient population -Identifying relevant community resources

Seventeen of 26 (65%) participating teams completed the Education Measures desirability/feasibility survey (7 from Family Medicine programs, 7 from Internal Medicine programs, 2 from Medicine-Pediatrics programs, and 1 from a Nurse Practitioner program). Results are shown in Table 3. The six measures with mean ratings of “somewhat to highly” feasible and desirable were retained for further discussion (shown in bold). Two of the retained measures were outcome measures related to critical appraisal of the literature and answering clinical questions, activities familiar to residency training curricula. The remaining four measures were process measures not routinely a part of residency training: analyzing population reports on the quality of care provided and improving patient-centeredness attributes of the practice. The two measures that were selected for reporting across all participating teams were: (1) number of residents receiving, reviewing, and discussing at least one registry report for the practice population (PBLI competency objective under the clinical information systems component of the CCM); (2) number of residents learning and demonstrating self-management support strategies (SBP competency objective under the self-management support component of the CCM) (Table 3).

Table 3.

Results of Education Measures Survey, Required and Optional Measures

Educational measurea ACGMEb CCMc Type of measure Feasibility (mean) Desirability (mean)
REQUIRED
 Number of residents receiving, reviewing, and discussing at least one registry report for the practice population (PBLI) PBLI CIS Process 4.6 4.7
 Number of residents learning (and demonstrating) self-management support strategies (SBP) SBP SMS Process 4.2 4.7
OPTIONAL
 Number of residents participating in health system teaching sessions SBP HS Process 4.3 3.7
 Number of residents participating in teaching sessions that address registry creation, validation, and interpretation PBLI CIS Process 3.9 3.2
 Number of residents analyzing the practice report and outlining an evidence-based improvement recommendation (e.g., portfolio entry) PBLI CIS Outcome 3.9 4.6
 Number of residents participating in a PDSAd9 cycle to test a change for the condition of interest PBLI DSD Process 3.9 4.2
 Number of residents actively participating on a practice improvement (QI) team over total residents in target learning population. SBP DSD Process 3.6 4.8
 Number of patients completing a P-ACICe10in the target patient population SBP SMS Process 4.1 4.4
 Number of residents receiving the results summary of the P-ACICe SBP SMS Process 4.0 4.1
 Number of residents with satisfactory completion of a Self-Management mini-CEXf SBP SMS Outcome 3.4 4.0
 Number of residents setting self-directed learning action plans PBLI DS Process 3.5 3.9
 Number of residents completing their action plans, including teaching their colleagues how to use evidence to improve chronic disease management PBLI DS Outcome 3.6 4.1
 Number of residents reviewing scientific evidence and development process behind guidelines used in the practice PBLI DS Process 3.8 4.0
 Number of residents appraising the literature for clinical care guidelines and sharing findings with team members PBLI DS Outcome 4.0 4.0
 Number of residents identifying and answering a clinical question that arises in the delivery of care to the population of interest (and teaching others what was learned) PBLI DS Outcome 4.4 4.6
 Number of residents documenting assessment of patients’ use of community-based support programs SBP C Process 3.0 3.4
 Number of residents identifying relevant community resources that meet an important need arising in the delivery of care to the population of interest, adding resources to repository, and teaching others what was learned SBP C Outcome 2.9 3.6

N = 17, 65% response rate. Likert scale: 1 = not at all desirable/feasible, 3 = neutral, and 5 = highly desirable/feasible. Items in bold met threshold criteria for feasibility (≥4.0) and desirability (≥4.0) and were retained for further discussion. Required measures are shown first. Optional measures follow.

aFor each measure, the numerator is the count of residents completing the objective, and the denominator is the total number of residents in the target learning population (except for patients completing the P-ACIC where the denominator is the patient population)

bAccreditation Council for Graduate Medical Education competencies (PBLI is practice-based learning and improvement; SBP is systems-based practice)

cChronic Care Model components (CIS is clinical information systems, SMS is self-management support, HS is health system, DSD is delivery system design, DS is decision support, C is community resources)

dPDSA is ‘Plan-Do-Study-Act’ rapid cycle method used in quality improvement

eP-ACIC is ‘Patient assessment of chronic illness care,’ a patient survey tool

fMini-CEX is a focused clinical evaluation exercise12

National Collaborative Results

Teams participating in the national collaborative developed the measures as described above. In the fourth month of the program, these teams were asked to begin reporting on the required education measures. Only 4 of 26 and 3 of 26 participating teams reported results for the first and second required educational measures between October 2005 and October 2006, with 383 learners exposed to at least one of the curricular innovations assessed by the required education measures. The target learner population, initially defined as the residents in the clinical practice sites, evolved to include all team members in the practice sites since the concepts were new to everyone. Because both of the required measures represented new concepts in the curriculum, all teams reported a performance baseline of zero. Data are shown in Table 4.

Table 4.

Team Performance on Education Measures

Definition Required measure 1 Required measure 2
Number of residents receiving, reviewing, and discussing at least one registry report for the practice population Number of learners demonstrating self-management (SM) support strategies
National California National California
Number of teams reporting 4 (18.2%) 15 (78.9%) 3 (13.6%) 15 (78.9%)
Months of curricular exposure reported (mean, range) 13.5 (12–15) 9.5 (7 to 11) 11 (6–15) 9.6 (6 to 12)
Combined total learner population 347 383 311 352
Mean % of learners exposed (SD) 80.6% (SD 5.5) 83.9% (SD 24.0) 58.6% (SD 27.2) 75.5% (SD 25.9)
Median 81.7% 96.4% 64.9% 78.9%
Range of exposure across teams 73.0–86.1% 21.4–100% 28.8–82.2% 21.4–100%

California Collaborative

The California collaborative employed the educational measures developed by the national collaborative. The majority of teams reported these measures after the first team training session, and their extended use provided more opportunity to realize their effect. Results are shown in Table 4.

Figures 1 and 2 illustrate cumulative improvement over the reporting period for 15 California collaborative teams reporting results for at least 6 months. In addition to the required measures, some teams used additional education measures. Examples include: percent of learners conducting a planned visit (12 teams, 178 learners, 60% of learners achieved), percent of learners identifying, learning, and teaching others about a clinical question (5 teams, 47 learners, 88% achieved), and percent of learners completing a Plan-Do-Study-Act cycle (9 teams, 149 learners, 95% achieved).

Figure 1.

Figure 1

Required measure 1: percent of learners reviewing a registry report. Legend: cumulative results for all reporting teams in California collaborative.

Figure 2.

Figure 2

Required measure 2: percent of learners demonstrating self-management support strategies. Legend: cumulative results for all reporting teams in California collaborative.

DISCUSSION

Using an iterative process in collaboration with those most impacted by the outcome, we successfully defined a set of feasible and desirable education measures at the intersection of the CCM and the ACGME Competencies that residency programs addressing curricular changes in chronic care could use to measure results. To facilitate early success in exposing learners to the new care delivery model, process rather than outcome measures were chosen for the required measures. We based this decision on our desire to minimize the additional burden to programs already undertaking extensive practice re-design at their local training sites and the relative resistance to more robust measures we detected during the development process.

As demonstrated by the low reporting rate, teams participating in the national collaborative had difficulty using educational measures in addition to the other requirements of their participation in the collaborative. This was less of a problem for California collaborative participants. Two observations may explain part of this difference. First, many of the California teams had previously used disease registries in their practices and did not need to overcome this significant obstacle. Second, a leading self-management program was previously developed and championed in California11, and many team members were familiar with the concepts. Although the development teams from the national collaborative lagged behind the implementation teams in the California collaborative, together they engaged more than 700 learners in the new curriculum. In addition, 80% of the implementation teams that reported measures used additional measures to drive the curricular change they desired.

Initially, we anticipated that residents would be the defined learner population. Teams recognized early that all members of the practice re-design were learning new skills and should be counted as learners participating in the curriculum, including faculty. As a result, faculty development was recognized to be as important a focus as resident engagement in the new curriculum.

Our study has several limitations. Not all teams participated in the development process, although 65% of teams completed the feasibility/desirability survey. We did not track specific participation and cannot comment on the differences between participants and non-participants. We also cannot detect differences between training disciplines and different measures may be more important in different learning environments or to different disciplines. Not all teams reported results of educational measures on a monthly bases and some did not report at all. Thus, these results reflect what is possible for teams who were able to overcome barriers to curricular change and successfully implement the CCM in their academic practices. Finally, these measures are for educational use, and their impact on patient care quality has not been evaluated.

This report offers a new approach to educational assessment for the development of ACGME competencies in the setting of chronic care. This work presents a framework for measuring educational outcomes linked to use of the CCM in residency training programs. Although process measures do not assure that new skills have been mastered, the stage is set for testing and refining educational outcome measures to improve residency training in chronic illness care.

Electronic supplementary material

Below is the link to the electronic supplementary material.

ESM 1 (32.5KB, doc)

(DOC 32 kb)

Acknowledgement

The Robert Wood Johnson Foundation and the California Healthcare Foundation generously supported the academic chronic care collaboratives that served as the basis for this work

Conflict of interest None disclosed.

References

  • 1.Bodenheimer T, Chen E, Bennett H. Confronting the growing burden of chronic disease: can the US health care workforce do the job? Health Affairs. 2009;28:64–74. doi: 10.1377/hlthaff.28.1.64. [DOI] [PubMed] [Google Scholar]
  • 2.Crossing the quality chasm: a new health system for the twenty-first century. Washington: National Academy Press; 2001. [Google Scholar]
  • 3.Bodenheimer T, Wagner EH, Grumbach K. Improving primary care for patients with chronic illness. JAMA. 2002;288:1775–1779. doi: 10.1001/jama.288.14.1775. [DOI] [PubMed] [Google Scholar]
  • 4.Bodenheimer T, Wagner EH, Grumbach K. Improving primary care for patients with chronic illness: the chronic care model, part 2. JAMA. 2002;288:1909–1914. doi: 10.1001/jama.288.15.1909. [DOI] [PubMed] [Google Scholar]
  • 5.Warm EJ, Schauer DP, Diers T, et al. The ambulatory long-block: An Accreditation Council for Graduate Medical Education (ACGME) Educational Innovations Project (EIP) J Gen Intern Med. 2008;23:921–6. doi: 10.1007/s11606-008-0588-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Dipiero A, Dorr DA, Kelso C, Bowen JL. Integrating systematic chronic care for diabetes into an academic general internal medicine resident-faculty practice. J Gen Intern Med. 2008;23:1749–1756. doi: 10.1007/s11606-008-0751-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Wagner EH, Austin BT, Davis C, Hindmarsh M, Schaefer J, Bonomi A. Improving chronic illness care: translating evidence into action. Health Affairs. 2001;20:64–78. doi: 10.1377/hlthaff.20.6.64. [DOI] [PubMed] [Google Scholar]
  • 8.ACGME Outcome Project. Available at: http://www.acgme.org/outcome/comp/compMin.asp, accessed March 23, 2010.
  • 9.Langley GL, Nolan KM, Nolan TW, Norman CL, Provost LP. The improvement guide: a practical approach to enhancing organizational performance. 2. San Francisco: Jossey-Bass Publishers; 2009. [Google Scholar]
  • 10.Bonomi AE, Wagner EH, Glasgow RE, Korff M. Assessment of Chronic Illness Care (ACIC): a practical tool to measure quality improvement. Health Serv Res. 2002;37:791–820. doi: 10.1111/1475-6773.00049. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Lorig KR, Ritter P, Stewart AL, et al. Chronic disease self-management program: two-year health status and health care utilization outcomes. Med Care. 2001;39:1217–1223. doi: 10.1097/00005650-200111000-00008. [DOI] [PubMed] [Google Scholar]
  • 12.Norcini JJ, Blank LL, Duffy FD, Fortna GS. The mini-CEX: a method for assessing clinical skills. Ann Int Med. 2003;138:476–481. doi: 10.7326/0003-4819-138-6-200303180-00012. [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Below is the link to the electronic supplementary material.

ESM 1 (32.5KB, doc)

(DOC 32 kb)


Articles from Journal of General Internal Medicine are provided here courtesy of Society of General Internal Medicine

RESOURCES