Abstract
Objective
We documented organizational costs for depression care quality improvement (QI) to develop an evidence-based, Veterans Health Administration (VA) adapted depression care model for primary care practices that performed well for patients, was sustained over time, and could be spread nationally in VA.
Data Sources and Study Setting
Project records and surveys from three multistate VA administrative regions and seven of their primary care practices.
Study Design
Descriptive analysis.
Data Collection
We documented project time commitments and expenses for 86 clinical QI and 42 technical expert support team participants for 4 years from initial contact through care model design, Plan–Do–Study–Act cycles, and achievement of stable workloads in which models functioned as routine care. We assessed time, salary costs, and costs for conference calls, meetings, e-mails, and other activities.
Principle Findings
Over an average of 27 months, all clinics began referring patients to care managers. Clinical participants spent 1,086 hours at a cost of $84,438. Technical experts spent 2,147 hours costing $197,787. Eighty-five percent of costs derived from initial regional engagement activities and care model design.
Conclusions
Organizational costs of the QI process for depression care in a large health care system were significant, and should be accounted for when planning for implementation of evidence-based depression care.
Keywords: Quality improvement, depression, primary care
Research shows major deficits in the quality of care for depression (Kerr et al. 2000; Charbonneau et al. 2003; Liu et al. 2006). Substantial evidence indicates that improved depression care and outcomes can be achieved by implementing evidence-based, cost-effective care models (Donohue and Pincus 2006; Gilbody et al. 2006; Williams et al. 2007). These models restructure how mental health care is delivered to primary care patients and require substantial change in the daily practice behavior of a range of provider types. Achieving this level of change requires the investment of organizational and clinical resources in a quality improvement (QI) process (Kilbourne et al. 2004). Most cost and cost-effectiveness analyses of QI efforts, however, focus on the patient care costs associated with the care model resulting from QI, rather than the organizational costs associated with the QI process itself (Wensing, van der Weijden, and Grol 1998; Greenhalgh et al. 2004; Grimshaw et al. 2004). The upfront organizational cost for the QI process is a key factor for decision makers to take into account when adopting a cost-effective evidence-based practice. In this paper, we report on organizational costs associated with QI for depression in the Veterans Health Administration (VA). The QI goals for the project were to achieve a VA-adapted, evidence-based depression care model that performed well, producing the expected good patient outcomes reported in the literature, sustained by initial practices, and spread to additional similar practices.
While there are few data on QI costs, researchers have documented the levels of effort required to implement evidence-based practices in health care organizations (Sharp et al. 2004; Stetler et al. 2006). Challenges in quantifying the costs associated with these efforts include the need to account for evolving implementation efforts over time, to link costs to specific activities associated with implementation, and to obtain complete and accurate data on these diverse activities. To meet these challenges, we developed novel applications of standard data collection and analytic methods. Using these methods, we report on the timelines, specific implementation activities, and associated costs for introducing evidence-based depression care models into routine care.
We carried out the study in three VA administrative regions as part of the initial stage of a project to encourage national implementation of improved depression care. The project used an evidence-based quality improvement (EBQI) approach (Wells et al. 2000; Rubenstein et al. 2002; Rubenstein et al. 2006) based on Continuous Quality Improvement principles (Berwick 1989; Shortell et al. 1995; Goldberg et al. 1998; Shortell, Bennett, and Byck 1998; Wagner, Glasgow et al. 2001) but focused specifically on adaptation and implementation of evidence-based care models from research literature rather than de novo practice redesign.
Previous EBQI efforts (Rubenstein et al. 2006) indicated that introducing evidence-based depression care models into staff model managed care organizations such as VA required adaptations of existing tools for each component of the chronic illness care model (Wagner et al. 1999; Wagner, Austin et al. 2001). These organizations have, for example, highly systematized information systems, well-developed provider educational approaches and cultures, and detailed clinical care policies. Developing adapted tools and refining them through Plan–Do–Study–Act (PDSA) cycles required levels of work time and technical expertise not easily available in typical clinical practices in these complex organizations. We therefore introduced a technical expert support team drawn from the research staffs of three VA health services research centers (Parker et al. 2008). No technical expert team members were from the practices implementing the new model. The clinical team, including clinical managers and regional and QI leaders, were the project's decision makers, while the technical expert team supported the decision making process and tool development as needed.
We know of no other study that attempts to document, in detail and over time, the costs of both technical expert support and clinical and QI management in implementing evidence-based organizational change for depression QI. Information on these costs and how they are distributed will enable better planning by clinical managers and policy makers. Our research questions focused on how QI costs changed over time, how costs were distributed between the clinical and technical expert support teams, and how much different types of activities cost overall and within each QI phase.
METHODS
Setting
The Translating Initiatives in Depression into Effective Solutions (TIDES) project was funded by the VA's Quality Enhancement Research Initiative (QUERI), which supports implementation research in the VA (Feussner, Kizer, and Demakis 2000; Rubenstein et al. 2000). The TIDES received Institutional Review Board (IRB) approval from all participating sites to carry out and evaluate depression QI.
The national VA health care system is divided into 22 regions, or Veterans Integrated Service Networks (VISNs), that have budgetary and administrative authority over all medical facilities in the region. Within each region, most day-to-day administrative and clinical functions are carried out by a set of health care systems (usually anchored by a hospital) that include a variety of local primary care practices, the majority of which are community-based. The TIDES targeted three multistate regional networks. Each region identified one or two practices to participate in TIDES.
The TIDES practices were in South Dakota (2), Wisconsin (1), Ohio (2), Florida (1), and Texas (1). Each of the practices saw between 4,600 and 14,000 primary care patients per year, were in rural areas or small cities, and had four to ten primary care clinicians.
QI Intervention
Implementation based on EBQI methods proceeded in phases marked by achievement of specified milestones by each participating region, medical center, and practice. The following stages and activities reflect the organization of this QI implementation.
1. Preparation Phase: This phase began with the first contact between the technical expert team and a region's leadership and included all activities before the design meeting. The purpose of this phase was to obtain leadership buy-in for participation in the TIDES QI program. Researchers offered participating regions an honorarium equivalent to half of a full time salary for a nurse care manager for 18 months and access to technical support for designing and implementing evidence-based depression care models. Leaders and researchers identified mental health, primary care, and administrative leadership. The first region began planning in April 2000, the second region in September 2000, and the third region in June 2001.
2. Design Phase: This phase began with each region's participation in a depression intervention design panel and included all project activities before the enrollment of the first patient under the newly designed depression care model at each practice. The goal of this phase was to adapt evidence-based care models to the VA setting. The design panel decided upon the care model's basic features (e.g., care management by telephone or in person). Regional and local practice leaders then designed the new model based on design panel specifications. The design phase was further classified into two subphases.
2a. Basic Design Subphase: This subphase began with the design panel meeting and ended with a local planning meeting at the practice level. Basic design focused on the intervention design panel itself, identification of TIDES practices, and establishment of the technical expert/clinical partnership organizational structures. Key decisions were to use Registered Nurse depression care managers (DCMs), to systematically monitor adherence to treatment and follow up by telephone, and to meet information technology needs through adaptation of the VA's computerized patient record system (CPRS). Technical expert and clinical partners together organized six workgroups, the first five of which were modeled after elements of the Chronic Illness Care Model (Wagner, Austin, and Von Korff 1996). These included (1) senior leadership, (2) provider education, (3) clinical informatics, (4) care management and patient self-management support, and (5) collaboration (Felker et al. 2006); a sixth workgroup, the faculty workgroup, focused on engaging clinical partners in papers, posters, and presentations. Workgroup membership consisted of clinical partner volunteers, research administrative staff, and two to six technical expert support team members.
2b. Practice Engagement Subphase: This subphase began with a local practice-planning meeting and ended with the enrollment of the first patient under the depression care model at each practice. The purpose of this subphase was to initiate provider education activities, informatics tools, and policy development (such as threatened suicide policies). Each region identified one nurse DCM during this subphase (Liu et al. 2007).
3. Implementation Phase: This phase began with the enrollment of the first patient in each practice in the depression care model, i.e., the beginning of the model implementation, and ended 12 months following the enrollment of the first patient. This phase represented a gradual transition to routine care with stable workloads. We divided the implementation phase into two 6-month subphases, although in reality the exact duration of each subphase differed somewhat across practices.
3a. Plan–Do–Study–Act (PDSA) Cycle Subphase: During the first 6 months after the beginning of model implementation, each participating practice tried out the depression care model, improved it, and further tailored it to local needs using iterative PDSA cycles (Langley et al. 1996).
3b. Routine Care Subphase: By the conclusion of this subphase, the second 6 months after the beginning of model implementation, all TIDES practices had the basic elements of the TIDES care model in place with a stable workload. All three DCMs reached their maximum patient panel size by Quarter 1, 2004, with 165 active patients per care manager (Liu et al. 2007).
Funding Sources for EBQI Activities
The VA's QUERI program supported technical expert support team activities. Clinical dollars and in-kind resources allocated by regional and local leadership supported clinical QI participant QI activities.
Cost Data Collection
Overview
Costs calculated for research/clinical partner implementation of evidence-based depression care models were based on data collected between April 26, 2000, and June 4, 2004. We collected data on as complete a set of participant hours and expenses as possible, including only activities directly related to QI. Expenses tracked included those related to travel, clinical informatics, leadership meetings, conference calls, development of training materials, training sessions, and e-mail communication. We included the costs of licenses for care management software. We excluded research-related costs such as those generated by research data collection and analysis or human subjects review. We also excluded costs associated with direct patient care.
Defining QI Participants
A QI participant is any individual who directly participated in the design or implementation of the depression care model, including giving or participating in educational or training sessions. Simply referring patients to the depression care model, for example, did not constitute QI participation. Technical expert support team participants were from three VA Medical Centers (VA Greater LA, VA Puget Sound, and VA Central Arkansas Healthcare Systems). Clinical QI participants included senior leaders from all three participating regions and medical centers, and local leaders from all participating practices.
Assigning Costs to Types of Intervention Activities
We assigned QI participant hours and costs to one of five activity classes (Table 1). Classes mirror EBQI workgroups other than the faculty workgroup (because faculty workgroup activities focused on research). We combined senior leadership and collaboration activities into a single class because these activities could not be reliably separated, and added a category for project coordination activities, which cut across workgroups and served to sustain them. An individual participant could carry out activities pertaining to more than one class. Only QI costs were included (e.g., DCM costs related to the EBQI implementation process were included while DCM costs associated with direct patient care were excluded). Data sources for estimates of costs and/or hours included surveys, e-mail, project schedules, meeting minutes, project cost records, and government salary information.
Table 1.
Implementation Activities
| Activity Class | Example Activities Included |
|---|---|
| Leadership and collaboration | Regional leadership design process |
| Individual and group communication through calls and e-mails | |
| In-person design panel and panel preparation | |
| Senior Leader and Collaboration Workgroup activities | |
| Coordination and policy development | |
| Provider education | Education Workgroup activities |
| Preparation and production of educational materials | |
| Local practice in-person educational activities | |
| Introductory educational conferences by technical expert support team, in person or video conference | |
| Ongoing seminars and academic detailing by clinical team | |
| Clinical informatics | Clinical reminder subgroup activities |
| Software development | |
| Software programming, licensing, and protocol development | |
| Server costs | |
| Pilot testing/PDSA cycles of informatics products | |
| Care management and patient self-management support | Care management and patient self-management support workgroup |
| Training for care manager role | |
| Marketing activities targeting practices and clinicians | |
| Involvement in PDSA cycles of care management | |
| Project coordination | Organizing meetings, tracking timelines, problem-solving implementation barriers |
PDSA, Plan–Do–Study–Act.
Email Data Collection and Review
We collected and archived all TIDES e-mail from the TIDES listserve and from individual workgroups. We excluded short e-mails (fewer than 10 words), duplications, nonproject related e-mails, and e-mails not generated by a research/clinical participant. One of three researchers then reviewed e-mails (excluding attachments) using an explicit review form and electronic word search strategy on e-mail subject lines to assign communications to one of the five activity classes, a timeline phase, and the technical expert support or clinical team. A second reviewer re-reviewed an initial set of 20 e-mails in each activity class. For the few disagreements (less than 5 percent), the first author (C. F. L.) made the final classification decision, and the other reviewers applied the decision rule to their subsequent reviews.
Data Collection from Project Records
We collected meeting schedules and minutes to measure time spent by each participant. For conference calls and in-person conferences, we assigned 90 percent of the scheduled time, rather than 100 percent, per attendee to account for lack of attendance for the full scheduled time. For in-person meetings, we allocated an additional 8 hours of travel time above the recorded meeting time for individuals who traveled by plane to a conference or meeting site, and 4 hours of travel time for individuals who traveled by car.
Survey Data Collection
We surveyed relevant participants on time spent setting up and maintaining clinical informatics support for the program and time spent developing educational materials and sessions. Surveyed participants included five members of the informatics workgroup (three clinical partner informatics specialists and two technical experts) and three members of the provider education workgroup (two clinical partner participants and one technical expert). One clinical informatics specialist did not respond.
We surveyed 13 members of the technical expert support team to estimate average time required per word for reading and composing e-mails. These participants recorded time for all project e-mails read or composed during 1 week. Based on 387 e-mails collected from the survey, we estimated the median times for reading (62 words/minute) and composing (21 words/minute). To calculate the time spent on e-mail, we counted the number of words in each e-mail and assigned a reading and a composing time based on the medians from our survey.
Salary Data Collection
All participants were VA employees. We determined each participant's government service level, job category (e.g., physician, nurse, etc.), and average hourly salary (Smith and Velez 2004), including fringe benefits.
Cost Estimation
We estimated all costs in 2003 dollars. We assigned costs to e-mail activities by multiplying time spent by hourly salary. We similarly assigned costs for meetings, but used 90 percent of meeting/conference call hours as the basis for costs, and accounted for transportation and accommodation based on budgeted expenses. We bounded our estimates by calculating ranges. Cost ranges for conference calls or in-person meetings indicate the difference between 80 percent versus 100 percent of measured participant hours. Ranges for travel costs other than time are based on 90 percent versus 110 percent of trip budgets. Ranges for e-mail costs represent the difference between calculated time-per-word estimates from the 25th percentile (reported quicker email reading/composing) survey respondents versus the 75th percentile (slower) respondents. Costs for software licenses were actual costs documented by purchase orders.
Analysis
Analyses are descriptive. We calculated the number of persons involved in each implementation phase (person counts) and the number of hours spent (person hours), with separate estimates for clinical QI and technical expert support team activities. Separate reporting for these two types of activities reflects the separate funding sources used for each (i.e., regional and local clinical resources versus external implementation research resources).
RESULTS
Time Required for Preparation and Design Phases
Timelines for preparation, design (basic design and practice engagement subphases), and implementation (PDSA cycle and routine care subphases) across regions and practices are shown in Figure 1. The design panel meeting was held jointly for regions A and B. Overall, preparation and design phases took an average of 27 months (range 23–34 months), with 9 months (range 6–11 months) for preparation and 17 months (range 12–30 months) for design. Within the design phase, the basic design subphase averaged 3 months (range 1–7 months) and the practice engagement subphase averaged 14 months (range 6–21 months). The length of the design phase decreased sequentially from 20 months in the first region to enter preparation (Region A), to 19 months in the second (Region B) and 12 months in the third (Region C).
Figure 1.
Phases and Timeline of Implementation Process by Region and Practice
QI Activities Completed by Participants
The QI participants carried out 197 regular conference calls, two leadership and design panel meetings (one for regions A and B and one for region C), one DCM training session, and 10 provider education sessions during our data collection window. The project generated 31,508 e-mails, and of these 18,253 e-mails generated by 125 individuals met inclusion criteria. Twenty-nine QI participants made 59 trips to 11 events that met our inclusion criteria.
QI Costs
Table 2 summarizes the total person counts, person hours, and cost estimates for the clinical partners and the technical expert support team. Over the 4-year period, 128 persons contributed an estimated 3,233 hours (range 2,738–3,813) for a total estimated cost of $282,225 (range $257,454–310,852). The technical expert support team accounted for 66 percent of the total hours spent and 70 percent of the total costs. Among clinical participants, senior leaders from regions and medical centers accounted for 59 percent of person counts, 84 percent of person hours, and 79 percent of total costs, the balances being contributed by the local practices.
Table 2.
Person Counts, Estimated Person Hours and Costs*
| Individuals Participating in TIDES | Total Person Hours | Total Costs | ||||
|---|---|---|---|---|---|---|
| Participants | % | Person Counts | % | Hours | % | Dollars |
| Technical Expert Team M.D. and Ph.D. investigators Staff | 33 | 42 | 66 | 2,147 (1,786–2,583) | 70 | 197,787 (181,778–216,951) |
| Clinical partners Senior leaders from regions and medical centers Local leaders and clinical staff Care managers | 67 | 86 | 34 | 1,086 (952–1,231) | 30 | 84,438 (75,676–93,901) |
| All participants | 100 | 128 | 100 | 3,233 (2,738–3,813) | 100 | 282,225 (257,454–310,852) |
The ranges in parentheses are based on sensitivity analyses.
TIDES, Translating Initiatives in Depression into Effective Solutions.
Table 3 shows the proportion of total personnel hours and costs accounted for by each activity class and each sequential implementation phase. As expected, both technical expert support team and clinical participants contributed most of their hours (76 percent [1,639 hours/2,147 hours] for technical experts and 68 percent [742 hours/1,086 hours] for clinical participants) during the preparation and design phases, with decreases as the project moved toward routine care. During preparation and design, leadership and collaboration activities accounted for the greatest number of hours (49 percent of clinical participant hours and 29 percent of technical expert participant hours) followed by provider education (16 percent clinical, 29 percent expert) and clinical informatics (19 percent clinical, 19 percent expert). Clinical informatics (including software costs) accounted for the largest single share (41 percent) of technical expert team costs, while leadership and collaboration accounted for the largest share (61 percent) of clinical participant costs.
Table 3.
Proportion of Person Hours and Costs by Implementation Phase and Activity Class for Technical Expert and Clinical Teams
| Clinical Team (n=86 participants) | Technical Expert Team (n=42 participants) | |||||
|---|---|---|---|---|---|---|
| Implementation Phase | Implementation Phase | |||||
| Preparation and Design Phases | PDSA Cycle Subphase | Routine Care Subphase | Preparation and Design Phases | PDSA Cycle Sub-Phase | Routine Care Subphase | |
| Total Hours=742 | Total Hours=179 | Total Hours=164 | Total Hours=1,639 | Total Hours=278 | Total Hours=230 | |
| Activity | % | % | % | % | % | % |
| Leadership and collaboration | 49 | 18 | 25 | 29 | 10 | 7 |
| Provider education | 16 | 6 | 0 | 25 | 1 | 3 |
| Clinical informatics | 19 | 6 | 3 | 19 | 16 | 10 |
| Care management and patient self-management support | 15 | 70 | 71 | 18 | 56 | 59 |
| Project coordination | 2 | 1 | 1 | 9 | 18 | 21 |
| Total percent | 100 | 100 | 100 | 100 | 100 | 100 |
| Total Cost= $61,795 | Total Cost= $11,860 | Total Cost= $10,784 | Total Cost= $177,193 | Total Cost= $11,287 | Total Cost= $9,307 | |
| % | % | % | % | % | % | |
| Leadership and collaboration | 61 | 21 | 29 | 24 | 11 | 10 |
| Provider education | 15 | 8 | 0 | 25 | 2 | 5 |
| Clinical informatics | 9 | 4 | 2 | 41 | 14 | 8 |
| Care management and patient self-management support | 14 | 66 | 68 | 7 | 55 | 57 |
| Project coordination | 2 | 1 | 1 | 3 | 18 | 20 |
| Total percent | 100 | 100 | 100 | 100 | 100 | 100 |
During the implementation phase, activities supporting care management accounted for the most person hours for both technical experts (56 percent for the PDSA cycle subphase and 59 percent for the routine care subphase) and clinical (70 percent for the PDSA cycle subphase and 71 percent for the routine care subphase) participants. While the final, routine care subphase consumed the fewest clinical and technical expert hours, with hours for provider education decreasing to zero for the clinical team, this subphase still required 38 hours per month from the technical experts and 27 hours per month from the clinical team. The trends for costs closely paralleled those reported above for personnel hours.
Among all types of activities, in-person meetings and conference calls consumed the most resources in terms of both person hours and costs for clinical participants (905 hours and $75,861) and technical experts (1,413 hours and $103,100). Most in-person meetings occurred during the preparation and design phases, while conference calls accounted for the majority of personnel hours and costs during implementation. This was true for both clinical (302 hours and $19,799) and technical expert (332 hours and $13,496) participants. Finally, e-mail communication consumed 481 hours and $20,441 for the technical expert team and 73 hours and $4,512 for the clinical team.
DISCUSSION
We documented organizational costs associated with the initial stages of a QI process aimed at paving the way for national implementation of evidence-based depression care models in the VA. Because we incorporated empirical data on activities such as e-mailing and conference calls that are not typically assessed, and identified costs beginning with the earliest QI phases, our estimates are likely to be more comprehensive than those reported in previous QI cost studies. We also developed and applied an innovative framework for reporting QI costs over time and across activities. The costs we report and our framework for reporting them provide important insights for future QI planning and research.
Costs are most meaningful when considered in terms of what the expenditures have achieved. The EBQI process upon which this implementation was based was intended to adapt research-tested depression care models and tools to the VA context. The intended outcomes of this process included the development of an adapted model that performed well for patients and could be sustained and spread.
In terms of model performance goals, the outcomes of patients receiving TIDES care management were similar to the outcomes of VA patients randomized to intervention arms of other collaborative care effectiveness studies (Gilbody et al. 2006; Donohue and Pincus 2007; Williams et al. 2007). For example, among patients referred to DCMs, 82 percent were treated for depression in primary care and 74 percent stayed on medication; 90 percent of primary care patients and 50 percent of mental health patients had clinically significant reductions in depressive symptomatology (PHQ9 scores <10) at 6 months (Liu et al. 2007).
The EBQI process, and the costs it entailed, also achieved intended results in terms of sustainability and spread. TIDES care models endured at all original sites (continuously cared for patients) for over 2 years, and remain active in five of the seven sites in 2008 (sustained over 5 years). In terms of spread goals, by 2006 the TIDES model was active in more than 50 VA practices. TIDES became one of the core models for VA's national mental health/primary care integration initiative, which began funding local sites in 2006 and was translated into policy in 2008. The TIDES model continues to spread based on national priorities.
Total costs for reaching the routine use of TIDES depression care models in two to three test practices in each of the three administrative regions amounted to almost $100,000 per region. The vast majority (85 percent) of QI costs were incurred during preparation and design phases. The meaning of these costs, however, depends significantly on context. QI in a large, highly systematized organization like the VA may require more upfront leadership and design than in a small independent practice. In addition, project goals of preparing for regional and national implementation may have placed especially heavy demands on care model design.
For example, while the VA's electronic medical record extensively supports QI (Asch et al. 2004), it is also a barrier. All basic practice activities in VA connect to the medical record; for a care model to become routine, it must be integrated into VA's medical record system. Yet, internal VA software development is constrained by ongoing heavy programming and administrative demands. Informatics development accounted for a third (33 percent) of all preparation and design costs. Integration of the program into standard VA policies and culture was likewise costly, reflected in leadership and collaboration costs accounting for another third (34 percent) of these costs. Once in place, the array of standardized features built into the new care model may have supported its sustainability and spread.
The decreasing time required for care model implementation in sequentially involved regions and continued spread of the model provide cautious optimism that the early design investment may have had downstream advantages. Future research should determine which QI organizational costs are one-time costs, and which are still incurred as new practices implement the adapted model.
Literature on depression care improvement has focused extensively on cost-effectiveness analysis, which compares treatment costs and outcomes at the patient level (Gilbody et al. 2006; Donohue, and Pincus 2007). Cost effectiveness is a critical measure of whether depression care models warrant widespread implementation. The organizational costs of QI, however, are seldom reported. In a previous, similar depression QI project (Rubenstein et al. 1995, 2002), Kaiser Permanente allocated $166,503 for QI design and implementation in three practices. The Institute for Healthcare Improvement sponsored Breakthrough Series QI collaboratives on various conditions (Wagner, Glasgow et al. 2001). Collaboratives used expert faculty, meetings (learning sessions), and e-mail to support QI teams from multiple organizations. Preliminary estimates of costs for these programs (Keeler 2004) from an external evaluation for three collaboratives on congestive heart failure or diabetes (Cretin, Shortell, and Keeler 2004) ranged from $81,000 per organization to $148,000. In a similar depression collaborative, six privately funded organizations paid $12,500 each to participate along with 11 publicly funded organizations (payments not reported) (Meredith et al. 2006). These payments can be thought of as representing (though not measuring) the costs of technical support in the collaborative approach. While differences in design and reporting methods preclude direct comparisons, these studies and our results document the significant level of organizational costs for QI, and the need for better methods for understanding and planning for them.
We documented continued demands on both the clinical and technical expert teams through the end of data collection. Although costs incurred once the models reached routine care were a small proportion of total costs (7 percent), the activities they reflected may have been crucial. These activities included support for quarterly quality reports on care management, retraining clinical staff due to turnover and spread, and support for ongoing DCM education. Clearly, these activities must be transitioned away from QI participants, especially external technical experts, as a successful QI project moves into routine care. Future efforts to move research evidence into practice should explicitly plan for such hand-offs.
The QI model used in this project emphasized the technical expert support team as a method of leveraging clinical participant expertise and decision making. Our findings show that, consistent with our efforts to minimize clinical participants’ burden, the technical expert team accounted for 70 percent of the total labor and costs, although they represented only 33 percent of total QI participants.
The TIDES implementation used research team members as technical experts to support implementation. Implementation of research-based care models for depression and other conditions (Institute of Medicine Committee on Quality of Health Care in America 2001) often requires in-depth knowledge of both the scientific literature and the clinical organization structure. An increasing number of large health care organizations, including Group Health, British health trusts, Kaiser Permanente, Harvard/Pilgrim, the VA, and practice networks (Unutzer et al. 2002; Stange 2005; Nutting et al. 2007) embed health services researchers as technical experts within their organizations (Lomas 2005). The economics of embedded researchers supporting QI efforts should be further investigated.
This study has limitations. First, with no direct access to telephone records or to e-mails not stored in e-mail archives, we potentially underestimated participant time costs. Second, costs reported here are most applicable to early VA adopter practices, and may be different from costs experienced by later adopters or in other systems (Rogers 2003). Third, although the costs of evaluation and research-related activities were excluded, delays due to IRB processes could have affected the length of the practice engagement subphase. Fourth, costs reported here reflect the QI goals, methods, and settings assessed, and are likely to be different given other contexts.
In summary, this study is one of few assessing organizational costs of a QI process. We know of no other study of the organizational costs of QI for depression care. We found that QI consumed significant resources, but resulted in an acceptable care model that performed well for patients and demonstrated the ability to sustain and spread. The framework we developed for reporting organizational costs comprehensively identified where and when these costs accrued, providing insights into the QI process. This approach can provide useful information on QI costs to organizations, researchers, and policy makers. A focus on QI costs will ultimately support better planning for QI and the development of increasingly economical but effective QI approaches.
Acknowledgments
Joint Acknowledgement/Disclosure Statement: Funding provided by Health Services Research and Development, Department of Veterans Affairs (grant numbers MNT-02-209 and MNT-01-027).
We would like to acknowledge the support and assistance from TIDES/WAVES/COVES project research staff in submitting detailed project records and contributing their comments to this paper. In addition, we would like to acknowledge the participation and support from our three VISN partners (VISNs 10, 16, and 23).
There are no contractual rights to review the manuscript before submission, but there is a requirement that Health Services Research and Development, Department of Veterans Affairs, be given a copy of the accepted manuscript before publication. The views expressed herein are those of the authors and do not necessarily represent the views of the Department of Veterans Affairs and other affiliated Institutions.
Disclosures: None.
Supporting Information
Additional supporting information may be found in the online version of this article:
Appendix SA1: Author Matrix.
Appendix SA2: Other Contributions.
Please note: Wiley-Blackwell is not responsible for the content or functionality of any supporting materials supplied by the authors. Any queries (other than missing material) should be directed to the corresponding author for the article.
REFERENCES
- Asch S M, McGlynn E A, Hogan M M, Hayward R A, Shekelle P, Rubenstein L, Keesey J, Adams J, Kerr E A. Comparison of Quality of Care for Patients in the Veterans Health Administration and Patients in a National Sample. Annals of Internal Medicine. 2004;141(12):938–45. doi: 10.7326/0003-4819-141-12-200412210-00010. [DOI] [PubMed] [Google Scholar]
- Berwick D M. Continuous Improvement as an Ideal in Health Care. New England Journal of Medicine. 1989;320(1):53–6. doi: 10.1056/NEJM198901053200110. [DOI] [PubMed] [Google Scholar]
- Charbonneau A, Rosen A K, Ash A S, Owen R R, Kader B, Spiro A, Hankin C, Herz L, Pugh M J, Kazis L, Miller D R, Berlowitz D R. Measuring the Quality of Depression Care in a Large Integrated Health System. Medical Care. 2003;41(5):669–80. doi: 10.1097/01.MLR.0000062920.51692.B4. [DOI] [PubMed] [Google Scholar]
- Cretin S, Shortell S M, Keeler E B. An Evaluation of Collaborative Interventions to Improve Chronic Illness Care. Framework and Study Design. Evaluation Review. 2004;28(1):28–51. doi: 10.1177/0193841X03256298. [DOI] [PubMed] [Google Scholar]
- Donohue J M, Pincus H A. Reducing the Societal Burden of Depression: A Review of Economic Costs, Quality of Care and Effects of Treatment. Pharmacoeconomics. 2007;25(1):7–24. doi: 10.2165/00019053-200725010-00003. [DOI] [PubMed] [Google Scholar]
- Felker B, Rubenstein L, Bonner L, Yano E, Parker L, Worley L, Sherman S, Ober S, Chaney E. Developing Effective Collaboration between Primary Care and Mental Health Providers. Primary Care Companion to The Journal of Clinical Psychiatry. 2006;8(1):12–6. doi: 10.4088/pcc.v08n0102. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Feussner J R, Kizer K W, Demakis J G. The Quality Enhancement Research Initiative (QUERI): From Evidence to Action. Medical Care. 2000;38(6 suppl 1):I1–I6. doi: 10.1097/00005650-200006001-00001. [DOI] [PubMed] [Google Scholar]
- Gilbody S, Bower P, Fletcher J, Richards D, Sutton A J. Collaborative Care for Depression: A Cumulative Meta-Analysis and Review of Longer-Term Outcomes. Archives of Internal Medicine. 2006;166(21):2314–21. doi: 10.1001/archinte.166.21.2314. [DOI] [PubMed] [Google Scholar]
- Goldberg H I, Wagner E H, Fihn S D, Martin D P, Horowitz C R, Christensen D B, Cheadle A D, Diehr P, Simon G. A Randomized Controlled Trial of CQI Teams and Academic Detailing: Can They Alter Compliance with Guidelines? Joint Commission Journal of Quality Improvement. 1998;24(3):130–42. doi: 10.1016/s1070-3241(16)30367-4. [DOI] [PubMed] [Google Scholar]
- Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of Innovations in Service Organizations: Systematic Review and Recommendations. Milbank Quarterly. 2004;82(4):581–629. doi: 10.1111/j.0887-378X.2004.00325.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Grimshaw J M, Thomas R E, MacLennan G, Fraser C, Ramsay C R, Vale L, Whitty P, Eccles M P, Matowe L, Shirran L, Wensing M, Dijkstra R, Donaldson C. Effectiveness and Efficiency of Guideline Dissemination and Implementation Strategies. Health Technology Assessment. 2004;8(6):1–72. doi: 10.3310/hta8060. [DOI] [PubMed] [Google Scholar]
- Institute of Medicine Committee on Quality of Health Care in America. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: National Academies Press; 2001. [Google Scholar]
- Keeler E. Improving Chronic Care Evaluation. Seattle, WA: Congress of Improving Chronic Care: Innovations in Research and Practice; 2004. [Google Scholar]
- Kerr E A, McGlynn E A, Van Vorst K A, Wickstrom S L. Measuring Antidepressant Prescribing Practice in a Health Care System Using Administrative Data: Implications for Quality Measurement and Improvement. Joint Commission Journal of Quality Improvement. 2000;26(4):203–16. doi: 10.1016/s1070-3241(00)26015-x. [DOI] [PubMed] [Google Scholar]
- Kilbourne A M, Schulberg H C, Post E P, Rollman B L, Belnap B H, Pincus H A. Translating Evidence-Based Depression Management Services to Community-Based Primary Care Practices. Milbank Quarterly. 2004;82(4):631–59. doi: 10.1111/j.0887-378X.2004.00326.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Langley G, Nolan K, Nolan T, Norman C, Provost L. The Improvement Guide—A Practical Approach to Enhancing Organizational Performance. New York: Jossey-Bass Inc; 1996. [Google Scholar]
- Liu C F, Campbell D G, Chaney E F, Li Y F, McDonell M, Fihn S D. Depression Diagnosis and Antidepressant Treatment among Depressed VA Primary Care Patients. Administration and Policy in Mental Health and Mental Health Services Research. 2006;33:331–41. doi: 10.1007/s10488-006-0043-5. [DOI] [PubMed] [Google Scholar]
- Liu C F, Fortney J C, Vivell S, Vollen K, Raney W N, Revay B, Garcia-Maldonado M, Pyne J, Rubenstein L V, Chaney E. Time Allocation and Caseload Capacity in Telephone Depression Care Management. American Journal of Managed Care. 2007;13(12):652–60. [PubMed] [Google Scholar]
- Lomas J. Using Research to Inform Healthcare Managers’ and Policy Makers’ Questions: From Summative to Interpretive Synthesis. Healthcare Policy. 2005;1(1):55–71. [PMC free article] [PubMed] [Google Scholar]
- Meredith L S, Mendel P, Pearson M, Wu S Y, Joyce G, Straus J B, Ryan G, Keeler E, Unutzer J. Implementation and Maintenance of Quality Improvement for Treating Depression in Primary Care. Psychiatric Services. 2006;57(1):48–55. doi: 10.1176/appi.ps.57.1.48. [DOI] [PubMed] [Google Scholar]
- Nutting P A, Gallagher K M, Riley K, White S, Dietrich A J, Dickinson W P. Implementing a Depression Improvement Intervention in Five Health Care Organizations: Experience from the RESPECT-Depression Trial. Administration and Policy in Mental Health. 2007;34(2):127–37. doi: 10.1007/s10488-006-0090-y. [DOI] [PubMed] [Google Scholar]
- Parker L E, Kirchner J E, Bonner L M, Fickel J J, Ritchie M J, Simon C E, Yano E M. Creating a Quality Improvement Dialogue: Utilizing Knowledge from Frontline Staff, Managers, and Experts to Foster Health Care Quality Improvement. Qualitative Health Research. 2008 doi: 10.1177/1049732308329481. in press. [DOI] [PubMed] [Google Scholar]
- Rogers E M. Diffusion of Innovations. New York: The Free Press; 2003. [Google Scholar]
- Rubenstein L, McCoy J, Cope D, Barrett P, Hirsch S, Messer K, Young R. Improving Patient Quality of Life with Feedback to Physicians about Functional Status. Journal of General Internal Medicine. 1995;10(11):607–14. doi: 10.1007/BF02602744. [DOI] [PubMed] [Google Scholar]
- Rubenstein L V, Meredith L S, Parker L E, Gordon N P, Hickey S C, Oken C, Lee M L. Impacts of Evidence-Based Quality Improvement on Depression in Primary Care: A Randomized Experiment. Journal of General Internal Medicine. 2006;21(10):1027–35. doi: 10.1111/j.1525-1497.2006.00549.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rubenstein L V, Mittman B S, Yano E M, Mulrow C D. From Understanding Health Care Provider Behavior to Improving Health Care: The QUERI Framework for Quality Improvement. Medical Care. 2000;38(6 suppl 1):I129–41. [PubMed] [Google Scholar]
- Rubenstein L V, Parker L E, Meredith L S, Altschuler A, dePillis E, Hernandez J, Gordon N P. Understanding Team-Based Quality Improvement for Depression in Primary Care. Health Services Research. 2002;37(4):1009–29. doi: 10.1034/j.1600-0560.2002.63.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sharp N D, Pineros S L, Hsu C, Starks H, Sales A E. A Qualitative Study to Identify Barriers and Facilitators to Implementation of Pilot Interventions in the Veterans Health Administration (VHA) Northwest Network. Worldviews on Evidence Based Nursing. 2004;1(2):129–39. doi: 10.1111/j.1741-6787.2004.04023.x. [DOI] [PubMed] [Google Scholar]
- Shortell S M, Bennett C L, Byck G R. Assessing the Impact of Continuous Quality Improvement on Clinical Practice: What It Will Take to Accelerate Progress. Milbank Quarterly. 1998;76(4):593–624. doi: 10.1111/1468-0009.00107. 510. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Shortell S M, O'Brien J L, Carman J M, Foster R W, Hughes E F, Boerstler H, O'Connor E J. Assessing the Impact of Continuous Quality Improvement/Total Quality Management: Concept versus Implementation. Health Services Research. 1995;30(2):377–401. [PMC free article] [PubMed] [Google Scholar]
- Smith M W, Velez J P. “A Guide to Estimate Wages of VA Employees.” HERC Technical Report #12. Health Services Research & Development Service Health Economics Resource Center: Department of Veterans Affairs.
- Stange K C. In This Issue: Trade-Offs, Time Use, Depression Care. Annals of Family Medicine. 2005;3:482–3. [Google Scholar]
- Stetler C B, Legro M W, Rycroft-Malone J, Bowman C, Curran G, Guihan M, Hagedorn H, Pineros S, Wallace C M. Role of ‘External Facilitation’ in Implementation of Research Findings: A Qualitative Evaluation of Facilitation Experiences in the Veterans Health Administration. Implementation Science. 2006;1:23. doi: 10.1186/1748-5908-1-23. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Unutzer J, Katon W, Callahan C M, Williams J W, Jr., Hunkeler E, Harpole L, Hoffing M, Della Penna R D, Noel P H, Lin E H, Arean P A, Hegel M T, Tang L, Belin T R, Oishi S, Langston C. Collaborative Care Management of Late-Life Depression in the Primary Care Setting: A Randomized Controlled Trial. Journal of the American Medical Association. 2002;288(22):2836–45. doi: 10.1001/jama.288.22.2836. [DOI] [PubMed] [Google Scholar]
- Wagner E H, Austin B T, Davis C, Hindmarsh M, Schaeffer J, Bonomi A. Improving Chronic Illness Care: Translating Evidence into Action. Health Affairs (Millwood) 2001;20(6):64–78. doi: 10.1377/hlthaff.20.6.64. [DOI] [PubMed] [Google Scholar]
- Wagner E H, Austin B T, Von Korff M. Improving Outcomes in Chronic Illness. Managed Care Quarterly. 1996;4(2):12–25. [PubMed] [Google Scholar]
- Wagner E H, Davis C, Schaefer C J, Von Korff M, Austin B. A Survey of Leading Chronic Disease Management Programs: Are They Consistent with the Literature? Managed Care Quarterly. 1999;7(3):56–66. [PubMed] [Google Scholar]
- Wagner E H, Glasgow R E, Davis C, Bonomi A E, Provost L, McCulloch D, Carver P, Sixta C. Quality Improvement in Chronic Illness Care: A Collaborative Approach. Joint Commission Journal of Quality and Safety. 2001;27(2):63–80. doi: 10.1016/s1070-3241(01)27007-2. [DOI] [PubMed] [Google Scholar]
- Wells K B, Sherbourne C, Schoenbaum M, Duan N, Meredith L, Unutzer J, Miranda J, Carney M F, Rubenstein L V. Impact of Disseminating Quality Improvement Programs for Depression In Managed Primary Care: A Randomized Controlled Trial. Journal of the American Medical Association. 2000;283(2):212–20. doi: 10.1001/jama.283.2.212. [DOI] [PubMed] [Google Scholar]
- Wensing M, van der Weijden T, Grol R. Implementing Guidelines and Innovations in General Practice: Which Interventions Are Effective? British Journal of General Practice. 1998;48(427):991–7. [PMC free article] [PubMed] [Google Scholar]
- Williams J W, Jr., Gerrity M, Holsinger T, Dobscha S, Gaynes B, Dietrich A. Systematic Review of Multifaceted Interventions to Improve Depression Care. General Hospital Psychiatry. 2007;29(2):91–116. doi: 10.1016/j.genhosppsych.2006.12.003. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.

