Abstract
Objective
Illustrate the value of a strategy used for measuring the costs and resources used in the implementation process over and above the costs of the intervention itself in the context of a two-arm randomized controlled trial.
Methods
Counties in California and Ohio (sites) were invited to implement Multidimensional Treatment Foster Care (MTFC), an alternative to congregate care for youth. Participating sites (n=53) were randomized to one of two implementation = sites share information and move through the implementation process as a cohort facilitated by an MTFC purveyor or (2) Individual Implementation (IND: “as usual”) where sites work individually with the MTFC purveyor. The implementations were monitored using the Stages of Implementation Completion (SIC) measure of a number of observable activities, developed as part of the trial to segment the implementation process into 8 stages of implementation. Resource data gathered from the implementation purveyors and site participants were used to map costs onto each of the 8 stages to generate total cost measures stratified by type of resource and stage of implementation for each of the study arms.
Results
The SIC provided a feasible costing template to map costs onto observable activities and to enable the examination of important differences in implementation strategies for an evidence-based practice. The average total implementation cost prior to program start-up of CDT was $133,106; IND cost $118,699. While CDT cost more in a number of stages, it resulted in fewer county staff hours being used and shorter mean times to implementation than IND. In cases where rapidity of implementation of reducing staff time required for implementation is valued, then CDT would be the preferable implementation approach.
Conclusions
The SIC is a useful tool for determining implementation resources needed for new evidence-based practice programs for youth and particularly for comparing different implementation strategies that might be tried in pilot programs.
Keywords: Implementation, Cost, SIC, Multidimensional Treatment Foster Care (MTFC), sunk cost
1. Introduction
Over the last decade, there has been an increased effort to implement evidence-based practices (EBPs) into real world community settings (Horwitz, & Landsverk, 2010). Although many interventions are developed with success in randomized controlled trials, there remains a gap in routine dissemination of these models in part because the interventions developed might not fit community needs or ability (Insel, 2011). When agencies, states, or other entities decide to implement a new practice they are faced with considering the costs of delivering that practice along with the costs associated with going through the implementation process itself. Implementation costs also are dependent on both the costs of the program being implemented and the implementation process being employed (Proctor, Landsverk, Aarons, Chambers, Glisson, & Mittman, 2009). Important to communities considering an EBP, is an understanding of what aspects of the implementation process are necessary for program success, and what resources are necessary to complete them. Although leading theories and frameworks include conceptualization of implementation process costs as an important factor (Proctor, Silmere, Raghavan, Hovmand, Aarons, Bunger, Griffey, & Hensley, 2011; Damschroder, Aron, Keith, Kirth, Alexander, & Lowery 2009), such costs are an understudied aspect of implementation science (Ginexi, & Hilton, 2006).
One reason that implementation costs might not be routinely examined in relation to implementation procedures is the lack of standard measurement. Liu, Rubenstein, Kirchner, Fortney, Perkins, Ober, Pyne, & Chaney (2009) examined the organizational cost of a depression care quality improvement intervention in the VA system and demonstrated the vast amount of time and costs that went into implementation procedures that were unaccounted for by the expense of the intervention itself. Although a thorough analysis was conducted, the authors reported that the outcomes were specific to that one organization and context and the results were not meant to be generalized to other settings. This study highlights the importance of determining implementation costs and the potential for agencies to underestimate the necessary resources for intervention start-up.
There could be great value to using standardized methods for estimating implementation costs; most directly, decision makers who are responsible for determining the viability and feasibility of adopting new practices would benefit from the ability to generalize across settings. While some decision makers might underestimate the amount of resources needed for implementation costs, others might over-estimate resource needs and limit themselves from adopting practices that could benefit their communities. Ritzwoller and colleagues (2009) argued for the need for standardized methods of analysis of cost data for behavioral medicine implementation. They suggested that this gap in knowledge might play a large role in why new behavioral medicine interventions fail to translate from research to practice. They outlined a prospective strategy for data collection and suggested the need for sensitivity analyses in order to help with the generalizability of cost outcomes to multiple settings and contexts.
The current study extends the limited literature on implementation costs by providing a method for estimating costs for one child and adolescent mental health EBP, Multidimensional Treatment Foster Care (Chamberlain, 1998). MTFC is an alternative treatment to congregate care for youth referred for out-of-home placement. Backed by multiple randomized trials (Leve, Chamberlain, & Reid, 2005; Chamberlain, Leve, & DeGarmo, 2007; Eddy, Smith, Brown, & Reid, 2005; Chamberlain, & Reid, 1998) and demonstrated cost-effectiveness (Aos, Phillips, Barnoski, & Leib, 2001), MTFC has been recognized as a Blueprint for Violence Prevention (Elliot, 1998), by the surgeon general as a model program (US Dept. of Health, 2000a; US Dept. of Health, 2000b), and as a top-tier program by the Coalition for Evidence-Based Policy (2009). As part of a large randomized implementation trial of MTFC, counties in California and Ohio were randomized between two implementation conditions to evaluate the effectiveness of each strategy in helping non-early adopting counties successfully implement and sustain a new MTFC program. To evaluate outcomes, a method for measuring implementation success or failure was developed—the Stages of Implementation Completion (Saldana, Chamberlain, Wang, & Brown, 2011). The SIC is a tool for examining implementation activities and provides a method of tracking the amount of time taken to complete activities (i.e., duration), as well as which of the recommended activities are completed (i.e., proportion). The SIC was developed as part of the trial and defines 8 stages that reside within the three phases of implementation that are well-accepted in the implementation science literature: Pre-implementation, Implementation, and Sustainability.
The current study examined the potential of the SIC to serve as a template for mapping implementation costs. While the ongoing nature of the randomized trial does not yet allow for a cost-effectiveness analysis of the implementation strategies, this paper demonstrates the ability to assess different levels of costs at different points in the implementation process dependent on the implementation strategy used, despite the intervention itself being the same. The primary aims of this paper are to (1) present a structured approach to informing decision-making about implementing MTFC; (2) describe the implementation phase-related costs associated with MTFC that are likely to be encountered under usual conditions of implementation; and (3) compare marginal cost increases, if any, for using the CDT implementation strategy.
2. Method
The current cost study is part of a larger randomized implementation trial. Although the design and procedure for this study have been described in full detail elsewhere (Chamberlain, Brown, Saldana, Reid, Wang, Marsenich, Sosna, Padgett, & Bouwman, 2008; Wang, Saldana, Brown, & Chamberlain, 2010; Chamberlain, Saldana, Brown, & Leve, 2010), a brief background of the study is described, followed by those processes that were included in the costing procedures. All study procedures were approved by the Research Center’s IRB.
2.1. Study Population, Sampling, and Randomization
The primary intention of the study from which the current cost data was gathered, was to evaluate two implementation strategies, Community Development Teams (CDT) and Individual implementation as usual (IND), to determine which was most beneficial in assisting non-early adopting counties in implementing and sustaining MTFC. Prior to this study, the California Institute of Mental Health extended a general invitation for all California counties to receive training in MTFC. At that time, a total of 9 of the 58 counties elected to participate; these early adopting counties were excluded from the current study. In addition, 8 other counties were excluded that had a “low need” for MTFC, defined as having fewer than 6 new youth in group care at any one time (i.e., the target population for the MTFC model); this was measured during two snapshot days from the 2004 calendar year (the most current data at study commencement). The remaining California counties were targeted for recruitment into the study, as were multiple sites in LA County.
Randomization occurred at two levels: study condition (CDT or IND) and time-frame (cohort 1, 2, 3). LA sites were excluded from randomization due to a class action law-suit that placed them automatically into the CDT condition (LA sites were not included in primary study outcomes comparing conditions). Eligible counties were matched on background variables (e.g., size, number of children in poverty, use of Medicaid, and per capita and group home placement rate) to form three equivalent groupings. Next, these three matched groups were randomly assigned to three sequential cohorts with start-up timelines staggered at yearly intervals to address grant resource issues. Finally, within each of the yearly cohorts, counties were randomly assigned to the CDT or IND conditions. These random assignments generated six replicate groups of counties.
Three years into this study, the project was extended to Ohio counties using a similar randomization strategy in attempt to increase the sample size of implementing counties. The Ohio Center for Innovative Practice, in collaboration with the California Institute of Mental Health performed CDT recruitment activities. Of the 88 Ohio counties, 38 were eligible for the study. One county had a previously operating MTFC program and the other 49 were excluded due to “low need.” Although all eligible Ohio counties underwent randomization procedures, only the first 12 counties (i.e., those randomized to the first cohort) were recruited to address study resource limitations. Of these, 11 consented to participate. A total of 53 sites participated across both states.
2.2. Study Conditions
The CDT model is a manualized implementation process (Sosna, & Marsenich, 2006) developed to help counties adopt EBPs with the support of CDT consultants who are well versed in the practices and who have relationships with practice developers in order to facilitate communication between developers and adopters. A key feature of the CDT is peer-to-peer networking, with CDT consultants forming groups of up to 6 counties who work through the implementation process together by a series of meetings and calls. Doing so allows the counties to problem-solve together, develop resource sharing strategies, and learn about successful methods for working through the stages of implementation. The counties develop relationships and are encouraged to interact with each other and offer ongoing support to one another throughout the implementation process.
The IND condition is the typical individual process that adopting communities undergo when implementing MTFC. The MTFC purveyor guides adopters through the implementation process individually through a series of calls, followed by a stakeholder meeting. Sites in both conditions receive the standard consultation and quality assurance package (training, weekly consultation calls, video reviews, site visits) that is normally part of the implementation process for the IND condition. Participation in this package is required to achieve MTFC program certification.
2.3. The Stages of Implementation Completion (SIC)
The SIC (Chamberlain, Brown, & Saldana, 2011) is an assessment tool developed to fill the gap in the availability of assessment measures of implementation. The SIC has 8-stages that were developed to measure a community’s progress toward successful implementation of the MTFC intervention. Each of these 8 stages falls within the three phases of implementation. Although each of the activities included in each stage were defined specifically for MTFC, the SIC was designed such that the stages themselves are thought to be applicable across EBPs. Table 1 provides a detailed list of those activities within the 8 stages included on the SIC for MTFC.
Table 1.
Individual (IND) | Community Development Team (CDT) | ||||||||
---|---|---|---|---|---|---|---|---|---|
Hours | Hours | ||||||||
Stages | Description | Cost/Fees | County | Site | Cost/Fees | County | Site | ||
PRE-IMPLEMENTATION | Stage 1 | Engagement | |||||||
1_1 | Interest Indicated | ||||||||
1_2 | Agreed to Consider Implementation | 2 hrs | 2 hrs | ||||||
Stage 2 | Consideration of Feasibility | ||||||||
2_1 | 1st County Response to 1st Planning Contact | ||||||||
2_2 | Feasibility Assessment/CDT Meeting #1 Held | $1,500a | 3 hrs | $1,200b per county | 6 hrs | ||||
2_3 | MTFC Feasibility Questionnaire Completed | ||||||||
Stage 3 | Readiness Planning | ||||||||
3_1 | Cost Calculator / Funding Plan Review | 5 hrs | |||||||
3_2 | Staff Sequence, Timeline, Hire Plan Review | N/A | 5 hrs | ||||||
3_3 | FP Recruitment Review | N/A | 100 hrs | ||||||
3_4 | Referral Criteria & Liaison Review | 12 hrs | 20 hrs | ||||||
3_5 | Communication Plan Review | 2 hrs | 2 hrs | ||||||
3_6 | Stakeholder's Meeting / CDT Meeting # 2 Held | $2,500a | 4 hrs | 4 hrs | $1,200b per county | 6 hrs | 6 hrs | ||
3_7 | Written Implementation Plan Completed | N/A | 42 hrs | 25 hrs | 25–240 hrs* | ||||
3_8 | MTFC Service Provider Selected | 5 hrs | 5 hrs | $7,500c | 5 hrs | 5 hrs | |||
IMPLEMENTATION | Stage 4 | Staff Hired and Trained | |||||||
4_1 | Agency Checklist / Questionnaire Completed | ||||||||
4_2 | 1st MTFC Staff Hired | $18,870d a month | 15 hrs | $21,171d a month | 15 hrs | ||||
4_3 | Program Supervisor Trained | ||||||||
4_4 | Clinical Training Held (CDT Meeting #3) | $6, 950a & $5,250b | 1 week | $6,950a & $5,250b | 1 week | ||||
4_5 | Foster Parent Training held (CDT meeting #4) | $5,630a | 1 day | $5,630a | 1 day | ||||
4_6 | Site Consultant Assigned to Site | ||||||||
Stage 5 | Fidelity monitoring processes in place | ||||||||
5_1 | Parent Daily Report Training Held | $3,600a | 2 days | $3,600a | 2 days | ||||
5_2 | 1st "Developer"/Program Admin Call | 1 hr/month | 2 hrs/month | ||||||
Stage 6 | Services and Consultation Begin | ||||||||
6_1 | 1st Placement | $2,200d a month | $2,200d a month | ||||||
6_2 | 1st Consult Call | $1,500a a month | 1 hr/week | $1,500a a month | 1 hr/week | ||||
6_3 | 1st Clinical Meeting Video Review | 2 hrs/week | 2 hrs/week | ||||||
6_4 | 1st Foster Parent Meeting Video Review | 2 hrs/week | 2 hrs/week | ||||||
SUSTAINABILITY | Stage 7 | Ongoing Services, consultation, fidelity monitoring, and feedback | |||||||
7_1 | Site Visit #1 | $2,640a | 2–3 days | $2,640a | 2–3 days | ||||
7_2 | Site Visit #2 | $2,640a | 2–3 days | $2,640a | 2–3 days | ||||
7_3 | Site Visit #3 | $2,640a | 2–3 days | $2,640a | 2–3 days | ||||
7_4 | Implementation Review #1 | $1,020a | $1,020a | ||||||
7_5 | Implementation Review #2 | $1,020a | $1,020a | ||||||
7_6 | Implementation Review #3 | $1,020a | $1,020a | ||||||
7_7 | Program Assessment #1 | $1,840a | 3–4 hrs | $1,840a | 3–4 hrs | ||||
Stage 8 | Competency | ||||||||
8_1 | Certification Application | 40–80 hrs | 40–80 hrs | ||||||
8_2 | Certified | $2,050a | $2,050a |
Abbreviation: MTFC, Multi-Dimensional Foster Care; IND, Individual (Control); CDT, Community Development Team (Treatment).
median point (107.5) in the range was used for calculations of hours for this activity
IND Fee
Travel and Lodging
CDT Fee
Cost represents average CA site costs
The SIC is a date-driven measure intended to be completed by the purveyor or developer of a model while monitoring the implementation process of new adopters. Scores for both the speed of implementation (i.e., Duration) and the proportion of implementation activities completed are calculated to determine if such factors influence the successful adoption of the intervention. Prior work suggested that the SIC predicts successful program start-up for MTFC programs in non-early adopting communities (US Dept. of Health, 2000a).
2.4. Costing Procedures
Both implementation conditions have costs associated with their efforts. Some of these costs are related to hours put forth to proceed through the implementation process and others are fees that are typically charged by the implementation consultant. For the current study, grant resources paid for the typical fees such that neither condition was responsible for paying, but were responsible for costs associated with person hours. For example, in Stage 2 the standard fee for conducting a Feasibility Assessment is $1,500 for IND, whereas CDT counties pay $1,200 for travel and lodging associated with attending the CDT Meeting. For the purpose of costing procedures, however, fees were included as if sites were operating under “real world” conditions. Information related to costs associated with implementation activities was obtained through interviews with CDT and IND purveyors, through calls with implementing counties, and from data obtained as part of the assessments conducted with clinical staff.
2.4.1. Fees
The fee structure between the IND and CDT conditions differs. In the IND condition, Stage 2 (Consideration of Feasibility) is completed primarily by the site with minimal contact with the MTFC purveyor. However, during Stage 3 (Readiness Planning), the MTFC purveyor interacts with county stakeholders (county level supervisors, case managers, referring agents, policy makers and other relevant individuals) and works directly with the implementing agency. To move from Stage 2 to Stage 3, adopters then pay the purveyor a fee of $1500. This fee pays for implementation consultation by the purveyor through the middle of Stage 3 when a stakeholder meeting is held just prior to hiring the clinical team. At this point, an additional $2500 fee is paid for the stakeholder meeting and the beginning of staff training (Stage 4).
In the CDT condition, sites receive the support of the CDT condition without a fee until they move to Stage 4 (staff hired and trained). Prior to Stage 4, sites incur costs ($1200) related to travel to the peer-to-peer CDT meetings, but do not pay a fee to receive consultation guidance through the implementation process. If through this consultation they determine that they wanted to proceed to Stage 4, a fee of $7500 is paid to the CDT purveyor.
2.4.2. Hours
Each of the implementation activities assessed by the SIC involved time and effort put forth by the site. Some of these hours are committed at the county system leader or “County Hours” level (e.g., Feasibility Assessment), others at the program site or “Site Hours” level (Staff Hiring), and others involve both county and site effort (Readiness Planning). For the data in this study, the number of hours spent on activities was determined from interviews with the purveyor organizations for both IND and CDT and from direct contact with implementing sites. For comparison of results, the number of hours required from both county system leaders and site staff were summed across each stage for each condition.
2.4.3. Salary and FTE of Staff
The costing of clinical salary and FTE (i.e., full-time equivalent) required for program implementation was accomplished from data provided directly by staff participating in the parent grant trial. As part of their baseline assessment, participants were asked to report their FTE on the MTFC program, their date of hire, and their salary. However, due to variations in costs between states (Ohio reported salaries were 19% lower than those reported in California), and the minimal number of Ohio counties that proceeded through the implementation process, only salaries from California were used in salary calculations. These data were averaged across each of the implementation conditions to determine the average salary and FTE of the clinical staff.
3. Results
Using the SIC as a method for mapping implementation costs revealed that differing implementation methods for the MTFC model used resources differently for some of the implementation procedures, despite many of the intervention costs themselves remaining the same across methods. As shown in Table 1, differences in costs of implementation methods occurred primarily during the pre-implementation phase (Stages 2–3), with an additional minor difference in activity 2 of Stage 4 (implementation phase). As shown, Stage 2 cost more out of pocket for the IND condition ($1,500) versus the CDT condition ($1,200), but also took half of the time from county stakeholders in the IND condition (3 hours) than the CDT condition (6 hours) to complete. Although the cost of travel is included in Stage 2, this calculation does not take into account the travel time that was necessary from CDT stakeholders to attend the CDT meeting. On the other hand, Stage 3 required more time for sites in the IND condition (206 hours) than the CDT condition (154.5 hours), but cost less (IND = $2500; CDT = $8700) in terms of implementation fees that are typically charged. This difference in the number of hours to complete Stage 3 activities is a direct product of the CDT model. That is, the majority of activities during this Stage were completed during the group CDT meeting (Stage 3, activity 6). Therefore, the non-hours listed for the CDT condition for activities in the first five activities in Stage 3 are not because the activities were not completed, but rather because the hours for completing them fell within the sixth Stage 3 activity (CDT Meeting). In fact, additional analyses indicated that there was not a significant difference in the number of activities completed during Stage 3 between IND and CDT conditions [mean IND= 5.9; CDT = 5.2; t (23) = −0.6869 p< .50] indicating that both conditions completed a relatively equal number of Stage 3 activities.
An additional difference between conditions was the inclusion of county system leader stakeholders. For the completion of Readiness Planning (Stage 3), county system leaders in the IND condition contributed 23 hours of time, whereas those in the CDT condition contributed 36 hours. Moreover, the completion of the written implementation plan (Stage 3activity 7) was relatively consistent across IND sites (42 hours) whereas the amount of time provided for the CDT condition varied greatly (range of 23–240 hours). Note that for summation figures, the mid-point of this range was used. Given the high value of county system leaders’ time, this variation could have multiple implications for hidden costs.
Table 2 provides further information to consider when assessing the cost of implementation. Drawing from Table 1, the total fixed fees (i.e., standard fees charged by each of the purveyor organizations for implementation activities such as stakeholder meetings and trainings), variable fees (e.g., number of consulting calls), and travel and lodging fees are considered in addition to salary. Following the hiring suggestions of each purveyor, it appears that those in the IND condition hire staff sooner than those in the CDT condition in relation to serving the first child. For example, on average, the Program Supervisor is hired 190 days prior to first placement in the IND condition, and 153 days prior in the CDT condition. However, the percent effort of nearly each position is lower for those in the IND condition than in the CDT condition. For example, the Administrator is hired for 23% effort in the IND condition, but 39% effort in the CDT condition. These salary costs (total IND = $88,769; CDT = $95,776) are incurred prior to program start-up (i.e., first child served). These costs should be added to the costs of Stages 4 and 5, between when the first staff is hired (second activity in Stage 4) and the placement of the first child (first activity in Stage 6).
Table 2.
Individual (IND) | Community Development Team (CDT) | |||||||
---|---|---|---|---|---|---|---|---|
TOTAL FIXED FEES a | $20,180 | $23,680 | ||||||
TOTAL VARIABLE FEES b | $4,500 | $6,000 | ||||||
TOTAL TRAVEL & LODGING FEES c |
$5,250 | $7,650 | ||||||
Position | FTE | Days to 1st Placement |
Std. Dev. |
Salary Cost d | FTE | Days to 1st Placement |
Std. Dev. | Salary Cost d |
Administrator | 23% | 269 | 224 | $16,482 | 39% | 228 | 218 | $23,648 |
Program Supervisor Foster Parent | 99% | 190 | 135 | $39,907 | 82% | 153 | 128 | $26,597 |
Recruiter/PDR caller | 75% | 119 | 154 | $14,402 | 85% | 173 | 127 | $23,797 |
Family Therapist | 48% | 133 | 114 | $11,700 | 56% | 113 | 74 | $11,597 |
Individual Therapist | 42% | 24 | 149 | $1,543 | 66% | 100 | 124 | $10,137 |
Skills Trainer | 69% | 57 | 197 | $4,736 | 71% | −9 | 182 | $0 |
TOTAL STAFF COST | $88,769 | $95,776 | ||||||
AVERAGE SITE COST | $118,699 | $133,106 |
Abbreviations: PDR, Parent Daily Report; FTE, Full-time Equivalent.
includes IND & CDT standard flat fees
includes the consult call cost based off the condition average amount prior to 1st placement
includes the travel expenses for CDT meetings & Training
salary costs are based off California sites’ average by position
As the sites moved into Stage 4, aside from the costs related to hiring, both the IND and CDT conditions maintained similar costs and hours through Stage 8 (Competency). During these Implementation and Sustainability phases, both conditions were receiving the consultation and quality assurance services required for a successful MTFC program, but the CDT condition received one additional hour/month of interaction with the purveyor than did the IND condition (Stage 5 activity 2). It should be noted that although Table 2 shows that the total average site costs are greater for the CDT condition than the IND condition, due to the limited number of sites that implemented to child placement (n = 22) and thus low sample size, this difference is not statistically significant [t (20)=.5505, p<.60].
4. Discussion
Although the cost-effectiveness of the MTFC intervention for youth with behavior problems in foster care has been demonstrated compared to other youth receiving typical out-of-home services14, this is the first study to examine the costs of implementing this EBP. Implementation cost outcomes presented for the IND condition provide estimates for the cost of implementing MTFC using the typical implementation strategies employed by the purveyor. That is, these estimates are what can be expected by a “real world” site considering adopting MTFC, over and above the cost of the intervention. Similarly, these outcomes demonstrate the amount of resources that program managers can expect to spend during the different phases of the MTFC implementation process.
However, the comparison made through the costing procedures described in this paper suggest that different implementation strategies have different cost structures and potentially different benefits for guiding sites through the implementation process. These costs include both fees for service and person hours needed to complete the activities of implementation. Although the CDT condition costs nearly $6,000 (not counting salary costs) more than the IND condition to complete Stages 1 through 3, the number of person hours to complete these stages is less (IND total hours = 226; CDT = 177.5). Without knowing if the chances of developing a successful model adherent program are greater in one condition or the other (due to the ongoing nature of the study) it is not yet possible to determine if money for either condition was used in a cost-effective way. However, the differences do speak to variations in incentive structures and how different implementation approaches are used to approach the same goal. In the IND condition, fees are paid earlier on in the implementation process than the CDT condition and, therefore, sites might feel more invested in the program. On the other hand, county system leaders spend more time in the CDT condition than the IND condition early in the process and, therefore, might feel more invested in the program. Regardless, the fee structure (deferred versus prepaid fee schedule) might need to be considered when making decisions about what implementation procedure to use, based on fiscal cycles and budget limitations. These “sunk costs,” or costs incurred before a program is established, are largely unrecoverable and therefore, outcomes from this study demonstrate the importance of understanding not only the costs of the practice, but of the implementation process when determining resources allocation.
Further, despite the current inability to conduct formal cost analyses, these results suggest that the SIC can serve as the foundation for costing the implementation process. In turn, outcomes from these costing procedures can be used to conduct future examination of the costs of IND versus CDT implementation strategies. The utility of the SIC to capture variations in costs between the two conditions for the same intervention suggests that this measure might be a step toward closing the gap in standardized methods of costing implementation processes. Although the SIC itself was designed to assess the implementation of MTFC specifically, the use of the measure as a structure to define implementation behavior and associated costs for other EBPs might be successful. To do so, the SIC would necessitate adaptation for other practices. If future research were to consider this opportunity, it might be possible to not only examine implementation costs across implementation strategies, but across practices within the same strategy.
There may be a number of reasons that empirical estimates of implementation costs have been heretofore neglected in the literature on evaluating new mental health intervention practices. One barrier might be conceptual: economists often note that “sunk costs don’t matter”; that is, once a program has been implemented, the non-recoverable implementation costs should not be part of the decision about how to operate the program. While this is true – it is only of limited relevance. Sunk costs do matter to when decisions are made under uncertainty. Clearly, until a new practice has been sustained for some time, there is uncertainty as to whether it will be successful. Thus, even from a pure economic theory perspective, sunk implementation costs may matter in decision making moving forward. More importantly, implementation costs are, of course, not sunk until after the new practices are implemented. Policy makers must decide ex ante whether to invest in a new practice, and then implementation costs – especially implementation costs that will differ across different practice options – are very much part of the opportunity costs that must be considered against the expected (i.e., uncertain) future benefits. Having knowledge of when, in the implementation process, different (and types of) costs can be expected is critical in helping decision makers lay out a clear fiscal plan in order to ensure that proper resource allocation has been determined prior to advancing the practice toward implementation.
A more important second barrier might therefore be that researchers do not have access to a coherent and systematic framework for estimating implementation costs – a framework which is additionally applicable in multiple contexts. This research has demonstrated the utility of such a framework with the SIC. The 8-stage measure is quite general. The numbered stages listed in Table 1 are not specific to a particular practice; rather the stages should apply to a broad range of EBPs. But, the SIC framework is highly customizable – as evidenced by the sub-activities under each numbered stage, which are specific to the MTFC model. Future research can focus on applying the SIC to other practices by generating a very different set of specific sub-activities specific to each practice. Thus, the SIC tool provides a framework that is general in nature and simultaneously customizable. It is hoped that the SIC will provide a basis for including implementation costs in future research and that these costs can become a routine part of comparative (cost-) effectiveness analysis.
5. Conclusions and Limitations
Although the intention of the current study was to determine if the SIC could be used as a method for mapping implementation costs, several limitations need be considered. Most significant is the use of California salary data only. As noted, Ohio salaries were 19% lower on average than California salaries. Given the limited number of Ohio counties participating, including state as a covariate was not possible and thus, the decision was made to conduct all salary computations with California data. On the other hand, the number of hours, days until first placement and FTE figures did not differ between states and thus calculations utilize combined Ohio and California data. Had Ohio been included in the salary calculations, then a non-specific salary would have been presented that was not comparable and therefore, meaningless for programs considering how their costs might relate to those shown. Although including only California salaries is a noteworthy limitation, organizations from other states might now be able to estimate their costs in relation to California salaries.
A second limitation of this study is the lack of data related to overhead costs. That is, the SIC mapping does not include estimates for costs such as rent, office supplies, and utilities. In order to conduct cost analyses between EBPs such fees would be necessary. On the other hand, assuming that there would be no difference between implementation conditions for such overhead, the SIC can be helpful in determining costs to include in cost effectiveness analyses between implementation methods of the same EBP. Similarly, travel fees did not include the time for travel in calculations. This is applicable for CDT sites that traveled to CDT meetings for peer-to-peer networking, but not for IND sites who had stakeholder meetings at their site location. This is a significant limitation if individuals had to take a significant amount of additional time for travel; however, as reported by the CDT purveyors, part of the CDT program is to intentionally find a centralized location for sites to meet. The intention is to decrease travel time and to link sites together that are close in proximity to one another in order to help increase the potential for sustainability of peer-to-peer networking.
Finally, this study is limited in that only total average site costs are available to program start-up (i.e., reaching Stage 6) due to the ongoing nature of the project. Therefore, it is unknown if costs driven by salaries will become more disparate between conditions over time. As noted in the Results, due to the limited power of this study statistical differences between conditions were not found. It is possible, however, that if costs differences become greater over time that this difference might be found to be significant. Similarly, as new outcomes become available from the study, such as the number of youth placed per condition, there will be a greater ability to determine if there are differences in how well programs in each implementation condition have used their resources.
Acknowledgement
This work was supported by the National Institute of Mental Health R01MH076158, National Institute on Drug Abuse R01MH076158-05S1 and KDA021603. The authors would like to thank all of the participating counties for their time and contribution to this study. Chamberlain is a partner in Treatment Foster Care Consultants, Inc., a company that provides consultation to systems and agencies wishing to implement MTFC.
Footnotes
Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
Other authors report no biomedical financial interests or any potential conflicts of interest.
References
- Aos S, Phipps P, Barnoski R, Leib R. The comparative costs and benefits of programs to reduce crime (No. 01-05-1201) Olympia, WA: Washington State Institute for Public Policy; 2001. [Google Scholar]
- Chamberlain P, Reid JB. Comparison of two community alternatives to incarceration for chronic juvenile offenders. Journal of Consulting and Clinical Psychology. 1998;66:624–633. doi: 10.1037//0022-006x.66.4.624. [DOI] [PubMed] [Google Scholar]
- Chamberlain P. Family Strengthening Series. Washington, DC: U. S. Department of Justice; 1998. Treatment Foster Care. (OJJDP Bullentin NCJ 1734211) [Google Scholar]
- Chamberlain P, Leve LD, DeGarmo DS. Multi-dimensional treatment foster care for girls in the juvenile justice system 2-year follow-up of a randomized clinical trial. Journal of Consulting and Clinical Psychology. 2007;75:187–193. doi: 10.1037/0022-006X.75.1.187. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Chamberlain P, Brown CH, Saldana L, Reid J, Wang W, Marsenich L, Sosna T, Padgett C, Bouwman G. Engaging and recruiting counties in an experiment on implementing evidence-based practice in California. Administration and Policy in Mental Health and Mental Health Research. 2008;35:250–260. doi: 10.1007/s10488-008-0167-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Chamberlain P, Saldana L, Brown CH, Leve L. Implementation of Multidimensional Treatment Foster Care in California: a randomized control trial of an evidence-based practice. In: Roberts-DeGennaro M, Fogel S, editors. Using Evidence to Inform Practice for Community and Organizational Change. Chicago, Illinois: Lyceum Books; 2010. pp. 218–234. [Google Scholar]
- Chamberlain P, Brown CH, Saldana L. Observational Measure of Implementation Progress: the stages of implementation completion (SIC) Implementation Science. 2011;6 doi: 10.1186/1748-5908-6-116. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Coalition for Evidence Based Policy. [Accessed Sept 2009];Social programs that work. http://evidencebasedprograms.org/wordpress/?page_id=122. [Google Scholar]
- Damschroder LJ, Aron DC, Keith RE, Kirth SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implementation Science. 2009;4:50. doi: 10.1186/1748-5908-4-50. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Eddy JM, Smith P, Brown CH, Reid JB. A survey of prevention science training: implementations for educating the next generation. Prevention Science. 2005;6:59–71. doi: 10.1007/s11121-005-1253-x. [DOI] [PubMed] [Google Scholar]
- Elliott DS. Blueprint for violence prevention. Boulder, CO: University of Colorado, Institute of Behavioral Science; 1998. [Google Scholar]
- Ginexi EM, Hilton TF. What’s next for translation research? Evaluation & the Health Profession. 2006;29:334–347. doi: 10.1177/0163278706290409. [DOI] [PubMed] [Google Scholar]
- Horwitz SM, Landsverk J. Methodological issues in child welfare and children’s mental health implementation research. Administration and Policy in Mental Health and Mental Health Service Research. 2010;38:1–3. doi: 10.1007/s10488-010-0316-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Insel T. Making the most of our interventions research. [Accessed September 06, 2011];NIMH Director’s Blog. 2011 May 20; http://www.nimh.nih.gov/about/director/index.shtml.
- Leve LD, Chamberlain P, Reid JB. Intervention outcomes for girls referred from juvenile justice: effects on delinquency. Journal of Consulting and Clinical Psychology. 2005;73:1181–1185. doi: 10.1037/0022-006X.73.6.1181. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Liu C, Rubenstein LV, Kirchner JE, Fortney JC, Perkins MW, Ober SK, Pyne JM, Chaney EF. Organizational costs of quality improvement for depression care. Health Services Research. 2009;44:225–244. doi: 10.1111/j.1475-6773.2008.00911.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Proctor EK, Landsverk J, Aarons G, Chambers D, Glisson C, Mittman B. Implementation research in mental health services: an emerging science with conceptual, methodological, and training challenges. Administration and Policy in Mental Health and Mental Health Service Research. 2009;36:24–34. doi: 10.1007/s10488-008-0197-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Proctor EK, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, Griffey R, Hensley M. Outcomes for implementation research: conceptual distinction, measurement challenges, and research agenda. Administration and Policy in Mental Health and Mental Health Service Research. 2011;35:65–76. doi: 10.1007/s10488-010-0319-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ritzwoller DP, Sukhanova A, Gaglio B, Glasgow R. Costing behavioral interventions: A practical guide to enhance translation. Annals of Behavioral Medicine. 2009;37:218–227. doi: 10.1007/s12160-009-9088-5. [DOI] [PubMed] [Google Scholar]
- Saldana L, Chamberlain P, Wang W, Brown CH. Predicting program start-up using the stages of implementation measure. Administration and Policy in Mental Health and Mental Health Service Research. 2011 doi: 10.1007/s10488-011-0363-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sosna T, Marsenich L. Community Development Team Model: Supporting the model adherent implementation of programs and practices. Sacramento, CA: California Institute of Mental Health Publications; 2006. [Google Scholar]
- US Department of Health and Human Services. Child Maltreatment, 2000a. Washington, DC: US Dept of Health and Human Services, Administration on Children and Families; 2000. [Google Scholar]
- US Department of Health and Human Services. Children and Mental Health.(2000). In mental health: A report of the Surgeon General, 2000b. Washington, DC: US Government Printing Office; 2000. pp. 123–220. [Google Scholar]
- Wang W, Saldana L, Brown CH, Chamberlain P. Factors that influenced county system leaders to implement an evidence-based program: a baseline survey within a randomized controlled trial. Implementation Science. 2010;5 doi: 10.1186/1748-5908-5-72. [DOI] [PMC free article] [PubMed] [Google Scholar]