Abstract
Introduction:
Implementation of integrated strategies for improving access to behavioral health services for youth in the legal system requires evidence of the costs of changing existing practices, and stakeholders need to be aware of what types of investments (e.g., personnel, data systems) lead to more efficient implementation and better outcomes.
Methods:
A cost analysis was conducted alongside the Juvenile Justice Translational Research on Interventions for Adolescents in the Legal System (JJ-TRIALS), a research cooperative comprising 34 community supervision agencies in seven states that were randomly assigned to Core or Core+Enhanced implementation interventions. Both were focused on improving screening, referral, and linkage to behavioral health services for youth with substance use disorders (SUD). Cost data were collected prospectively across all implementation phases.
Results:
During Baseline, the average cost was $11,083 per site (range: $1104 to $19,399). Enhanced sites had relatively higher baseline costs ($13,176 vs. $9222 in the Core sites). During the Experiment phase, Enhanced sites continued to incur higher implementation costs relative to the Core sites, but these costs steadily declined and ultimately converged with Core sites as they entered the sustainment phase.
Conclusions:
Enhanced sites had higher implementation costs; but both Enhanced and Core sites showed similar trends in decreasing costs across the Experiment period. These decreasing costs reflected both fewer meetings and lower participation over time. In a funding climate where available resources are already scarce, access to cost data can help agencies prepare to implement and sustain new practices.
Keywords: Implementation costs, Youth in the legal system, Substance use disorders, Evidence based strategies, Implementation facilitation
1. Introduction
High rates of substance use and mental health disorders among youth in the legal system (YLS) complicate the process of rehabilitation and perpetuate high rates of unmet need among these youth for treatment and other services (Elkington et al., 2010; Scott et al., 2019; Teplin et al., 2005; Tolou-Shams et al., 2007). The evidence shows that >70 % of YLS report prior drug use (Belenko & Logan, 2003) and more than a third screen positive for substance use disorders (SUD) at the time of arrest (Aarons et al., 2001; Wasserman et al., 2010). Across the juvenile justice system, including community supervision, which oversees a majority of YLS, nearly 70 % of youth have mental health disorders (White et al., 2019); yet only about 20 % of YLS determined to be “in need” are referred to treatment (Wasserman et al., 2021). Utilization of services is lower still with only about 33 % of those “in need” actually receiving treatment (White et al., 2019). YLS also have a higher prevalence of HIV risk behaviors and sexually transmitted diseases relative to non-justice involved youth (Belenko et al., 2009; Teplin et al., 2003). Addressing these issues requires an integrated approach in which juvenile justice (JJ) and behavioral health (BH) agencies coordinate efforts to identify youth in need of SUD treatment and link them to services (Binard & Prichard, 2008; Grisso & Underwood, 2004).
In an ideal system, all YLS would be screened for SUD. If negative, this initial screening would lead to substance use prevention due to risk. If positive, it would lead to a more in-depth assessment of behavioral health services needs and result in referral and linkage to appropriate treatment in the community. Data show that service delivery often breaks down, however, even at the initial step of screening in juvenile justice settings (Belenko et al., 2004; Belenko & Dembo, 2003; D. Bowser et al., 2018; Jainchill, 2012; Marsden & Straw, 2000; Nissen et al., 2006; Young et al., 2007). Consequently, high proportions of youth have unmet SUD and mental health treatment needs. Part of this disconnect may be the result of financial concerns regarding the additional resources required by juvenile justice and behavioral health agencies to ensure youth receive the recommended continuum of evidence-based care. A second part of this disconnect may relate to the broader context of how effectively and efficiently the juvenile justice and behavioral health systems are able to coordinate efforts on behalf of YLS with SUD. A national survey of community supervision (CS) agencies, behavioral health (BH) agencies, and judges found that while both CS and BH agencies agreed that there was cross-agency collaboration for youth with substance use and/or mental health issues, less than half of these agencies endorsed the existence of consistent, ongoing collaborative activities such as cross-training of staff, pooling funding to pay for services, or sharing operational oversight (Scott et al., 2019).
Understanding the costs of implementing strategies to support better delivery of behavioral health services has become an important consideration within the field of Dissemination and Implementation Science. Several studies have been published including general methodologic guidelines for implementation cost analysis (Gold et al., 2022; Salloum et al., 2022; T. Wagner et al., 2020; T. H. Wagner, 2020), systematic literature reviews (D. M. Bowser et al., 2021; Michaud et al., 2022; Vale et al., 2007), conceptual frameworks featuring costs (Corso et al., 2014; Garner, 2022; Saldana et al., 2014), and individual cost and cost effectiveness studies alongside hybrid implementation-effectiveness trials (Garner et al., 2012, 2018; Hinde et al., 2022). While this work encompasses a variety of settings and stakeholders, less is known about the costs of implementing new strategies targeting SUD within JJ agencies. To our knowledge, this is the first study of intervention implementation costs featuring JJ and BH partnerships.
Integrating services and coordinating processes to improve the delivery of SUD services for YLS was the focus of the Juvenile Justice -Translational Research on Adolescents in the Legal System (JJ-TRIALS), a multisite cooperative research initiative funded by the National Institute on Drug Abuse (NIDA). JJ-TRIALS cooperative encompassed NIDA, a coordinating center (CC) and six academic Research Centers (RC) with 36 county JJ agencies from seven states. This was the first large-scale evaluation of structured implementation interventions targeting substance use. While costing was not originally part of the overall study design of JJ-TRIALS, a cost analysis was funded through a separate grant allowing us to prospectively collect cost data across all phases of the implementation intervention. This study reports the main results of the cost analysis.
The details on the JJ-TRIALS protocol is published in the JJ-TRIALS Cooperative et al. (2015). In brief, JJ-TRIALS was a multisite cluster randomized control trial comparing two implementation strategies, Core and Core+Enhanced, for promoting the adoption of evidence based screening, assessment, referral and treatment within juvenile justice agencies (mainly juvenile probation agencies). JJ-TRIALS adopted the Exploration, Preparation, Implementation, Sustainment (EPIS) framework developed by Aarons and colleagues (Aarons et al., 2011). The Core Implementation Intervention (Core) featured a comprehensive needs assessment; staff training; support for selecting a goal to improve outcomes along the Behavioral Health Services Cascade (Belenko et al., 2017; Cidav et al., 2020); data-driven decision making (DDDM); and teaching stakeholders how to collect, analyze, and interpret data to improve outcomes/practices. The Enhanced Implementation Intervention (Enhanced) added Implementation Facilitators to assist local change teams (comprised of members from participating JJ and BH agencies) at each study site in applying DDDM principles and process improvement strategies to achieve Cascade-related goals.
The Core Implementation strategy, delivered to all participating sites, occurred in the 6 months leading up to randomization, corresponding to the Preparation phase of the EPIS model. The Experimental phase was approximately 12 months (corresponding to the Implemenation phase of the EPIS model), followed by 6 months in the Maintenance phase (corresponding to the Sustainment phase of the EPIS model). There was some variability in the number of months sites spent in phases, which is adjusted for in the cost analysis. These phases were disaggregated in the main trial analyses as Baseline, Pre-randomization, Early Experiment, Late Experiment, and Maintenance. Fig. A.1 in the appendix provides an overview of the study design with additional details on the various activities that occurred during the Baseline and Experiment phases. The primary study outcomes focused on reducing unmet need for treatment services through retention in the Behavioral Health Services Cascade (Belenko et al., 2017), an optimal care model with 7 different process metrics of behavioral health services delivery (screening and assessment, identifying substance use need, treatment referral, treatment initiation, treatment engagement, continuing care). Additionally, measures of recidivism for youth cohorts defined around the trial were evaluated using JJ agency records. Comparative effectiveness of Core and Enhanced conditions was assessed using a multisite clustered randomized trial design with a randomized three-wave rollout; thus randomization of each site to a Wave pair, and then randomization to condition within Wave pair. Thirty-six sites (i.e., counties) were originally enrolled and 34 completed the study.
The main findings of JJ-TRIALS found that although screening rates in the participating sites were relatively high (~70 %), referral rates were low (~22 % of youth determined to be “in need” of services were referred to treatment) (Wasserman et al., 2021). Treatment Initiation among youth referred to treatment was nearly 68 %, but overall treatment initiation among youth in need of services was only about 15 % (Wasserman et al., 2021). The Enhanced sites did demonstrate faster time to treatment initation and were found to be more effective in helping youth progress further along the behavioral health care cascade (Knight et al., 2022). Main study findings are discussed further in the Discussion section below.
2. Methods
The cost analysis is framed from the perspective of the JJ agencies and their BH partners, representing the value of time and other resources invested by these agencies to participate in all aspects of the implementation interventions. We employ a micro-costing approach using cost tracking worksheets and interviewer-led staff surveys. Resource utilization and expenditures are categorized by the three JJ-TRIALS study phases: Baseline (e.g., Core support and training activities), Experiment (following randomization to Core or Enhanced implementation interventions); and Maintenance following the end of the intervention period. We also compare the first six months of the Experiment phase (i.e., Early Experiment) to the last six months (Late Experiment). Research center costs associated with implementation intervention activities are included (and designated as RC costs in the results tables) but costs specific to the study (purely research-related) are excluded. To account for variability in the length of time each site spent in each study phase, we adjust costs in each site by the length of time (number of months) spent in a phase. The University of Miami institutional review board (IRB) reviewed and approved study procedures. Cost data were analyzed in Microsoft Excel (Microsoft Corp, Redmond, WA) and Stata 15.1 (Stata Corp, College Station, Texas).
2.1. Cost data collection
Baseline activities include in-person meetings as well as conference calls/virtual meetings, thus some cost categories (i.e., travel expenses) are not relevant for all activities within this phase. Thirteen different core support training activities were identified to determine the costs of the Baseline phase. Costs are organized across three entities: Research Center (RC), Juvenile Justice agency (JJ), and Behavioral Health agency (BH). The Pre-Implementation Cost Tracking Worksheet (PICTW) was developed for this study to collect information from key staff at the RCs (worksheet is provided in the appendix, A.2). The PICTW is based on standard microcosting principles to capture staff time, travel time and travel-related expenses, space costs and supply costs. There is also a miscellaneous category where additional expenses can be recorded. Development of the PICTW was informed by the Drug Abuse Treatment Cost Analysis Program (DATCAP) and Substance Abuse Services Cost Analysis Program (SASCAP), validated costing instruments from numerous published substance use-related economic studies (French et al., 2008; Zarkin et al., 2004). The PICTW was administered to all Wave 3 sites (representing 12 justice agencies and their behavioral health partners). Given limited funding for the economic analysis during the first waves of the study, we were not able to administer the PICTW to all sites across all waves. We did collect these data from 4 Wave-1 sites and 2 Wave-2 sites, which we then used to impute an average cost per Baseline activity across all waves and all sites by applying a weighted average based on the number of youth under supervision in the participating JJ systems. We adopted this imputation approach to account for larger JJ systems potentially representing a larger portion of baseline costs.
During the Experiment and Maintenance phases, all sites completed a Monthly-Site Check-In (MSC) with the RCs. This monthly survey included an economic cost section that asked respondents about the titles and roles of all participants in the monthly face-to-face meetings, the time invested in each meeting, and the amount of pre- and post-meeting time for planning and follow-up activities. Monthly activities during the Experiment phase involved monthly local change team meetings in person and via phone/virtual.
In-person meetings occurred at the juvenile justice sites (i.e., at a juvenile probation office). This required the BH and RC staff to travel to the site, resulting in relatively higher travel costs in some more rural counties. Transportation costs are estimated using the average cost per mile ($0.58) from the American Automobile Association times the number of miles traveled between agencies (and the RC). Total travel costs also factored in staff time spent traveling to Baseline meetings captured on the PICTW. The largest component to total Baseline costs is staff time to participate in trainings and other meetings. Some of this time involved pre-meeting preparations and post-meeting follow-up communication between RCs and agency staff. Individual staff time costs is computed for each attendee in these meetings. For consistency and comparability, staff time was valued using national salary estimates for juvenile justice and behavioral health positions (e.g., juvenile probation officer, substance use counselor) was obtained from the Department of Labor’s Occupational Information Network (O*NET; onetonline.org), a national resource for occupation-specific data linked to the Bureau of Labor Statistics (BLS) (O*NET Online, n.d.). This process involved mapping reported staff titles/roles onto occupations based on the Standard Occupational Classification System and the North American Industry Classification System to support generalizability of the cost results. Supplies and other costs include food/drink, printing/duplication, and transcription services. Space costs are estimated using the online office rental resource cityfeet.com, which provides office rental rates by zip code and by square foot per year. This allowed us to estimate a “cost per square foot/per hour” for the office or conference room space where meetings were held.
Maintenance activities are also tracked on the MSC. Because sites entered this phase at different times, the costs are very preliminary and incomplete. We opted to summarize maintenance costs with the available data, but do not highlight these as providing a complete picture of what maintenance would look like outside of this five-year research study.
3. Results
3.1. Baseline costs
Results of the cost analyses are presented as Baseline (core support activities) costs (Table 1), total costs of implementation intervention by phase and by site (Table 2), average costs by study condition and intervention phase (Fig. 1), and trajectories of implementation costs by study condition across all phases (Fig. 2). Seven unique activities comprise Baseline core support trainings provided to all sites prior to randomization to Core or Enhanced. Table 1 reports mean, median, minimum (min), and maximum (max) costs by stakeholder [Research Center (RC), Juvenile Justice agency (JJ), and Behavioral Health agency (BH)], as well as total core support activities costs across all stakeholders and the percentage of total core support costs incurred by each stakeholder. The RCs incurred the largest costs in all activities given their leading role in providing core support activities to JJ and BH partners. JJ agencies incurred relatively higher costs (nearly twice as high) as their BH partners. This was not unexpected as the intervention was based in the JJ agencies. Needs assessment group interview and goal achievement/goal selection trainings had the highest costs as they required more involvement from staff; monthly site check-in orientation had the lowest cost. Across all core support activities, the average total cost across all sites was $11,083 (median: $10,587; $7172; $16,609 IQR), representing the value of time and other resources invested in baseline trainings and meetings to prepare for implementation.
Table 1.
Baseline costs representing core support training activities.
| Core Support Activities | Mean | Median | Min | Max |
|---|---|---|---|---|
|
| ||||
| Leadership Orientation Meeting | ||||
| Research Center | $1101 | $949 | $204 | $2209 |
| Juvenile Justice | $233 | $158 | $4 | $617 |
| Behavioral Health | $193 | $167 | $0 | $676 |
| Needs Assessment Group Interview | ||||
| Research Center | $1389 | $977 | $359 | $4415 |
| Juvenile Justice | $351 | $319 | $60 | $804 |
| Behavioral Health | $271 | $221 | $0 | $788 |
| Line Staff Orientation | ||||
| Research Center | $949 | $943 | $191 | $2949 |
| Juvenile Justice | $380 | $240 | $89 | $1135 |
| Behavioral Health | $171 | $108 | $0 | $651 |
| Site Feedback Report Leadership Meeting | ||||
| Research Center | $926 | $997 | $14 | $2972 |
| Juvenile Justice | $167 | $113 | $16 | $570 |
| Behavioral Health | $109 | $63 | $0 | $596 |
| Goal Achievement Training: Goal Selection | ||||
| Research Center | $1311 | $909 | $397 | $4368 |
| Juvenile Justice | $427 | $350 | $132 | $1743 |
| Behavioral Health | $182 | $144 | $0 | $505 |
| Goal Achievement Training: Dddm | ||||
| Research Center | $812 | $709 | $260 | $2488 |
| Juvenile Justice | $292 | $259 | $76 | $508 |
| Behavioral Health | $148 | $101 | $0 | $486 |
| Monthly Check-In Orientation | ||||
| Research Center | $31 | $22 | $9 | $73 |
| Juvenile Justice | $21 | $14 | $9 | $95 |
| Behavioral Health | $4 | $0 | $0 | $43 |
| Total Core Support Activities Cost | ||||
| Across All (RC, JJ, BH) | $11,083 | $10,587 | $1104 | $19,399 |
| Research Center (%) | 67 % | 70 % | 89 % | 70 % |
| Juvenile Justice (%) | 22 % | 20 % | 10 % | 25 % |
| Behavioral Health (%) | 11 % | 10 % | 1 % | 5 % |
Notes: RC = Research Center; JJ = Juvenile Justice agency; BH = behavioral health partner.
Table 2.
Total and Average Costs Across Implementation Phases by Research Center (RC).
| Implementation Phases |
RC A (Total Costs) |
RC B (Total Costs) |
RC C (Total Costs) |
RC D (Total Costs) |
RC E (Total Costs) |
RC F (Total Costs) |
Average (SD) |
Median [IQR, 25 %, 75 %] |
||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Core | Enhanced | Core | Enhanced | Core | Enhanced | Core | Enhanced | Core | Enhanced | Core | Enhanced | All | All | |
|
| ||||||||||||||
| Baseline | 27,991 (4176) | 32,414 (568) | 11,975 (2933) | 23,330 (4230) | 24,304 (2418) | 28,106 (1.395) | 27,487 (545) | 45,629 (3759) | 25,845 (9.158) | 36,013 (866) | 48,393 (3222) | 45,324 (3934) | 11,083 (5265) | 10,587 [7172; 16,609] |
| Experiment | ||||||||||||||
| Early Experiment | 4470 (1003) | 7624 (235) | 2754 (149) | 13,487 (3408) | 7137 (3153) | 27,601 (2355) | 10,487 (1969) | 11,154 (1799) | 11,157 (3073) | 13,764 (5130) | 6767 (1.625) | 22,697 (1.323) | 4.091 (3197) | 3462 [1181; 6082] |
| Late Experiment | 3467 (842) | 7531 (952) | 1670 (460) | 10,449 (2316) | 714 (301) | 18,761 (1272) | 1623 (406) | 12,367 (2494) | 14,600 (9272) | 9040 (3766) | 5230 (1160) | 17,991 (1508) | 3.042 (3141) | 1774 [576; 5268] |
| Total Experiment | 7938 (1810) | 15,154 (1187) | 4424 (591) | 23,935 (5718) | 7852 (3454) | 46,362 (3.624) | 12,110 (1.653) | 23,521 (4.278) | 25,756 (12,346) | 22,804 (8896) | 11,997 (2582) | 40,687 (2743) | 7134 (6003) | 5385 [2146, 11,063] |
| Total - All Phases | 35,292 (3684) | 47,568 (1755) | 16,400 (3500) | 47,266 (2242) | 32,155 (3052) | 75,468 (4230) | 39,598 (1671) | 69,151 (5479) | 51,601 (3188) | 58,817 (8030) | 60,390 (5794) | 86,011 (6672) | 18,216 (8683) | 18,358 [11,275; 23,706] |
Notes: Standard deviation (SD) in parentheses. RC = Research Center. IQR = interquartile range.
Fig. 1.
Average Intervention Costs Per Site by Study Condition and Intervention Phase
Notes: Average costs per site adjusted for number of months spent in each phase. Total Costs include sustainment costs.
*Significant difference between Core and Enhanced at p < 0.05.
Fig. 2.
Change over Time in Core and Enhanced Implementation Costs by Phase.
3.2. Costs across implementation phases
Average costs across Baseline, Early Experiment and Late Experiment implementation phases are summarized by site and overall in Table 2. For the purpose of comparing implementation costs by Core and Enhanced study conditions, we present Baseline costs by these conditions as well, but note that randomization to Core or Enhanced did not occur until the end of the Baseline phase. As noted above, for comparability across sites, cost estimates are adjusted to account for site-specific differences in number of months spent in implementation phases. We also calculate an average cost-per-month and present those results in the appendix (Fig. A.3). There is significant variability in average costs across sites for all phases. Baseline represents the largest cost, driven by the staff time associated with the trainings and meetings described in Table 1. The greatest time and resources in the Baseline phase were invested by Site F; the lowest cost for this phase was in Site B.
Implementation costs during the Experiment Phases (early, late) are also highly variable both within and across Core and Enhanced sites. The costs during the Early Experiment phase (first 6 months of the 12-month intervention period) range from $2754 for Core agencies in Site B to $27,601 for Enhanced agencies in Site C. Across all sites, Enhanced implementation costs are higher as expected, given that Enhanced sites feature an Implementation Facilitator guiding implementation goals and encouraging monthly workgroup meetings. The Implementation Facilitator role was covered either by an RC staff member (valued as documented time spent on implementation facilitation activities) or by a consultant (valued per hour or per contract). There was also relatively higher local change team attendance in Enhanced sites. Site A has the lowest costs across Enhanced sites during the early experiment phase. On average, costs begin to decline during the second half of the Experiment period. Among Core sites, late experiment phase costs range from $714 in Site C to $14,600 in Site E. Enhanced sites costs during late experiment phase range from $7531 in Site A to $18,761 in Site C.
Total implementation costs across all RCs among the Core sites range from $16,400 to $60,390 and among the Enhanced sites from $47,266 to $86,011. The cost data are highly skewed (see IQR) as illustrated in Fig. 1, which shows the mean intervention costs by study condition and implementation phase. Enhanced sites invest approximately $10,000 more in implementation costs on average than Core sites. Costs during Early and Late Experiment phases were significantly different between Core and Enhanced, but by the end of the experiment phase as sites entered the Maintenance phase, these costs converge around $2500 on average for the entire Maintenance phase. These latter costs provide only a preliminary understanding of the activities implemented following the Experiment period given the short duration of the Maintenance phase at the end of the study and should not be considered representative of a typical Maintenance Phase.
Fig. A.3. in the appendix shows the average monthly cost by study condition and implementation phase. Across all phases, the average monthly cost is $786 per Enhanced site and $493 per Core site. During Baseline, average monthly costs during Core support trainings and other activities is $1522 per Enhanced site and $1095 per Core site. During the Early Experiment phase, average monthly costs are $901 per Enhanced site and $348 per Core site. These average monthly costs decline in Late Experiment to $711 in Enhanced and $233 in Core.
Fig. 2 illustrates how implementation costs change over the phases of the intervention for Core and Enhanced sites. Although randomization to Core or Enhanced did not occur until the end of the Baseline phase, it turns out that Enhanced sites have higher Baseline costs relative to Core sites. Once the Experiment phase began, Enhanced sites maintain relatively higher costs than Core sites, but for all sites implementation costs decline over the Experiment period and converge as the sites enter Maintenance. Given the baseline differences, it is important to note that the percent reduction in implementation costs throughout the Experiment period is 83 % for Enhanced sites and 70 % for Core sites.
4. Discussion
This study describes results of a comprehensive cost analysis focusing on unique phases and resources of an implementation intervention targeting juvenile justice agencies and their behavioral health agency partners in 34 counties across seven states. This work was conducted as part of the economic analyses of JJ-TRIALS, with the goal of providing costing data to inform stakeholders about the resources required to support interagency collaboration to improve receipt of substance use services among YLS. The main findings of the study provide evidence of increased treatment referrals, initiation and continuing care across all participants, but only a few significant differences between Core and Enhanced (O*NET Online, n.d.). In brief, Core intervention support in the Baseline phase led to increased receipt of behavioral health services over time for all sites. Compared with the Baseline period, the percentage of youth in need who were referred to treatment increased during the Experiment period, and the increase was greater in sites receiving Enhanced intervention support. However, the percentage of youth in need of treatment who were referred remained disappointingly low, regardless of condition. Referrals declined in the Enhanced condition during the Maintenance phase.
It is also noteworthy that there was considerable variation across sites in treatment referral rates and the impact of the Core and Enhanced implementation activities on referral (Belenko et al., 2022). Engagement and continuing care rates increased significantly between Baseline and Early Experiment but did not persist into Late Experiment phase. Finally, in terms of cascade penetration, earlier YLS cohorts showed greater penetration from Baseline to Early and Late Experiment phases; and the Enhanced sites had greater penetration in the Behavioral Health Cascade relative to Core (Knight et al., 2022).
This study represents one of the first implementation costing studies on the establishment of partnerships between JJ and BH agencies to improve services for youth with substance use disorders. Costs were summarized by phases guided by the EPIS framework, which comprised Baseline, Early Experiment, Late Experiment, and Maintenance. We found that the Enhanced sites had higher baseline costs despite all sites receiving the same bundle of core support trainings to prepare for implementation. This was primarily due to Enhanced sites having more participants attending the core support trainings as noted above. Intervention implementation costs were driven largely by time invested in monthly local change team meetings. The Enhanced sites had the additional cost of an implementation facilitator to help guide decisions and facilitate partnerships between juvenile justice and behavioral health agencies. The Core sites were left to decide meeting schedules on their own, and as a result did not meet regularly during the Experiment phase. As expected, Enhanced sites had higher implementation costs; but both Enhanced and Core sites showed similar trends in decreasing costs across the Experiment period. These decreasing costs reflected both fewer meetings and lower participation over time. It is possible that once local change teams achieved their goals there was less reason to meet toward the end of the Experiment period. Regarding the maintenance phase, we had some preliminary cost data showing continued meetings and other activities following the Experiment phase, but these data only suggest that agencies spent about the same amount of time conducting these maintenance meetings, regardless of being in an enhanced or core site.
As a practical application, these results highlight the importance of tracking costs along implementation phases. As far as informing policy, considering that cost burden is noted among the top challenges faced by agencies striving to change existing practices or adopt new practices, having examples of the types of activities and costs associated with an agency-level implementation intervention is evidence policymakers can refer to in developing new strategies to improve service delivery or other goals going forward. Fig. 2 highlights that by the end of Early Experiment into Late Experiment, investments and costs by Core and Enhanced groups were merging. Identifying the causes for reduced resource allocations toward the end of the experiment phase could highlight gaps in the implementation process and suggest areas for re-investment in JJ-BH interagency collaboration.
4.1. Limitations
A few study limitations are notable. In the assessment of Baseline/core support costs, not all Research Center sites completed the PICTW. There were gaps in funding the economic analysis of JJ-TRIALS, but by Wave 3, all sites participated in Baseline cost data collection. As noted in the Methods section, we did have some Baseline cost data from Wave 1 (4 sites) and Wave 2 (2 sites), and used these data to impute costs (adjusting for site-specific YLS populations) for sites across all Waves. Another limitation is the reliance on key informants to summarize staff attendance and time invested in meetings including pre/post-meeting effort. Because staff time drove Baseline and Experiment phase costs, errors in reporting this information would have a significant impact on the economic results. It is also possible that a similar implementation intervention conducted outside of a research study would have lower costs. Regarding Maintenance, given that this phase only covers up to 6 months post-Experiment, it is unclear if this trend in decreasing costs over time is generalizable. The RCs actually incurred the majority of the intervention-related costs across all phases of the study. Finally, although the costs for both Enhanced and Core sites were relatively low at the end of the Experiment phase, we did not explore potential funding mechanisms (reimbursement, financing, grant support) that are available to support such partnernships between justice and behavioral health agencies. This would be a critical piece in promoting future adoption of these intervention components on a broader scale as well as speak to the feasibility of sustaining these activities.
5. Conclusion
Creating partnerships between juvenile justce and behavioral health agencies is critically important to improve access to substance use disorder treatment and other services for justice-involved youth. Interventions geared toward unifying these siloed systems are needed, but significant barriers remain. Cost burden – long acknowledged as one of the primary barriers – remains a challenge as agencies may not have the resources or staff readily available to allocate to the additional meetings and other activities that are required to form and sustain monthly workgroups. Qualitative interviews with JJ-TRIALS JJ agency stakeholders about lessons learned from the study highlight concerns regarding staff burden, the importance of continuing to improve communication across and within agencies, and being able to identify deficiencies in connecting youth to treatment providers to proactively find solutions (Cawood et al., 2020). Beyond centralized budgets on the justice side, behavioral health agencies do not have reimbursement mechanisms for collaborative partnerships, and other challenges such as high staff turnover in both sectors will naturally impede the ability to form and/or sustain such partnerships. As the nation continues to grapple with high rates of substance use disorders – especially opioid use disorders – community-level interventions are being evaluated to improve access to and retention in needed services. Many of the tools and strategies implemented in JJ-TRIALS are being repurposed under the NIH HEAL initiatives, where community engagement and multiagency partnerships are being formalized to target opioid use and overdose (Aldridge et al., 2020; Murphy et al., 2021; Walsh et al., 2020).
From an economic perspective, JJ-TRIALS provides an understanding of how such partnerships can be attained at relatively modest costs, but that this can still be burdensome enough to make sustaining partnerships nonviable across juvenile justice and behavioral health sectors. Agencies need to be aware of the size of investments that will be required to prepare and implement new strategies that improve Behavioral Health Cascade retention. Future work necessitates that we do a better job of tracking these implementation costs and formalizing sources of funding support for agencies to allow them to free up the staff time to stay engaged in workgroups that will significantly improve interagency service planning and linkage. Targeting data collection that will allow us to explore heterogeneity in resource requirements and costs by setting (e.g., rural vs. urban) and population subgroups is also an important goal for future work.
Supplementary Material
Supplementary data to this article can be found online at https://doi.org/10.1016/j.josat.2025.209721.
Funding sources for research
U01DA036221; U01DA036226; U01DA036233; U01DA036176; U01DA036225; U01DA036224; U01DA036158; R21DA044378; P30DA040500.
Footnotes
CRediT authorship contribution statement
Kathryn E. McCollister: Writing – original draft, Resources, Project administration, Methodology, Investigation, Funding acquisition, Formal analysis, Conceptualization. Diana Bowser: Writing – review & editing, Writing – original draft, Methodology, Investigation, Funding acquisition, Formal analysis, Conceptualization. Jenny E. Becan: Writing – review & editing, Writing – original draft. Danica K. Knight: Writing – review & editing, Writing – original draft. Steven Belenko: Writing – review & editing, Writing – original draft. Angela A. Robertson: Writing – review & editing. Michael L. Dennis: Writing – review & editing, Methodology.
Declaration of competing interest
All authors report grant or contract funding from the National Institute of Drug Abuse (NIDA) for this study. All authors declare no competing interests.
References
- Aarons GA, Brown SA, Hough RL, Garland AF, & Wood PA (2001). Prevalence of adolescent substance use disorders across five sectors of care. Journal of the American Academy of Child & Adolescent Psychiatry, 40(4), 419–426. 10.1097/00004583-200104000-00010 [DOI] [PubMed] [Google Scholar]
- Aarons GA, Hurlburt M, & Horwitz SM (2011). Advancing a conceptual model of evidence-based practice implementation in public service sectors. Administration and Policy in Mental Health and Mental Health Services Research, 38(1), 4–23. 10.1007/s10488-010-0327-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Aldridge AP, Barbosa C, Barocas JA, Bush JL, Chhatwal J, Harlow KJ, , … Morgan JR, et al. (2020). Health economic design for cost, cost-effectiveness and simulation analyses in the HEALing communities study. Drug and Alcohol Dependence, 217, Article 108336. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Belenko S, & Dembo R (2003). Treating adolescent substance abuse problems in the juvenile drug court. International Journal of Law and Psychiatry, 26(1), 87–110. 10.1016/S0160-2527(02)00205-4 [DOI] [PubMed] [Google Scholar]
- Belenko S, & Logan T (2003). Delivering more effective treatment to adolescents: Improving the juvenile drug court model. Journal of Substance Abuse Treatment, 25 (3), 189–211. 10.1016/S0740-5472(03)00123-5 [DOI] [PubMed] [Google Scholar]
- Belenko S, Sprott JB, & Petersen C (2004). Drug and alcohol involvement among minority and female juvenile offenders: Treatment and policy issues. Criminal Justice Policy Review. 10.1177/0887403403255068 [DOI] [Google Scholar]
- Belenko S, Dembo R, Rollie M, Childs K, & Salvatore C (2009). Detecting, preventing, and treating sexually transmitted diseases among adolescent arrestees: An unmet public health need. American Journal of Public Health, 99(6), 1032–1041. 10.2105/AJPH.2007.122937 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Belenko S, Knight D, Wasserman GA, Dennis ML, Wiley T, Taxman FS, … Sales J (2017). The Juvenile Justice Behavioral Health Services Cascade: A new framework for measuring unmet substance use treatment services needs among adolescent offenders. Journal of Substance Abuse Treatment, 74, 80–91. 10.1016/j.jsat.2016.12.012 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Belenko S, Dembo R, Knight DK, Elkington KS, Wasserman GA, Robertson AA, … Wiley T (2022). Using structured implementation interventions to improve referral to substance use treatment among justice-involved youth: Findings from a multisite cluster randomized trial. Journal of Substance Abuse Treatment, 140, Article 108829. 10.1016/j.jsat.2022.108829 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Binard J, & Prichard M (2008). Model policies for juvenile justice and substance abuse treatment: A report by reclaiming futures. Robert Wood Johnson Foundation. https://www.ojp.gov/ncjrs/virtual-library/abstracts/model-policies-juvenile-justice-and-substance-abuse-treatment. [Google Scholar]
- Bowser D, Henry BF, Wasserman GA, Knight D, Gardner S, Krupka K, … Robertson A (2018). Comparison of the overlap between juvenile justice processing and behavioral health screening, assessment and referral. Journal of Applied Juvenile Justice Services, 2018, 97–125. [PMC free article] [PubMed] [Google Scholar]
- Bowser DM, Henry BF, & McCollister KE (2021). Cost analysis in implementation studies of evidence-based practices for mental health and substance use disorders: A systematic review. Implementation Science, 16(1), 26. 10.1186/s13012-021-01094-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cawood M, Koontz V, Arrigona N, Robertson A, Blackwell L, Scanu-Hansen T, … Welsh WN (2020). Journal of Applied Juvenile Justice Services. Available online at https://www.npjs.org/resources/journal-of-applied-juvenile-justice-services/2020-issues. [Google Scholar]
- Cidav Z, Mandell D, Pyne J, Beidas R, Curran G, & Marcus S (2020). A pragmatic method for costing implementation strategies using time-driven activity-based costing. Implementation Science, 15(1), 28. 10.1186/s13012-020-00993-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Corso PS, Taylor N, Bennett J, Ingels J, Self-Brown S, & Whitaker DJ (2014). Marginal cost analysis of two train-the-trainer models for implementing SafeCare®. Western Journal of Emergency Medicine, 15(5), 623–626. 10.5811/westjem.2014.4.21422 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Elkington KS, Bauermeister JA, & Zimmerman MA (2010). Psychological distress, substance use, and HIV/STI risk behaviors among youth. Journal of Youth and Adolescence, 39(5), 514–527. 10.1007/s10964-010-9524-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- French MT, Zavala SK, McCollister KE, Waldron HB, Turner CW, & Ozechowski TJ (2008). Cost-effectiveness analysis (CEA) of four interventions for adolescents with a substance use disorder. Journal of Substance Abuse Treatment, 34 (3), 272–281. 10.1016/j.jsat.2007.04.008 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Garner BR (2022). From innovative applications of the effectiveness-implementation hybrid trial design to the dissemination, implementation, effectiveness, sustainment, economics, and level-of-scaling hybrid trial design. Frontiers in Health Services, 2. 10.3389/frhs.2022.1007750 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Garner BR, Godley SH, Dennis ML, Hunter BD, Bair CML, & Godley MD (2012). Using pay for performance to improve treatment implementation for adolescent substance use disorders: Results from a cluster randomized trial. Archives of Pediatrics & Adolescent Medicine, 166(10), 938–944. 10.1001/archpediatrics.2012.802 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Garner BR, Lwin AK, Strickler GK, Hunter BD, & Shepard DS (2018). Pay-for-performance as a cost-effective implementation strategy: Results from a cluster randomized trial. Implementation Science, 13(1), 92. 10.1186/s13012-018-0774-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gold HT, McDermott C, Hoomans T, & Wagner TH (2022). Cost data in implementation science: Categories and approaches to costing. Implementation Science, 17(1), 11. 10.1186/s13012-021-01172-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Grisso T, & Underwood LA (2004). Screening and assessing mental health and substance use disorders among youth in the juvenile justice system. A Resource Guide for Practitioners. US Department of Justice. https://eric.ed.gov/?id=ED484681. [Google Scholar]
- Hinde JM, Garner BR, Watson CJ, Ramanan R, Ball EL, & Tueller SJ (2022). The implementation & sustainment facilitation (ISF) strategy: Cost and cost-effectiveness results from a 39-site cluster randomized trial integrating substance use services in community-based HIV service organizations. Implementation Research and Practice, 3, Article 26334895221089266. 10.1177/26334895221089266 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Jainchill N (Ed.). (2012). Understanding and treating adolescent substance use disorders: Assessment, treatment, juvenile justice responses. Civic Research Institute. [Google Scholar]
- Knight DK, Belenko S, Dennis ML, Wasserman GA, Joe GW, Aarons GA, … Wiley TRA (2022). The comparative effectiveness of Core versus Core+enhanced implementation strategies in a randomized controlled trial to improve substance use treatment receipt among justice-involved youth. BMC Health Services Research, 22(1), 1535. 10.1186/s12913-022-08902-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Marsden ME, & Straw RS (2000). Substance abuse treatment in adult and juvenile correctional facilities: Findings from the uniform facility data set 1997 survey of correctional facilities. National Clearinghouse for Alcohol and Drug Information. https://eric.ed.gov/?id=ED449405. [Google Scholar]
- Michaud TL, Pereira E, Porter G, Golden C, Hill J, Kim J, … Estabrooks PA (2022). Scoping review of costs of implementation strategies in community, public health and healthcare settings. BMJ Open, 12(6), Article e060785. 10.1136/bmjopen-2022-060785 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Murphy SM, Laiteerapong N, Pho MT, Ryan D, Montoya I, Shireman TI, … McCollister KE (2021). Health economic analyses of the justice community opioid innovation network (JCOIN). Journal of Substance Abuse Treatment, 128, Article 108262. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Nissen LB, Butts JA, Merrigan D, & Kraft MK (2006). The RWJF reclaiming futures initiative: Improving substance abuse interventions for justice-involved youths. Juvenile and Family Court Journal, 57(4), 39–51. 10.1111/j.1755-6988.2006.tb00130.x [DOI] [Google Scholar]
- O*NET Online. (n.d.). National Center for ONET Development. Retrieved March 1, 2025, from onetonline.org.
- Saldana L, Chamberlain P, Bradford WD, Campbell M, & Landsverk J (2014). The cost of implementing new strategies (COINS): A method for mapping implementation resources using the stages of implementation completion. Children and Youth Services Review, 39, 177–182. 10.1016/j.childyouth.2013.10.006 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Salloum RG, Wagner TH, Midboe AM, Daniels SI, Quanbeck A, & Chambers DA (2022). The economics of adaptations to evidence-based practices. Implementation Science Communications, 3(1), 100. 10.1186/s43058-022-00345-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Scott CK, Dennis ML, Grella CE, Funk RR, & Lurigio AJ (2019). Juvenile justice systems of care: Results of a national survey of community supervision agencies and behavioral health providers on services provision and cross-system interactions. Health & Justice, 7(1), 11. 10.1186/s40352-019-0093-x [DOI] [PMC free article] [PubMed] [Google Scholar]
- Teplin LA, Elkington KS, McClelland GM, Abram KM, Mericle AA, & Washburn JJ (2005). Major mental disorders, substance use disorders, comorbidity, and HIV-AIDS risk behaviors in juvenile detainees. Psychiatric Services, 56(7), 823–828. 10.1176/appi.ps.56.7.823 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Teplin LA, Mericle AA, McClelland GM, & Abram KM (2003). HIV and AIDS risk behaviors in juvenile detainees: Implications for public health policy. American Journal of Public Health, 93(6), 906–912. 10.2105/AJPH.93.6.906 [DOI] [PMC free article] [PubMed] [Google Scholar]
- the JJ-TRIALS Cooperative, Knight DK, Belenko S, Wiley T, Robertson AA, Arrigona N, … Leukefeld C (2015). Juvenile justice—Translational research on interventions for adolescents in the legal system (JJ-TRIALS): A cluster randomized trial targeting system-wide improvement in substance use services. Implementation Science, 11(1), Article 57. 10.1186/s13012-016-0423-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Tolou-Shams M, Brown LK, Gordon G, & Fernandez I (2007). Arrest history as an indicator of adolescent/young adult substance use and HIV risk. Drug and Alcohol Dependence, 88(1), 87–90. 10.1016/j.drugalcdep.2006.09.017 [DOI] [PubMed] [Google Scholar]
- Vale L, Thomas R, MacLennan G, & Grimshaw J (2007). Systematic review of economic evaluations and cost analyses of guideline implementation strategies. The European Journal of Health Economics, 8(2), 111–121. 10.1007/s10198-007-0043-8 [DOI] [PubMed] [Google Scholar]
- Wagner T, Yoon J, Jacobs J, So A, Kilbourne A, Wei Y, & Goodrich D (2020). Estimating costs of an implementation intervention. Medical Decision Making.. 10.1177/0272989X20960455 [DOI] [PubMed] [Google Scholar]
- Wagner TH (2020). Rethinking how we measure costs in implementation research. Journal of General Internal Medicine, 35(2), 870–874. 10.1007/s11606-020-06104-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Walsh SL, El-Bassel N, Jackson RD, Samet JH, Aggarwal M, Aldridge AP, … Battaglia TA (2020). The HEALing (helping to end addiction long-term SM) communities study: Protocol for a cluster randomized trial at the community level to reduce opioid overdose deaths through implementation of an integrated set of evidence-based practices. Drug and Alcohol Dependence, 217, Article 108335. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wasserman GA, McReynolds LS, Schwalbe CS, Keating JM, & Jones SA (2010). Psychiatric disorder, comorbidity, and suicidal behavior in juvenile justice youth. Criminal Justice and Behavior, 37(12), 1361–1376. 10.1177/0093854810382751 [DOI] [Google Scholar]
- Wasserman GA, McReynolds LS, Taxman FS, Belenko S, Elkington KS, Robertson AA, … Wiley TRA (2021). The missing link (age): Multilevel contributors to service uptake failure among youths on community justice supervision. Psychiatric Services, 72(5), 546–554. 10.1176/appi.ps.202000163 [DOI] [PMC free article] [PubMed] [Google Scholar]
- White LM, Aalsma MC, Salyers MP, Hershberger AR, Anderson VR, Schwartz K, … McGrew JH (2019). Behavioral health service utilization among detained adolescents: A meta-analysis of prevalence and potential moderators. The Journal of Adolescent Health: Official Publication of the Society for Adolescent Medicine, 64(6), 700–708. 10.1016/j.jadohealth.2019.02.010 [DOI] [PubMed] [Google Scholar]
- Young DW, Dembo R, & Henderson CE (2007). A national survey of substance abuse treatment for juvenile offenders. Journal of Substance Abuse Treatment, 32(3), 255–266. 10.1016/j.jsat.2006.12.018 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Zarkin GA, Dunlap LJ, & Homsi G (2004). The substance abuse services cost analysis program (SASCAP): a new method for estimating drug treatment services costs. Evaluation and Program Planning, 27(1), 35–43. [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.


