Abstract
Characterizing community mental health (CMH) treatment duration and discharge is an important step toward understanding how to better meet client needs. This report describes patterns of treatment duration and discharge among clinicians participating in a state-funded evidence-based treatment (EBT) training initiative. After training and consultation, clinicians (N=376) reported on treatment duration and discharge for their “most complete case.” On average, clinicians delivered 12.4 sessions (SD=5.1) of the treatment. After completing treatment, half of clinicians (58.7%) continued with regularly-scheduled therapy, either using EBT elements or switching to supportive therapy. Clinicians who continued with regularly-scheduled therapy delivered treatment in approximately the same number of sessions. Results revealed that CMH clinicians often do not discontinue therapy after completing a treatment protocol. These findings suggest it may be essential to better understand clinician decision-making around applying EBTs to their caseloads.
Keywords: community mental health, evidence-based treatment, treatment duration, treatment discharge
In the United States, 7.7 million (16.5%) children had at least one behavioral health disorder in the last year.1 One avenue for accessing mental health services is publicly-funded community mental health (CMH) agencies. In efforts to improve access to and quality of care in CMH, researchers have sought to characterize usual care in CMH to assess strengths and gaps in the implementation of evidence-based treatments (EBTs).2–4 Although research suggests that elements consistent with many EBTs are delivered to clients in CMH, some elements are delivered with very low intensity while others may be rarely delivered (e.g., assigning and reviewing homework).2 As such, stakeholders within CMH have focused on providing EBT training and support to increase use of EBTs within CMH;5 however, these efforts have seen mixed success.6 While evaluating EBT use in CMH, it is important to consider both the context of CMH and how EBTs may be applied in that context. Understanding how EBTs are applied in the context of CMH may provide useful information to tailor EBT implementation efforts.
Although some EBTs were explicitly developed to fit within the constraints of CMH (e.g.,7), many EBTs were not originally designed for the context of CMH. EBTs are typically intended to be focused and time-limited, often delivered in 12–16 weekly sessions lasting 60–90 minutes in efficacy studies.8 In CMH, it has historically been less common for regular weekly sessions to occur, and treatment duration has tended to be longer. One study of CMH treatment utilization found that CMH clinicians provide care for an average of about 23 sessions.2 Another study of treatment utilization patterns among children in CMH (N=30,055), found that only 17% of cases had regular sessions over a six-month period.9 Many children seeking treatment in CMH were seen only once or had infrequent appointments.9 These differences may in part be explained by the financial context of CMH, where reimbursements and clinical revenue may not cover the cost of EBT delivery.10 Differences in CMH treatment may also reflect a variety of differences between CMH and EBT efficacy trials, including differences in client presentation11 and session content.2 CMH clients may present with high comorbidity (i.e., multiple presenting problems) that may not be treated with just one EBT11 or face life stressors (e.g., housing insecurity or job loss) that require attention during EBT sessions.12 Further, CMH clinicians may be required to take on additional roles to support clients, such as engaging in efforts to coordinate external care (i.e., case management).2 CMH clinicians may not feel equipped to address client complexity or crises within EBT protocols,13 thereby prompting clinicians to pause EBT delivery and prolong care as they address client concerns across multiple domains (i.e., both clinical and social).
While research has examined EBT uptake and compared usual care to EBT use,14 little is known about how EBTs are integrated into clinicians usual practice in CMH, including how long they deliver EBTs and whether they discharge clients after completing EBTs. Research has largely examined total treatment length in CMH,2,12 but characterizing when and how CMH clinicians apply EBTs in their caseloads, as well as what CMH clinicians do after completing an EBT may be an important step toward understanding how to better meet CMH clients’ needs in less time. Condensing EBT delivery could make treatment more feasible and affordable for clients while improving cost-effectiveness and treatment reach for CMH systems. This study examines treatment duration and discharge in CMH settings following a state-funded EBT training initiative focused on children and adolescents. Treatment duration and discharge were operationalized by examining how many sessions were used to deliver an EBT, if clinicians completed delivery of the EBT, and what happened after EBT delivery (e.g., client discharge, continued therapy, booster sessions, etc.). The goal of this manuscript is to advance the understanding of EBT implementation in CMH and provide important context for EBT training and research efforts. To that end, this manuscript describes patterns of treatment duration and discharge among clinicians participating in a state-funded EBT training initiative.
Method
Clinician participants were part of the Washington State-funded training initiative for EBTs in CMH. The initiative, referred to as CBT+,15 builds on other transdiagnostic approaches16 and provides in-person training and remote expert consultation on Cognitive Behavioral Therapy (CBT) for four target presenting problems: anxiety, depression, trauma, and disruptive behavior. Participants attended a three-day in-person training and received six-months of twice-monthly, group-based expert consultation by phone. In order to receive a CBT+ certificate of completion, participants were required to participate in at least 9 of 12 consultation calls, complete a pre-training and post-consultation survey, present at least one case during the consultation calls, and enter data online for cases in which they delivered the treatment.
Data for these analyses came from two self-report surveys collected pre-training and post-consultation over two years of CBT+ training for CMH clinicians (September 2017-September 2019). In the pre-training survey, participants provided demographic (e.g., age) and background information (e.g., theoretical orientation). In the post-consultation survey (approximately 7-months post-training), participants evaluated the initiative (e.g., self-rated CBT skill, training/consultation feedback) and reported on treatment delivery duration and client discharge. Given the aim of describing patterns of treatment after training, this study utilizes a single group, posttest-only design. To assess duration, clinicians were prompted to think of their “most complete client that [they] delivered one of the CBT+ EBTs to during the [six-month] consultation period.” Clinicians were asked to think of their “most complete client” to get a better sense of how clinicians applied the full EBT from start to finish with a client (i.e., the full treatment and not select modules; not cases exclusively focused on crisis management). The researchers felt grounding this question in a “complete case” would reduce bias compared to assessing cases which were not yet completed and may have a higher proportion of non-EBT sessions solely due to the timing of the survey. The researchers also expected that grounding the question in a specific case would provide more informative data than questions about hypothetical or general caseloads.
Clinicians were asked: 1) how many sessions they spent delivering the CBT+ treatment, and 2) what they did after completing the treatment. When describing activity after they completed treatment, clinicians could select multiple options (presented in Table 1). Participant responses were categorized into two main groups, determined by if clients had consistently scheduled sessions following EBT completion: continued regular treatment and discontinued regular treatment. Clinicians were considered to continue regular treatment if they selected “Continue regularly scheduled therapy using CBT+ treatment elements” or “Switch to supportive therapy.” Clinicians were considered to discontinue regular treatment if they selected “Discharge the client from therapy;” “Plan for or schedule a booster session for the treatment;” or “Keep the door open for client-initiated sessions.”
Table 1.
Post-Treatment Activity
| Continued Regular Treatment | Discontinued Regular Treatment |
|---|---|
|
Continue regularly scheduled
therapy using CBT+ treatment elements
Switch to supportive therapy |
Discharge the client from
therapy
Plan for or schedule a booster session for the treatment Keep the door open for client-initiated sessions |
All clinicians (N=461) completed the pre-training survey, and 81.6% (N=376) also completed the post-consultation survey. Thirteen cases were excluded from analyses because clients transitioned to other agencies or types of care (e.g., wraparound services or DBT), or clinicians left their agency, yielding a final sample of 363 clinicians/cases. Survey completers and non-completers did not appear to systematically differ after visual inspection of covariates. CBT+ initiative activities were reviewed by the University of Washington Institutional Review Board and determined to be exempt from review.
Analyses were conducted using R version 3.6.0. Descriptive statistics are reported to characterize participating clinicians as well as examine treatment duration, and client discharge. A chi-square test was used to assess the association between theoretical orientation (CBT v. other) and continuation of regular therapy. As both number of sessions before EBT delivery and duration of EBT delivery failed the Shapiro-Wilk test of normality, Mann-Whitney-Wilcoxon tests were performed to compare these variables between clinicians who continued and discontinued regular therapy.17
Results
Participants were predominantly female (83.7%, n=307) and White (67.8%, n=248). Most had a master’s degree (94.3%, n=345), and a plurality identified their primary theoretical orientation as Cognitive Behavioral (36.2%; n=133). On average, participants were 35.1 (SD=10.3) years old with 3.5 (SD=4.2) years of experience providing psychotherapy.
On average, clinicians reported spending 12.4 sessions (SD=5.1; range: 0–40) delivering the treatment. Approximately one-fifth of clinicians (20.1%; n=73) had not completed the treatment at the time of the survey; these cases are not included in average treatment duration. Only 21.2% of clinicians (n=77) reported discharging their clients (i.e., discontinued scheduled sessions). A breakdown of reasons for client discharge is presented in Table 2. Continuation of therapy did not vary with theoretical orientation, χ2 (21, N = 92) = 16.7, p = 0.7.
Table 2.
Participant Demographics and EBT Delivery
| Participant Demographics | ||
|---|---|---|
| Mean | SD | |
| Age | 35.1 | 10.3 |
| Years providing therapy | 3.5 | 4.2 |
| Average caseload | 27.0 | 16.9 |
| n | % | |
| Female | 307 | 83.7 |
| Race and Ethnicity | ||
| White | 248 | 67.8 |
| Hispanic or Latino | 42 | 11.5 |
| Asian | 16 | 4.4 |
| Black/African American | 15 | 4.1 |
| American Indian or Alaska Native | 3 | 0.8 |
| Native Hawaiian or other Pacific Islander | 3 | 0.8 |
| Multiracial | 28 | 7.7 |
| Other | 11 | 3.0 |
| Education Level | ||
| Bachelor’s | 5 | 1.4 |
| Master’s | 345 | 94.3 |
| Doctoral | 6 | 1.6 |
| Other | 10 | 2.7 |
| Primary Theoretical Orientation | ||
| Cognitive Behavioral Therapy | 133 | 36.2 |
| Integrative/Eclectic | 104 | 28.3 |
| Family Systems | 42 | 11.4 |
| Humanistic/Existential | 38 | 10.4 |
| Interpersonal | 20 | 5.4 |
| Psychodynamic/analytic | 8 | 2.2 |
| Biological | 1 | 0.3 |
| Other | 21 | 5.7 |
| EBT Delivery* | ||
| M | SD | |
| Duration of CBT+ Treatment delivery (sessions) | 12.4 | 5.1 |
| n | % | |
| Discontinued Regular Therapy | 77 | 21.2 |
| Discharged | 68 | |
| Plan for or schedule a booster session for the treatment | 6 | |
| Keep the door open for client-initiated sessions | 19 | |
| Other | 7 | |
| Continued Regular Therapy | 213 | 58.7 |
| Continue regularly scheduled therapy using CBT+ elements | 183 | |
| Switch to supportive therapy | 70 | |
| Did Not Complete Treatment | 73 | 20.1 |
Participants could pick more than one answer choice for discharge, so percentages will not add up to 100%
Among those who were discharged (n=77), clinicians reported spending 11.7 sessions (SD=3.5; range: 2–20) delivering the treatment. For those who continued with regularly scheduled therapy (n=213), clinicians reported spending 12.9 sessions (SD=5.5; range: 0–40) delivering the treatment. Mann-Whitney-Wilcoxon tests indicated that clinicians who continued regularly scheduled therapy did not take more sessions to deliver the EBT (p=0.27).
Discussion
EBTs for common mental and behavioral health disorders are increasingly implemented in CMH settings.18 The goal of EBT rollout is to increase access to high-quality mental health care for all populations; however, little attention has been paid to understanding how CMH clinicians apply EBTs to specific cases, including the content of treatment sessions, length of EBT delivery, and rates of client discharge. These findings provide some information about clinician-reported EBT dosage for their most complete case. These results may reflect noted differences between how EBTs were originally designed, as brief, targeted treatments with an emphasis on one presenting problem and a goal of client discharge, and the context of treatment delivery in publicly-funded settings, where clinicians juggle treatment provision and case management to address client complexity and stressors.12
Clinicians reported delivering the treatment in a time-limited way, consistent with many EBT manuals (e.g., 13). In contrast to other studies, which found longer treatment duration in CMH,2,20 this suggests that CMH clinicians may deliver EBTs with similar pacing as efficacy trials in some cases. The differences between previous findings and the current study may be in part due to differences in methods. While previous research has reviewed charts or observed practice, studies suggest clinician self-report on EBT is a valid proxy for observational methods, particularly for more structured EBTs.21 By reporting on self-reported length in their “most complete case,” these results distinguish between focused EBT delivery and total treatment length, whereas other studies assume total treatment length is the same length as focused EBT delivery. For example, many clinicians reported that they continued regular therapy with CBT+ elements after completing the CBT+ treatment. However, these sessions were not included in the calculation of EBT delivery length because many clinicians anecdotally reported drawing from CBT+ strategies to engage clients in work that is not directly related to one of CBT+’s target presenting problems. The methods used in the current study help distinguish focused EBT delivery from total treatment length in order to create a more accurate characterization of EBT delivery in CMH. The findings suggest most clinicians perceived that they were able to provide CBT for target problems within the session range intended by developers, at least for a subset of their clients.
Regarding client discharge, these findings suggest CMH clinicians may face barriers in completing treatment and that, once completed, it is not standard practice to discharge children from regularly scheduled therapy (i.e., weekly sessions). When asked to report on their “most complete case,” approximately one-fifth of clinicians reported they had not yet completed the CBT+ treatment. This may reflect barriers in completing treatment protocols in CMH, such as client stressors that require attention in session12, funding challenges,10 or inconsistent attendance.9 Further, only a minority of clients were discharged from regularly scheduled therapy when treatment was completed. Over half (58.7%) of clients continued to receive ongoing therapy. Of note, the majority of cases that continued regularly scheduled sessions did include CBT+ elements (72.3%).
The contextual realities of CMH may explain why clients are not discharged from regularly scheduled therapy. As noted previously, children in CMH may present inconsistently9 and/or have complex and comorbid diagnoses.11 Further, children in CMH are more likely to be of ethnic minority status, have lower socioeconomic status, and report greater numbers of life stressors.22 As a result, the expectations of CMH clinicians may extend beyond traditional treatment delivery. CMH clinicians may continue to see clients after completing EBT delivery to address other client concerns, such as co-morbid problems or case management. CMH clinicians may also continue to see clients to provide support with life stressors, such as experiences of discrimination or housing insecurity. Previous research in CMH has shown that clinicians often engage in case management and care coordination.2
Interestingly, while CBT treatment duration did not significantly differ between those clinicians who did and did not discharge clients, these findings call greater attention to the applicability of EBT protocols to some clients in CMH. Others have noted the limited applicability of EBTs to children in CMH.11 The large number of clinicians who reported continuing regular therapy may reflect client complexity. These numbers may also reflect clinicians’ beliefs that EBTs may be less relevant for some clients or that EBTs do not allow for clinicians to attend to clients’ other needs, such as case management. EBT training initiatives and research efforts would benefit from a greater understanding of how CMH clinicians approach ending EBTs and discharging clients. This includes decision-making regarding treatment discharge, providing needed case management, and addressing life stressors. Such findings can also help inform understanding of when, and for whom, clinicians continue therapy after EBT delivery.
These findings should be considered within the context of several limitations. Notably, the survey from which these data are collected asks clinicians to think of only one client, specifically their “most complete” client. By reporting on their “most complete” case, clinicians may be reporting on their best-case scenario. As such, these findings are not meant to represent aggregate clinician behavior and may only reflect the best case of an average caseload of 27 for clinicians in this study. Further, clinicians were not provided guidance about what constituted their “most complete” case, nor did they report additional information about those cases, such as presenting problems, treatment fidelity, or client outcomes. This further limits the generalizability of findings and the ability to make greater claims about what might be associated with treatment duration and discharge. For example, these findings do not account for treatment fidelity or client outcomes, both of which could impact treatment duration and client discharge. Additionally, data on client activity were collected through clinicians’ retrospective self-report and thus may have been subject to self-report or retrospective biases.
Implications for Behavioral Health.
Implementation of EBTs has been proposed as a way of improving behavioral health outcomes of children and adolescents receiving treatment in CMH settings. However, efforts to increase EBT implementation have not been fully informed by the CMH context in which providers deliver EBTs to their clients. Findings from this study suggest that CMH clinicians are likely to continue providing therapy following EBT delivery. Stakeholders may need to better understand clinician decision-making around applying EBTs to their caseloads. The field may also need to better adapt EBT implementation efforts to the CMH context, in which EBTs may be part of improving quality of care (as suggested by Park and colleagues)11 but does not supplant usual care or necessarily shorten treatment duration.
Disclosures and acknowledgements:
This publication was made possible in part by funding from the Washington State Department of Social and Health Services, Division of Behavioral Health and Recovery (WA DBHR). Ms. Berliner and Dr. Dorsey were paid by WA DBHR to serve as trainers and consultants for the activities outlined in this study. Manuscript preparation was supported by an NIMH fellowship to Mr. Triplett (F31MH124328).
Conflict of Interest Statement
This publication was made possible in part by funding from the Washington State Department of Social and Health Services, Division of Behavioral Health Recovery (DBHR). Dr. Dorsey and Ms. Berliner were paid by Washington-State Department of Social and Health Services, Division of Behavioral Health Recovery (DBHR) to serve as trainers and consultants for the activities outlined in this study. The remaining authors declare that they have no conflict of interest.
Footnotes
Previous presentation: Findings not previously presented.
Human Participants: All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.
Informed consent: Informed consent was obtained from all individual participants included in the study.
References
- 1.Whitney DG, Peterson MD. US National and State-Level Prevalence of Mental Health Disorders and Disparities of Mental Health Care Use in Children. JAMA Pediatrics. 2019;173(4):389–391. doi: 10.1001/jamapediatrics.2018.5399 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Garland AF, Brookman-Frazee L, Hurlburt MS, et al. Mental health care for children with disruptive behavior problems: A view inside therapists’ offices. Psychiatric Services. 2010;61(8):788–795. doi: 10.1176/ps.2010.61.8.788 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Weisz JR, Ugueto AM, Cheron DM, et al. Evidence-based youth psychotherapy in the mental health ecosystem. The Journal of Clinical Child and Adolescent Psychology. 2013;42(2):274–286. doi: 10.1080/15374416.2013.764824 [DOI] [PubMed] [Google Scholar]
- 4.Edmunds JM, Beidas RS, Kendall PC. Dissemination and implementation of evidence-based practices: Training and consultation as implementation strategies. Clinical Psychology Science and Practice. 2013;20(2):152–165. doi: 10.1111/cpsp.12031 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Rubin RM, Hurford MO, Hadley T, et al. Synchronizing Watches: The Challenge of Aligning Implementation Science and Public Systems. Administration and Policy in Mental Health and Mental Health Services Research. 2016;43(6):1023–1028. doi: 10.1007/s10488-016-0759-9 [DOI] [PubMed] [Google Scholar]
- 6.Beidas RS, Williams NJ, Becker-Haimes EM, et al. A repeated cross-sectional study of clinicians’ use of psychotherapy techniques during 5 years of a system-wide effort to implement evidence-based practices in Philadelphia. Implementation Science. 2019;14(1):67. doi: 10.1186/s13012-019-0912-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Chorpita BF & Weisz JR. Modular Approach to Therapy for Children with Anxiety, Depression, Trauma, or Conduct Problems (MATCH-ADTC). Satellite Beach, FL: PracticeWise, LLC; 2009. [Google Scholar]
- 8.Kendall PC, Flannery-Schroeder E, Panichelli-Mindel SM, et al. Therapy for youths with anxiety disorders: A second randomized clinical trial. Journal of Consulting and Clinical Psychology. 1997;65(3):366–380. doi: 10.1037/0022-006X.65.3.366 [DOI] [PubMed] [Google Scholar]
- 9.Burley M Outpatient Treatment Differences for Children Served in Washington’s Public Mental Health System. Olympia, WA; 2009. [Google Scholar]
- 10.Beidas RS, Stewart RE, Adams DR, et al. A Multi-Level Examination of Stakeholder Perspectives of Implementation of Evidence-Based Practices in a Large Urban Publicly-Funded Mental Health System. Administration and Policy in Mental Health and Mental Health Services Research. 2016;43(6):893–908. doi: 10.1007/s10488-015-0705-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Park AL, Tsai KH, Guan K, et al. Unintended Consequences of Evidence-Based Treatment Policy Reform: Is Implementation the Goal or the Strategy for Higher Quality Care? Administration and Policy in Mental Health and Mental Health Services Research. 2018;45(4):649–660. doi: 10.1007/s10488-018-0853-2 [DOI] [PubMed] [Google Scholar]
- 12.Guan K, Park AL, Chorpita BF. Emergent Life Events During Youth Evidence-Based Treatment: Impact on Future Provider Adherence and Clinical Progress. The Journal of Clinical Child and Adolescent Psychology. 2019;48(sup1):S202–S214. doi: 10.1080/15374416.2017.1295382 [DOI] [PubMed] [Google Scholar]
- 13.Marques L, Dixon L, Valentine SE, et al. Providers’ perspectives of factors influencing implementation of evidence-based treatments in a community mental health setting: A qualitative investigation of the training-practice gap. Psychological Services. 2016;13(3):322–331. doi: 10.1037/ser0000087 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Weisz JR, Jensen-Doss A, Hawley KM. Evidence-based youth psychotherapies versus usual clinical care: A meta-analysis of direct comparisons. American Psychologist. 2006;61(7):671–689. doi: 10.1037/0003-066X.61.7.671 [DOI] [PubMed] [Google Scholar]
- 15.Dorsey S, Berliner L, Lyon AR, et al. A statewide common elements initiative for children’s mental health. The Journal of Behavioral Health Services & Research. 2016;43(2):246–261. doi: 10.1007/s11414-014-9430-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Weisz JR, Chorpita BF, Palinkas LA, et al. Testing standard and modular designs for psychotherapy treating depression, anxiety, and conduct problems in youth: A randomized effectiveness trial. Archives of General Psychiatry. 2012;69(3):274–282. doi: 10.1001/archgenpsychiatry.2011.147 [DOI] [PubMed] [Google Scholar]
- 17.Rochon J, Gondan M, Kieser M. To test or not to test: Preliminary assessment of normality when comparing two independent samples. BMC Medical Research Methodology 2012;12(1):81. doi: 10.1186/1471-2288-12-81 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.McHugh RK, Barlow DH. The dissemination and implementation of evidence-based psychological treatments. American Psychologist. 2010;65. doi: 10.1037/a0018121 [DOI] [PubMed] [Google Scholar]
- 19.Beck JS. Cognitive Behavior Therapy, Basics and Beyond, Second Edition. 2nd ed. New York: Guilford Press; 2011. doi: 10.1097/01.pra.0000432605.83964.85 [DOI] [Google Scholar]
- 20.Jensen TK, Holt T, Ormhaug SM. A Follow-Up Study from a Multisite, Randomized Controlled Trial for Traumatized Children Receiving TF-CBT. Journal of Abnormal Child Psychology. 2017;45(8):1587–1597. doi: 10.1007/s10802-017-0270-0 [DOI] [PubMed] [Google Scholar]
- 21.Brookman-Frazee L, Stadnick NA, Lind T, et al. Therapist-observer concordance in ratings of EBP strategy delivery: Challenges and targeted directions in pursuing pragmatic measurement in children’s mental health services. Administration and Policy in Mental Health and Mental Health Services Research. 2020;(0123456789). doi: 10.1007/s10488-020-01054-x [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Southam-Gerow MA, Chorpita BF, Miller LM, et al. Are children with anxiety disorders privately referred to a university clinic like those referred from the public mental health system? Administration and Policy in Mental Health and Mental Health Services Research. 2008;35(3):168–180. doi: 10.1007/s10488-007-0154-7 [DOI] [PubMed] [Google Scholar]
