Abstract
Implementation initiatives and technology-based resources aim to address barriers to Evidence-Based Practice (EBP) use by creating generalizable techniques that can be used for a variety of youth-serving agencies. However, research has not carefully examined unique differences between agency types or individual programs in readiness to use such technologies and implementation strategies. The current study explored differences between community mental health clinics and child advocacy centers on organizational cultural factors (e.g., ability to change and commitment for change) to implement a novel technology-based toolkit to support delivery of Trauma Focused Cognitive Behavioral Therapy (TF-CBT). Results indicated that TF-CBT providers from child advocacy centers reported greater commitment to change and more support to use the technology-based system than those from community mental health centers. Findings suggest that implementation initiatives should address the needs of individual agencies and service settings and adaptations should be explored to best meet the needs of these settings.
Keywords: Organizational culture, Evidence-based practice, Community mental health, Trauma focused care, Child advocacy
Approximately one in four children in the United States experiences a diagnosable mental health disorder before age 18, resulting in a population estimate of 18 million children (Kazdin, 2018). Evidence-based practices (EBP)–defined as the integration of empirically supported interventions with provider clinical expertise and patient characteristics, developmental levels, life experiences, cultural backgrounds, and preferences (American Psychological Association, 2021) – provide the best outcomes for patients with mental health disorders (American Psychological Association, 2021; Chambless & Ollendick, 2001; Spring, 2007). However, EBPs are often underutilized by mental health providers (Chorpita et al., 2011; Palinkas et al., 2008) or not delivered with sustained fidelity which may optimize patient outcomes (Collyer et al., 2019). Within the past decade, many technology-based resources (i.e., apps, mHealth interventions) have been developed to target mental health directly and support EBP delivery (i.e. Donker et al., 2013; Weisel et al., 2019). One potential strength of these technological resources is their ability to help clinicians deliver EBPs with fidelity and engage clients in treatment (Anton et al., 2020, Ruggiero et al., 2017). However, it is unclear whether organizational factors, such as leadership, support, and tolerance for change, may impact clinical providers’ adoption of these technologies, nor whether these factors may vary across settings. The goal of many implementation initiatives is to create broad strategies (e.g., incentives for specific provider behaviors; training, coaching, and consultation practices) for heterogeneous organizations (i.e., thereby increasing generalizability). Yet, it is also important to examine the role of idiosyncratic (or ‘unique,’ site-specific) setting variables that may necessitate the adaptation of implementation strategies to meet an agency’s needs (Powell et al., 2015, 2019).
Unique Agency Factors
Mental health services are available to youth in many different forms and settings. Children may receive services at community mental health agencies, private practices, child advocacy centers, and a host of other sites that have unique workflows, priorities, patient volumes, patient demographics, and available funding streams. Thus far, research has not carefully examined unique differences between agency or setting types that may impact EBP implementation initiatives. This is an important area of study because implementation strategies may need to be tailored to address these factors and specific agency needs to effectively facilitate EBP adoption and sustainment. Two common types of agencies that provide mental health treatment services include: 1) community mental health clinics (CMHCs) that serve patients with diverse presenting mental health problems (Cheung & Snowden, 1990; US Center for Mental Health Services, 2011), and 2) specialty clinics that primarily focus on one type of presenting problem. Child advocacy centers (CAC) serve as one example of a specialty program or agency, which focus on providing forensic and clinical care for children who have experienced child maltreatment (Herbert et al., 2018). CACs coordinate the investigation, treatment, and prosecution of child abuse cases. As part of the national accreditation standards, they must directly provide or coordinate trauma-specific mental health treatment for children/families when needed (National Children’s Alliance, 2021). In contrast, CMHCs tend to serve diverse patient populations of children and adults with a wide array of mental health conditions. In addition, CMHCs and specialty clinics, like CACs, may differ in a variety of other ways, such as different available funding streams and workload expectations for providers, which also may impact implementation initiatives.
A potential difference that has yet to be explored is differences in organizational culture between CMHCs and CACs. Organizational culture refers to the norms and expectations surrounding employee behavior and procedures within the organization (Glisson & James, 2002; Glisson & Williams, 2015). Culture is an important construct to examine when implementing a new change/intervention because the shared organization beliefs and expectations about the change can impact the beliefs and actions of organization members. Positive attitudes toward EBP are associated with a constructive organizational culture; for example, those that are supportive and motivating (Feldman, 1993). In contrast, defensive organizational culture focuses on employee conformity and dependence. Research has found that providers in agencies with a defensive organizational culture report higher levels of resistance to using EBPs (Aarons & Sawitzky, 2006). More broadly, organizational culture can impact how successfully providers implement EBPs. If organizational culture does not align with implementing a new EBP, it can lead to a lack of innovation and implementation failure (Aarons et al., 2016; Beidas et al., 2015; Feldman, 1993). Whereas research has indicated that culture is multi-dimensional (Cooke & Szumal, 1993; Xenikou & Furnham, 1996), there is a dearth of empirical research on the specific dimensions of culture. To date, research has focused on the development of several different scales to measure distinct dimensions of culture. For example, the Organizational Culture Inventory (OCI; Cooke & Lafferty, 1989) consists of 12 cultural domains, whereas the Organizational Beliefs Questionnaire (Sashkin & Flumer, 1985) measures 10 aspects of culture. Across these measures, six dimensions of culture were identified with the strongest dimensions being satisfaction needs, or how an organization encourages members to innovate, and task oriented organizational growth, or the value of continued improvement and growth (Xenikou & Furnham, 1996). Research has yet to explore specific factors relevant to these domains or examine agency-specific differences in organizational cultural factors between CMHCs and specialty clinics. Identification of these idiosyncratic agency factors is important to understand whether tailoring of implementation initiatives may be needed to meet the unique needs of these settings. Our review of the limited literature revealed three potentially important cultural domains related to satisfaction, needs, and value of continued growth that may differ between agency type: (1) change commitment, or how committed an organization is to making changes to implement a new practice (Shea et al., 2014); (2) change efficacy, or how capable an organization is to implement a new practice (Shea et al., 2014); and (3) support for using a specific type of EBP, such as TF-CBT (Cohen & Mannarino, 2016; Cohen et al., 2007). We conceptualize organizational readiness and support as aspects of culture that have been demonstrated in prior research to impact implementation outcomes (Feldman, 1993).
Current Study
The current study is part of a larger hybrid type 1 effectiveness-implementation trial to evaluate an innovative, scalable, tablet-based resource, the SPARK (Supporting Providers and Reaching Kids) toolkit, to improve clinician fidelity to Trauma Focused-Cognitive Behavior Therapy (TF-CBT), and, in turn, patient engagement and outcomes. TF-CBT is an EBP that targets trauma-related symptoms in children and adolescents (Cohen & Mannarino, 2017; Cohen et al., 2007; Anton et al., 2020). The SPARK toolkit aims to improve fidelity to TF-CBT by providing clinicians a technological tool that consists of nine interactive chapters matching each module of TF-CBT, which consist of videos, games, and/or activities that can help engage clients and guide the clinician to deliver the hypothesized mechanisms of change of TF-CBT (Anton et al., 2020). This larger study is a multi-site trial in which we have partnered with agencies delivering TF-CBT to children ages 8–16 years (Anton et al., 2020).
The current study aimed to explore differences between CMHCs and CACs (specialty clinics that often prioritize TF-CBT in their treatment offerings) on specific organizational cultural factors—commitment to change and change efficacy to implement SPARK, a technology-based toolkit to support the use of TF-CBT, as well as organizational support to use TF-CBT in general. Trauma-related concerns are the most common presenting problems among families served by CACs. Thus, we hypothesized that CACs would have higher organizational support and a more supportive culture for the implementation of a trauma focused intervention, such as TF-CBT, as well as tools to support its use, compared with CMHCs. We sought to compare organizational cultural factors between CACs and CMHCs, specifically, to test this hypothesis.
Method
Participants and Procedure
An Institutional Review Board (IRB) reviewed and approved this study. Participants (N = 167) included mental health providers who were recruited from 18 community mental health clinics (69%; n = 118) and 12 child advocacy centers (29%; n = 49) across three southeastern US states.
Providers were eligible to participate if they had previous training in TF-CBT (i.e., completed an 8–10 h asynchronous online continuing education course, intensive in-person training, and/or expert consultation), were full or part time employees in their respective agencies, and had obtained at least a Master’s degree in social work, counseling, clinical psychology, or a related field. Eligible and interested providers provided consent following attendance at an orientation for the study and subsequently completed assessments via a secure online platform. Participants were e-mailed a link to the online survey upon study enrollment that was used to assess individual and agency-related factors related to TF-CBT and EBP use and took approximately 30 min to complete.
Measures
Demographics
To obtain demographic information, providers were asked to complete questions on race, gender, type of agency employed, and professional experience with children.
Organizational Culture Factors to Implement SPARK
Two measures were used to assess providers’ perceptions of organizational factors that may relate to the adoption of the SPARK toolkit, an evidence-informed toolkit to support the use of TF-CBT, as each measure assesses a specific domain factor that we are empirically studying (change commitment, change efficacy). The first was the Organizational Readiness for Implementation Change (ORIC; Shea et al., 2014). This scale consists of 24 items that yield five subscales; however, given the focus of this study (and to minimize participant burden), the Organization Commitment to Change Subscale (5 items) and the Organization Change Efficacy Subscale (7 items) were the two subscales selected. An example item from the Organization Commitment to Change Subscale is “People who work here are committed to implementing this change.” An example item from the Organization Change Efficacy Subscale is “People who work here feel confident that the organization can support people as they adjust to this change.” For each question, respondents were instructed to reference SPARK, as the ‘change’ being implemented. Participants were asked to indicate how strongly they agreed with each statement using a scale from 1 (“disagree”) to 5 (“agree”). Previous research has found average scores on the organizational change commitment subscale ranging from 2.07 to 4.2 (Arthur et al., 2020; Shea et al., 2014) and on the organizational change efficacy subscale ranging from 2.14 to 4.3 (Arthur et al., 2020; Shea et al., 2014). Both the Organizational Commitment to Change (α = 0.93) and the Change Efficacy subscales (α = 0.95) demonstrated high internal consistency in the current sample.
Organizational Support for TF-CBT
This measure was specific to provider perceptions of organizational factors in implementing TF-CBT. The TF-CBT in Your Organization measure, developed as part of a statewide training initiative (Hanson et al., 2016; Helseth et al., 2020), consisted of 18 items in which participants indicated their agreement from 1 (“strongly disagree”) to 5 (“strongly agree”) on questions specifically asking about organization resources, guidelines, policies, and training to implement TF-CBT. Example items are “In our organization, clinical supervisors expect therapists to adhere to TF-CBT components, techniques, and practices;” “In our organization, peer consultation on TF-CBT is available to all therapists;” and “In our organization, new therapists typically receive basic training in TF-CBT either in house or from external trainers.” This scale demonstrated high internal consistency in our sample (α = 0.93).
Data Analysis Plan
First, descriptive statistics were computed to examine organizational factors by site type. Next, multivariate analysis of variance (MANOVA) using SPSS (version 28.0) were conducted to examine differences by agency type on providers’ perceptions of organizational culture factors. Missing data were handled using pairwise deletion (21 cases were missing data in which 20 were from CMHC and 1 was from CAC).
Results
Descriptive Information
The completion rate of the baseline survey was 92%. Provider ages ranged from 23 to 71 years old (M = 38 years, SD = 10.7), and primarily identified as women (83%). The majority of participants were White (58%); 28% reported Black or African American race. Approximately 9% of providers identified as Hispanic or Latinx. Thirty nine percent of providers had been conducting mental health treatment with children for five or more years, and 29% had been conducting mental health treatment with trauma exposed children for five or more years. See Table 1. There was a significant difference among providers from CMHCs and CACs in that providers from CMHCs were significantly older (F(1,151) = 4.58, p = 0.03, Mean CMHCs = 39.1 [SD = 10.6], Mean CACs = 35.1 [SD = 10.7]).
Table 1.
Demographic information
| Age | Overall sample | CMHCs | CACs | |||
|---|---|---|---|---|---|---|
| M (SD) | M (SD) | M (SD) | ||||
| 37.8 (10.8) | 39.1 (10.6) | 35.1 (10.7) | ||||
| N | % | N | % | N | % | |
| Gender | ||||||
| Female | 140 | 83 | 93 | 79 | 46 | 94 |
| Male | 14 | 8 | 11 | 9 | 3 | 6 |
| Agency type | ||||||
| Community mental health clinics | 118 | 69 | ||||
| Child advocacy centers | 49 | 29 | ||||
| Race | ||||||
| White | 100 | 58 | 60 | 51 | 36 | 74 |
| Black/African American | 48 | 28 | 38 | 32 | 9 | 18 |
| Hispanic/latinx | 15 | 9 | 11 | 9 | 4 | 8 |
| Professional experience with children | ||||||
| Less than 6 months | 5 | 3 | 0 | 0 | 5 | 10 |
| 6 months to 1 year | 13 | 8 | 7 | 6 | 6 | 12 |
| 1 to 3 years | 35 | 21 | 15 | 13 | 19 | 39 |
| 3 to 5 years | 36 | 21 | 29 | 25 | 7 | 14 |
| 5 or more years | 65 | 39 | 53 | 51 | 12 | 25 |
| Professional experience with children and trauma | ||||||
| Less than 6 months | 8 | 5 | 1 | 1 | 7 | 14 |
| 6 months to 1 year | 18 | 11 | 10 | 9 | 8 | 16 |
| 1 to 3 years | 39 | 23 | 22 | 19 | 16 | 33 |
| 3 to 5 years | 40 | 24 | 30 | 25 | 10 | 20 |
| 5 or more years | 49 | 29 | 41 | 35 | 8 | 16 |
Descriptive information about the measures of organizational culture can be found in Table 2. Scores on the Organizational Commitment to Change subscale ranged from 2.2 to 5.0 with an average of 4.3, and scores on the Organizational change efficacy subscale ranged from 2.4 to 5.0 with an average of 4.3. For the TF-CBT in Your Organization scale, scores ranged from 30.0 to 94.0 with an average of 70.1. All measures were highly to moderately correlated in the overall sample, as well as by site type, indicating that these measures are related. Specific correlation coefficients can be found in Table 3.
Table 2.
Descriptive information in overall sample and by site type
| Overall sample | Community mental health sites | Child advocacy centers | ||||
|---|---|---|---|---|---|---|
| M | SD | M | SD | M | SD | |
| Organization implementation factors | ||||||
| Change commitment subscale | 4.3 | 0.7 | 4.1 | 0.7 | 4.7 | 0.5 |
| Change efficacy subscale | 4.3 | 0.7 | 4.1 | 0.7 | 4.7 | 0.5 |
| TF-CBT organizational support measure | 70.1 | 12.7 | 65.8 | 11.6 | 78.5 | 10.3 |
Table 3.
Correlation table in overall sample and by site type
| Overall sample | Community mental health sites | Child advocacy centers | |||||||
|---|---|---|---|---|---|---|---|---|---|
| Change commitment subscale | Change efficacy subscale | TF-CBT org support | Change commitment subscale | Change efficacy subscale | TF-CBT org support | Change commitment subscale | Change efficacy subscale | TF-CBT org support | |
| Change commitment subscale | 1 | 1 | 1 | ||||||
| Change efficacy subscale | 0.93** | 1 | 0.93** | 1 | .87** | 1 | |||
| TF-CBT organizational support measure | 0.59** | 0.66** | 1 | 0.54** | 0.59** | 1 | .33** | .57** | 1 |
p < 0.01
Differences by Agency Type
Results of the MANCOVA analysis indicated a statistically significant difference in organizational culture factors based on agency type (F(3, 142) = 15.34, p < 0.001; Wilk’s Λ = 0.756, partial η2 = 0.25). Follow up tests revealed that TF-CBT providers from CACs reported significantly higher perceived organizational commitment to change (i.e., implement SPARK) (F(1, 144) = 24.79, p < 0.001, partial η2 = 0.15), and significantly higher perceived organizational change efficacy (F(1,144) = 27.29, p < 0.001, partial η2 = 0.16) than providers from CMHCs. TF-CBT providers from CACs also reported significantly higher scores on the TF-CBT in Your Organization scale (F(1,144) = 42.69, p < 0.001, partial η2 = 0.23), reflecting greater support for TF-CBT implementation than CMHC providers.
Discussion
This study examined differences in organizational culture among providers in CMHCs versus CACs who reported actively using a trauma-focused evidence based treatment. Factors assessed included provider perceived organizational commitment to change and change efficacy to implement SPARK, an evidence-informed tool to support the use of TFCBT, as well as organizational support for using TF-CBT. Overall, providers perceived their sites as being ready for change (i.e., SPARK implementation), as well as their sites being supportive of TF-CBT use in general. However, results indicated differences between those employed in CACs vs. CMHCs. Specifically, TF-CBT trained providers from CACs reported significantly higher perceived organizational commitment to change and change efficacy in implementing SPARK, as well as organizational factors specific to TF-CBT use in general than TF-CBT trained providers from CMHCs. These results have important implications for clinics hoping to implement EBPs and technology-based tools to support EBP delivery.
First, the perceptions of providers in the current study were that their site was generally open to change and that implementation of new initiatives, techniques, and approaches was possible at their site. This is encouraging and suggests that, generally, youth-serving agencies are creating a culture in which EBPs and technology-based resources that support EBP use are generally accepted. These findings align with extant research indicating an increased uptake and positive attitudes towards use of EBP in clinical settings (Aarons & Sawitzky, 2006). Scores on the organizational culture measures reported in the current sample were notably higher than some in the extant literature (Shea et al., 2014) but on par with more recent literature (Arthur et al., 2020).
Results also indicated that there were differences between at least two major agency types that could impact the implementation of specific initiatives and resources and may call for unique strategies or additional resources to achieve adoption. Historically, implementation strategies have aimed to create tools that can be applied broadly across agency types (Powell et al., 2019). As one example, regardless of agency type, it may be necessary to offer specialized assistance when implementing a technology-based resource to troubleshoot technology specific issues. However, in some instances, it may be necessary to adapt existing implementation strategies. In the current study, providers from CACs reported higher perceived support than providers from CMHCs to use TF-CBT to treat their patients. It is possible that this may be due to the EBP being well aligned with the mission of the CACs, which specialize in treating trauma-related presenting problems. It is also possible that providers from CACs had more experience and training in TF-CBT, due to trauma-related presenting problems being their primary focus, that made them more comfortable considering integrating a new tool. In contrast, CMHC providers serve a patient population with more diverse presenting problems that require an array of EBPs, which may result in fewer resources and less support for a specific EBP, such as TF-CBT, or support for implementing a tool that only supports one specific EBP. Organization leadership and implementation scientists should assess the needs of each organization to tailor implementation strategies to the unique characteristics of the organization to maximize adoption and sustained use. For example, prior to implementation, a needs assessment should be conducted to determine how frequently a provider is delivering TF-CBT as well as their confidence in their delivery. This information can be used to guide implementation strategies that meet the providers current needs. Specifically, if providers rarely use TF-CBT or are very confident in their delivery, they may be less inclined to learn and adopt a toolkit to help with their TF-CBT delivery. However, it is important to assess these provider needs prior to choosing an implementation strategy.
Additionally, providers from CMHCs perceived their organization as having a lower commitment to change and having less efficacy to change than providers from CACs in the context of implementing SPARK. This highlights the need to involve agency leaders in implementation initiatives to ensure that providers observe change as a priority among leaders. Previous research has indicated that when an organization’s leadership conveys specific support for new systems, they are more likely to succeed and be sustained (Farahnak et al., 2019). Based on the results of this study, implementation initiatives in CMHCs may benefit from assessing and addressing providers’ perceptions of the organization commitment to change and change efficacy to strengthen buy-in. It may be beneficial for leadership to explain to providers the rationale behind the change (Cohen et al., 2007) and to provide support surrounding the change, such as including trainings on the EBP and providing printed materials (Armenakis et al., 1993) as well as online resources (Heck et al, 2015). Organization leadership working toward the implementation of an EBP should be aware of how this may impact specific agency processes. For example, if the change will add work to a clinician’s already busy schedule, addressing compensation or benefits for adding the work will be important. The partnership between agency leadership and providers to understand workflows and daily tasks as well as incorporate providers’ input and concerns toward implementing a new EBP or system have the potential to yield a high motivation for organization change (Cohen & McWilliams, 2020). This partnership can impact how organization culture is perceived and yield more confidence from providers that the organization can sustain the change (Cohen & McWilliams, 2020).
Limitations and Future Directions
While several study strengths exist, the current study is not without limitations. First, data from the current study were collected cross-sectionally via self-report from the providers. This could introduce bias, as providers have entered the study with certain attitudes toward their organization, TF-CBT, and the idea of implementing a new tool to assist with TF-CBT delivery. While neither observational nor qualitative measures were used in this study, future research may benefit from combining provider self-report data with observational field note data as well as conducting semi-structured interviews to better understand provider perceptions and existing agency resources that may contribute to our understanding of organizational climate. Additionally, it is important to gather information from those holding various roles within an agency, including administrators, supervisors, and front-line staff, as perceptions may vary (Beidas et al., 2016; Hanson et al., 2018). While objective measures of organizational culture, such as recorded naturalistic interactions between organizational members or data related to the number of resources housed in the agency, would have increased confidence in the data, this study does provide important information regarding providers’ perceptions of their organizational culture, which may inform organization policies to support implementation initiatives. It is recommended that future research include organizational culture measures from diverse perspectives within an agency, as well as objective indicators.
Future research should also further study the unique cultural factors of both providers and clients related to integrating technology into therapy. While some technology enhanced interventions have been shown to be effective with clients from ethnic and racial minority groups (Hantsoo et al., 2018; Rosenbaum et al., 2017), there is a dearth of research on the acceptability and effectiveness of most technology based or enhanced interventions. Among providers, it is important to assess how their cultural background may impact their adoption of technology tools. Individuals from Latinx and African American groups are more likely to use smartphones as their technology of choice so they may be more open to the use of technology in mental health care to reduce barriers (Anderson, 2015). Additionally, providers from more collectivistic cultures who prioritize interpersonal relationships may have concerns about technology disrupting the therapeutic alliance (Fenichel, 2001; Shore et al., 2006). Finally, providers living in rural areas may be more open to technology due to the necessity of technology being used to reach clients. In order to increase the acceptance of technology-enhanced interventions by both the provider and the client, it will be important to both assess and empirically evaluate necessary cultural needs and adaptations.
Relatedly, additional contextual factors were not considered in the present study that may impact provider perception of organization culture and openness to implementing SPARK. Data were not systematically collected on provider caseload, so it is unclear if potential differences in caseload or clinical expectations among providers from CACs and CMHCs may have influenced their views of agency culture. In formative work for the larger parent study, average caseloads varied from less than 10 clients per week to caseloads over 90 clients. Such differences could impact provider perceptions and willingness to implement SPARK, as those with larger caseloads may have less time to participate in SPARK training or to use SPARK with their TF-CBT treatment cases. Future research should consider additional contextual factors, such as accommodating providers who may have greater time demands.
Because data were collected at a single timepoint for each partnering agency, it is unclear if the organizational commitment to change, change efficacy, or support to use TF-CBT changed over time. It is also notable that implementation of TF-CBT varied across providers. Specifically, because some providers were only recently trained, they were new to TF-CBT delivery, whereas others had been using TF-CBT for many years. Future research should prospectively examine the impact of organizational change commitment, change efficacy, and requirements to determine how they may change or impact provider attitudes toward EBPs. Finally, the current study is the first of its kind to identify differences between agency types in organizational culture. Future studies are needed to further explore idiosyncratic agency factors (e.g., workflow, infrastructure) that may influence technology-based resource uptake to inform tailoring of implementation strategies.
Conclusion
This study examined provider perceptions of specific organizational culture factors—commitment to change and change efficacy to use a technology-based toolkit to support the delivery of TF-CBT, as well as support for using TF-CBT, in general—and whether there were differences among providers in community mental health agencies versus child advocacy centers. Results indicated that TF-CBT trained providers from CACs perceived significantly more commitment and efficacy to change to using the SPARK toolkit, as well as support for delivering TF-CBT, than TF-CBT trained providers from CMHCs. Identifying the perceptions of the organizational culture prior to the onset of implementation initiatives may help adapt initiatives to maximize success. After assessing the culture, organizations can use that information to tailor implementation strategies to meet specific needs, such as determining whether training may differ depending on the EBP, the targeted disorder or symptoms, and/or other factors, such as provider work flow and patient volume. Tailoring implementation strategies to meet unique needs of an agency will lead to more successful EBP adoption and sustainment and thereby improve quality of mental health care.
Funding
This study was supported by National Institute of Mental Health (NIMH) Grant R01 MH110620-01A1 (PI: Ruggiero). Views expressed herein are those of the authors and do not necessarily reflect those of NIMH or respective institutions.
Footnotes
Conflict of interest We have no conflicts of interest to disclose.
References
- Aarons GA, Green AE, Trott E, Willging CE, Torres EM, Ehrhart MG, & Roesch SC (2016). The roles of system and organizational leadership in system-wide evidence-based intervention sustainment: A mixed-method study. Administration and Policy in Mental Health and Mental Health Services Research, 43(6), 991–1008. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Aarons, Sommerfeld DH, & Willging CE (2011). The soft underbelly of system change: the role of leadership and organizational climate in turnover during statewide behavioral health reform. Psychological Services, 8(4), 269–281. 10.1037/a0026196 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Aarons, & Sawitzky AC (2006). Organizational culture and climate and mental health provider attitudes toward evidence-based practice. Psychological Services, 3(1), 61–72. 10.1037/1541-1559.3.1.61 [DOI] [PMC free article] [PubMed] [Google Scholar]
- American Psychological Association (2021). Policy statement on evidence-based practice in psychology. Professional practice Guidelines. Retrieved from: https://www.apa.org/practice/guidelines/evidence-based-statement [Google Scholar]
- Anderson M (2015). Racial and ethnic differences in how people use mobile technology. Pew Research Center. Retrieved from: https://www.pewresearch.org/fact-tank/2015/04/30/racial-and-ethnic-differences-in-how-people-use-mobile-technology/ [Google Scholar]
- Anton MT, Ridings LE, Hanson R, Davidson T, Saunders B, Price M, Kmett Danielson C, Chu B, Dismuke CE, Adams ZW, & Ruggiero KJ (2020). Hybrid type 1 randomized controlled trial of a tablet-based application to improve quality of care in child mental health treatment. Contemporary clinical trials, 94, 106010. 10.1016/j.cct.2020.106010 [DOI] [PMC free article] [PubMed] [Google Scholar]
- ArmenakisHarris SG, & Mossholder KW (1993). Creating readiness for organizational change. Human Relations (New York), 46(6), 681–703. 10.1177/001872679304600601 [DOI] [Google Scholar]
- Arthur, Christofides N, & Nelson G (2020). Educators’ perceptions of organisational readiness for implementation of a pre-adolescent transdisciplinary school health intervention for inter-generational outcomes. PLoS ONE, 15(1), e0227519. 10.1371/journal.pone.0227519 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Beidas RS, Marcus S, Aarons GA, Hoagwood KE, Schoenwald S, Evans AC, & Mandell DS (2015). Predictors of community therapists’ use of therapy techniques in a large public mental health system. JAMA Pediatrics, 169(4), 374–382. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Beidas, Williams NJ, Green PD, Aarons GA, Becker-Haimes EM, Evans AC, Rubin R, Adams DR, & Marcus SC (2016). Concordance between administrator and clinician ratings of organizational culture and climate. Administration and Policy in Mental Health and Mental Health Services Research, 45(1), 142–151. 10.1007/s10488-016-0776-8 [DOI] [PubMed] [Google Scholar]
- Chambless, & Ollendick TH (2001). Empirically supported psychological interventions: controversies and evidence. Annual Review of Psychology, 52(1), 685–716. 10.1146/annurev.psych.52.1.685 [DOI] [PubMed] [Google Scholar]
- Cheung, & Snowden lR. (1990). Community mental health and ethnic minority populations. Community Mental Health Journal, 26(3), 277–291. 10.1007/BF00752778 [DOI] [PubMed] [Google Scholar]
- Chorpita R, Daleiden EL, Ebesutani C, Young J, Becker KD, Nakamura BJ, Phillips L, Ward A, Lynch R, Trent L, Smith RL, Okamura K, & Starace N (2011). Evidence-based treatments for children and adolescents: an updated review of indicators of efficacy and effectiveness. Clinical Psychology (New York, NY), 18(2), 154–172. 10.1111/j.1468-2850.2011.01247.x [DOI] [Google Scholar]
- Cohen MB, & McWilliams J (2020). Overcoming resistance to change. The Palgrave Handbook of Organizational Change Thinkers. 10.1007/978-3-319-49820-1_6-1 [DOI] [Google Scholar]
- Cohen, & Mannarino AP (2016). Evidence based intervention: trauma-focused cognitive behavioral therapy for children and families. Parenting and Family Processes in Child Maltreatment and Intervention. 10.1007/978-3-319-40920-7_6 [DOI] [Google Scholar]
- Cohen, Mannarino AP, Perel JM, & Staron V (2007). A pilot randomized controlled trial of combined trauma-focused cbt and sertraline for childhood ptsd symptoms. Journal of the American Academy of Child and Adolescent Psychiatry, 46(7), 811–819. 10.1097/chi.0b013e3180547105 [DOI] [PubMed] [Google Scholar]
- Collyer, Eisler I, & Woolgar M (2019). Systematic literature review and meta-analysis of the relationship between adherence, competence and outcome in psychotherapy for children and adolescents. European Child & Adolescent Psychiatry, 29(4), 417–431. 10.1007/s00787-018-1265-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cooke RA, & Lafferty JC (1989). Organizational culture inventory. Plymouth, MI: Human Synergistics. [Google Scholar]
- Cooke, & Szumal JL (1993). Measuring normative beliefs and shared behavioral expectations in organizations: the reliability and validity of the organizational culture inventory. Psychological Reports, 72(3_suppl), 1299–1330. 10.2466/pr0.1993.72.3c.1299 [DOI] [Google Scholar]
- Donker, Petrie K, Proudfoot J, Clarke J, Birch M-R, & Christensen H (2013). Smartphones for smarter delivery of mental health programs: a systematic review. Journal of Medical Internet Research, 15(11), e247–e247. 10.2196/jmir.2791 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Farahnak LR, Ehrhart MG, Torres EM, & Aarons GA (2019). The influence of transformational leadership and leader attitudes on subordinate attitudes and implementation success. Journal of Leadership & Organizational Studies, 27(1), 98–111. 10.1177/1548051818824529 [DOI] [Google Scholar]
- Feldman SP (1993). How organizational culture can affect innovation. In Hirschhorn L & Barnett CKE (Eds.), The psychodynamics of organizations: Labor and social change (pp. 85–97). Temple University Press. [Google Scholar]
- Glisson C, & James LR (2002). The cross-level effects of culture and climate in human service teams. Journal of Organizational Behavior, 23(6), 767–794. 10.1002/job.162 [DOI] [Google Scholar]
- Glisson C, & Williams NJ (2015). Assessing and changing organizational social contexts for effective mental health services. Annual Review of Public Health, 36(1), 507–523. [DOI] [PubMed] [Google Scholar]
- Hanson RF, Saunders BE, Peer SO, Ralston E, Moreland AD, Schoenwald SK, & Chapman J (2018). Community-based learning collaboratives and participant reports of interprofessional collaboration, barriers to and utilization of child trauma services. Child & Youth Services Review, 94, 206–314. 10.1016/j.childyouth.2081.09.038 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hanson RF, Schoenwald S, Saunders BE, Chapman J, Palinkas LA, Moreland AD, & Dopp A (2016). Testing the community-based learning collaborative (CBLC) implementation model: A study protocol. International Journal of Mental Health Systems. 10.1186/s13033-016-0084-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hantsoo, Criniti S, Khan A, Moseley M, Kincler N, Faherty LJ, Epperson CN, & Bennett IM (2018). A mobile application for monitoring and management of depressed mood in a vulnerable pregnant population. Psychiatric Services (Washington, DC), 69(1), 104–107. 10.1176/appi.ps.201600582 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Heck NC, Saunders BE, & Smith DW (2015). Web-based training for an evidence-supported treatment: Training completion and knowledge acquisition in a global sample of learners. Child Maltreatment, 20(3), 183–192. [DOI] [PubMed] [Google Scholar]
- Helseth SA, Peer SO, Are F, Korell AM, Saunders SK, Chapman JE, & Hanson RF (2020). Sustainment of trauma-focused and evidence-based practices following learning collaborative implementation. Administration and Policy in Mental Health and Mental Health Services Research. 10.1007/s10488-020-01024-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Herbert, Walsh W, & Bromfield L (2018). A national survey of characteristics of child advocacy centers in the United States: Do the flagship models match those in broader practice? Child Abuse & Neglect, 76, 583–595. 10.1016/j.chiabu.2017.09.030 [DOI] [PubMed] [Google Scholar]
- Kazdin AE (2018). Annual research review: Expanding mental health services through novel models of intervention delivery. Journal of Child Psychology and Psychiatry, 60(4), 455–472. 10.1111/jcpp.12937 [DOI] [PubMed] [Google Scholar]
- NCA’s standards for accredited members. (2021). National Children’s Alliance. https://www.nationalchildrensalliance.org/ncas-standards-for-accredited-members/
- Powell, Beidas RS, Lewis CC, Aarons GA, McMillen JC, Proctor EK, & Mandell DS (2015). Methods to improve the selection and tailoring of implementation strategies. The Journal of Behavioral Health Services & Research, 44(2), 177–194. 10.1007/s11414-015-9475-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Powell, Fernandez ME, Williams NJ, Aarons GA, Beidas RS, Lewis CC, McHugh SM, & Weiner BJ (2019). Enhancing the impact of implementation strategies in healthcare: a research agenda. Frontiers in Public Health, 7, 3–3. 10.3389/fpubh.2019.00003 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rosenbaum, Piers AD, Schumacher LM, Kase CA, & Butryn ML (2017). Racial and ethnic minority enrollment in randomized clinical trials of behavioural weight loss utilizing technology: a systematic review. Obesity Reviews, 18(7), 808–817. 10.1111/obr.12545 [DOI] [PubMed] [Google Scholar]
- Ruggiero, Saunders BE, Davidson TM, Lewsky Cook D, & Hanson R (2017). Leveraging technology to address the quality chasm in children’s evidence-based psychotherapy. Psychiatric Services (Washington, DC), 68(7), 650–652. 10.1176/appi.ps.201600548 [DOI] [PubMed] [Google Scholar]
- Sashkin M, & Flumer R (1985). Pillars of excellence: Organizational beliefs questionnaire. Organizational Design and Development, Bryn Mawr, PA. [Google Scholar]
- Shea CM, Jacobs SR, Esserman DA, Bruce K, & Weiner BJ (2014). Organizational readiness for implementing change: A psychometric assessment of a new measure. Implementation Science. 10.1186/1748-5908-9-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Spring B (2007). Evidence-based practice in clinical psychology: What it is, why it matters; what you need to know. Journal of Clinical Psychology, 63(7), 611–631. 10.1002/jclp.20373 [DOI] [PubMed] [Google Scholar]
- US Center for Mental Health Services. (2011). The comprehensive community mental health services for children and their families program: evaluation findings; Annual report to Congress, Retrieved from: https://store.samhsa.gov/system/files/pep13-cmhi2011.pdf
- Weisel, Fuhrmann LM, Berking M, Baumeister H, Cuijpers P, & Ebert DD (2019). Standalone smartphone apps for mental health—a systematic review and meta-analysis. NPJ Digital Medicine, 2(1), 1–10. 10.1038/s41746-019-0188-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Xenikou, & Furnham A (1996). A correlational and factor analytic study of four questionnaire measures of organizational culture. Human Relations (New York), 49(3), 349–371. 10.1177/001872679604900305 [DOI] [Google Scholar]
