Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2019 Dec 1.
Published in final edited form as: Clin Soc Work J. 2018 Oct 15;46(4):271–280. doi: 10.1007/s10615-018-0687-0

The Availability of Supervision in Routine Mental Health Care

Mimi Choy-Brown 1, Victoria Stanhope 2
PMCID: PMC6426317  NIHMSID: NIHMS1509700  PMID: 30906079

Abstract

Clinical supervision is an embedded resource for practice quality in community mental health organizations. Supervision has been found to increase provider competence and decrease stress. In addition, supervision has been associated with service user outcomes including decreased depressive symptoms. However, little is known about the availability and nature of supervision in real world settings. The primary aims of this study were to identify available supervision and the extent to which contextual factors are related to that availability. The data source for this study was a multi-state and multi-site (N=14) NIMH-funded trial survey of providers (N=273). Supervision was measured by hours per week (quantity) and by utilization of best practice activities (content). Univariate, chi-square, independent samples t-tests, and ANOVA analyses were used to assess supervision content and quantity and to examine subgroup differences. Participants reported an average of 2.17 hours of supervision per week and 28.6% of participants endorsed best practice content. Supervision quantity varied significantly across sites (p<.05) and program type (p<.05) while content did not. Individual role within the organization had a significant relationship with reported supervision content (p<.001). In these settings, organizations are exercising discretion in how to utilize supervision within the available time. Supervision time also varied by program type, increasing with the intensity of services. Findings demonstrate that reports of availability vary according to position within the organization and the intensity of services within a given program type. Implications for workforce development, access to quality services, and implementation of evidence-based practices are discussed.

Keywords: clinical supervision, mental health services


Clinical supervision is widely believed to be integral to ongoing learning, support, and quality clinical practice with people seeking therapeutic services. The provision of supervision has been a critical component in the education and development of mental health professionals (Borders et al., 2014). Supervisory models to support clinicians in their work with people seeking services abound in the literature (Bernard & Goodyear, 2014; Shulman, 1993) and knowledge of supervision continues to grow (Bearman, Schneiderman, & Zoloth, 2017; Sewell, 2017). In the administration of public mental health services, programmatic service delivery models have included – though rarely explicitly compensate for – clinical supervision and many have required the provision of supervision to staff interacting with people seeking services (e.g., Commission on Accreditation of Rehabilitation Facilities [CARF]) (Hoge, Migdole, Farkas, Ponce, & Hunnicutt, 2011). In addition, the growing area of implementation science has pointed to the need for ongoing educational support for providers to translate evidence-based practices into their interactions with people seeking services (Beidas et al., 2013; Gleacher et al., 2011).

Supervision has been significantly related to improving staff and service user outcomes in behavioral health settings (Bambling, King, Raue, Schweitzer, & Lambert, 2006; Bearman et al., 2013; Henggeler, Schoenwald, Liao, Letourneau, & Edwards, 2002; Mor Barak, Travis, Pyun, & Xie, 2009; Schoenwald, Mehta, Frazier, & Shernoff, 2013; Schoenwald, Sheidow, & Chapman, 2009). In clinical evidence-based practice trials, clinical supervision models have been manualized and have improved implementation outcomes such as treatment adherence and fidelity (Henggeler et al., 2002; Henggeler & Schoenwald, 1998; Martino et al., 2006; Schoenwald et al., 2009). In some cases, post-training supervisory coaching conducted has been found to be more important than the training quality itself because providers have been able to practice and receive feedback (Salas, Tannenbaum, Kraiger, & Smith-Jentsch, 2012; Schoenwald et al., 2013; Sholomskas et al., 2005). Elements of evidence-based, best practices in supervision have been articulated in the behavioral health literature and include observation of actual practice (i.e., live supervision with a supervisor providing feedback in vivo), use of direct observation or session recordings (Milne & Reiser, 2011), and use of client feedback and outcomes (Worthern & Lambert, 2007) to inform the supervision process. Despite the established utility and expectation of supervision, knowledge of best practice supervision availability for providers working in community mental health services remains limited (Dorsey et al., 2018). Evidence for supervision effectiveness is even less robust in non-behavioral health settings (e.g., child welfare) (Carpenter, Webb, & Bostock, 2014).

In public mental health settings, supervision has been understood to encompass three primary functions of administration, support, and education (Kadushin & Harkness, 2002). The hierarchical, casework model of supervision used primarily in these agency settings often utilizes individual, group, or peer supervision modalities and has been sensitive to organizational contextual factors (Bogo & McKnight, 2006). Specifically, the service setting type (e.g., school, hospital, or social service agency) has driven availability of supervision along with organizational policies, complexity of the service systems, and workload burden (Berger & Mizrahi, 2001; Bogo & McKnight, 2006). In addition, upwards of half of all behavioral health services are using task-shifting program models that employ supervisors with clinical degrees to oversee staff without clinical degrees to provide the direct services (Hoge et al., 2007).

Many challenges to the provision of supervision have existed at multiple levels (e.g., policy, organizational, or individual) with increasing complexity combined with shrinking resources in the field (Hoge, et al., 2011). Non-profit organizations, and increasingly co-located integrated health settings, have often been the providers of public mental health services. These organizations have been operating within a policy environment focused on measuring the outcomes (rather than the process) of the work with people seeking services (Mosley & Smith, 2018). This focus on accountability via outcomes measurement has required organizations to expand their capacity to record their work with people seeking services and has ultimately increased the documentation requirements for providers.

States primarily fund community mental health services and the professionalization requirements for the workforce vary by state. Within a given state funding context, organizations may be directed to hire people with licensure and credentials and be required to provide supervision. Some states have also endeavored to bolster supervision practice by offering additional training and support for clinical supervisors as well as by specifying particular supervision guidelines (e.g., Connecticut Workforce Collaborative on Behavioral Health Supervision Competency Development Initiative; Supervision Competency Workforce Initiative, 2009). Within these state policy contexts, organizations may offer robust clinical supervision that fulfills licensure requirements to attract a limited qualified workforce (e.g., in New York State one of the requirements for a clinical social work license is 3 years of supervised practice experience). In addition, particular programs (e.g., Assertive Community Treatment; New York State Office of Mental Health, 2007) may have different requirements for supervision as part of the fidelity to the program model, which could influence supervisory practices in one program over another.

In response to shifting service contexts, the format of supervision has evolved (e.g., individual versus group and formal versus informal (or ad hoc) supervision time (Schoenwald et al., 2009). In addition, supervision content is shifting. Supervisees do not always have access to supervision focused on clinical practice (Bearman et al., 2013; Ellis, Berger, Ayala, Swords, & Siembor, 2013). Administrative supervision dominates limited supervision time and supervisors feel “ill-equipped” for additional responsibilities (Dill & Bogo, 2009; Hoge, Wolf, Migdole, Cannata, & Gregory, 2016). This is a critical issue of service quality that could result in potentially alarming clinical autonomy for an untrained workforce operating in an increasingly complex system of care (Authors, 2015a; Substance Abuse and Mental Health Services Administration [SAMHSA], 2012; 2013). This administrative dominance may be attributed to states’ reliance more on requirements rather than financial incentives for supervision activities, as supervision has remained an unreimbursed requirement for service delivery. Such incentive structures may increase the focus on more administrative tasks (e.g., documentation for risk management and billing) and limit clinical practice supervision time. Supervisors carry the risk of vicarious (and sometimes direct) liability for any incompetent services (or omission of services) that constitute malpractice. Thus, supervisors are legally responsible for all activities of any staff member, volunteer, or intern they oversee (National Association of Social Workers [NASW], 2004; NASW & Association of Social Work Boards, 2012). Given the potential impact of the organizational context on available supervision practice, it remains unknown how much discretion supervisors have to shape the supervision process.

Research has been mixed on access to adequate supervision. One national study reported that on average respondents working in behavioral health settings had access to effective supervision as defined by a 14-item measure of supervision effectiveness (Laschober, de Tormes Eby, & Sauer, 2012). However, another study found that 93% of their sample of supervisees from a variety of community settings (i.e., 50% behavioral health, 16% University-based settings, 7% schools) had experienced inadequate supervision (Ellis et al., 2013). Yet, another study found that the majority of supervisors and supervisees in children’s community mental health in San Diego County were both satisfied with the amount of time spent on various functions in supervision and felt that evidence-based practice techniques were not included consistently enough (Accurso, Taylor, & Garland, 2011). In addition, the 2004 National Social Work Workforce study reported decreased supervision and increased workload (Whitaker, Weismiller, & Clark, 2006). While some recommendations on how to bolster supervision to ensure safe and effective care have been made, questions remain about whether supervision is an available resource for clinical practice improvement in routine services (Dorsey et al., 2018; Hoge, Migdole, Cannata, & Powell, 2014). Knowledge is limited about routine supervision practices, the use of best practice content, the quantity of supervision time devoted to clinical practice, and supervision models that are effective in integrating best practices into usual care (Dorsey et al., 2013). As such, this study seeks to contribute what supervision is available in routine practice within community mental health settings.

Methods

Data Source

This descriptive study examined data drawn from the baseline survey administered as part of a multi-state federally-funded randomized controlled trial testing the effectiveness of Person-Centered Care Planning (PCCP). Community mental health research sites (N=14) were randomly selected in two northeastern states to participate in the trial. The community mental health sites collectively served over 8,000 service users and provided a multitude of services including crisis intervention, individual and group therapy, residential, and case management services. Participating research sites nominated leaders, supervisors, and direct care staff to be recruited for participation in the parent study trial. In general, inclusion criteria primarily consisted of working for at least one year in a participating program; for leaders (N=49), overseeing PCCP implementation; for supervisors (N=81), overseeing service planning – the target service activity for the PCCP intervention; for direct service providers (N=143), that they met the criteria of a change agent (e.g., leaders among staff) and were reporting to a participating supervisor. Each supervisor nominated two direct service providers.

Each of the participating providers received an email introducing them to the study from the leadership in the organization. Subsequently, surveys were administered via an electronic link embedded in an email from the research study team. The 45-minute online survey was administered to research sites one month prior to the delivery of the PCCP intervention between October of 2014 and November of 2015. LimeSurvey software was used for survey development and data collection. The overall response rate for the survey was 89%. The human subjects committee from the authors’ institution approved all study protocols and the parent study was registered as a clinical trial. More information about the parent study methods has been published elsewhere (Authors, 2015b).

Measures

This study examined provider reports of supervision (quantity and content) and contextual factors (research site, organizational role, and program type).

Supervision quantity was measured using two continuous items including the number of hours in supervision in an average week (“In an average week, how many hours do you spend in supervision?”) and the percentage of supervision time spent focused on clinical versus administrative tasks (On average, what percentage of supervision time is spent on administrative versus clinical practice content?). These two continuous items were used to calculate supervision quantity in which the percent of supervision time focused on clinical work was multiplied by the number of supervision hours. Three different surveys were administered based on the three possible participant roles (i.e., leader, supervisor, and direct care provider). The question stem texts varied, but all stems directed the participants to comment on the direct care supervision. Leaders were asked the amount of supervision time and format utilized for direct care providers in their agency. Supervisors were asked the amount of time that their direct supervisees were in supervision and the formats they utilized with their supervisees. Direct care providers were asked to speak about their experiences in supervision.

Supervision content was derived from one categorical item asking participants to endorse all applicable supervision formats or activities from a list of 11. For direct care providers, the question stem was: What formats for supervision do you receive? (e.g., individual, group, staff meeting, case presentations, live supervision, direct observations or recorded sessions, client feedback, peer supervision, consultant supervision, outside/independent, informal). Supervisors reported on the formats or activities available to their supervisees and leaders reported on the formats or activities received by direct care providers at their agency. A dichotomous variable was created from that item, with a “yes” answer meaning that the participant endorsed at least one of three possible best practice supervision activities from the list of 11 options itemized in the survey.

These three best practice supervision content activities were identified through a multistage integrative literature review (Whittemore & Knafl, 2005) involving SAMHSA’s National Registry for Evidence-Based Programs and Practices (NREPP) to identify supervision characteristics associated with evidence-based practices and through a review of supervision literature for a broader perspective on best practice supervision activities. Inclusion for evidence-based practices in the review was determined using a four-stage process utilizing the NREPP search tool:

  1. Evidence-based practices (EBP) were identified for mental health and substance use treatment of service users older than 13 years of age in outpatient or community settings.

  2. A quality assessment review was performed of SAMHSA descriptions for Readiness for Dissemination and Quality of Evidence index scores of 3 or higher.

  3. SAMHSA information was reviewed for the specification or emphasis on supervision practices.

  4. A literature search was conducted to identify published work related to identified EBPs and supervision factors (Figure 1 presents process flow diagram).

Figure 1.

Figure 1

Multistage Identification of Supervision Using SAMHSA’s National Registry of Evidence-based Programs and Practices

From this process, 12 EBPs were identified that utilized three different supervision models for the interventions: Motivational Interviewing, Cognitive-Behavioral Therapy, and Multi-Systemic Therapy (Henggeler & Schoenwald, 1998; Martino et al., 2006; Milne, 2009; Milne & Reiser, 2017). Taking a modified common elements approach (Chorpita, Becker, & Daleiden, 2007), these supervision manuals were reviewed for distilled components including the supervisory practice structure (i.e., quantity), content (e.g., activities), data source (e.g., taped sessions), characteristics, and available impact research. Next, these components were reviewed across the three manuals and shared elements were found across the supervision manuals, including opportunities for observation of actual practice and constructive feedback based on that practice (Authors, 2016). This was found to be consistent with the literature in evidence-based supervision practices (Bearman et al., 2013; Milne & Dunkerley, 2010) and best practice supervision (Borders et al., 2014). In particular, supervision content included live or direct observation, recording of EBP use with service users, or use of service user outcomes data to inform the session.

Other variables included were research site and organizational role, which were identified for each participant through administrative data prior to the survey administration and were not included as items in the survey. In the survey, participants provided information about demographics, caseload, and program type. Program type was measured by one categorical item asking respondents to select from the list of program types the one that best described their program (i.e., community support, Assertive Community Treatment (ACT), outpatient therapy, young adult services, residential services, other). Caseload was measured by one continuous item asking direct care providers to report on their own caseload and supervisors/leaders to report on the direct care providers’ caseloads on average.

Analytic Strategy

Univariate statistics identified the available supervision quantity and content in community mental health services. Independent samples t-tests, analyses of variance, and chi-square analyses identified statistically significant differences across research site (N=14), program type (N=6), organizational role (N=3), and state (N=2). This survey had limited missing data. Cases with missing data were deleted list-wise from analyses.

Results

Sample Characteristics

The sample (N=273) was comprised of leaders (N=49), supervisors (N=81), and direct care staff (N=143). The majority of participants were female (71.1%) and white (65.9%). The mean age of the sample was 42.6 years. On average, it was an experienced sample with an average of 13.89 years in mental health services and 7.53 years of tenure at their current organization. Approximately half of the participants held master’s degrees (55.3%), a third had bachelor’s degrees (33.8%), and others had less education. On average, supervisors oversaw 10.40 staff (SD=9.05). Table 1 outlines sample demographics in more detail by role.

Table 1.

Sample Demographics by Role

All Leaders (N=49) Supervisors (N=81) Direct Care (N=143)
N/M %/SD N/M %/SD N/M %/SD N/M %/SD
Gender (N=272)
 Male 78 28.6% 18 36.7% 21 25.90% 39 27.27%
 Female 194 71.1% 31 63.3% 60 74.10% 103 72.03%
Race (N=265)
 Non-White 85 31.2% 5 10.2% 30 37.00% 55 38.46%
 White 180 65.9% 44 89.8% 51 63.00% 86 60.14%
Hispanic (N=263) 11 4.0% 1 2.0% 2 2.60% 8 5.60%
Education (N=272)
 High School 29 10.7% 3 6.1% 4 4.90% 22 15.38%
 College 92 41.0% 10 20.4% 19 23.50% 63 44.06%
 Graduate 151 55.3% 36 73.5% 58 71.60% 57 39.86%
Program Type
 Community Sup 85 31.1% 21 42.90% 19 23.50% 45 31.47%
 ACT 33 12.1% 1 2.00% 13 16.00% 19 13.29%
 Outpatient 50 18.3% 16 32.70% 13 16.00% 21 14.69%
 Young adult 24 8.8% 0 0.00% 10 12.30% 14 9.79%
 Residential 35 12.8% 0 0.00% 17 21.00% 18 12.59%
 Other 46 16.8% 11 22.40% 9 11.10% 26 18.18%
Age in Years (N=268) 42.60 12.36 53.10 10.48 41.94 11.26 39.27 11.59
Years in MH (N=271) 13.89 10.12 24.08 10.02 14.32 8.33 10.15 8.52
Years at Agency (N=270) 7.53 7.28 13.20 9.51 8.17 6.37 5.16 5.55
Caseload 28.02 49.21 25.87 7.08 36.07 84.55 24.16 22.83

Note. Sample specified when missing data present. Community Supp = community support services; ACT = Assertive Community Treatment; Outpatient = outpatient therapy; Young adult = young adult services; MH = mental health

Available Supervision Quantity

On average, participants reported 2.17 hours for supervision quantity (i.e., supervision devoted to clinical content; SD=1.93) with a modal number of 30 minutes and a median of 1.60 hours. Please see Table 2 for descriptive statistics of supervision quantity and content by role and program type. Even if we assume that the clinical content encompassed primarily clinical topics (e.g., not educational content) and supervisors need the current status of each person served, this only gives supervisors approximately 5 minutes to learn about each person served with average caseloads, as reported by direct care staff, being 24.16 (SD=22.83). On average, 58.9% of supervision was used for clinical-focused content while the additional 41.1% was used for administrative matters. Analyses of variance demonstrated significant differences in available clinical supervision time across research sites (p<.05) and program types (p<.05; see Table 2 for means and Table 3 for results). Independent samples t-tests identified significant differences between the mean supervision quantity in the two state contexts (p<.01): one state mean was 2.83 hours (SD=2.57) and the other was 1.90 (SD=1.54). The range of mean clinical hours for organizations across both states was 1.40 (SD=1.08) to 3.39 (SD=2.45).

Table 2.

Results of Chi-Square Tests and Descriptive Statistics for Supervision by Role and Program Type

Supervision Quantity Supervision Content
M SD N (%)
All (N=273) 2.17 1.93 78 (28.6)
Role1
Leaders (49) 1.65 1.07 22 (44.9)
Supervisors (81) 2.43 2.32 30 (37.0)
Direct Care (143) 2.12 1.79 26 (18.2)
Program Type2
Community Support (77) 2.12 1.78 30 (35.3)
ACT (31) 3.03 2.23 9 (27.3)
Outpatient (46) 1.53 1.02 20 (40.0)
Young Adult (24) 2.39 1.55 9 (37.5)
Residential (31) 2.54 2.01 13 (37.1)
Other (38) 1.80 2.59 13 (28.3)

Note.

1

Supervision content by Role (χ2 (2) = 16.81, p<.001);

2

Does not add up to full sample due to missing data; Supervision content by Program Type: χ2 (5)=4.40, p= .494

Table 3.

One-Way Analyses of Variance Analyses of Supervision Quantity by Role, Program, and Site

Source df SS MS F p
Role
 Between Groups 2 12.799 6.399 1.725 0.18
 Within Groups 229 849.367 3.709
 Total 231 862.166
Program Type
 Between Groups 5 50.045 10.009 2.785 0.018
 Within Groups 226 812.121 3.593
 Total 231 862.166
Research Site
 Between Groups 13 92.089 7.084 2.005 0.021
 Within Groups 218 770.076 3.532
 Total 231 862.166

Available Supervision Content

As defined, supervision content was minimally available with an average of only 18.2% of direct care staff and 28.6% of all participants endorsing access to at least one of the best practice supervision content activities. Chi-square analyses revealed no significant differences in supervision content across states (χ2 (1) = .297, p=.586), research sites (χ2 (13) = 16.62, p=.217), or program types (χ2 (5) = 4.40, p=.494). However, significant differences in supervision content were found across identified role in the organization (χ2 (2) = .16.81, p<.001). Table 2 presents descriptive findings for supervision content by program type and role.

Discussion

These study findings suggest that providers have access to an adequate dose of supervision quantity with an average of over two hours per week focused on clinical content. This time includes all potential modalities for contact (e.g., staff meetings, individual, and informal supervision time). In addition, respondents indicated that they spent an additional 1.5 hours per week on average in supervision focused on administrative content. These findings differ in total quantity and percentage of time devoted to clinical content when compared to previously reported findings of about 50 minutes every other week (Dorsey et al., 2018). Findings here suggest an adequate amount of supervision time as compared to NYS clinical licensing standards (New York State Education Department, 2017) or CT requirements for supervisory contact that require one hour per 40 hours of clinical contact (Supervision Competency Workforce Initiative, 2009). However, caseload burden may have limited supervisors’ ability to talk with providers about their interactions with people seeking services each week. The average time constraint calculated in this study indicates a potential challenge to supervisors’ critical function of overseeing service interactions.

Findings suggest that while the quantity of supervision met recommended standards, supervision content consistent with best practice supervision standards or supervision activities utilized in efficacy trials of evidence-based practice were not widely available in routine services in these community mental health settings. Best practice supervision content was available for a limited number of participants in this sample, with under one fifth of providers reporting access. This finding is lower than previous findings that at least one third of providers report observation or review of their work informing supervisory feedback (Laschober et al., 2012). This opportunity for constructive feedback based on review of providers’ practice – whether from the service user or supervisor perspective - was a consistent element in specified supervision models used in clinical trials. Feedback has been considered an essential part of successful learning (Milne, 2009). For a workforce primarily relying upon on-the-job learning (SAMHSA, 2013), this could signal a key missed opportunity and a clear contextual barrier to integrating new practices into available mental health services. Kolb’s (1984) Experiential Learning Theory has proposed that skills and knowledge are acquired through a process of practical experience, reflection, conceptualization, and planning. To inform this learning process and meet service user needs, supervisors and providers have relied on provider self-report and progress notes (Accurso et al., 2011). The discussion of provider self-report and progress notes have potential strengths for understanding the clinical encounter, but also introduce potential obfuscation of key information given the nature of the supervisory relationship (Noelle, 2003). Opportunities for external practice review and feedback may provide insights into areas for improvement and share the substantial responsibility for practice quality with others. Additionally, supervisors make decisions about how to use their time in supervision and address the potential tensions between the records of practice and the actual practice quality. Supervisory decision-making about what and how much supervision they provide may also be sensitive to contextual factors including considerations of potential liability.

This study also examined how supervision quantity and content varied as a function of the state, research site, program, and role that participants occupied within their organizations. For supervision content it largely depended on who was asked, with leaders being significantly more likely to endorse the availability of best practice supervision activities. This significant difference in supervision content, but not in quantity could have indicated a disconnect between leadership perceptions and on-the-ground realities of service provision. Leaders may have a general sense of how much, but not what or how, supervision takes place in their organizations. Alternatively, leadership responses could have also represented awareness of at least one best practice supervision activity available within the programs as opposed to availability throughout the organization. Beyond these role differences, supervision content did not vary significantly across the states, research sites, or program types within this sample, potentially indicating a consistent unavailability of feedback based on observation or review of practice.

On the other hand, supervision quantity was found to vary by contextual factors including state, research site, and program type. State agencies typically administer a significant portion of the community mental health services funding and are in a position to influence how supervisors’ time is used within their system. However, what remains unclear are the mechanisms of state influence on supervision and how they monitor or incentivize supervision quantity and content. The variation among participating sites suggests that organizations are also making choices about supervision quantity, but have less often prioritized the use of best practice supervision content during that time. Programmatic variation in supervision quantity seems to mirror program enrollment criteria for severity of need. For example, outpatient programs received the least supervision whereas ACT teams received the most. Alternatively, supervision hours could have perhaps varied by program according to the amount of staff’s clinical training (e.g., outpatient therapy providers presumably have more clinical training and therefore receive less supervision).

These variations support that supervision quantity can be malleable with regards to shifting contextual factors and that key stakeholders make choices about supervision dose and content. The range among direct care providers indicates variation between and within supervisors themselves in supervision quantity and content. Such findings suggest that supervisors also exercised discretion in how much and how they used their time with their staff. Another study found that staff perceived supervision time alternately as either support or scrutiny of their work, which was found to be influenced by the process and content during supervision time (Authors, 2015c). Further uncovering the potential implications of supervisory choices in supervision represents an important opportunity to understand the role of supervisors in shaping direct practice.

In the administration of community mental health services, supervisors have considerable responsibility and discretion over the service user experience of services. In environments that are over-saturated with work, policy makers and organizations often cannot realistically expect all of the stated policy requirements to be met (Lipsky, 2010). In fact, bureaucratic effectiveness has often relied on managers and supervisors to calibrate policy to the on-the-ground landscape (Evans, 2011; Lipsky, 2010). In particular, during an implementation effort to translate a new practice into their work, the supervisors’ understanding of that practice and their motivation in facilitating that understanding among their staff will potentially affect how they use their discretion in influencing practice. Given the stress and overburdened work levels, supervisors in these bureaucracies cannot meet all requirements and thus must prioritize their time and energy (Lipsky, 2010). Should a conflict arise between the supervisory and organizational or external policy environments, the supervisor will ultimately be in a key proximal position to facilitate or inhibit the new change and to make sense of the critical practice elements within the intervention. According to previous research (Rapp et al., 2010), this has perhaps been particularly true for the translation of complex interventions with underlying value shifts (e.g., recovery-oriented practices) in addition to concrete practice changes (Whitley, Gingerich, Lutz, & Mueser, 2009). This study supports the notion that contextual factors contribute to the availability of best practice supervision, due to finding a difference of over two hours per week in supervision quantity between some sites and a range of 15-55% of staff endorsing access to best practice supervision content. However, mechanisms to improve that availability remain underdeveloped. Research and practice knowledge suggest that supervisors affect practice quality and that a better understanding of supervisory mechanisms (e.g., supervision quantity and content) within their supervision time would further bridge the gap in the relationship between supervision and practice quality improvement (Milne, 2009).

Limitations

This study has contributed new knowledge of the availability of supervision quantity and content. However, findings must be understood in the context of the limitations of the study. In particular, the evaluation of statistically significant differences among variables has not considered the complex picture inherent in the real world context of these organizations. Community mental health services are incredibly diverse and complex, operating in equally diverse and complex systems of care. Future research employing multivariate analyses would strengthen our understanding of these relationships. These findings also represented participant self-report of their experiences with supervision, which are subject to potential response bias. In addition, these study data did not include alternative potential activities that states, organizations, and programs were engaged in to support the quality of their practice such as the implementation of other related interventions (e.g., motivational interviewing). Lastly, these findings support the need for further analyses examining all of these factors together to understand the contextual fabric of influences on supervision - why and how it matters, and how we can better maximize the opportunities available in this already embedded resource.

Conclusions and Implications

These study findings about the supervision quantity and content (as defined in this study) can contribute to future research, social work education, practice, and policy. Future research investigating supervisors and supervision should consider the influence of contextual factors within the system, organization, and program. In particular, the simultaneous consideration of the context, supervision, provider practice, and service user experiences using multivariate analytic strategies would further our understanding of the relationships among these variables. Given the paucity of best practice supervision literature, further research is urgently needed to better understand how organizations and providers approach practice quality support in routine care.

Additionally, for social work education, social work supervisors have played significant roles as field instructors supporting our “signature pedagogy” in field learning. While field education differs from practice supervision in key ways (e.g., purpose), these findings shed insight into available supervision quantity and best practice supervision content within these field settings. . The integrative review of known best practice supervision indicated that the provision of feedback based on actual review of provider-service user interactions has been a critical component to improving and sustaining quality practice. However, this study’s sample did not report consistently receiving feedback. Practice quality may improve if supervisors and practitioners focus on the content over the quantity of supervision time. For practitioners, students, and supervisors, the consideration is how one might access and integrate opportunities for feedback into their supervision to support practice.

Beyond individual ethical responsibilities, these findings have implications for policy makers, organizational leaders, administrators, and educators. In addition, research has been increasingly examining the potential value of simulated practice interactions for learning and feedback (Bearman et al., 2017; Bogo, Shlonsky, Lee, & Serbinski, 2014). Further knowledge of simulation applications in low-resource settings is needed. Funders of community mental health services have historically prioritized their spending to be devoted to interactions with service users (e.g., fee for service billing). Tension continues to exist between supporting quality versus breadth of service provision within increasingly limited budgets and waning political support. The integration of best practice supervision that is known to improve supervisee learning and practice quality may require increased staff time and organizational infrastructure for data collection. Without resolution, these unmet learning needs could continue to frustrate workforce training and retention and shift the responsibility for practice quality downwards from policy makers to organizations and, ultimately, to supervisors and individual providers. Financial difficulties may exacerbate these workforce issues, constrain organizational quality assurance activities, and limit opportunities for frontline staff to receive best practice supervision content. In turn, this could then contribute to poor quality of care, which has had considerable consequences for people seeking services in public mental health settings (Institute of Medicine, 2015). Further examination and introduction of incentives for integrating best practices into clinical supervision, which is an already embedded resource for workforce training, could be critical for addressing gaps in the quality of care.

Acknowledgments

This research was supported by grant funding from the National Institute of Mental Health (F31MH110120-01A1 Examining Supervision as an Implementation Strategy to Improve Provider Adoption of Evidence-Based Practice & R01MH099012 Person-Centered Care Planning and Service Engagement).

Biography

Mimi Choy-Brown is an assistant professor at the University of Minnesota with over a decade of practice experience. She received her PhD from NYU and her research interests are in community mental health services, supervision, implementation science, and developing strategies to improve and sustain quality service provision.

References

  1. Accurso EC, Taylor RM, & Garland AF (2011). Evidence-based practices addressed in community-based children’s mental health clinical supervision. Training and Education in Professional Psychology, 5, 88–96. doi: 10.1037/a0023537 [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Authors, 2015a [Google Scholar]
  3. Authors, 2015b [Google Scholar]
  4. Authors, 2015c [Google Scholar]
  5. Authors, 2016 [Google Scholar]
  6. Bambling M, King R, Raue P, Schweitzer R, & Lambert W (2006). Clinical supervision: Its influence on client-rated working alliance and client symptom reduction in the brief treatment of major depression. Psychotherapy Research, 16, 317–331. doi: 10.1080/10503300500268524 [DOI] [Google Scholar]
  7. Bearman SK, Schneiderman RL, & Zoloth E (2017). Building an evidence base for effective supervision practices: An analogue experiment of supervision to increase EBT fidelity. Administration and Policy in Mental Health and Mental Health Services Research, 44, 293–307. doi: 10.1007/s10488-016-0723-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Bearman SK, Weisz JR, Chorpita BF, Hoagwood K, Ward A, Ugueto AM, & Bernstein A (2013). More practice, less preach? The role of supervision processes and therapist characteristics in EBP implementation. Administration and Policy in Mental Health and Mental Health Services Research, 40, 518–529. doi: 10.1007/s10488-013-0485-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Beidas RS, Edmunds JM, Cannuscio CC, Gallagher M, Downey MM, & Kendall PC (2013). Therapists’ perspectives on the effective elements of consultation following training. Administration and Policy in Mental Health and Mental Health Services Research, 40, 507–517. doi: 10.1007/s10488-013-0475-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  10. Berger C & Mizrahi T (2001). An evolving paradigm of supervision within a changing health care environment. Social Work in Health Care, 32(4), 1–18. doi: 10.1300/J010v32n04_01 [DOI] [PubMed] [Google Scholar]
  11. Bernard JM & Goodyear RK (2014). Fundamentals of clinical supervision (5th ed.). NJ: Pearson. [Google Scholar]
  12. Bogo M & McKnight K (2006). Clinical supervision in social work. The Clinical Supervisor, 24, 49–67. doi: 10.1300/J001v24n01_04 [DOI] [Google Scholar]
  13. Bogo M, Shlonsky A, Lee B, & Serbinski S (2014). Acting like it matters: A scoping review of simulation in child welfare training. Journal of Public Child Welfare, 8, 70–93. doi: 10.1080/15548732.2013.818610 [DOI] [Google Scholar]
  14. Borders LD, Glosoff HL, Welfare LE, Hays DG, DeKruyf L, Fernando DM, & Page B (2014). Best practices in clinical supervision: Evolution of a counseling specialty. The Clinical Supervisor, 33, 26–44. doi: 10.1080/07325223.2014.905225 [DOI] [Google Scholar]
  15. Carpenter J, Webb CM, & Bostock L (2013). The surprisingly weak evidence base for supervision: Findings from a systematic review of research in child welfare practice (2000–2012). Children and Youth Services Review, 35(11), 1843–1853. doi: 10.1016/j.childyouth.2013.08.014 [DOI] [Google Scholar]
  16. Chorpita BF, Becker KD, & Daleiden EL (2007). Understanding the common elements of evidence-based practice. Journal of the American Academy of Child and Adolescent Psychiatry, 46, 647–652. doi: 10.1097/chi.0b013e318033ff71 [DOI] [PubMed] [Google Scholar]
  17. Dill K & Bogo M (2009). Moving beyond the administrative: Supervisors’ perspectives on clinical supervision in child welfare. Journal of Public Child Welfare, 3, 87–105. doi: 10.1080/15548730802695105 [DOI] [Google Scholar]
  18. Dorsey S, Kerns SE, Lucid L, Pullmann MD, Harrison JP, Berliner L, … Deblinger E.(2018). Objective coding of content and techniques in workplace-based supervision of an EBT in public mental health. Implementation Science, 13(19). doi: 10.1186/s13012-017-0708-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  19. Dorsey S, Pullmann MD, Deblinger E, Berliner L, Kerns SE, Thompson K, … Garland AF (2013). Improving practice in community-based settings: a randomized trial of supervision – study protocol. Implementation Science, 5(89), 1–11. doi: 10.1186/s13012-018-0754-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  20. Ellis MV, Berger L, Ayala EE, Swords BA, & Siembor M (2013). Inadequate and harmful clinical supervision: Testing a revised framework and assessing occurrence. The Counseling Psychologist, 42, 434–472. doi: 10.1080/15548730802695105 [DOI] [Google Scholar]
  21. Evans T (2011). Professionals, managers and discretion: Critiquing street-level bureaucracy. The British Journal of Social Work, 41, 368–386. doi: 10.1093/bjsw/bcq074 [DOI] [Google Scholar]
  22. Gleacher AA, Nadeem E, Moy AJ, Whited AL, Albano AM, Radigan M, … Hoagwood KE (2011). Statewide CBT training for clinicians and supervisors treating youth: The New York State evidence-based treatment dissemination center. Journal of Emotional and Behavioral Disorders, 19, 182–192. doi: 10.1177/1063426610367793 [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Henggeler S, & Schoenwald S (1998). Multisystematic therapy supervisor manual: Promoting quality assurance at the clinical level. Unpublished. [Google Scholar]
  24. Henggeler S, Schoenwald S, Liao J, Letourneau E, & Edwards D (2002). Transporting efficacious treatments to field settings: The link between supervisory practices and therapist fidelity in MST programs. Journal of Clinical Child and Adolescent Psychology, 31, 155–167. doi: 10.1207/S15374424JCCP3102_02 [DOI] [PubMed] [Google Scholar]
  25. Hoge MA, Migdole S, Cannata E, & Powell DJ (2014). Strengthening supervision in systems of care: Exemplary practices in empirically supported treatments. Clinical Social Work Journal, 42, 171–181. doi: 10.1007/s10615-013-0466-x [DOI] [Google Scholar]
  26. Hoge MA, Migdole S, Farkas MS, Ponce AN & Hunnicutt C (2011) Supervision in public sector behavioral health: A review. The Clinical Supervisor, 30, 183–203. doi: 10.1080/07325223.2011.604276 [DOI] [Google Scholar]
  27. Hoge MA, Morris JA, Daniels AS, Stuart GW, Huey LY, & Adams N (2007). An action plan for behavioral health workforce development: A framework for discussion. Rockville, MD: Substance Abuse and Mental Health Services Administration, U.S. Department of Health and Human Services. [Google Scholar]
  28. Hoge MA, Wolf J, Migdole S, Cannata E, & Gregory FX (2016). Workforce development and mental health transformation: A state perspective. Community Mental Health Journal, 52, 323–331. doi: 10.1007/s10597-015-9953-6 [DOI] [PubMed] [Google Scholar]
  29. Institute of Medicine. (2015). Psychosocial interventions for mental and substance use disorders: A framework for establishing evidence-based standards. Washington, DC: The National Academies Press. [PubMed] [Google Scholar]
  30. Kadushin A & Harkness D (2002). Supervision in social work (4th ed.). New York, NY: Columbia University Press. [Google Scholar]
  31. Kolb DA (1984). Experiential learning: Experience as the source of learning and development. Englewood Cliffs, NJ: Prentice Hall. [Google Scholar]
  32. Laschober TC, de Tormes Eby LT, & Sauer JB (2012). Clinical supervisor and counselor perceptions of clinical supervision in addiction treatment. Journal of Addictive Diseases, 31, 382–388. doi: 10.1080/10550887.2012.735599 [DOI] [PMC free article] [PubMed] [Google Scholar]
  33. Lipsky M (2010). Street-level bureaucracy: Dilemmas of the individual in public services. New York, NY: Russell Sage Foundation. [Google Scholar]
  34. Martino S, Ball SA, Gallon SL, Hall D, Garcia M, Ceperich S, … Hausotter W. (2006) Motivational interviewing assessment: Supervisory tools for enhancing proficiency. Salem, OR: Northwest Frontier Addiction Technology Transfer Center, Oregon Health and Science University. [Google Scholar]
  35. Milne DL (2009). Evidence-based clinical supervision: Principles and practice (1st ed.). Chichester, West Sussex, UK: John Wiley & Sons Ltd. [Google Scholar]
  36. Milne D and Dunkerley C (2010) Towards evidence-based clinical supervision: The development and evaluation of four CBT guidelines. The Cognitive Behaviour Therapist, 3, 43–57. doi: 10.1017/S1754470X10000048 [DOI] [Google Scholar]
  37. Milne D & Reiser RP (2011). Observing competence in CBT supervision: A systematic review of the available instruments. The Cognitive Behavioral Therapist, 4, 89–100. doi: 10.1017/S1754470X11000067 [DOI] [Google Scholar]
  38. Milne DL & Reiser RP (2017). A manual for evidence-based CBT supervision. Hoboken, NJ: John Wiley & Sons Ltd. [Google Scholar]
  39. Mor Barak M, Travis D, Pyun H, & Xie B (2009). The impact of supervision on social work outcomes: A meta analysis, Social Service Review, 83, 3–32. doi: 10.1086/599028 [DOI] [Google Scholar]
  40. Mosley JE & Smith SR (2018). Human service agencies and the question of impact: Lessons for theory, policy, and practice. Human Service Organizations: Management, Leadership & Governance, 42,113–122. doi: 10.1080/23303131.2018.1425953 [DOI] [Google Scholar]
  41. National Association of Social Work & Association of Social Work Boards. (2012). Best practice standards in social work supervision. Retrieved from http://www.socialworkers.org
  42. New York State Education Department. (2017). LCSW license requirements. Retrieved from http://www.op.nysed.gov
  43. New York State Office of Mental Health. (2007). Assertive Community Treatment Guidelines.Retrieved from https://www.omh.ny.gov
  44. Noelle M (2003). Self-report in supervision. The Clinical Supervisor, 21(1), 125–134. doi: 10.1300/J001v21n01_10 [DOI] [Google Scholar]
  45. Rapp CA, Etzel-Wise D, Marty D, Coffman M, Carlson L, Asher D, … Holter M. (2010). Barrier to evidence-based practice implementation: Results of a qualitative study. Community Mental Health Journal, 46, 112–118. doi: 10.1007/s10597-009-9238-z [DOI] [PubMed] [Google Scholar]
  46. Salas E, Tannenbaum SI, Kraiger K, & Smith-Jentsch KA (2012). The science of training and development in organizations: What matters in practice. Psychological Science in the Public Interest, 13, 74–101. doi: 10.1177/1529100612436661 [DOI] [PubMed] [Google Scholar]
  47. Schoenwald S, Mehta T, Frazier S, Shernoff E (2013). Clinical supervision in effectiveness and implementation research. Clinical Psychology: Science and Practice, 20, 44–59. doi: 10.1111/cpsp.12022 [DOI] [Google Scholar]
  48. Schoenwald S, Sheidow A, & Chapman J (2009). Clinical supervision in treatment transport: Effects on adherence and outcomes. Journal of Counseling and Clinical Psychology, 77, 410–421. doi: 10.1037/a0013788 [DOI] [PMC free article] [PubMed] [Google Scholar]
  49. Sewell K (2017). Theoretically grounded, evidence-informed clinical supervision for the SNAP programs: A model in development. The Clinical Supervisor, 36, 340–359. doi: 10.1080/07325223.2017.1352549 [DOI] [Google Scholar]
  50. Sholomskas DE, Syracuse-Siewert G, Rounsaville BJ, Ball SA, Nuro KF, & Carroll KM (2005). We don’t train in vain: A dissemination trial of three strategies of training clinicians in cognitive-behavioral therapy. Journal of Consulting and Clinical Psychology, 73, 106–115. doi: 10.1037/0022-006X.73.1.106 [DOI] [PMC free article] [PubMed] [Google Scholar]
  51. Shulman L (1993). Interactional supervision. Washington, D.C.: National Association of Social Workers Press. [Google Scholar]
  52. Substance Abuse and Mental Health Services Administration. (2012). Behavioral health, United States (HHS Publication No. [SMA] 13–4797). Rockville, MD: Substance Abuse and Mental Health Services Administration. [Google Scholar]
  53. Substance Abuse and Mental Health Services Administration (2013). Report to congress on the nation’s substance abuse and mental health workforce issues. Rockville, MD: Substance Abuse and Mental Health Services Administration. [Google Scholar]
  54. Supervision Competency Workforce Initiative. (2009). Enhancing supervisory skills in Connecticut’s behavioral health workforce. Retrieved from https://www.cwcbh.org/projects/supervisor_competency
  55. Whitaker T, Weismiller T, & Clark E (2006). Assuring the sufficiency of a frontline workforce: A national study of licensed social workers Executive Summary. Washington, DC: National Association of Social Workers. [Google Scholar]
  56. Whitley R, Gingerich S, Lutz WJ, & Mueser KT (2009). Implementing the Illness Management and Recovery program in community mental health settings: Facilitators and barriers. Psychiatric Services, 60, 202–209. doi: 10.1176/appi.ps.60.2.202 [DOI] [PubMed] [Google Scholar]
  57. Whittemore R & Knafl K (2005). The integrative review: Updated methodology. Journal of Advanced Nursing, 52, 546–553. doi: 10.1111/j.1365-2648.2005.03621.x [DOI] [PubMed] [Google Scholar]
  58. Worthen VE & Lambert MJ (2007). Outcome oriented supervision: Advantages of adding systematic client tracking to supportive consultations. Counsel Psychotherapy Research, 7, 48–53. doi: 10.1080/14733140601140873 [DOI] [Google Scholar]

RESOURCES