Abstract
Background
Effective clinical supervision is necessary for high-quality care in community-based substance use disorder treatment settings, yet little is known about current supervision practices. Some evidence suggests that supervisors and counselors differ in their experiences of clinical supervision; however, the impact of this misalignment on supervision quality is unclear. Clinical information monitoring systems may support supervision in substance use disorder treatment, but the potential use of these tools must first be explored.
Aims
First, this study examines the extent to which misaligned supervisor-counselor perceptions impact supervision satisfaction and emphasis on evidence-based treatments. This study also reports on formative work to develop a supervision-based clinical dashboard, an electronic information monitoring system and data visualization tool providing real-time clinical information to engage supervisors and counselors in a coordinated and data-informed manner, help align supervisor-counselor perceptions about supervision, and improve supervision effectiveness.
Methods
Clinical supervisors and frontline counselors (N=165) from five Midwestern agencies providing substance abuse services completed an online survey using Research Electronic Data Capture (REDCap) software, yielding a 75% response rate. Valid quantitative measures of supervision effectiveness were assessed, along with qualitative perceptions of a supervision-based clinical dashboard.
Results
Through within-dyad analyses, misalignment between supervisor and counselor perceptions of supervision practices was negatively associated with satisfaction of supervision and reported frequency of discussing several important clinical supervision topics, including evidence-based treatments and client rapport. Participants indicated the most useful clinical dashboard functions and reported important benefits and challenges to using the proposed tool.
Discussion
Clinical supervision tends to be largely an informal and unstructured process in substance abuse treatment, which may compromise the quality of care. Clinical dashboards may be a well-targeted approach to facilitate data-informed clinical supervision in community-based treatment agencies.
Introduction
Despite consensus on a number of evidence-based treatments for substance use disorders (SUD), most individuals with SUD do not receive the high-quality, evidence-based care needed for recovery.1,2 The National Institute on Drug Abuse 2016–2020 Strategic Plan3 states that our nation has a significant and ongoing treatment gap for SUD; less than 12% of those in need of treatment for SUD receive it at a specialty substance abuse facility,4 and many of those entering specialty treatment programs are not being given access to current evidence-based treatments.5 This constitutes a critical implementation gap, created in large part by a system in which addiction counselors often do not have the formal training and licensure required in most healthcare fields.6–8 This reality highlights the need for clinical supervision, which is the formal process of providing professional support by which counselors further develop clinical knowledge and skills.7,9 Effective clinical supervision accounts for up to 16% of the variance in client recovery outcomes, signaling the need for data-driven clinical supervision efforts that improve the quality of evidence-based treatment for SUD.10,11 Unfortunately, data-informed supervision in SUD treatment, a process characterized by ongoing measurement of actionable outcomes to guide clinical quality improvement, is “virtually non-existent” in the United States, signaling a need for measurement-based tools to facilitate this process.12,13
While some scholars have explored the perceptions of supervisors and counselors regarding clinical supervision, many knowledge gaps remain involving this important dyadic relationship. Relative to counselor reports of supervision intensity, supervisors report more weekly interaction with their counselors and greater diversity of modes of supervision interaction (e.g., one-on-one, group, phone, email), indicating a misalignment in perceptions between supervisors and counselors regarding clinical supervision practices.14 It is unclear, however, whether this misalignment—or the extent to which supervisors and their respective counselors differ in their perceptions on the current practices and quality of clinical supervision—impacts outcomes such as supervision satisfaction and emphasis on skill development in supervision.
Given the importance of clinical supervision in developing the skills necessary to ensure high-quality delivery of evidence-based treatments, Laschober and colleagues argue that the goal must be to achieve “very effective” clinical supervision ratings, a level rarely found in community-based SUD services.14 More research is needed to better understand the mechanisms involved in the supervision process, including strategies to improve the effectiveness of clinical supervision.15–17 The research that does exist suggests that systematic clinical supervision tools may help align supervisor and counselor perceptions and improve supervision effectiveness.14 One such tool would be a supervision-based “clinical dashboard”, an information monitoring system and data visualization tool providing real-time clinical information to engage supervisors and counselors in a coordinated and data-informed manner. The use of clinical dashboards and similar tools in healthcare,18,19 including mental health settings,20–23 is well-documented; however, little research has explored this concept in the SUD field. This study uniquely examines the potential utility of a supervision-focused clinical dashboard in the SUD field.
Current Study
This study was conducted within a network of five addiction service agencies active within the Community Academic Partnership on Addiction, a collaborative effort between university and community addiction treatment leaders to facilitate the integration of evidence-based treatments, practices, and strategies into practice. In this setting, the current study addresses two main research questions. First, we seek to answer the following question: “Do discrepancies between supervisors and their respective counselors regarding current supervision practices predict lower supervision satisfaction and reduced focus on providing evidence-based treatments?” Therefore, this study addresses the extent to which misaligned perceptions between supervisors and counselors impact supervision outcomes.
Second, we examine supervisor and counselor perceptions about a proposed data-informed supervision tool to address gaps in clinical supervision, as identified by both the current study and prior research.6,12,14,17 Here, the study seeks to answer the following question: “Is a supervision-based clinical dashboard feasible, and which specific functions should be included to best improve clinical supervision quality?” Therefore, this study also reports on formative work to develop a supervision-based clinical dashboard tool to help align supervisor-counselor perceptions and improve supervision effectiveness.
Methods
Participants
Participants were comprised of 165 full-time service providers of five Midwestern addiction service agencies who self-identified as primarily frontline direct service providers (66%), clinical supervisors or managers (24%), administrators (2%), or other (9%). More than one-third (n = 60; 38%) of participants reported to be current supervisors; the remainder (n = 98; 62%) did not have any supervisory responsibilities. The sample consisted of 102 supervisor-counselor dyads (i.e., matched pairs of supervisors and their respective counselors). Demographic information on the sample is presented in Table 1.
Table 1.
Demographics of the sample (N = 165)
| Total | Counselor | Supervisor | |||||||
|---|---|---|---|---|---|---|---|---|---|
|
| |||||||||
| Variables | N | % | M (SD) | N | % | M (SD) | N | % | M (SD) |
| Gender | 161 | 96 | 60 | ||||||
| Female | 112 | 70 | 69 | 72 | 40 | 67 | |||
| Male | 49 | 30 | 27 | 28 | 20 | 33 | |||
| Race | 159 | 98 | 60 | ||||||
| White | 123 | 77 | 73 | 74 | 46 | 77 | |||
| Black or African-American | 32 | 20 | 20 | 20 | 11 | 18 | |||
| Other | 11 | 7 | 5 | 5 | 3 | 5 | |||
| Program Type | 161 | 97 | 59 | ||||||
| Outpatient | 70 | 44 | 43 | 44 | 27 | 46 | |||
| Inpatient | 31 | 19 | 22 | 23 | 9 | 15 | |||
| Residential | 8 | 5 | 5 | 5 | 2 | 3 | |||
| Other | 52 | 32 | 27 | 28 | 21 | 36 | |||
| Currently Have Client Caseload | 163 | 98 | 60 | ||||||
| Yes | 122 | 75 | 84 | 86 | 34 | 57 | |||
| No | 41 | 25 | 14 | 14 | 26 | 43 | |||
| Level of Education | 160 | 96 | 59 | ||||||
| High School or less | 14 | 9 | 9 | 9 | 4 | 7 | |||
| Associates degree | 11 | 7 | 9 | 9 | 2 | 3 | |||
| Bachelors degree | 44 | 28 | 29 | 30 | 14 | 24 | |||
| Masters degree | 91 | 57 | 49 | 51 | 39 | 66 | |||
| Doctoral degree | 0 | 0 | 0 | 0 | 0 | 0 | |||
| Age | 149 | 42.4 (13.9) | 92 | 40.7 (14.8) | 53 | 45.3 (12.0) | |||
| Years in Current Position | 161 | 4.4 (4.5) | 96 | 3.9 (4.6) | 60 | 5.33 (4.5) | |||
| Years in SUD Field | 160 | 8.5 (8.7) | 96 | 6.9 (8.2) | 59 | 11.3 (9.0) | |||
Participants worked in one of the five CAPA agencies providing SUD services. These agencies consisted of a social service provider of housing, employment, and health programs for homeless individuals (n = 53; 32%), an outpatient substance abuse treatment center (n = 52; 32%), an alcohol and drug treatment program in correctional treatment settings (n = 27; 16%), a family-centered behavioral healthcare provider for women with addiction (n = 20; 12%), and an inpatient drug and alcohol rehabilitation center (n = 13; 8%).
Procedures
We administered an online survey via Research Electronic Data Capture (REDCap) software24 to participants identified by their organization to be either supervisors or supervisees involved in the delivery of addiction services. An initial email invitation was sent, and then up to three follow-up email invitations were sent in two-week intervals to those who had not yet completed the survey. Participants completing the online survey received $15 gift cards to Amazon.com. We obtained a 75% response rate to the online survey.
The study was approved by the Washington University Human Research Protection Office (Institutional Review Board ID #: 201508163; Approved 9/15/15; Renewed 7/8/16). This research was funded through the Center for Dissemination and Implementation in the Institute for Public Health at Washington University in St. Louis. The authors used the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) guidelines as a checklist for reporting the current observational study.
Measures
Perceptions of clinical supervision practices were measured with the Manchester Clinical Supervision Scale (MCSS-26), an instrument with established psychometric properties.25 The MCSS-26 assessed supervisors’ and counselors’ perceptions of the impact of supervision on normative (i.e., ongoing monitoring, evaluation, and quality control), formative (i.e., development of professional knowledge and clinical skills), and restorative (i.e., peer support and encouragement) domains of clinical supervision.26 This scale included mirrored items for both supervisors and counselors (e.g., “I provide my supervisee with valuable advice”; “My supervisor provides me with valuable advice”). Items were rated on a Likert scale of 0 (Strongly disagree) to 4 (Strongly agree). Internal consistency for this scale was excellent (α = .94 for counselors; α = .90 for supervisors). To calculate supervisor-counselor misalignment, the absolute value of the difference between supervisor and counselor perceptions of clinical supervision practices (i.e., MCSS-26 scores) within each dyad was computed through pairwise comparisons.
Perceptions of the need for and usefulness of a range of clinical dashboard functions was examined using a modified version of the Clinical Dashboard Questionnaire.22 This measure included items on access to key clinical care information (e.g., “To what extent do you know the current stage of change for your clients?”; 0=To a very small extent; 4=To a very great extent), the current presence or absence of various supervision capabilities (e.g., “Is there currently a system to help you manage your goals and responsibilities?”; 0=No; 1=Yes), and the perceived usefulness of obtaining certain functions or capabilities through a clinical dashboard (e.g., “How helpful would it be to obtain information on treatment session attendance rate via the dashboard display screen?”; 0=Not at all helpful; 4=Very helpful). Internal consistency for this scale was also excellent (α = .93).
Additionally, the online survey included items on (1) the frequency, duration, and structure of supervision meetings at the agency (e.g., “How frequently do you meet with your supervisor for 1-on-1 supervision?”); (2) frequency of discussing various supervision topics (e.g., evidence-based treatments/interventions, caseload review, client rapport, goal setting; 0=Never; 4=Always); and (3) satisfaction with the quality of their current supervision (0=Very dissatisfied; 4=Very satisfied), and (4) demographic information. Finally, the online survey included two open-ended items assessing the potential benefits and challenges associated with the use of an electronic tool to support data-informed clinical supervision.
Analytic Plan
Quantitative data
Data on current supervision practices were first analyzed descriptively through frequencies and means. Analysis of variance (ANOVA) and Pearson’s correlation analyses were used to test the associations between supervisor and counselor perceptions of supervision. Multiple linear regression analyses were used to determine the relationship between alignment of supervisor-counselor perceptions regarding clinical supervision practices and supervision-related outcomes of supervisor satisfaction, counselor satisfaction, and reported topics discussed in supervision meetings. Missing data for each variable was fairly minimal (<10%).
Qualitative data
Responses to open-ended items were coded using NVivo qualitative coding software and analyzed using directed content analysis.27 This constituted an iterative process of both inductive and deductive approaches, whereby prior research14–20 informed a very general a priori template of predefined codes yet new codes were assigned to emerging response categories during coding.28 Through frequent research team meetings, this coding scheme was refined and finalized.
Results
Quantitative Findings
Descriptive analyses of supervision practices
Supervisors (M = 3.04; SD = .48) and counselors (M = 2.90; SD = .62) had moderately favorable perceptions of current supervision practices. In supervisors, normative (M = 2.91; SD = .63), formative (M = 3.06; SD = .50), and restorative (M = 3.12; SD = .57) domains of clinical supervision were each rated moderately high. In counselors, normative (M = 2.83; SD = .67), formative (M = 2.89; SD = .73), and restorative (M = 2.97; SD = .68) domains were also each rated moderately high. Supervisors (M = 2.75; SD = .90) and counselors (M = 2.83; SD = 1.27) reported being moderately satisfied with their current supervision.
Analysis of variance in supervision practices
Figure 1 illustrates reports of the frequency, duration, and structure of current supervision practices. Ratings are presented separately for supervisors and counselors; ANOVA tests indicated that there were no significant differences between supervisors and counselors (ps ≥ .112), although counselors tended to report less frequent, shorter, and less structured supervision meetings as compared to supervisors.
Figure 1.
Perceptions of current supervision practices
Figure 2 indicates the relative frequency with which various topics were reportedly discussed in supervision meetings, indicating significant differences between supervisors and counselors. Although supervisors were more likely than counselors to report discussing evidence-based treatments (p < .01), client relationship/rapport (p < .01), and professional development (p < .05) topics, the two groups were largely similar in their perceptions of discussing task assignment, goal setting, and caseload review.
Figure 2.
Frequency of supervision topics discussed in meetings
Note. *p<.05 **p<.01 ***p<.001.
Correlation and multiple regression analyses of supervisor-counselor dyads alignment
Within the 102 supervisor-counselor dyads (i.e., matched pairs of supervisors and counselors) in the sample, perceptions of supervision practices on the MCSS-26 were not significantly correlated, r(100) = −.067, p = .504. To determine the relationship between supervisor-counselor alignment and supervision satisfaction, we controlled for covariates of the primary outcome of supervision satisfaction (based on significant bivariate correlations with the outcome) in the multiple linear regression analyses. For satisfaction of supervision in counselors, we controlled for counselor perceptions of supervision frequency, duration, and presence of an agenda, as well as supervisor perceptions of presence of an agenda. For satisfaction of supervision in supervisors, we controlled for counselor perceptions of supervision frequency and duration, as well as supervisor perceptions of duration, structure, and presence of an agenda. Even after accounting for all covariates, misalignment between supervisor and counselor perceptions of supervision practices was significantly negatively associated with satisfaction of supervision in both counselors (β = −.595, t = −2.93, p = .014) and supervisors (β = −.388, t = −2.37, p = .039). This misalignment also predicted counselors reporting reduced frequency of discussing professional development (p = .009), ethical questions (p = .009), goal setting (p < .001), therapy/interventions (p = .016), and client rapport (p = .001). Nonetheless, the misalignment between supervisor and counselor dyads did not predict personal, organizational policy, or off-task issues including discussion of topics unrelated to supervision (e.g. weather, family concerns).
Analysis of variance in perceptions about clinical supervision dashboard
Toward the end of the online survey, we defined a supervision-based clinical dashboard for participants by providing a generic description of its core purpose; the aim was to provide a general understanding of the proposed tool without overly influencing (or constricting) participants’ perceptions and responses. Participants were then given the opportunity to indicate which types of functions, or components, they would find most useful in a supervision-based clinical dashboard. Figure 3 presents the reported usefulness of each potential dashboard function, again indicating significant differences between supervisors and counselors.
Figure 3.
Usefulness of potential dashboard functions
Note. *p<.05 **p<.01 ***p<.001.
ANOVA tests indicated that, relative to counselors, supervisors reported higher ratings on the usefulness of a dashboard to compare agencies on client outcomes (p < .01), manage feedback on supervision quality (p < .001), and provide a formal agenda for supervision (p < .01). Predominantly, however, supervisors and counselors reported similar perspectives on the usefulness of most potential dashboard functions, particularly for the functions of reporting on attendance rate, treatment completion rate, current stage of change, and reduction in substance use since entering treatment. Overall, managing goals and responsibilities was reported to be the most useful potential function, followed by summary information on clients’ attendance rate, current stage of change, and reduction in substance use. When prompted with a description of the proposed electronic clinical supervision dashboard, 90% of counselors and 95% of supervisors indicated that such a tool could be helpful in supporting clinical supervision.
Qualitative Findings
The online survey had two open-ended questions about the dashboard: (1) Describe two potential benefits associated with the use of a supervision-based clinical dashboard, and (2) Describe two potential problems associated with the use of a supervision-based clinical dashboard. Table 2 outlines the key themes of potential benefits and challenges to using a clinical supervision dashboard. Eight themes emerged, including Applicability of the tool, Appropriateness of data captured, Efficiency/time, Impact on client care, Impact on counselor, Impact on counselor-supervisor relationship/communication, Perception of increased visibility, and Structure of supervision. Definitions and representative quotes regarding benefits and challenges are included for each theme.
Table 2.
Key themes regarding potential benefits and challenges of a supervision dashboard
| Theme | Definition | Representative Quotes |
|---|---|---|
| Applicability of Tool | Comments related to the usefulness or applicability of a dashboard within a setting, program, or among a specific audience. | Benefit: “It would also be neat to see trends of clients in general over time - for example, trends in successful completion, average stay of clients in treatment, etc” |
| Challenge: “Needs to not be cumbersome and will have to add value without distracting from an already overloaded job” | ||
| Appropriateness of data captured | Comments related to the type/display of data or information collected and/or available to collect via the dashboard. | Benefit: “Supervisors need a better understanding of the work load the counselors have and need to see each site has different needs for clients” |
| Challenge: “Could make agency outcomes more of a priority than staff development” | ||
| Efficiency/ time | Comments that relate to any aspect of time and/or clinical efficiency. This may include perceptions of how use of the dashboard may impact the time available for clinical care and supervision. | Benefit: “One place to find everything needed for supervision, no papers having to be filled out so more concise and structured in that way, less time if everything is computerized out of the typical work schedule” |
| Challenge: “We don’t need one more thing to have to do unless it really will save us time and will allow us to do our jobs more efficiently…We constantly add more and nothing gets taken away” | ||
| Impact on Client care | Comments related to how the use of a dashboard might alter, impact, or inform how clinicians interact with clients. | Benefit: “Knowing more about my clients in an easy to work with format. Having that knowledge on hand at any given time would be a great help for me” |
| Challenge: “Pressure to achieve goals of the supervisor may take away from treatment to clients” | ||
| Impact on counselor | Comments related to how the dashboard impacts counselor’s clinical care or professional development, including his/her ability to track supervision goals, be motivated, or be organized in clinical care. | Benefit: “Also with a Dashboard it would be nice to chart progress on different professional development goals. Similar to a treatment plan for the staff” |
| Challenge: “It may be distracting… another way for supervisors to keep track of us and monitor us, which may make us feel more constricted in our counseling approaches” | ||
| Impact on counselor – supervisor relationship/communication | Comments related to how the use of a dashboard might alter, change, or impact the communication and/or interpersonal relationship between a supervisor and his/her supervisees. | Benefit: “The most helpful thing about the dashboard could be that it could help me and my supervisor make sure out goals for supervision are the same (I often find we have two different agendas and goals for supervision)” |
| Challenge: “Dashboard could make supervisor/supervisee feel that their interaction on there is enough and it may cut back face to face supervision” | ||
| Perception of increased visibility | Comments that specify how use of a dashboard will increase visibility of treatment information (e.g., ease of access to information), outcomes (e.g., visualization of data), and clinician proficiency (e.g., compare self to others). May include both positive (e.g., sharing clinical information) and negative (e.g., reducing privacy) perceptions. | Benefit: “A benefit of a ‘dashboard’ is a quick reference look at… identified areas of client care and supervision. Secondly… it could help clinicians identify any weaknesses they may have and open dialogue about ways to improve |
| Challenge: “Supervision is personal and displaying it on the work dashboard seems like an invasion of privacy” | ||
| Structure of supervision | Comments that relate to how supervision is currently structured at an organization and how use of a dashboard might change or impact that structure. | Benefit: “Better organization for supervisions sessions and less unproductive time during sessions” |
| Challenge: “That supervision would become boring/unhelpful due to the rigid structure” |
Discussion
Our findings suggest that clinical supervision tends to be largely an informal and unstructured process, characterized by sparse meetings, infrequent use of an agenda, and only intermittent discussion of supervision topics central to the development of addiction counselors (e.g., treatment skills building, establishing client rapport, enhancing professional development). These gaps may squander important opportunities to foster addiction counselors’ training and professional development, roles that clinical supervision is intended to serve. In line with research by Laschober and others,14 our findings indicate that supervisors may rate the impact of supervision more favorably than counselors, reflecting an important disconnect between roles. In addition to supervisor-counselor perceptions of supervision not being correlated, the within-dyad discrepancies in perceptions are associated with lower satisfaction with supervision and less frequent discussion of important supervision topics. This also builds on prior research indicating that misalignment between provider perceptions and agency-level characteristics negatively impacts provider morale and organizational learning climate for evidence-based practice implementation.29
Further, supervisors and counselors reported only moderately favorable perceptions of their current clinical supervision experiences. While evidence of clinical utility is needed, it is possible that a tool, such as a dashboard, may help provide performance feedback in supervision, improve communication between supervisor-counselor dyads, and facilitate discussion of evidence-based treatments within clinical supervision. When presented with a range of potential functions and components for the dashboard, supervisors and counselors reported that a tool to manage supervision activities and track professional development progress and visualize data on clinical outcomes (e.g., treatment attendance rate, retention through first month of treatment) would be the most useful to their practice. Qualitative data aligned with these findings, indicating enthusiasm for an electronic supervision tool providing ready access to trends in client outcomes and ongoing evidence-based treatment fidelity monitoring in counselors.
Our findings on the perceived need and receptivity for measurement-based tools in community-based addiction service agencies suggest that a clinical dashboard, or a data visualization “information monitoring system”, may be a well-targeted approach to facilitating data-informed clinical supervision. Such a dashboard would focus on monitoring counselor treatment quality and client outcomes, providing a one-screen display of aggregate client data on key performance indicators (e.g. treatment completion rates, attendance rates) that could be summarized at the client, counselor, or agency level. For optimal integration with the existing workflow, the dashboard should be connected in real-time (or near real-time) with the agency electronic health record system, improving the usefulness and sustainability of the tool. Integration of such a tool with electronic health records would communicate a clear organizational expectation for using a data-driven approach to improve evidence-based treatment implementation.30
Although such a tool has not yet been developed to support clinical supervision in SUD, other technology developers and researchers in the SUD field have structured electronic clinical dashboards to resemble a dashboard of a car for frontline clinicians, prioritizing simple display and rapid comprehension of key clinical indicators (e.g., frequency of drug use, frequency of attending mutual self-help groups).31 Similar to this model, the supervision-based electronic dashboard would likely include a front page screen to provide a snapshot of core indicators, including treatment attendance and completion rates, current stage of change, and reduction in substance use since entering treatment. Users could access additional information, including trends over time and targeted supporting resources (e.g., Treatment Improvement Protocols; TIP Series), simply by clicking on the icon of the relevant clinical indicator.
Through the use of basic filtering options, users would also be able to “zoom in” to specific indicators of a particular client or “zoom out” to aggregate indicators at the levels of the counselor, supervisor, or agency. For lower-tech agencies, dashboard summaries of the front page screen could be printed and shared in individual or group supervision meetings or at program or agency-wide meetings. However, supervision-based dashboards are likely to be most useful when both supervisors and counselors have access to the electronic tool in near real-time and when the dashboard is integrated with the agency electronic health record system to limit the extent to which counselors must manually enter clinical data for the dashboard to be updated and clinically relevant.
Further, to achieve maximum impact on supervision effectiveness, the clinical dashboard should address all three domains of clinical supervision—normative, formative, and restorative. For instance, the dashboard could improve a) normative supervision by providing measurement feedback on treatment fidelity and client engagement outcomes to identify counselors in need of extra supervision or assistance, (b) formative supervision by offering web-based training resources to further develop knowledge and skills for effective treatment delivery, and (c) restorative supervision by increasing avenues for supervisor-counselor communication and support through a HIPAA-compliant central hub for real-time messaging and file sharing.26
Limitations
Despite the many strengths of this study, including use and integration of both qualitative and quantitative data, strong psychometric properties of the measures, high response rate, and the innovative diagnostic yet solutions-focused approach to the important issue of clinical supervision, there are certainly study limitations that must be acknowledged. First, as some counselors had the same supervisor, the calculation of supervisor-counselor misalignment required the use of supervisor data multiple times in some cases. While this partial dependence in the data can potentially threaten validity of the findings, we conducted sensitivity analyses that temporarily removed supervisors appearing more often in the data. All of these analyses yielded similar results and interpretations as are presented above, reducing concern of potential bias and indicating the robustness of our findings and conclusions.
Second, although the five agencies included were diverse in their structure and approach to SUD service provision, they were all active organizations within a community-academic network (i.e., CAPA). As a result, the generalizability of these results to SUD agencies more broadly cannot be guaranteed. Further, as the supervisees in this study are in a community-based SUD service setting and not in a clinical training program, our results may not generalize to younger and less experienced trainees in the SUD field. In addition, while we presented participants with a general description of the proposed clinical dashboard and a wide range of potential components to include, we obtained their perspectives in the absence of a prototype dashboard which may have limited the comprehensiveness and applicability of participant responses regarding the clinical dashboard. Finally, although our 75% response rate is quite high for research in community-based service settings, we are unable to assess how our participants may or may not differ from the 25% of those approached who did not participate. As with most studies, our findings are subject to some degree of nonresponse bias; however, a response rate of 75% can be considered “very good” and at relatively minimal risk for nonresponse bias.32
Implications and Future Directions
In summary of our findings, there are substantial discrepancies between supervisors and counselors regarding perceptions of supervision processes in SUD treatment. This argues for a data-informed approach to improve supervision processes by enhancing normative (i.e., measurement-based outcomes feedback), formative (i.e., skill building and professional development), and restorative (i.e., supervisor-counselor communication and support) domains of clinical supervision. A supervision-based electronic dashboard represents one potential solution to the supervision gaps identified in this study by leveraging clinically-relevant data to improve care. Regarding the proposed electronic dashboard, we found adequate alignment between supervisors and counselors in perceived usefulness of various potential dashboard functions. Our data inform future dashboard development efforts and the field more broadly, suggesting that the most useful dashboards will (1) display visual data on key outcomes including client attendance rate, current stage of change, and reductions in substance use, and (2) help manage goals and responsibilities for supervisors and counselors. Thoughtful design, development, and evaluation of data-informed supervisions tools such as electronic dashboards are greatly needed. Future research should build on these initial findings and focus on the clinical utility of electronic dashboards and explore the degree to which positive effects on client outcomes operate through improved clinical supervision.
Evidence suggests that a clinical dashboard may bolster adherence to guidelines and has the potential to improve client outcomes.18 However, before any judgment on utility can be made, empirical evidence linking the use of dashboards to the use of evidence-based approaches or improved clinical outcomes is needed. Future research on dashboards should consider issues that are central to clinical supervision, such as training and professional development of counselors, rather than exclusively focusing on monitoring and providing feedback on client outcomes. Therefore, it may be prudent to build in features to supplement the performance feedback function, including fidelity monitoring tools, evidence-based treatment training resources, and supervisor-counselor communication supports. Such a tool may help to align supervisor-counselor perceptions and enhance normative, formative, and restorative domains of clinical supervision, which could significantly improve the quality and effectiveness of clinical supervision in substance use disorder treatment programs.
Acknowledgments
We would like to acknowledge Emily Kryzer, MSW, MPH for her integral role in the qualitative coding and analysis of these data.
Funding and Support
This research was supported by a Small Grants Program through the Center for Dissemination and Implementation in the Institute for Public Health at Washington University in St. Louis. Dr. Baumann is also funded by the Dissemination and Implementation Research Core (DIRC) of the Washington University Institute of Clinical and Translational Sciences (NCRR UL1RR024992) and the Implementation Research Institute (NIMH R25 MH080916).
Contributor Information
Alex T. Ramsey, Department of Psychiatry, Washington University School of Medicine
Ana Baumann, Brown School of Social Work, Washington University in St. Louis
David Patterson Silver Wolf, Brown School of Social Work, Washington University in St. Louis
Yan Yan, Division of Public Health Sciences, Washington University School of Medicine
Ben Cooper, Institute for Public Health, Washington University in St. Louis
Enola Proctor, Brown School of Social Work, Washington University in St. Louis
References
- 1.The National Center on Addiction and Substance Abuse at Columbia University. An SBIRT implementation and process change manual for practitioners. 2012 Available at: http://www.casacolumbia.org/sites/default/files/files/An-SBIRT-implementation-and-process-change-manual-for-practitioners.pdf.
- 2.Miller WR, Sorensen JL, Selzer JA, Brigham GS. Disseminating evidence-based practices in substance abuse treatment: A review with suggestions. Journal of substance abuse treatment. 2006;31(1):25–39. doi: 10.1016/j.jsat.2006.03.005. [DOI] [PubMed] [Google Scholar]
- 3.National Institute on Drug Abuse (NIDA) Strategic Plan 2016–2020. Available at: https://www.drugabuse.gov/sites/default/files/2016-2020nidastrategicplan.pdf.
- 4.SAMHSA National Survey on Drug Use and Health: Detailed Tables 2014. Available at: https://www.samhsa.gov/data/sites/default/files/NSDUH-DetTabs2014/NSDUH-DetTabs2014.pdf.
- 5.Knudsen HK, Abraham AJ, Roman PM. Adoption and Implementation of medications in addiction treatment programs. Journal of Addiction Medicine. 2011;5:21–27. doi: 10.1097/ADM.0b013e3181d41ddb. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Rothrauff-Laschober TC, de Tormes Eby LT, Sauer JB. Effective clinical supervision in substance use disorder treatment programs and counselor job performance. Journal of mental health counseling. 2013;35(1):76. doi: 10.17744/mehc.35.1.50n6w37328qp8611. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Powell DJ. Clinical supervision and professional development of the substance abuse counselor. DIANE Publishing. 2010 [PubMed] [Google Scholar]
- 8.Woo SM, Hepner KA, Gilbert EA, Osilla KC, Hunter SB, Muñoz RF, Watkins KE. Training addiction counselors to implement an evidence-based intervention: strategies for increasing organizational and provider acceptance. Cognitive and behavioral practice. 2013;20(2):232–44. doi: 10.1016/j.cbpra.2012.03.004. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Powell DJ. Clinical supervision in alcohol and drug abuse counseling: principles, models, methods. John Wiley & Sons; 2004. [Google Scholar]
- 10.Callahan JL, Almstrom CM, Swift JK, Borja SE, Heath CJ. Exploring the contribution of supervisors to intervention outcomes. Training and Education in Professional Psychology. 2009;3(2):72. [Google Scholar]
- 11.Wheeler S, Richards K. The impact of clinical supervision on counsellors and therapists, their practice and their clients: a systematic review of the literature. Counselling and Psychotherapy Research. 2007;7(1):54–65. [Google Scholar]
- 12.Accurso EC, Taylor RM, Garland AF. Evidence-based practices addressed in community-based children’s mental health clinical supervision. Training and Education in Professional Psychology. 2011;5(2):88. doi: 10.1037/a0023537. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Carroll KM, Rounsaville BJ. A vision of the next generation of behavioral therapies research in the addictions. Addiction. 2007;102(6):850–62. doi: 10.1111/j.1360-0443.2007.01798.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Laschober TC, de Tormes Eby LT, Sauer JB. Clinical supervisor and counselor perceptions of clinical supervision in addiction treatment. Journal of Addictive Diseases. 2012;31(4):382–8. doi: 10.1080/10550887.2012.735599. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Beidas RS, Kendall PC. Training therapists in evidence-based practice: a critical review of studies from a systems-contextual perspective. Clinical Psychology: Science and Practice. 2010;17(1):1–30. doi: 10.1111/j.1468-2850.2009.01187.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Schoenwald SK, Mehta TG, Frazier SL, Shernoff ES. Clinical supervision in effectiveness and implementation research. Clinical Psychology: Science and Practice. 2013;20(1):44–59. [Google Scholar]
- 17.Dorsey S, Pullmann MD, Deblinger E, Berliner L, Kerns SE, Thompson K, Unützer J, Weisz JR, Garland AF. Improving practice in community-based settings: a randomized trial of supervision-study protocol. Implementation Science. 2013;8(1):1. doi: 10.1186/1748-5908-8-89. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Dowding D, Randell R, Gardner P, Fitzpatrick G, Dykes P, Favela J, Hamer S, Whitewood-Moores Z, Hardiker N, Borycki E, Currie L. Dashboards for improving patient care: review of the literature. International journal of medical informatics. 2015;84(2):87–100. doi: 10.1016/j.ijmedinf.2014.10.001. [DOI] [PubMed] [Google Scholar]
- 19.Egan M. Clinical dashboards: impact on workflow, care quality, and patient safety. Critical care nursing quarterly. 2006;29(4):354–61. doi: 10.1097/00002727-200610000-00008. [DOI] [PubMed] [Google Scholar]
- 20.Bickman L. A measurement feedback system (MFS) is necessary to improve mental health outcomes. Journal of the American Academy of Child and Adolescent Psychiatry. 2008;47(10):1114. doi: 10.1097/CHI.0b013e3181825af8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Chorpita BF, Daleiden EL, Bernstein AD. At the intersection of health information technology and decision support: Measurement Feedback Systems… and beyond. Administration and Policy in Mental Health and Mental Health Services Research. 2016;43(3):471–7. doi: 10.1007/s10488-015-0702-5. [DOI] [PubMed] [Google Scholar]
- 22.Daley K, Richardson J, James I, Chambers A, Corbett D. Clinical dashboard: use in older adult mental health wards. The Psychiatrist Online. 2013;37(3):85–8. [Google Scholar]
- 23.Lyon AR, Lewis CC, Boyd MR, Hendrix E, Liu F. Capabilities and characteristics of digital measurement feedback systems: results from a comprehensive review. Administration and Policy in Mental Health and Mental Health Services Research. 2016;43(3):441–66. doi: 10.1007/s10488-016-0719-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, Conde JG. Research electronic data capture (REDCap)—a metadata-driven methodology and workflow process for providing translational research informatics support. Journal of biomedical informatics. 2009;42(2):377–81. doi: 10.1016/j.jbi.2008.08.010. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Winstanley J, White E. The MCSS-26©: Revision of The Manchester Clinical Supervision Scale© using the Rasch Measurement Model. Journal of Nursing Measurement. 2011;19(3):160–78. doi: 10.1891/1061-3749.19.3.160. [DOI] [PubMed] [Google Scholar]
- 26.Proctor B. Supervision: a cooperative exercise in accountability. In: Marken M, Payne M, editors. Enabling and ensuring supervision in practice. 1986. National Youth Bureau, Council for Education and Training in Youth and Community Work. [Google Scholar]
- 27.Hsieh HF, Shannon SE. Three approaches to qualitative content analysis. Qualitative health research. 2005;15(9):1277–88. doi: 10.1177/1049732305276687. [DOI] [PubMed] [Google Scholar]
- 28.Creswell J, Klassen A, Plano Clark V, Smith K. Best practices for mixed methods research in the health sciences. 2011 Available at: https://www2.jabsom.hawaii.edu/native/docs/tsudocs/Best_Practices_for_Mixed_Methods_Research_Aug2011.pdf.
- 29.Ramsey AT, van den Berk-Clark C, Patterson D. Provider-agency fit in substance abuse treatment organizations: implications for learning climate, morale, and evidence-based practice implementation. BMC Research Notes. 2015;8:194. doi: 10.1186/s13104-015-1110-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Patterson D, Ramsey AT. Do organizational expectations influence workers’ implementation perceptions? Psychological Services. 2016;13(4):428. doi: 10.1037/ser0000090. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Cucciare MA, Weingardt KR, Humphreys K. How Internet Technology Can Improve the Quality of Care for Substance Use Disorders. Current Drug Abuse Reviews. 2009;2:256–262. doi: 10.2174/1874473710902030256. [DOI] [PubMed] [Google Scholar]
- 32.Babbie ER. Survey Research Methods: Second Edition. Cengage Learning. 1990 [Google Scholar]



