Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2019 Mar 21.
Published in final edited form as: Subst Abus. 2017 Oct 18;39(2):162–166. doi: 10.1080/08897077.2017.1380743

Openness to adopting evidence-based practice in public substance use treatment in South Africa using task shifting: Caseload size matters

Jessica F Magidson a, Jasper S Lee b, Kim Johnson c, Warren Burnhams d, J Randy Koch e, Ron Manderscheid f, Bronwyn Myers c,g
PMCID: PMC5862765  NIHMSID: NIHMS909664  PMID: 28934063

Abstract

Background

In response to the lack of coverage for substance use treatment in the Western Cape province of South Africa, the local government expanded funding for evidence-based practices (EBPs) for treating substance use. Yet, little is known about provider and staff attitudes towards adopting EBPs in this setting, which is particularly relevant in this context where task shifting clinical care increases demands on paraprofessional providers. This study aimed to (1) assess attitudes towards adopting EBPs among a range of staff working in substance use treatment in Cape Town using a task shifting model; and (2) evaluate factors associated with openness towards adopting EBPs in this setting.

Methods

Staff (n=87) were recruited from 11 substance use treatment clinics. Demographics and job-related characteristics were assessed. Staff perceptions of organizational factors were assessed using the TCU Organizational Readiness for Change (ORC) scale. The dependent variable, attitudes towards adopting EBPs, was assessed using the Evidence-Based Practice Attitude Scale (EBPAS).

Results

This study is one of the first to administer the EBPAS in South Africa and found good internal consistency (total score: α=.82). In a multivariable model adjusting for site and factors associated with EBPAS total score at the bivariate level, only smaller caseload size was associated with greater openness to adopting EBPs (B=1.62, SE=.73; t=2.21; p<.05).

Conclusions

As pressure to scale up implementation of EBPs in South African substance use treatment services intensifies, additional efforts are needed to understand barriers to adopt EBPs in this setting. Supporting staff adoption of EBPs in resource-limited settings may require additional resources to limit staff caseloads in the context of task shifting.

Keywords: implementation science, substance use treatment, evidence based practice, international addiction, task shifting

1. Introduction

The Western Cape (WC) province of South Africa has faced alarming increases in rates of substance use in the past two decades1. Further, coverage for substance use treatment is less than 5% of the need in this area2,3. In response, the City of Cape Town and the Western Cape province expanded funding for evidence-based substance use treatment, which has been the core of the public health response to the growing substance use epidemic in Cape Town4. Further, given shortages of trained health care workers and limited accessibility of services to address the substance use epidemic in Cape Town5,6, there have been pressures to use “task shifting” models of care, whereby clinical responsibilities are distributed, when appropriate, to less specialized members of the team7.

Despite the priority of increasing access to evidence-based substance use treatment in publicly-funded treatment centers, there have been few efforts to understand provider and staff attitudes towards adopting evidence-based practices (EBPs) for substance use in this setting in the context of task shifting models. Prior qualitative research examining substance use treatment providers’ (n=21) perspectives on barriers to implementing a new performance evaluation system identified organizational barriers including staff burden, limited time to adopt new practices, and limited data analysis skills. Participants also emphasized the need to have providers invested in the proposed change8. Other research also identified organizational barriers to implementing EBPs, such as indecision about the motivation to change, which was found to be influenced by job level and role9. As pressure to scale up implementation of EBPs in South African substance use treatment services intensifies, additional efforts are needed to better understand barriers and openness to adopting EBPs in this setting, including attention to individual factors in the context of task shifting10.

As such, the primary aims of this study were to (1) assess attitudes towards adopting EBPs among a diverse group of staff (directors, supervisors, counselors, and support staff) using task shifting models in 11 substance use treatments centers in Cape Town, South Africa; and (2) evaluate factors associated with greater openness towards adopting EBPs in this setting.

2. Material and Methods

2.1 Procedures

As part of the evaluation of a new performance measurement system for South African substance use treatment services11, staff at participating facilities were asked to complete a survey on organizational functioning and openness to adopting new EBPs. The WC province was purposively selected as a site for pilot implementation as the publicly-funded substance use treatment programs serve diverse population groups with different patterns of drug use and have different types of treatment infrastructure. At each of these implementing treatment sites, staff were approached through the director/treatment manager who identified people to be trained in the performance evaluation system that was being evaluated as part of this study. The types of staff members surveyed across sites included directors, supervisors, counselors, and support staff; at least one supervisor/director and one counsellor were recruited at each site. All of the staff surveyed had clinical responsibilities/caseloads. As a task shifting model of care delivery7, even the support staff included in this process had a clinical case load in that they would conduct intake assessments and coordinate referrals to more specialized care. They would also often deliver brief interventions to retain people in care (i.e., “retention counselling”).

This performance measurement system was first implemented in 13 treatment facilities (three residential and 10 outpatient) in the WC in 2014. At 11 of these sites (one residential and all outpatient sites), staff selected to implement the new performance evaluation system were administered the assessment instruments detailed below. The assessments were administered in-person using written pen and paper completion. EBPs in this setting include primarily cognitive behavioral therapy (CBT), motivational interviewing/motivational enhancement therapy, and twelve-step facilitation.

Staff who agreed to participate were asked to provide written informed consent prior to self-completing the questionnaire. The South African Medical Research Council Ethics Committee reviewed and approved the study protocol. Participants were given a R50 (~USD 5 at time of study) voucher to thank them for their participation.

2.2 Assessments

2.2.1

Demographic questionnaire assessed age, gender, and education level.

2.2.2

Job-related characteristics survey assessed job role (program director/supervisor, counselor, or support staff who are administrative clerks who also conduct clinical intakes and brief interventions), time in current job (<1 year; 1–3 years; >3 and <5 years; 5+ years), and number of current clients on caseload (1–20; 21–40; >40).

2.2.3

TCU Organizational Readiness for Change (ORC) scale assessed staff perceptions of organizational readiness for change. This measure has previously been used in South Africa9. The measure assessed staff perceptions of four organizational factors: Motivation for Change (33 items), Adequacy of Resources (31 items), Staff Attributes (31 items), and Organizational Climate (30 items)12. Items are rated on a 5-point scale ranging from “disagree strongly” =1 to “agree strongly” =5. A subset of items are assessed independently of the subscales, including whether “CBT guides much of your counseling” which was used as our indicator of individual CBT orientation. For this item, responses of “disagree strongly”, “disagree”, or “uncertain” were coded as “no” and responses of “agree” or “agree strongly” were coded as “yes”. In the current study, α’s ranged from good to excellent: Motivation scale α=92; Adequacy of Resources α=.79; Staff Attributes α=.87; Organizational Climate α=.70.

2.2.4

The Evidence-Based Practice Attitude Scale (EBPAS) is a 15-item self-report measure that assessed attitudes more generally towards adopting EBPs13, which was the main outcome in the current study. Participants rated the degree to which they agreed with each statement on a 5-point scale from 0 (not at all) to 4 (to a very great extent). The EBPAS is comprised of four subscales (intuitive Appeal of EBP, Requirements of adopting EBP, Openness to new practices, and Divergence of usual practice with EBP), which comprise a total score reflecting the extent to which respondents are open to adopting EBPs. We followed prior research that used the composite score of the EBPAS1317. In this study, internal consistency was good (EBPAS total score; α=.82).

2.3 Statistical Analysis

First, bivariate analyses were conducted examining the relationship between all variables listed above and the dependent variable (EBPAS score). Next, a multivariable linear regression analysis was conducted, entering in the model variables related to EBPAS score in bivariate analyses at p ≤ .05 and adjusting for treatment site. All analyses were run in SPSS v.24.

3. Results

3.1 Sample characteristics

Table 1 presents the full sample characteristics (n=87). The sample represented a diverse range of roles such as program directors (n=7), clinical supervisors (n=3), counselors (n=46), and support staff who conduct clinical intakes (n=28). Mean age of participants was 37.3 years (SD=10.9) and 68% were female; 45% did not complete high school. Just less than half (47%) had been in their current job 3 years or less. Approximately one-quarter of the sample (24%) had over 40 patients on their caseload. See Table 1 for all demographic and clinical variables.

Table 1.

Descriptive characteristics and relation to EBPAS total score in study sample.

Variable % (N) Mean EBPAS score (M, SD)* Relation to EBPAS total score (statistic) Relation to EBPAS total score (p value)
Age (Mean, SD) 37.3, 10.9 -- r=.26 p=.024
Gender F(1, 69)=.67 p=.42
 Male 26% (23) 10.54 (2.54)
 Female 68% (59) 10.03 (2.34)
Education level F(5, 66)=1.03 p=.41
 Did not complete high school/no degree 45% (39) 9.85 (2.20)
 Completed high school 13% (11) 9.77 (2.17)
 Postgraduate diploma in addictions care 2% (2) 13.67 (0)
 Bachelors degree 21% (18) 10.40 (2.59)
 Graduate degree 8% (7) 11.32 (3.27)
 Other 8% (7) 9.40 (2.61)
Role F(2, 71)=7.12 .002
 Program director/Supervisor 12% (10) 11.88 (2.18)
 Counselors 53% (46) 10.45 (2.24)
 Support staff who do intakes 32% (28) 8.78 (2.31)
Time in current job F(3, 69)=.50 .68
 <1 year 23% (20) 9.65 (2.36)
 1–3 years 24% (21) 10.55 (1.61)
 3.1–5 years 30% (26) 10.39 (2.12)
 >5 years 21% (18) 10.19 (3.41)
Number of clients F(2, 65)=4.18 .02
 <20 47% (41) 10.81 (1.91)
 20–40 14% (12) 10.21 (2.43)
 >40 24% (21) 9.07 (2.41)
CBT orientation F(1, 71)=11.69 .001
 No/Uncertain 23% (20) 8.38 (2.34)
 Yes 68% (59) 10.58 (2.26)
TCU Organizational Readiness for Change (Mean, SD)
 Motivation for Change 30.6 (6.8) r=.03 .81
 Adequacy of Resources 35.1 (5.6) r=.15 .20
 Staff Attributes 39.6 (4.9) r=.32 .006
 Organizational Climate 35.5 (4.0) r=.27 .02
*

Note. Percentages may not add up to 100 due to missing data. Mean EBPAS score provided for categorical variables only.

3.2 Bivariate analyses

At the bivariate level, the individual-level factors related to EBPAS total score were being older (r=.26, p=.024), individual CBT orientation (F(1, 71) = 11.69), having a smaller case load (F(2,65)=4.18, p=.02), and job role (F(2,71)=7.12, p=.002): support staff who conduct clinical intakes were the least open to adopting EBPs compared to counselors and clinical supervisors/program directors. At the organizational level, there were not significant differences across sites in EBPAS scores (F(10,63)=1.26; p=.27). Further, there were also no differences in EBPAS scores in residential vs. outpatient programs (F(1, 72)=.20; p=.66). Regarding correlations between ORC scales and EBPAS scores, only more favorable organizational climate (r=.27, p=.02) and more favorable organizational staff attributes (r=.32, p=.006) were significantly related to EBPAS scores. See Table 1 for all bivariate relationships with EBPAS total score.

3.3 Multivariable model

Variables related to the dependent variable (EBPAS) at the bivariate level (at p ≤ 0.05: age, CBT orientation, job role, number of clients, organizational climate, organizational staff attributes) were included in the multivariable linear regression analysis, and treatment site was adjusted for. In the multivariable analysis, only having fewer than 20 clients was associated with greater openness towards EBPs (B=1.62, SE= .73; t= 2.21; p<.05). See Table 2 for multivariable linear regression results.

Table 2.

Multivariable linear regression model of attitudes towards evidence based practice

Variable B, SE 95%CI t p
Number of clients*
 <20 clients 1.62, .73 .15, 3.08 2.21 .03
 20–40 clients 1.23, .81 −.40, 2.86 1.51 .14
Job role**
 Counselor .47, .69 −.92, 1.86 .68 .50
 Supervisor 1.26, 1.19 −1.13, 3.65 1.05 .30
Age −.02, .04 −.09, .06 −.45 .65
CBT orientation (yes) .20, .84 −1.48, 1.88 .24 .81
TCU Organizational climate .15, .09 −.03, .34 1.65 .11
TCU Staff Attributes .07, .08 −.09, .22 .83 .41
*

Note. Dependent variable: EBPAS total score. Results do not differ whether site is adjusted for. To be overly conservative, we have adjusted for site in these results.

*

Reference group is >40 number of clients.

**

Reference group is support staff.

4. Discussion

This study evaluated attitudes towards adopting EBPs among substance use treatment staff in 11 public substance use treatment facilities in Cape Town, South Africa and examined factors related to greater openness to implementing EBPs in this setting. To our knowledge, the current study is one of the first to administer the EBPAS measure in South Africa. In the multivariable model, only smaller caseload size was associated with greater openness to adopt EBPs. Staff with fewer clients were more likely to have greater openness towards adopting EBPs than staff with a larger caseload. Among the staff who participated in the current study, 24% had over 40 clients on their caseload. In resource-limited treatment settings, such as the settings in which this study took place, staff are typically overburdened by high numbers of clients and few resources to meet each patients’ needs18, presenting a barrier to adopting EBPs in this setting.

This study contributes to our understanding of attitudes towards EBP adoption in a low-resource setting using the EBPAS. Although the measure performed well in terms of internal consistency (α=.82), the lack of prior validation of the EBPAS in this setting should be noted. The measure has been used in several international settings outside the US, including the Netherlands19, Spain15, and Greece20; however, its validation in a resource-limited sub-Saharan African context is limited. Future research is needed to more systematically validate the EBPAS in this context, particularly in light of recent rollout of evidence-based substance use treatment4.

As one might expect, staff members who endorsed CBT as guiding their approach were more likely to be open to adopting EBPs more generally. Yet, previous research has shown that among substance use treatment providers in South Africa who endorsed CBT as guiding their approach, approximately two-thirds had received no formal training in CBT18. These findings confirm the need to continue to develop the substance use treatment workforce in South Africa by providing training and supervision for delivering EBPs21, and monitoring of fidelity to evidence-based models of treatment. Yet, findings also point to the need to manage caseload size to maximize the benefit of training and supervision efforts.

There were numerous factors that were unrelated to attitudes towards EBPs, including gender, education level, time in current job, staff perceptions of the organization’s readiness for change, and adequacy of resources. It was surprising that some of these factors were unrelated to attitudes towards adopting EBPs given prior research suggesting their relevance19,22,23. However, it may also reflect the diverse sample of staff roles sampled, as a lack of organizational resources may not be a barrier identified by clinicians and support staff. Further, the organizational variables assessed were staff perceptions of the organization, and not necessarily reported from the organization’s administration or leadership. The gap and misperception between organization-wide approaches to EBPs vs. reporting by clinicians on the ground has been identified in other substance use treatment research24.

Findings must be interpreted in light of other study limitations, including a relatively small sample that was comprised predominantly of counselors (53%). Although we statistically adjusted for job role in the multivariable model, caseload size is likely most impactful for counselors providing direct clinical care; however, due to expectations around task shifting at this site, all staff surveyed had a clinical role, including support staff. Future work with a larger sample size across each job role category may be better able to determine the factors associated with openness towards EBPs for each type of role. Additionally, staff reported on organizational variables; future directions may include using additional organization-level variables that do not rely on staff reporting, for instance observational methods to assess organizational climate. Additionally, interpretation of the EBPAS scales would be further enhanced by qualitative evaluations of barriers to adopting new evidence-based practices. Although qualitative research was conducted as part of this initiative25, it was separate from the current study. Further research would benefit from mixed methods evaluations of factors influencing readiness to adopt new evidence-based practices. Future research and EBP implementation efforts also may benefit from determining government officials’ attitudes towards implementing EBPs, as this buy-in will ultimately guide policy efforts. Finally, staff members assessed were those who were implementing a new performance evaluation system at each site, and we did not track the number of staff at each site who were not enrolled. As such, we do not know if the staff surveyed were representative of the larger sample. We also did not systematically assess in this study degree of implementation of EBPs prior to this study, or fidelity of EBP delivery, which would be important factors to assess in future research.

Conclusions

Current findings suggest that monitoring staff treatment caseload is an important consideration when considering adoption of new EBPs. The current findings have implications for program managers to monitor caseload sizes and administrative burden across substance use treatment staff when promoting adoption of new EBPs. This is particularly relevant currently in under-resourced settings where a growing workforce of paraprofessional, non-specialized community health workers are assuming greater responsibility in addressing unmet mental health and substance use treatment needs through task shifting. Policy efforts to improve implementation of EBPs may require additional resources to limit the number of clients on each staff members’ caseload and increase resources to address administrative burden, as the greater number of clients assigned to a counselor, the more administrative work is also required, even for clients who may not be actively engaged in care.

Acknowledgments

Funding

This research was funded by the Western Cape Department of Social Development (WC-DoSD). Its contents are solely the responsibility of the authors and do not necessarily represent the official views of the WC-DoSD. The WC-DoSD had no further role in the study design; in the collection, analysis and interpretation of data; in the writing of the report; or in the decision to submit the paper for publication. Dr. Magidson’s work on this manuscript was supported by the National Institutes of Health [K23DA041901]. The work on this publication was also funded by the South African Medical Research Council.

Footnotes

Throughout the paper, we use the term “task shifting” to encompass both task shifting and task sharing. “Task sharing” involves delineating specific roles and responsibilities for each provider within a clinical team, including important ways in which higher level providers support training and supervision of less specialized providers.

Author Contributions

JM and BM conceptualized this manuscript and wrote the first draft in collaboration with JL, KJ, and WB. RK and RM critically edited and revised the current draft. All authors participated in the research and/or manuscript preparation. All authors have approved the final manuscript.

References

  • 1.Dada S, Plüddemann A, Parry C, et al. Monitoring alcohol & drug abuse trends in South Africa (July 1996–December 2011) SACENDU Res Brief. 2012;15(1):1–14. [Google Scholar]
  • 2.Meade CS, Towe SL, Watt MH, et al. Addiction and treatment experiences among active methamphetamine users recruited from a township community in Cape Town, South Africa: A mixed-methods study. Drug Alcohol Depend. 2015;152:79–86. doi: 10.1016/j.drugalcdep.2015.04.016. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Myers B, Kline TL, Doherty IA, Carney T, Wechsberg WM. Perceived need for substance use treatment among young women from disadvantaged communities in Cape Town, South Africa. BMC Psychiatry. 2014;14(1):1. doi: 10.1186/1471-244X-14-100. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Gouse H, Magidson JF, Burnhams W, et al. Implementation of cognitive-behavioral substance abuse treatment in sub-Saharan Africa: Treatment engagement and abstinence at treatment exit. PloS One. 2016;11(1):e0147900. doi: 10.1371/journal.pone.0147900. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Myers B, Louw J, Fakier N. Alcohol and drug abuse: removing structural barriers to treatment for historically disadvantaged communities in Cape Town. Int J Soc Welf. 2008;17(2):156–165. [Google Scholar]
  • 6.Myers BJ, Louw J, Pasche SC. Inequitable access to substance abuse treatment services in Cape Town, South Africa. Subst Abuse Treat Prev Policy. 2010;5(1):28. doi: 10.1186/1747-597X-5-28. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.World Health Organization. Task shifting: rational redistribution of tasks among health workforce teams: global recommendations and guidelines. Geneva, Switzerland: WHO Press; 2007. [Google Scholar]
  • 8.Myers B, Petersen Z, Kader R, et al. Identifying perceived barriers to monitoring service quality among substance abuse treatment providers in South Africa. BMC Psychiatry. 2014;14(1):1. doi: 10.1186/1471-244X-14-31. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Bowles S, Louw J, Myers B. Perceptions of organizational functioning in substance abuse treatment facilities in South Africa. Int J Ment Health Addict. 2011;9(3):308–319. doi: 10.1007/s11469-010-9285-2. [DOI] [Google Scholar]
  • 10.Joe GW, Broome KM, Simpson DD, Rowan-Szal GA. Counselor perceptions of organizational factors and innovations training experiences. J Subst Abuse Treat. 2007;33(2):171–182. doi: 10.1016/j.jsat.2006.12.027. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Myers B, Govender R, Koch JR, Manderscheid R, Johnson K, Parry CDH. Development and psychometric validation of a novel patient survey to assess perceived quality of substance abuse treatment in South Africa. Subst Abuse Treat Prev Policy. 2015;10(1) doi: 10.1186/s13011-015-0040-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Lehman WE, Greener JM, Simpson DD. Assessing organizational readiness for change. J Subst Abuse Treat. 2002;22(4):197–209. doi: 10.1016/s0740-5472(02)00233-7. [DOI] [PubMed] [Google Scholar]
  • 13.Aarons GA. Mental health provider attitudes toward adoption of evidence-based practice: The evidence-based practice attitude scale (EBPAS) Ment Health Serv Res. 2004;6(2):61–74. doi: 10.1023/b:mhsr.0000024351.12294.65. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Aarons GA, Glisson C, Green PD, et al. The organizational social context of mental health services and clinician attitudes toward evidence-based practice: A United States national study. Implement Sci. 2012;7(1):56. doi: 10.1186/1748-5908-7-56. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.De Paúl J, Indias S, Arruabarrena I. Adaptation of the evidence-based practices attitude scale in Spanish child welfare professionals. Psicothema. 2015;27(4):341–346. doi: 10.7334/psicothema2015.67. [DOI] [PubMed] [Google Scholar]
  • 16.Himelhoch S, Riddle J, Goldman HH. Barriers to implementing evidence-based smoking cessation practices in nine community mental health sites. [Accessed April 27, 2016];Psychiatr Serv. 2014 doi: 10.1176/appi.ps.201200247. http://ps.psychiatryonline.org/doi/abs/10.1176/appi.ps.201200247. [DOI] [PubMed]
  • 17.Suttle C, Chillinor K, Thompson R, et al. Attitudes and barriers to evidence-based practice in optometry educators. Optom Vis Sci. 2015;92(4):514–523. doi: 10.1097/OPX.0000000000000550. [DOI] [PubMed] [Google Scholar]
  • 18.Sodano R, Watson DW, Rataemane S, Rataemane L, Ntlhe N, Rawson R. The substance abuse treatment workforce of South Africa. Int J Ment Health Addict. 2010;8(4):608–615. doi: 10.1007/s11469-009-9245-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.van Sonsbeek MAMS, Hutschemaekers GJM, Veerman JW, Kleinjan M, Aarons GA, Tiemens BG. Psychometric properties of the Dutch version of the Evidence-Based Practice Attitude Scale (EBPAS) Health Res Policy Syst. 2015;13(1) doi: 10.1186/s12961-015-0058-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Melas CD, Zampetakis LA, Dimopoulou A, Moustakis V. Evaluating the properties of the Evidence-Based Practice Attitude Scale (EBPAS) in health care. Psychol Assess. 2012;24(4):867–876. doi: 10.1037/a0027445. [DOI] [PubMed] [Google Scholar]
  • 21.Pasche S, Kleintjes S, Wilson D, Stein DJ, Myers B. Improving addiction care in South Africa: Development and challenges to implementing training in addictions care at the University of Cape Town. Int J Ment Health Addict. 2015;13(3):322–332. doi: 10.1007/s11469-014-9537-7. [DOI] [Google Scholar]
  • 22.Aarons GA, Glisson C, Hoagwood K, et al. Psychometric properties and U.S. National norms of the Evidence-Based Practice Attitude Scale (EBPAS) Psychol Assess. 2010;22(2):356–365. doi: 10.1037/a0019188. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Saldana L, Chapman JE, Henggeler SW, Rowland MD. The Organizational Readiness for Change scale in adolescent programs: Criterion validity. J Subst Abuse Treat. 2007;33(2):159–169. doi: 10.1016/j.jsat.2006.12.029. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Koch JR, Breland A. Behavioral healthcare staff attitudes and practices regarding consumer tobacco cessation services. J Behav Health Serv Res. 2015 Aug; doi: 10.1007/s11414-015-9477-4. [DOI] [PubMed] [Google Scholar]
  • 25.Myers B, Williams PP, Johnson K, Govender R, Manderscheid R, Koch JR. Providers’ perceptions of the implementation of a performance measurement system for substance abuse treatment: A process evaluation of the Service Quality Measures initiative. S Afr Med J. 2016;106(3):308. doi: 10.7196/SAMJ.2016.v106i3.9969. [DOI] [PubMed] [Google Scholar]

RESOURCES