Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2013 Oct 31.
Published in final edited form as: Child Maltreat. 2011 Dec 5;17(1):10.1177/1077559511430722. doi: 10.1177/1077559511430722

Initial Implementation Indicators From a Statewide Rollout of SafeCare Within a Child Welfare System

Daniel J Whitaker 1, Kerry A Ryan 1, Robert C Wild 1, Shannon Self-Brown 1, John R Lutzker 1, Jenelle R Shanley 1, Anna M Edwards 1, Erin A McFry 1, Colby N Moseley 1, Amanda E Hodges 1
PMCID: PMC3814171  NIHMSID: NIHMS515022  PMID: 22146860

Abstract

There is a strong movement toward implementation of evidence-based practices (EBP) in child welfare systems. The SafeCare parenting model is one of few parent-training models that addresses child neglect, the most common form of maltreatment. Here, the authors describe initial findings from a statewide effort to implement the EBP, SafeCare®, into a state child welfare system. A total of 50 agencies participated in training, with 295 individuals entering training to implement SafeCare. Analyses were conducted to describe the trainee sample, describe initial training and implementation indicators, and to examine correlates of initial training performance and implementation indicators. The quality of SafeCare uptake during training and implementation was high with trainees performing very well on training quizzes and role-plays, and demonstrating high fidelity when implementing SafeCare in the field (performing over 90% of expected behaviors). However, the quantity of implementation was generally low, with relatively few providers (only about 25%) implementing the model following workshop training. There were no significant predictors of training or implementation performance, once corrections for multiple comparisons were applied. The Discussion focuses on challenges to large-scale system-wide implementation of EBP.

Keywords: dissemenation/implementation, parenting, child welfare services/child protection, neglect


Many child welfare systems are moving to adopt structured, standardized, evidenced-based approaches to working with families, as evaluations of existing unstructured services have generally failed to find positive service effects (e.g., Chaffin, Bonner, & Hill, 2001; Westat, 2002). One promising evidence-based practice (EBP) being implemented in several child welfare agencies is the SafeCare® model. SafeCare is a behaviorally based parent training model that targets parents of children aged 0–5. SafeCare content focus on home safety, child health, and parent–child interactions (Lutzker & Bigelow, 2002) using behavioral techniques and a structured approach to parent training. By addressing health and safety, along with positive parenting, SafeCare addresses risk factors for both child neglect and physical abuse.

There is a considerable and growing evidence base for SafeCare (Whitaker, Lutzker, Self-Brown, & Edwards, 2008). Most recently, randomized trials within the child welfare system and with families at risk for maltreatment are showing positive results in favor of SafeCare. For example, Chaffin and colleagues conducted a statewide trial within the child welfare system comparing SafeCare to services as usual (SAU). Findings indicated reduced recidivism for families with children aged 0–5 receiving SafeCare (hazards ratio = .74; Chaffin, Hecht, Bard, Silovsky, & Beasley, in press). Reduced recidivism was also found for a subset of over 350 Native American families receiving SafeCare compared to those receiving SAU (Chaffin, Bard, Bigfoot, & Maher, under review). In both studies, the overwhelming majority of cases statewide (90%) involved neglect, with 76% of cases uniquely neglect, validating SafeCare’s impact on child neglect. Two other trials in Oklahoma have found positive effects of SafeCare on parent engagement and retention (Damashek, Doughty, Ware, & Silovsky, 2011), parent satisfaction, and perceived cultural relevance of the intervention (Chaffin et al., under review; Silovsky et al., 2011), and even provider job burnout (Aarons, Fettes, Flores, & Sommerfeld, 2009) and retention (Aarons, Sommerfeld, Hecht, Silovsky, & Chaffin, 2009).

This report presents initial data from a statewide rollout of SafeCare in one state. SafeCare training is provided by the National SafeCare Training and Research Center (NSTRC) using a rigorous training model that includes workshop training and intensive ongoing support, which is critical in establishing and maintaining fidelity (Fixsen, Naoom, Blasé, Friedman, & Wallace, 2005; Joyce & Showers, 2002). The goals of this report are (a) to describe the individuals and organizations that participated in a statewide implementation of SafeCare; (b) to report on SafeCare training and implementation indicators and their correlates (e.g., demographics, field of study, work experience), and (c) to discuss the challenges and barriers to this implementation.

Method

Statewide SafeCare Implementation Plan

State funds were received in 2008 for training providers of family preservation services to conduct SafeCare. In this particular state, as in many others, most direct child welfare services are provided by private agencies following a child maltreatment investigation conducted by public child welfare workers. Accordingly, private providers were the most appropriate choice for SafeCare training. To build capacity for large-scale training and support, NSTRC recruited and trained a group of contracted employees to provide Safe-Care training throughout the state, with NSTRC faculty and staff providing supervision and quality control.

The implementation was designed so that each agency was trained both to deliver SafeCare (termed home visitors) and to conduct ongoing coaching, consisting of regular fidelity monitoring with feedback. Coaching is a standard part of NSTRC’s implementation model (Whitaker et al., 2008), and critical for implementation with fidelity (Fixsen et al., 2005). Each agency thus received training for at least one coach and one home visitor. Coaches received intensive supervision from SafeCare trainers until they demonstrated high adherence (fidelity) to the coaching model (85% or greater), and thereafter received monthly supervision and monitoring from a SafeCare trainer for up to 6 months. The data reported here pertain only to the home visitor portion of training, which all trainees received.

Recruitment of Agencies for Training

Recruitment of agencies was a joint effort between NSTRC and the state agency. NSTRC hired a full-time coordinator to assist with marketing, recruitment, and training coordination. Recruitment efforts were focused on, but not limited to, agencies with active contracts to provide family preservation services in one or more counties. After challenges with recruiting agencies for training and getting appropriate referrals for those agencies, in early 2010, state staff and NSTRC faculty traveled to each of the state service regions to discuss SafeCare and its implementation. Directors from contracted agencies from each region were invited along with the regional and county child welfare staff. Following these orientations, agencies interested in training contacted the SafeCare program coordinator. As a result, no data exist on agencies that were not interested in SafeCare training. Agency directors were asked to complete an agency questionnaire as part of the application process. Questionnaires were received from 37 of the 50 agencies that received training; 13 did not return the questionnaire. The trainees from agencies who did not return the questionnaires, did not differ from those who did when compared on primary outcomes. The questionnaire included questions about the agency, familiarity with procedures to be used in the implementation (e.g., using a structured intervention, observing sessions), preparation for training, and how they became interested in SafeCare training. All trainees included in this report were trained between May 2009 and July 2011. Agencies trained an average of 5.8 staff (SD = 3.4).

Training

SafeCare training consisted of a pretraining orientation, a 5-day workshop on the SafeCare home visiting model (attended by all trainees), and an additional half-day devoted to completion of appropriate paperwork required by the state. Workshop training consisted of didactics, modeling of skills (live or via video), skill practice by trainees, and supervised role-plays (four:one for each module). Trainees must meet mastery criteria for skills in each module before proceeding to coached field work. Coaching was conducted weekly until providers were certified (four sessions with acceptable fidelity of 85% or greater), and monthly thereafter. This is the standard NSTRC training model used in all implementations at the time of this project (small changes have been made subsequently). Trainees were required to complete all 5 days of workshop training with acceptable fidelity of 85% or greater on all role-plays and module quizzes to proceed with implementation.

Trainee Sample

A total of 295 individuals from the 50 agencies enrolled in training. All trainees were asked to complete a brief demographics survey prior to training, which included questions about participant sex, age, race, education, and field of study, experience working with families at risk for child maltreatment, and whether or not the provider had ever completed training in another EBP (yes/no) and if so, which ones. The survey also included the Evidence-Based Practice Attitude scale or EBPAS (Aarons, 2004), a 15-item scale that assesses provider attitudes toward adopting EBP. The overall reliability for the EBPAS was good (α = .80). Although the EBPAS includes four subscales, only the total score is presented here because a factor analysis did not reveal the same dimensions as originally described by Aarons (2004). For purposes of progress indicators described below, trainees were followed as long as they were actively pursuing SafeCare cases. Providers were deemed inactive if they left their agency (n = 28), reported they were no longer pursuing SafeCare cases (n = 38), or simply failed to communicate with NSTRC for 6 or more months (n = 87). The mean number of months providers were active was 10.8 (SD = 6.3), with a range of 1.3–30.5 months. At the time of this writing, 49% (n = 144) of providers were active, and 51% (N = 151) were inactive, though inactive providers may have delivered SafeCare at some point.

Training and Implementation Progress Indicators

Several measures were collected as indicators of uptake of training and implementation by providers. At the individual level, performance measures included individual performance on role-plays and quizzes as part of workshop training, infield fidelity scores, and whether certification was achieved once implementation began.

In workshop role-plays

SafeCare-play scores were assessed during workshop training by SC Trainers using standardized fidelity checklists. To complete training, trainees must perform role-plays for each SafeCare module to an 85% mastery criterion level. In the event a trainee failed a role-play, they were given feedback and another opportunity to demonstrate mastery. We recorded the number of role-play failures for each participant and initial role-play scores.

Quizzes

Following workshop training and prior to implementation, trainees were required to complete quizzes for each of the four SafeCare modules. Quizzes were “take-home” and thus trainees were allowed to use all SafeCare materials provided during training. Each of the four quizzes includes approximately 30 items which test the conceptual knowledge of the practices and procedures of each module. For example, in the Safety module quiz, trainees must correctly classify various hazards; in the Parent-Child Interaction module quiz, trainees must identify why ignoring minor misbehaviors is important. Trainees who submitted a quiz and do not pass at the 85% mastery criterion were allowed to retake the failed questions.

Implementation and fidelity measures

Upon completion of the workshop portion of training, trainees began implementing SafeCare with families referred from the county child welfare agencies or other appropriate sources (e.g., prevention cases diverted from child protective services, foster parents). Each home visitor was instructed to begin SafeCare with one family until they met certification criteria. As part of the implementation plan, all sessions were audio-recorded for fidelity monitoring purposes, and at least two of the first four sessions for each home visitor were observed live for added support. Fidelity was monitored with standardized checklists by the SafeCare Trainer or a certified on-site coach, who reviewed the entire session. Each fidelity checklist contains about 30 items that correspond to specific behaviors providers are expected to perform during the session.

Because trainees were trained on a rolling basis beginning in May 2009, they were “active” SafeCare providers for different periods of time. In addition, some providers and some entire agencies became inactive at different points (e.g., because the provider left the agency; because the agency was no longer interested in pursuing SafeCare referrals). As is common with large-scale implementations, there were delays in implementation for various reasons (e.g., lack of referrals for SafeCare), and, in some cases, there were extreme delays. The median number of days between completing training and the first in-field session was 79 days (range 8–506 days). With these challenges in mind, three primary variables were examined as measures of in-field implementation: (a) whether any SafeCare sessions had been performed (yes/no); (b) whether the provider reached home visitor certification (yes/no) by completing any four sessions with 85% fidelity or greater (sessions did not have to be consecutive); and (c) mean fidelity scores across the first four sessions (which was necessary prior to certification), as scored on the fidelity checklists mentioned above (0–100%).

Results

Description of Agencies

Of the 50 agencies trained, 34 (68%) were located in an urban setting and 16 (32%) were located in a rural area. Most agencies (78%) reported already serving populations at risk for abuse/neglect, but some were new to serving such populations. Forty-nine percent of agencies had been serving families for 10 or more years, but 22% were relatively new, having served families for 5 years or less. Agencies used a mix of full-time, part-time and contract staff, and employed a median of 22 direct service staff (range 2–213) that provided a range of services. Agencies were most likely to have heard about SafeCare through the county or state child welfare offices (62%). Less than a third (32.4%) of agencies indicated they had ever implemented a structured intervention, and just over 60% reported that they already conducted live observation of sessions for supervision.

Description of Trainees

Demographic data were missing for 63 individuals who did not complete demographic forms, and thus description is available for only 232 individuals. Table 1 provides demographic information for the individuals that participated in training. The sample was primarily female and African American. Most had advanced degrees, and there was substantial diversity in field of study (psychology, social work, counseling, etc.). The sample was a mix of new and experienced providers who overall expressed positive attitudes toward EBP. Although about half of trainees reported being trained in a specific EBP, a review of trainees’ reports about the specific EBP suggested that some trainees appeared to misunderstand the question.

Table 1.

Characteristics of Staff Trained in SafeCare

Variable N (%) or M (SD), n (n = 295)
Sex
 Female 214 (88.1)
 Male 29 (11.9)
Age 39.8 (SD = 10.8), n = 216
Race
 African American 141 (60.0)
 White 78 (33.1)
 Latino 6 (2.6)
 Other 10 (4.3)
Education
 Bachelor’s degree 33 (13.9)
 Master’s or PhD 205 (86.1)
Discipline
 Social work 61 (26.3)
 Psychology 53 (22.8)
 Counseling 40 (17.2)
 Other 78 (33.6)
Ever trained in an evidence-based practice? 105 (50.0)
Years work experience
 Less than 1 year 73 (35.6)
 1–5 years 84 (41.0)
 5+ years 48 (23.2)
Attitudes toward EBP 4.21 (SD = .50), n = 212
Mean quiz score (0–100) 93.7 (SD = 3.4), n = 293
Mean role-play score (0–100) 93.6 (SD = 3.3), n = 265
Conducted any SafeCare sessions 75/295 (25.4)
Certified as home visitor 66/295 (22.4)
Overall initial in-field fidelity score (0–100) 92.4 (SD = 5.6), n = 77
In-field fidelity, health module 93.4 (SD = 4.3), n = 20
In-field fidelity, safety module 93.0 (SD = 3.9), n = 19
In-field fidelity, parent–child interaction module 92.3 (SD = 5.2), n = 48
In-field fidelity, parent–infant interaction module 92.2 (SD = 5.3), n = 24

SafeCare Training and Implementation Indicators

The bottom of Table 1 shows the descriptive statistics for the five primary training and implementation progress indicators. The mean workshop quiz score and role-play scores from training were both very high. Failures on the quizzes and role-plays were rare (quizzes 2.1%, 24 of 1,149; role-plays 1.4% 14 of 1,024). Note that trainees complete role-plays with the assistance of outline, and thus failures are relatively rare. Regarding implementation, only 25.4% trainees conducted any SafeCare sessions, and 22.5% reached certification. In-field fidelity for home visitors who implemented SafeCare as scored by SafeCare trainers (who either attended live or listened to an audio recording of the session) was quite high at over 90%, and scores were very similar across SafeCare modules.

We examined correlations among six demographic factors (i.e., sex, age, having a graduate degree, primary discipline, prior training in EBP [yes/no], and EBPAS score) and five implementation indicators (quiz scores, role-play scores, any implementation, certification, and in-field fidelity). Of the 30 correlations, only 6 were significant at the traditional α level of p < .05 level, and none were significant at the Bonferroni-corrected α level of p < .0016 (.05/30). For those significant at .05, age was inversely related to role-play scores (r = −.18, p < .01, n = 196), quiz scores (r = −.21, p = .002, n = 215), and in-field fidelity (r = −.36, p = .007, n = 54), with younger trainees performing better than older ones on each measure. Females were more likely than males to implement SafeCare (r = .14, p = .03, n = 243) and to reach certification (r = .13, p = .05, n = 243). Primary discipline was related to implementing SafeCare such that counselors were more likely to implement than non-counselors (r = .17, p = .007, n = 232). We also examined whether the two workshop indicators (role-plays and quiz scores) were related to the three implementation indicators (any implementation, certification as home visitor (HV), and mean infield fidelity). Quiz scores were related to in-field fidelity (r = .33, p = .003, n = 77). Role-play scores were related to all three implementation indicators: any implementation (r = .13, p = .03, n = 265), certification (r = .18, p = .004, n = 265), and in-field fidelity (r = .24, p = .04, n = 69).

Discussion

This article describes initial implementation data from a statewide rollout of SafeCare within a child welfare system with mixed findings of implementation progress. Workshop-based indicators suggest high performance during training, and observed in-field fidelity scores were excellent. However, the overall levels of implementation are low with relatively few providers conducting any SafeCare sessions and even fewer reaching certification. Thus overall, the story of the implementation to date seems to be one of high quality but low quantity. The only consistent predictors of training or implementation performance were age and role-play performance, which predicted subsequent implementation and fidelity. However, these variables were not significant when adjusted for multiple comparisons, and the magnitude of those correlations was generally small, calling into question their importance.

Challenges: Why Low Quantity?

For successful large-scale implementation, several factors must work in concert: there must be a workforce that is willing and able to implement the new practice, organizations that support that practice, and system-level factors that allow a practice to occur (Beidas & Kendall, 2010; Fixsen et al., 2005). In the current implementation, we believe that the first two factors were in place as about 300 providers were trained over a 2-year period. We also believe systems-level issues have kept the implementation from reaching its full potential to date. Through follow-up phone calls with providers, we heard many anecdotal accounts that referral sources appeared to not be sufficiently informed about SafeCare and providers had not received appropriate referrals. Despite presentation to county staff and group e-mails from the state office, many providers reported that the local child welfare staff who made referrals remained unaware of SafeCare. Problems such as this are compounded in larger states (the current state is in the top quarter of U.S. states in population) where there are many counties and many individuals in each county to educate about SafeCare and its implementation.

In addition, there appeared to be issues with how Safe-Care was implemented in relation to existing service programs. SafeCare was added as a “new” service that required a specific referral for SafeCare services, instead of being integrated into an existing program. In other words, county staff who refer for in-home parent training services were given SafeCare as an additional option for referral (SafeCare), and thus SafeCare was competing with existing programs better known to county child welfare staff (e.g., Parent Aide, Wrap-Around Services, Homestead). Further complicating the referral issue was the fact that it took several months for SafeCare to be integrated into the electronic referral system that is used by most counties. Had SafeCare been integrated into an existing service program (perhaps one that did not use a structured intervention model), we may have seen a more steady flow of referrals. However, this was not possible because SafeCare required more frequent visits than most service programs allowed, and could not simply be delivered in the context of those programs. These points highlight the need for careful planning regarding an implementation at all levels, individual, organizational, and system prior to beginning the implementation to allow the implementation to flow smoothly (Fixsen et al., 2005).

Study Limitations

There are a number of limitations of the study methods and results. First, there was a fair amount of missing data, and it is not known whether participants who completed demographic surveys differed from those who did not. Second, although observed fidelity was high, there were relatively few participants who provided fidelity data, and it is unknown whether those who did not implement would have performed as well as those who did. Similarly, because training was voluntary, agencies that chose to participate in SafeCare training may represent a biased sample; they may be more likely to adopt EBPs. A third limitation focuses on the outcome measures, particularly the use of quiz and role-plays as outcomes. Participants were allowed to use materials to complete quizzes and to use outlines to conduct role-plays, and thus scores were high and there were relatively few failures. Still, there was enough variability in role-play scores to predict later in-field fidelity. Last, we have discussed the important role that we believe system-level factors played in influencing the implementation. However, none of these system-level factors were measured systematically, and thus, the degree of their impact cannot be determined with certainty.

Summary and Conclusions

The data we present here are not unlike data from other implementations that demonstrate the difficulties of implementing new practices. In this implementation, there was no cost to providers for training and support (other than the time of their personnel to attend training), yet level of implementation was low. Thus, even free training and support were not sufficient to produce broad-scale implementation without addressing organizational and systems-level variables. This issue has been noted by many authors (Fixsen et al., 2005; Beidas & Kendall, 2010). A broader approach in which “communities of practice” are developed to facilitate change at the community and organizational level variables (Glisson & Schoenwald, 2005) may be needed to fully effect implementation. Controlled implementation trials are needed to understand how different training approaches influence implementation uptake and fidelity.

Acknowledgments

Funding

The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This work was supported by the Georgia Department of Human Services contract numbers 42700-040-0000005261 and 42700-040-0000003341 and CDC grant R18CE0001479-01.

Footnotes

Reprints and permission: sagepub.com/journalsPermissions.nav

Declaration of Conflicting Interests

The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

References

  1. Aarons GA. Mental health provider attitudes toward adoption of evidence-based practice: The Evidence-Based Practice Attitude Scale (EBPAS) Mental Health Services Research. 2004;6:61–74. doi: 10.1023/b:mhsr.0000024351.12294.65. [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Aarons GA, Fettes DL, Flores LE, Sommerfeld DH. Evidence-based practice implementation and staff emotional exhaustion in children’s services. Behaviour Research and Therapy. 2009;47:954–960. doi: 10.1016/j.brat.2009.07.006. [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Aarons GA, Sommerfeld D, Hecht D, Silovsky J, Chaffin M. The impact of evidence-based practice implementation and fidelity monitoring on staff turnover: Evidence for a protective effect. Journal of Consulting and Clinical Psychology. 2009;77:270–280. doi: 10.1037/a0013223. [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Beidas RS, Kendall PC. Training therapists in evidence based practice: A critical review of studies from a systems contextual perspective. Clinical psychology: Science and Practice. 2010;17:1–30. doi: 10.1111/j.1468-2850.2009.01187.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Chaffin M, Bard D, Bigfoot DS, Maher EJ. A comparative outcome study of home-based services for American Indian parents in child welfare under review. [Google Scholar]
  6. Chaffin M, Bonner BL, Hill RF. Family preservation and family support programs: Child maltreatment outcomes across client risk levels and program types. Child Abuse & Neglect. 2001;25:1269–1289. doi: 10.1016/s0145-2134(01)00275-7. [DOI] [PubMed] [Google Scholar]
  7. Chaffin M, Hecht D, Bard D, Silovsky JF, Beasley WH. A statewide trial of the SafeCare home-based services model with parents in child protective services. Pediatrics. doi: 10.1542/peds.2011-1840. in press. [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Damashek A, Doughty D, Ware L, Silovsky J. Predictors of client engagement and attrition in home-based child maltreatment prevention services. Child Maltreatment. 2011;16:9–20. doi: 10.1177/1077559510388507. [DOI] [PubMed] [Google Scholar]
  9. Fixsen DL, Naoom SF, Blasé KA, Friedman RM, Wallace F. Implementation research: A synthesis of the literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, the National Implementation Research Network; 2005. [Google Scholar]
  10. Glisson C, Schoenwald SK. The ARC organizational and community intervention strategy for implementing evidence-based children’s mental health treatments. Mental Health Services Research. 2005;7:243–259. doi: 10.1007/s11020-005-7456-1. [DOI] [PubMed] [Google Scholar]
  11. Joyce BR, Showers B. Student achievement through staff development. 3. Alexandria, VA: ACSD; 2002. [Google Scholar]
  12. Lutzker JR, Bigelow KM. Reducing child maltreatment: A guidebook for parent services. New York, NY: Guilford; 2002. [Google Scholar]
  13. Silovsky J, Bard D, Chaffin M, Hecht DB, Burris L, Owora A, Lutzker J. Prevention of child maltreatment in high risk rural families: A randomized clinical trial with child welfare outcomes. Children and Youth Services Review. 2011;33:1435–1444. [Google Scholar]
  14. Westat . Evaluation of family preservation and reunification programs: Final report. Washington, DC: U.S. Department of Health and Human Services; 2002. [Google Scholar]
  15. Whitaker DJ, Lutzker JR, Self-Brown S, Edwards AE. Implementing an evidence-based program for the prevention of child maltreatment: The SafeCare® program. Report on Emotional & Behavioral Disorders in Youth. 2008;8:55–62. [Google Scholar]

RESOURCES