Abstract
Parent training programs, with a range of empirical support, are available to improve parenting skills and reduce child behavior problems. Yet, little is known about programs provided in typical communities. This pilot study's purpose was to identify and describe parent programs—and the agencies that provide them—in one midsized midwestern city. The sample included 21 program directors and 25 practitioners employed by 19 agencies. data were gathered using structured phone interviews. of the 35 programs represented, 37.1% were developed by the agency, while close to two thirds were previously developed interventions. only a small number of the parent programs identified were classified into the category of strong empirical support; however, several included hallmarks often associated with empirically supported parent programs.
A wealth of knowledge demonstrates a linkage between parenting behaviors and child emotional and behavioral development. However, it is acknowledged that environmental factors, such as socioeconomic status and race, often moderate this relationship (English et al., 2005; Lansford, Deater-Deckard, Dodge, Bates, & Pettit, 2004; Luthar & Zelazo, 2003; Paulussen-Hoogeboom, Stams, Hermanns, & Peetsma, 2007). Effective parenting often serves as a protective factor for children, while harmful parenting places a child at risk for a range of problems.
Despite the strong association between parenting and child outcomes, Sanders, Markie-Dadds, and Turner (2003) have astutely pointed out that “parents generally receive little preparation beyond the experience of having been parented, with most learning on the job through trial and error” (p. 1). Through the use of internal and external supports, many parents are able to tackle the challenges without reliance on formal parent training programs. However, others may be less adequately prepared to meet their child's needs, struggle to parent children with difficult behaviors, or have a greater risk of engaging in ineffective or harmful parenting practices.
Parents in need of additional supports may seek formal parent training programs on their own, be referred by a friend or family member, or be mandated to receive services by public social service agencies, such as child welfare. Parents may participate in a variety of parenting programs from an array of providers. However, not all parent training programs are equally effective (Barth et al., 2005), and surprisingly little is known about the nature of parent training programs provided in the community. This study sought to describe the community agencies, and the parent programs they provide, along a continuum of empirical support. For purposes of this study, we did not differentiate between parent programs whose participants self-selected into services, were mandated to attend, or included a combination of the two.
Interventions to change parent behavior, protect children, improve developmental outcomes, and reduce parental stress are of utmost importance. A wide array of parent programs are available; however, the level of evidence available to support their effectiveness varies greatly (Barth et al., 2005; Hurlburt, Barth, Leslie, Landsverk, & McCrae; 2007). Parent Management Training (PMT; Kazdin, 2005; Patterson, Chamberlain, & Reid, 1982; Patterson, Reid, & Eddy, 2002), the Incredible Years (Webster-Stratton, 1984, 1998; Webster-Stratton & Hammond, 1997), Parent-Child Interaction Therapy (PCIT; Capage, McNeil, Foote, & Eyberg, 1998; Eyberg, Boggs, & Algina, 1995; Eyberg & Robinson, 1982), the Positive Parenting Program (Triple P; Sanders et al., 2003), and multisystemic therapy (MST; Borduin et al., 1995) have emerged as parent interventions with strong empirical support.
Although an extensive review of parent behavioral interventions is outside the scope of this paper and has been reported elsewhere (Brestan & Eyberg, 1998; Farmer, Compton, Burns, & Robertson, 2002; Kazdin & Weisz, 1998), outcomes from randomized clinical trials that have tested these interventions will be reviewed here briefly. Kazdin (2005) has summarized the extensive empirical evidence demonstrating the effectiveness of PMT. He reported that parent participation in PMT reduces behavior problems for both children and adolescents, and those benefits have been maintained for 1 or 2 years following the completion of the intervention. PCIT has also been shown to reduce child behavior problems in clinical populations (Eyberg et al., 1995) and maintain the improvements over multiple years (Hood & Eyberg, 2003). Incredible Years, a video-based program , has been identified as a model program by the Blueprint for Violence Prevention (Center for the Study and Prevention of Violence, n.d.). It has demonstrated effectiveness at reducing child behavior problems and increasing parenting skills for cases in which the child is at risk of conduct disorder (Webster-Stratton, 1998; Webster-Stratton & Hammond, 1997). Triple P has solid empirical evidence demonstrating its ability to reduce child behavior problems and parental stress among a variety of parent populations, including parents of children with early onset conduct disorder, parents at risk of child maltreatment, depressed mothers, and parents experiencing marital conflict (Sanders, Markie-Dadds, Tully, & Bor, 2000; Sanders et al., 2003; Sanders & McFarland, 2000). Finally, MST, which has also been identified as a Model Program by the Blueprint for Violence Prevention (Center for the Study and Prevention of Violence, n.d.) and an exemplary program by the Office of Juvenile Justice and Delinquency Prevention (n.d.), has been shown to improve outcomes across multiple domains for juvenile offenders and their families (Henggeler, Mihalic, Rone, Thomas, & Timmons-Mitchell, 1998; Henggeler, Schoenwald, Borduin, Rowland, & Cunningham, 1998).
Although each parenting program is unique, especially in regard to service modality, they have many elements in common. Hallmarks of these interventions include that they:
are guided by social learning theory, seek to improve parent-child relationships,
teach the use of praise and rewards to encourage positive behavior, and
use behavioral approaches to respond to difficult child behaviors. (Hurlburt et al., 2007)
Individualized family assessment and ongoing evaluation of programs are also important elements (Henggeler, Schoenwald, et al., 1998; Kazdin, 2005). Assessment ensures that the intervention is the best fit to meet the specific needs of the family, and evaluation involves continuous monitoring to ensure that the intervention is working. Practitioner or facilitator training requirements, ongoing supervision, and clear practice guidelines or a manual are additional similarities (Barth et al., 2005). Practice guidelines may or may not include a predetermined number of sessions. While some empirically supported interventions (ESI) are implemented in a set number of sessions, others allow for more flexibility based on the individualized assessment of need.
Sensitivity to the child's age and the participation of both children and parents in the intervention are also hallmarks of empirically supported programs. Participation of the child in the intervention, in some form, is common and may take the form of inclusion in the session with the parent, homework assignments for the parent to practice skills with his or her child between sessions, or offering a concurrent child program. Empirically supported parent programs also account for age, a determinant in parent training (Dishion & Patterson, 1992) because of the variation in the needs of parents of children in differing developmental stages.
Despite several parent programs having strong empirical support, little is known about how widely available they are to parents in the community. This is a problem for several reasons: (a) at-risk parents may not be receiving the best interventions, (b) public monies may be wasted by funding unsupported programs in the community, and (c) researchers comparing promising parenting programs to “usual care” may be unaware of what that care consists of. This pilot study sought to determine the extent to which empirically supported parent training programs were being implemented in one midsized Midwestern city, and to describe the organizations and practitioners currently providing the training.
This study answered the following research questions: (a) What are the characteristics of the agencies and practitioners providing parent training? (b) What are the characteristics of the parent training programs being provided? and (c) To what extent are the identified parent training programs empirically supported?
Methods
Sampling Frame
The IMPROVE project (Interventions for Multi-sector Provider Enhancement; Dr. Arlene Stiffman, personal communication), the local child protective service agencies, and an Internet search were used to identify agencies that potentially provide parent programs. The IMPROVE project included a recently accumulated comprehensive database of community services, including agencies providing parent training interventions. Programs with a very narrowly defined client base (e.g., teen mothers or homeless individuals) or offering services determined to be counseling or therapy (and not parent training) were excluded.
Procedures
All procedures and materials were approved by the Human Research Protection Office at Washington University in St. Louis. Respondents were assured of their confidentiality and the voluntary nature of their participation. They were provided with contact information for study personnel and the principal investigator.
Recruitment
A three-stage recruitment process was used to protect the employment status of the participants and respect the hierarchy of the organizations. First, an introductory letter from the project coordinator was sent to the director of each targeted agency. The letter provided information about the study and requested permission to contact the program director(s) and practitioner(s) affiliated with the agency's parent program(s). The letter was followed by a phone call 3-5 days later. Once initial permission was obtained and program directors were identified, the program directors were contacted by phone and faxed an informational letter to elicit their consent for study participation. Second, following the completion of their interview, program directors were asked to identify practitioners whom they supervise within the parent programs they oversee. Third, these practitioners were mailed an introductory letter and phoned 3–5 days later to solicit their participation. Respondents were paid $10 for their time.
Data collection
Data were gathered using structured phone interviews, mainly consisting of closed-ended questions. Study protocol included two interview modules: one tailored to program directors and one tailored to practitioners. The program director interviews typically lasted between 45 minutes and one hour, while the practitioner interviews typically lasted 30 minutes. Telephone interviews were conducted by three trained, master's-level research assistants and the doctoral-level project coordinator.
To reduce participant burden, program-level data were collected only for a maximum of three programs per program director. If a respondent was the director of more than three programs, he or she was asked questions regarding the characteristics of the three largest programs he or she oversaw. Additionally, demographic, educational, and employment data were also gathered on each respondent.
Measures
The data presented in this article were derived from questions designed specifically for this project. They inquired about the content of the parent training program(s) with particular emphasis on the hallmarks of empirically supported programs, training and supervision, organizational structure and context, and population served. Additionally, interview protocols also included three adapted standardized instruments, the analysis of which is presented elsewhere (Schurer, Kohl & Bellamy, in press). The standardized scales included the Organizational Readiness for Change survey and the Survey of Organizational Functioning (both developed by the Institute of Behavioral Research, Texas Christian University) and the Evidence-Based Practice Attitude Scale (EBPAS; Aarons, 2004). Finalization of the inter view protocol included reviewing the questions with a program director and two practitioners who were not in the study sample pool; this was to obtain feedback about their initial reaction to questions, including their perception as to what the question was inquiring about. Revisions were made in response to their valuable feedback.
As noted previously, the majority of questions were closed-ended; however, a few questions inquired about numbers (such as the number of clients served). Additionally, directors were asked what specific child ages the program targeted. This response was later classified by developmental stage: infant, preschooler, elementary school aged, or middle/ high school aged. It was possible for a program to target more than one age (e.g., if a director noted that a program targeted parents of children 0 to 4 years old, the program was classified as targeting both infants and preschool children).
Response Rate
Table 1 summarizes the response rates. Twenty-seven agencies were initially identified, and their directors were contacted. After initial contact and clarification, 3 were excluded upon learning that they did not currently have a parent program meeting the inclusion criteria. This left 24 total agencies; of these, 3 agency directors did not respond to repeated attempts to secure agreement to participate. The remaining 21 (87.5%) did agree to have their agency participate and provided names and contact information for at least 1 program director (PD).
Table 1.
Category | Total possible | Total represented | Response rate |
---|---|---|---|
Agencies | 24 | 19 | 79.2% |
Program directors | 28 | 21 | 75.0% |
Practitioners | 42 | 25 | 60.0% |
Some agencies employed multiple PDs; this resulted in 28 PDs for study recruitment, of whom 21 eventually agreed to participate and were interviewed. The remaining 25% either declined participation (10.7%) or never responded to recruitment efforts (14.3%). Two agencies had 2 PDs interviewed, resulting in 19 total agencies (79.2% of total valid agencies recruited) being represented in the sample.
Recruitment of practitioners proved to be more challenging, as the 21 PDs had varying numbers of corresponding practitioners (range: 0 to 10), and several insisted on study personnel not contacting their staff directly. Therefore, 11 practitioners were informed about the study by their PDs, but did not receive follow-up by the research team. Forty-seven practitioners were mailed an introductory letter and received follow-up phone calls. In this process, 5 were either found no longer to be at the agency or not actually participating in parent programs. Of the remaining 42 possible respondents, 25 (60%) agreed to participate and were interviewed. The remaining 40% either declined participation or never responded to recruitment efforts.
ESI/Not ESI Classification
Because very few parent programs have been rigorously tested and validated (Barth et al., 2005), external evaluation information was not available for most of the programs in the sample. This made assessing their potential efficacy very challenging. The need for refinement of the traditional ESI/Not ESI classification system is highlighted by Kazdin (2004). However, even if his recommended categories were utilized, nearly all of the programs in the sample would be classified as “not evaluated,” providing no further insight into how they compare to the gold standards in the field, such as PCIT or PMT.
To better capture this aspect of the programs, a systematic process was developed to categorize each program's level of ESI—or, in other words, to determine how closely it resembled empirically supported interventions. Therefore, each program was assigned a rating denoting how potentially efficacious it appeared to be based on hallmarks of evidence-based parent interventions described in the literature (Barth et al., 2005; Hurlburt et al., 2007). Hallmarks were considered to be things such as including a manual, requiring training from an outside program expert, delivering an age-specific curriculum, and incorporating an individual assessment that informed service provision. Scientific ratings from the California Evidence-Based Clear-inghouse for Child Welfare (n.d.) were also taken into consideration. The level of ESI ratings ranged from 1, loosely defined and structured interventions, to 4, well-established ESIs. If the program was PMT, MST, PCIT, or the Incredible Years (the leading ESI parent programs; Barth et al., 2005; CEBC, n.d.), it was assigned the highest rating of 4 (i.e., well-established, Empirically Supported Interventions). From there, a series of decision points were used to rate the program:
1. If the programs did not have a manual, they received the lowest rating of 1. Additionally, even if the programs had a manual, but were internally developed by the agency and did not target a specific age range and/or include an individual assessment, they also received a 1 (i.e., loosely defined and structured interventions).
2. If the programs were externally developed, required outside training, and were age-specific—and they included an assessment—they received a 3, indicating that they were parenting interventions that have some hallmarks of ESI.
3. All others received a rating of 2, signifying interventions that have some structure. These interventions may or may not have been developed by the agency, but none of them required external training. They all had a manual/curriculum, but did not have either an age-specific curriculum and/or an individual assessment.
Data Analysis
All analyses were conducted using SPSS® 15.0. Frequencies and descriptive statistics were calculated. Associations between respondents’ positions and level of education were analyzed using bivariate correlations, chi-square tests, and t tests.
Results
General Context: Agency and Staff Characteristics
Agencies
Nineteen agencies were represented in this study. Most of the agencies (84.2%) were characterized by respondents as a “nonprofit children/family service agency.” The three other agency settings (15.8%) were described by program directors as a hospital, a community mental health clinic, and a church. In addition to their parent training programs, many (57.9%) of the agencies offered other programs, meaning that they were multiservice organizations. Parenting programs served between 8 and 200 families per month (M = 52.7, Mdn = 32.5, mode = 30). The number of distinct parent programs offered by agencies varied. Six (31.6%) agencies provided one or two programs, while 4 (21.1%) provided three programs, 2 (10.5%) provided four programs, and 1 (5.3%) provided six distinct parenting programs.
In general, agencies had diversified funding streams—although all of them received funding from the state. Federal funding was received by 83.3% of the agencies, and 44.4% also received funding from city or county governments. However, given the complex nature of government appropriations, it is a bit unclear whether, in every case, the local and state money was in addition to federal funding. Almost two thirds (63.2%) of the agencies were partially funded by the United Way, and they all received funding from various other private foundations. Additionally, 94.7% generated funding via fundraising efforts and individual donations.
Parent Program Characteristics
Information was gathered on a total of 35 programs, which were encompassed in the 19 agencies described next. Programs varied widely in age. The mean length of time a program had been operating was 11.4 years (Mdn = 7.96). However, the modal length of operation was only 1 year, with a range of 1 to 58 years.
Funding
In addition to agency funding sources, many of the individual programs were funded independently, further complicating the funding landscape. Twenty-seven (77.1%) of the programs were funded by program-specific grants to the agency. A quarter (25.7%) also reported having a contract to provide parenting services to clients of the city or county child welfare agency. One fifth (20%) of the programs were partially funded by Medicaid reimbursements as well as client fees, usually set on a sliding scale. Additionally, 14.3% were reimbursed for services by private insurance companies, and 40% of the programs were funded by still other sources, such as directly from the federal government or from private donations and fundraising efforts.
Staff
The overwhelming majority (91.3%) of respondents were female, and most (73.9%) were Caucasian. While not significantly so, program directors were more likely to be Caucasian than were practitioners (81.0% versus 68.0%).
Both the program directors and practitioners interviewed were well educated and had had substantial experience in the field of parent training. Eighty-six percent (n = 18) of the program directors and 64.0% (n = 16) of practitioners held a master's degree or higher. Participants most commonly held degrees in the area of social work (50.0% of all respondents), education (17.4%), or psychology (13.0%). Other degree disciplines included counseling (6.5%), nursing (4.3%), medicine (2.2%), business (2.2%), and “other” (4.3%). Many of the respondents had been working in the area of parent training and at their present jobs for a considerable length of time. On average, program directors and practitioners had been at their present jobs for over 5 years (range 1 month to 15 years) and had over 10 years of experience in the field of parent training (range 10 months to 41 years). Program directors reported having slightly more experience, though not significantly so—and, all in all, the practitioners interviewed were remarkably experienced. Unexpectedly, no statistically significant differences in education and practice experience were found between program directors and practitioners.
Modality and setting
Most of the programs favored groups as the primary modality of teaching parenting competency. Almost half (45.7%) of the 35 programs exclusively interacted with parents in a group setting, and 34.3% mixed both group and individual meetings. Only one fifth (20%) of the programs were delivered solely to an individual parent or couple. This was reflected in the fact that only 2 of the programs (5%) were home-based interventions. Seven (20%) of the programs did occasionally visit the client's home in addition to delivering services at the agency, but the majority (74.3%) interacted with parents only outside of their homes.
Despite having little contact with the families in their homes, over 70% of the programs included clients’ children at some point in the intervention, a characteristic common to empirically supported programs. Three programs (8.6%) heavily involved the children, and 22 programs (62.9%) had children occasionally participate in sessions. Ten programs (28.6%) did not involve the children.
Client characteristics
The parent programs represented specifically targeted populations expected to need additional parenting guidance, such as neglectful (65.7%) and physically abusive (62.9%) parents, as well as parents of children with behavior problems (57.1%). Program directors of 12 programs (34.3%) also reported specifically targeting foster parents. Twenty-five (71.4%) also targeted “other” populations. “Other” populations most often specified by program directors of these programs included “low-income parents” (28%, n = 7) and “teen parents” (20%, n = 5).
Characteristics of the program content
In addition to specific parent populations, most programs (77%, n = 27) also specifically targeted parents with children of a particular age, meaning that program material was most applicable to problems and developmental issues of a particular child age group. While directors of 4 programs (14.8% of those claiming to target a particular age) actually indicated that their program was appropriate for parents with children of all developmental stages (0–18 years), most focused on younger children (see Table 2). Two thirds (66.7%) had curriculum specific to parenting preschool-aged children (2–4 years old). Over half (51.9%) discussed parenting infants (0–2 years old), and 17 programs (48.6%) were targeted at parents of elementary school-aged children (5–11 years old). Directors of 8 programs responded that issues regarding parenting adolescents (12–18 years old) were also covered. One agency, serving pregnant teens, ran 3 programs (11.1%) that also covered prenatal care and development.
Table 2.
Age of child | No. programs a | % programs |
---|---|---|
Prenatal | 3 | 11.1 |
Infants (0–2 yrs.) | 14 | 51.9 |
Preschool (2–4 yrs.) | 18 | 66.7 |
Elementary school (5–11 yrs.) | 17 | 48.6 |
Middle and high school (12–18 yrs.) | 8 | 29.6 |
All ages | 4 | 14.8 |
Dichotomized variables based on themes in qualitative data given about the 27 parent programs.
Most programs also included content on more general parenting skills. Directors of all the programs responded that their intervention covered the use of praise and rewards as well as how to set limits, and they had the goal of increasing parent-child engagement—elements common to parent programs with strong empirical support. Most programs also covered how to prevent misbehavior (97.1%), typical child development (94.3%), and improving children's social skills (91.4%). A little over three fourths (77.1%) of the programs also provided information and skills about how to reduce children's aggressive behavior.
Programs’ Level of Evidence and Rigorous Development
Incorporating the hallmarks of empirically supported interventions (ESIs)
Our sample of 35 programs included 13 (37.1%) that were developed by the agency, meaning that close to two thirds of the programs were “off the shelf” interventions, presumably with more empirical or theoretical grounding and testing. However, some of the agency-developed programs did have some sort of manual or formal, written curriculum. In fact, 85.7% of all programs utilized a manual (see Table 3). Most staff providing externally developed parent programs received outside training on the program (81%, n = 22), and 82.4% of those (n = 17) received ongoing external supervision for the program (see Table 3).
Table 3.
Program characteristics | No. programs | % programs |
---|---|---|
Developed by agency | 13 | 37.1 a |
Practitioners received outside training | 17 | 81.0 b |
Practitioners receive ongoing external supervision | 14 | 82.4 c |
Utilizes a manual | 30 | 85.7 a |
Set number of sessions (Mode = 6, Range: 3–48) | 22 | 62.9 a |
Uses an individual family assessment to help planning | 28 | 80.0 a |
Includes an evaluation of outcomes | 31 | 88.6 a |
All programs (N = 35)
external programs (n = 21; 1 of 22 respondents indicated “Don’t Know”)
programs with external training (n = 17).
Another mark of formalization is whether a program has a set number of sessions. Nearly two thirds (62.9%) of the programs did. The modal length of the parent programs was six sessions, but ranged from 3 to 48 sessions (see Table 3). Programs without a dictated length typically served parents on a more individual basis and considered the intervention completed when certain goals were attained or when it was mutually agreed upon to end services.
In contrast to a formalized structure, it is also important that parent programs make some attempt to tailor the intervention to the needs of the client. One way of doing this is to conduct an individual family assessment during the engagement process to help inform service planning. Four fifths, or 80%, of the programs featured such an assessment (see Table 3). Included in this assessment were often standardized and repeated measures used as baselines for evaluating the outcome of the program.
Most programs (88.6%, n = 31) reported including an evaluation of outcomes, though a wide assortment of techniques were employed. In an open-ended response to the question “Describe the evaluation technique used,” 90% of the 30 programs for which answers were available used repeated measures, most often in the form of pre- and postintervention tests. Over half (53.3%) also featured standardized measures, such as the Parenting Stress Index or the Child Behavior Checklist. For about one sixth of the programs that evaluated outcomes, program directors also mentioned using direct observation of parent-child interactions, general client satisfaction questionnaires, and/or child behavior assessments.
ESI rating and implementation
Only 4 programs (11.4%) were well-established ESIs. These included PCIT, MST, and the Incredible Years. Seven (20.0%) of the programs had some hallmarks of empirically supported parenting interventions, earning a level 3 rating. These programs were not developed by the individual agency, required external training and monitoring, and included age-specific curricula and individual family assessments. However, more than two thirds (68.6%, n = 24) of the parent programs being provided in the community did not appear to have been rigorously designed or evaluated, earning a rating of a 1 or 2 (see Table 4).
Table 4.
Level of ESI | No. programs a | % programs | |
---|---|---|---|
4 | Well-established ESI | 4 | 11.4 |
3 | Interventions that have some hallmarks of ESI | 7 | 20.0 |
2 | Interventions that have some structure | 15 | 42.9 |
1 | Loosely defined and structured interventions | 9 | 25.7 |
Some agencies provided multiple programs with various levels of empirical support.
Discussion
This pilot study was the first stage in a larger program of research focused on the transportability of empirically supported parent training programs aimed at improving parenting capacities and reducing child behavior problems from mental health service delivery settings to social service settings. Its purpose was to gain a better understanding of the community-based parent training programs being offered in the community, as well as to develop a picture of the agencies currently delivering these services. The vast majority of our sample included nonprofit child and family service agencies providing multiple services with extremely diverse funding streams. Parent training programs delivered by the agencies varied considerably with respect to age. While some had been operating for decades, the modal program age was only 1 year. One possible explanation for this finding, though not testable in this study, is that funding supports for this type of program are often time limited, resulting in frequent program turnover. This would highlight the importance of addressing sustainability issues early in the implementation process. Alternatively, this finding could also hint at either an increased interest in providing parent programs and/or a willingness to try new types of interventions.
This potential propensity toward innovation is a promising finding. Despite the well-documented complexities of implementing empirically supported interventions (e.g., Fixsen, Naoom, Blase, & Friedman, 2005; Mullen, Bledsoe, & Bellamy, 2008; Proctor et al., 2007), some agencies are making efforts to offer efficacious interventions to the parents they serve. A major finding of this study, however, was the lack of scientifically validated interventions being offered to parents in the community, many of whom are considered by the programs to be at risk of maltreating their children. Only 3 of the parent programs provided by sample agencies met the criteria for well-established and empirically supported parent training programs. Most programs (68.6%) earned ESI-level ratings of only 1 or 2 out of 4, meaning that programs being delivered are loosely structured interventions that do not contain many of the hallmarks shown to make parenting programs more efficacious. Additional mixed-methods research is necessary to examine how agencies make decisions about the programs they provide.
Limitations
It is important to point out some limitations encountered in this study. The overall sample size of participating agencies was small. Therefore, we may not have had ample power to detect meaningful differences. While the response rate of program directors was good, fewer practitioners agreed to participate. It is possible that younger, less-experienced practitioners were among those who did not respond, potentially resulting in an overrepresentation of experienced practitioners.
Although 4 agencies reported using known interventions with strong empirical support, the nature of the data did not allow us to ascertain fidelity to the model. Agencies may have used only certain components of the program—or, alternatively, they may have used the model in its entirety without strictly adhering to guidelines put forth by the intervention developers. A treatment manual or formalized practice guidelines are an important hallmark of empirically supported parent training programs. The vast majority of programs in this study (85.7%) relied on a manual. Without empirical validation of the intervention, however, the availability of a manual does not ensure the quality of the material contained within the manual. Furthermore, the inclusion of multiple hallmarks of empirically supported parent programs cannot be construed as definite evidence of program effectiveness. It is likely that the reason programs achieve successful outcomes is the amalgamation of content, service delivery modality, and organizational context. Despite these limitations, this study does have important implications.
Implications
The study's findings help fill the gaps in knowledge regarding what is “usual care” for community-based parent training programs and what is the organizational context in which these programs are being delivered. To determine whether or not programs are effective in a given service setting, rigorous empirical testing is necessary. Knowledge about “usual care” and the organizational context that may impact implementation, such as that obtained in this pilot project, can inform, for instance, decisions regarding an agency's readiness to adapt an ESI. For example, it is much easier to implement new ESIs if programs and practitioners are accustomed to working with elements common to these interventions, such as manuals, fidelity checks, and assessments. In addition, the findings provide some indication of the level of penetration that ESIs have achieved in a typical midsized city. Our findings demonstrate that, by and large, the community-based social service system has yet to adopt interventions with proven efficacy. To increase the uptake of empirically validated programs, dissemination and implementation strategies may need to be tailored to the specific context of the system. Our findings begin to flush out some of these specifics.
In addition, the study's methodology of assessing a program's level of empirical support based on multiple “hallmarks” of an ESI provides a more nuanced way of characterizing a program that goes beyond a dichotomous, supported, or unsupported framework. This could allow program directors and practitioners to determine how their current program offerings compare to rigorously tested programs and how much change may be required to implement an ESI. It could also allow researchers and policy makers a way to chart agencies’ progress in offering potentially more efficacious treatment. Clearly, given the pilot nature of this study, this methodology is not fully developed, and it may need additional validation before being used more widely.
It was hypothesized that demographic and educational differences in staff would be found based on position (program director versus practitioner) and that, in general, staff would be fairly inexperienced. To our surprise, all levels of staff employed at the sample agencies have similar levels of experience and education. Again, this may help facilitate the implementation of empirically supported interventions, though it may also translate into staff who are entrenched in current practices and reluctant to change. The effect of staff experience and education on readiness to change and adopt new interventions warrants further exploration.
In conclusion, study results provide a current picture of the parent training programs available in the community and the agencies that provide them. However, to most effectively influence service policy and practice, findings from this study must be used in conjunction with other empirical evidence. Knowledge about the most salient service needs and organizational context can be combined to better inform decisions about the service delivery setting in which ESIs should be implemented to achieve optimal implementation effectiveness, intervention effectiveness, and program sustainability.
Acknowledgments
Support for this project was provided by the Center for mental Health Services Research at the George Warren Brown School of Social Work, Washington University, through an award from the National Institute of mental Health (5P30 mH068579). The authors wish to thank the staff of the Children's division of the missouri department of Social Services in St. Louis. We also acknowledge Penny Stein for her helpful comments on an earlier version of this manuscript.
Contributor Information
Patricia L. Kohl, Brown School of Social Work, Washington University..
Jennifer Schurer, Brown School of Social Work, Washington University..
Jennifer L. Bellamy, School of Social Service Administration, University of Chicago..
References
- Aarons GA. Mental health provider attitudes toward adoption of evidence-based practice: The Evidence-Based Practice Attitude Scale (EBPAS). Mental Health Services Research. 2004;6:61–74. doi: 10.1023/b:mhsr.0000024351.12294.65. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Barth RP, Landsverk J, Chamberlain P, Reid JB, Rolls JA, Hurlburt MS, et al. Parent training programs in child welfare services: Planning for a more evidenced-based approach to serving biological parents. Research on Social Work Practice. 2005;15:353–371. [Google Scholar]
- Borduin CM, mann BJ, Cone LT, Henggeler SW, Fucci BR, Blaske DM, et al. Multisystemic treatment of serious juvenile offenders: Longer term prevention of criminality and violence. Journal of Consulting and Clinical Psychology. 1995;63:569–578. doi: 10.1037//0022-006x.63.4.569. [DOI] [PubMed] [Google Scholar]
- Brestan EV, Eyberg SM. Effective psychosocial treatments of conduct-disordered children and adolescents: 29 years, 82 studies and 5,272 kids. Journal of Clinical Child Psychology. 1998;27:180–189. doi: 10.1207/s15374424jccp2702_5. [DOI] [PubMed] [Google Scholar]
- California Evidence-Based Clearinghouse for Child Welfare Scientific rating scale. Retrieved January 27, 2008, from http://www.cachildwelfareclearinghouse.org/scientific-rating/scale.
- Capage LC, McNeil CB, Foote R, Eyberg SM. Parent-child interaction therapy: An effective treatment for young children with conduct problems. The Behavior Therapist. 1998;21:137–138. [Google Scholar]
- Center for the Study and Prevention of Violence Blueprints model programs overview. Retrieved march 18, 2008, from the Univ. of Colorado at Boulder Web site: http://www.colorado.edu/cspv/blueprints/modelprograms.html.
- Dishion TJ, Patterson GR. Age effects in parent training outcomes. Behavior Therapy. 1992;23:719–729. [Google Scholar]
- English DJ, Upadhyaya MP, Litrownik AJ, Marshall JM, Runyan DK, Graham JC, et al. Maltreatment's wake: The relationship of maltreatment dimensions to child outcomes. Child Abuse & Neglect. 2005;29:597–619. doi: 10.1016/j.chiabu.2004.12.008. [DOI] [PubMed] [Google Scholar]
- Eyberg SM, Boggs S, Algina J. Parent–child interaction therapy: A psychosocial model for the treatment of young children with conduct problem behavior and their families. Psychopharmacology Bulletin. 1995;31:83–91. [PubMed] [Google Scholar]
- Eyberg SM, Robinson EA. Parent-Child Interaction Therapy: Effects on family functioning. Journal of Clinical Child Psychology. 1982;11:130–137. [Google Scholar]
- Farmer EMZ, Compton SN, Burns BJ, Robertson E. Review of evidence base for treatment of childhood psychopathology: Externalizing disorders. Journal of Consulting and Clinical Psychology. 2002;70:1267–1302. doi: 10.1037//0022-006x.70.6.1267. [DOI] [PubMed] [Google Scholar]
- Fixsen DL, Naoom SF, Blase KA, Friedman RM. Implementation research: A synthesis of the literature. Univ. of South Florida, Louis de la Parte Florida mental Health Institute, National Implementation Research Network; Tampa, FL: 2005. [Google Scholar]
- Henggeler SW, mihalic SF, Rone L, Thomas CR, Timmons-Mitchell J. Blueprints for violence prevention, Book 6: Multisystemic therapy. Center for the Study and Prevention of Violence; Boulder, Co: 1998. [Google Scholar]
- Henggeler SW, Schoenwald SK, Borduin CM, Rowland MD, Cunningham PB. Multisystemic treatment of antisocial behavior in children and adolescents. Guildford Press; New York: 1998. [Google Scholar]
- Hood K, Eyberg SM. Outcomes of Parent–Child Interaction Therapy: Mothers’ reports on maintenance three to six years after treatment. Journal of Clinical Child and Adolescent Psychology. 2003;32:419–429. doi: 10.1207/S15374424JCCP3203_10. [DOI] [PubMed] [Google Scholar]
- Hurlburt MS, Barth RP, Leslie LK, Landsverk JA, McCrae J. Building on strengths: Current status and opportunities for improvement of parent training for families in child welfare. In: Haskins R, Wulczyn F, editors. Using research to improve policy and practice. Brookings Institution Press; Washington, DC: 2007. pp. 81–106. [Google Scholar]
- Kazdin AE. Evidence-based treatments: Challenges and priorities for practice and research. Child and Adolescent Psychiatric Clinics. 2004;13:923–940. doi: 10.1016/j.chc.2004.04.002. [DOI] [PubMed] [Google Scholar]
- Kazdin AE. Parent Management Training: Treatment for oppositional, aggressive and antisocial behavior in children and adolescents. oxford University Press; New York: 2005. [Google Scholar]
- Kazdin AE, Weisz JR. Identifying and developing empirically supported child and adolescent treatments. Journal of Consulting and Clinical Psychology. 1998;66:19–36. doi: 10.1037//0022-006x.66.1.19. [DOI] [PubMed] [Google Scholar]
- Lansford JE, Deater-Deckard K, Dodge KA, Bates JE, Pettit GS. Ethnic differences in the link between physical discipline and later adolescent externalizing behaviors. Journal of Child Psychology and Psychiatry. 2004;45:801–812. doi: 10.1111/j.1469-7610.2004.00273.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Luthar SS, Zelazo LB. Research on resilience: An integrative review. In: Luthar SS, editor. Resilience and vulnerability: Adaptation in the context of childhood adversities. Cambridge University Press; New York: 2003. pp. 510–549. [Google Scholar]
- Mullen EJ, Bledsoe SE, Bellamy JL. Implementing evidence-based social work practice. Research on Social Work Practice. 2008;18:325–338. [Google Scholar]
- Office of Juvenile Justice and Delinquency Prevention OJJDP model programs guide. Retrieved July 14, 2008, from http://www.dsgonline.com/mpg2.5//mpg_index.htm.
- Patterson GR, Chamberlain P, Reid JB. A comparative evaluation of parent training procedures. Behavior Therapy. 1982;13:638–650. doi: 10.1016/j.beth.2016.11.004. [DOI] [PubMed] [Google Scholar]
- Patterson GR, Reid JB, Eddy MJ. A brief history of the Oregon model. In: Reid JB, Patterson GR, Snyder J, editors. Antisocial behavior in children and adolescents: A developmental analysis and model for intervention. American Psychological Association; Washington, DC: 2002. [Google Scholar]
- Paulussen-Hoogeboom MC, Stams G, Hermanns JMA, Peetsma TTD. Child negative emotionality and parenting from infancy to preschool. Developmental Psychology. 2007;43(2):438–453. doi: 10.1037/0012-1649.43.2.438. [DOI] [PubMed] [Google Scholar]
- Proctor EK, Knudsen KJ, Fedoravicius N, Hovmand P, Rosen A, Perron B. Implementation of evidence-based practice in community behavioral health: Agency director perspectives. Administration and Policy in Mental Health and Mental Health Services Research. 2007;34(5):479–488. doi: 10.1007/s10488-007-0129-8. [DOI] [PubMed] [Google Scholar]
- Sanders MR, Markie-Dadds C, Tully L, Bor B. The Triple P–Positive Parenting Program: A comparison of enhanced, standard and self-directed behavioral family intervention for parents of children with early onset conduct problems. Journal of Consulting and Clinical Psychology. 2000;68:624–640. [PubMed] [Google Scholar]
- Sanders MR, Markie-Dadds C, Turner KMT. Theoretical, scientific and clinical foundations of the Triple P–Positive Parenting Program: A population approach to the promotion of parenting competence. Parenting Research and Practice Monograph No. 1. 2003 Retrieved on February 14, 2008, from http://www.triplep.net/files/pdf/Parenting_Research_and_Practice_monograph_No.1.pdf.
- Sanders MR, McFarland ML. The treatment of depressed mothers with disruptive children: A controlled evaluation of cognitive behavioural family intervention. Behaviour Therapy. 2000;31:89–112. [Google Scholar]
- Schurer J, Kohl PL, Bellamy JL. organizational context and readiness for change: A study of community-based parenting programs in one midwestern city. Administration in Social Work. in press. [Google Scholar]
- Webster-Stratton C. A randomized trial of two parent training programs for families with conduct-disordered children. Journal of Consulting and Clinical Psychology. 1984;52:666–678. doi: 10.1037//0022-006x.52.4.666. [DOI] [PubMed] [Google Scholar]
- Webster-Stratton C. Preventing conduct problems in Head Start children: Strengthening parenting competencies. Journal of Consulting and Clinical Psychology. 1998;66:715–730. doi: 10.1037//0022-006x.66.5.715. [DOI] [PubMed] [Google Scholar]
- Webster-Stratton C, Hammond M. Treating children with early onset conduct problems: A comparison of child and parent training interventions. Journal of Consulting and Clinical Psychology. 1997;65:93–99. doi: 10.1037//0022-006x.65.1.93. [DOI] [PubMed] [Google Scholar]