Abstract
Policy-makers are looking to after-school programs to improve family and child well-being and are searching for evidence-based ways to improve the quality of after-school programs. This study examines whether the Good Behavior Game, a behavior management curriculum designed for school classrooms, can be easily migrated to academically-focused after-school programs. Our results are based on program observations, qualitative interviews, and ratings of implementation fidelity. We provide a description of the structure and activities in these after-school programs, then identify challenges to implementing and evaluating classroom-based interventions in the after-school setting.
Keywords: after-school, intervention, behavior management, evaluation
Introduction
Policy-makers and educators in the U.S. and other countries increasingly recognize the positive role that after-school programs and extended school day services can play in improving children’s academic, social, behavioral, and health outcomes (Afterschool Alliance, 2004; Mahoney & Zigler, 2006). While these after-school or extended day programs afford numerous opportunities to promote healthy development, research from the U.S. on the impacts of these programs is mixed. Some studies report positive effects of after-school programs on children’s development (Mahoney, Lord, & Carryl, 2005; Posner & Vandell, 1994), while others report few benefits (James-Burdumy et al, 2005; NICHD, 2004). One reason for variation in the effects of after-school programs is variation in the quality of after-school program activities. While there has not been a large, nationally representative assessment of after-school program quality in the U.S., research on U.S. after-school program infrastructure suggests there is room for quality improvement. For instance, many after-school programs in the U.S. pay low wages, have widely varying staff education levels, offer only part-time jobs, and not surprisingly, have high rates of staff turnover (Seppanen et al, 1993). Without adequate knowledge, infrastructure, and incentives, it will be difficult for after-school program staff to provide the high quality programming that is necessary to achieve beneficial child outcomes. Therefore researchers have begun looking for evidence-based strategies to improve the quality of after-school programs.
Unfortunately, research on effective strategies to improve the quality of after-school programs is limited. While both the structures and the curriculums in after-school programs tend to differ from those of the traditional school day, there are enough similarities between the two environments that it may be possible to take interventions designed to improve classroom quality and adapt them for the after-school setting. Identifying evidence-based practices from the school improvement literature that can be easily adapted for after-school programs may be an effective way to rapidly increase the program improvement options available to after-school programs.
This study contributes to our knowledge about the feasibility of migrating classroom-based interventions into after-school programs. Data for this study comes from a pilot project in which we implemented the Good Behavior Game, a classroom-based behavior management system, in eight after-school programs in the U.S. Effective behavior management practices are an important foundation for any kind of youth programming. Behavior management strategies may be particularly necessary for after-school programs because they are trying to elicit cooperation from children after they have already been through a full school day. In addition, after-school programs are often serving children who are at-risk of academic or behavioral problems, thereby increasing the importance of a systematic strategy of behavior management (Walker et al, 1996).
The after-school programs in this study were all 21st Century Community Learning Center programs, which are federally funded programs designed to boost academic achievement and foster healthy development among at-risk students. We focus on academically-oriented after-school programs in this initial study because we anticipate easier migration when the structures and behavioral expectations in the after-school programs are more “school-like”. While the research for this study comes from the U.S., the implications of our findings should apply to extended school day programs in other countries as well, providing the programs have a similar structure to those described in this study.
Using both qualitative and quantitative data, this study provides insight into our experiences implementing a classroom-based intervention in after-school programs. We present a description of the 21st CCLC programs in our sample. As researchers consider strategies to improve after-school program quality, having a clear sense of the structure and content of different types of after-school programs is essential. We also answer two research questions. First, what are the barriers to implementing the GBG in after-school programs? We document the barriers to implementation that were most salient for program staff, directors, and implementation coaches. To show the extent to which these barriers interfere with implementation, we provide descriptive information on the after-school programs’ level of fidelity with various components of the GBG. Second, what kinds of approaches are needed to effectively evaluate the effects of classroom-based interventions in after-school programs? During data collection, we became particularly interested in the challenges posed by within-program variation in the quality of after-school program activities and staff. We document the extent of within-program variation in after-school program activity quality and discuss the implications for evaluation efforts.
While the sample size for this study is small, there is widespread interest in information on evidence-based practices that can improve the quality of after-school programs. Because of the growing interest in after-school programming, researchers from many related areas are moving into this field. We hope our experiences and findings provide timely assistance to others as they design related projects.
Migrating a School-Based Approach to After-School Programs: The Good Behavior Game
The Good Behavior Game (Barrish, Saunders, & Wolf, 1969) is a simple, empirically supported application of behavior management that improves children’s classroom behavior (see Tingstrom, Sterling-Turner, & Wilczynski, 2006 for a review). The Good Behavior Game (GBG) has been shown to reduce teacher reports of aggression for at-risk boys (Kellam et al, 1994), to reduce the chances that boys start smoking (Kellam & Anthony, 1998). The GBG has also been combined with other types of interventions to improve child outcomes. For instance, the combination of the GBG with enhanced academic and social curriculums may reduce the likelihood of suspensions and conduct disorder diagnoses (Ialongo, Poduska, Werthamer, & Kellam, 2001), and the combination of the GBG with a parenting component and social skills training may have longer-term effects by delaying first arrests and marijuana use (Eddy, Reid, & Fetrow, 2000). The GBG has been used to influence child behavior in a number of applications from normal classrooms to special education settings with children ranging from preschool to middle school and even with populations of autistic or mentally ill children (Embry, 2002; Tingstrom, Sterling-Turner, & Wilczynski, 2006). The most recent data demonstrates that high quality implementation of the GBG in first-grade classrooms can reduce problem behavior, delinquency, and substance abuse in early adulthood, particularly for males (Kellam, et al., 2008).
The GBG is designed to use a combination of positive peer pressure and positive reinforcement to elicit good behavior from children. In our study, we adapted Embry and colleagues (2003) “PAX” version of the GBG. The intervention is relatively simple. During the initial set-up of the GBG, children and teachers use role-playing and story-telling to develop a shared understanding of the behaviors found in an unpleasant school and those found in a wonderful school. Children are then placed on teams. On a daily basis, teachers are expected to announce that the class will be playing the GBG for a specified amount of time. During this time, the number of misbehaviors (or ‘spleems’) is counted. Teams receive positive reinforcement if members of their team behave well. These reinforcements are minor but fun rewards, such as 60 seconds of making animal noises, pencil tapping, stretches, or making faces. Because children enjoy these activities, the GBG often elicits positive peer pressure, with children encouraging their team members to behave well.
Once children have learned to play the game, many other components can be added. For instance, children are assigned roles on teams, such as a “Tootler” (versus a “tattler”) whose job is to write positive reinforcement notes to children who are behaving well. Teachers begin to play “The Secret Game” which is an unannounced game with the intention of generalizing GBG to a longer portion of the day. Embry (2002) has likened the effects of the game to a behavioral vaccine that helps prevent problematic behavior. Therefore, not only is the GBG important for establishing a foundation on which other curriculums can be laid, but teaching children to make good behavior management decisions in a positive environment may have long-term developmental impacts.
Differences between Classrooms and After-School Programs that May Influence the Implementation of Classroom-Based Interventions in After-School Programs
If the activities and behavioral expectations in after-school programs are similar to those found in the classrooms where the GBG has been evaluated, then migration of the GBG into after-school programs should be easily and we should see similar results. If, however, the activities and behavioral expectations are different in after-school programs, the migration may be more difficult and/or the GBG may be less effective. While there will always be challenges implementing a new intervention and fidelity must be carefully monitored (Dumas et al, 2001), classroom-based studies indicate that the challenges to implementing the GBG are not large enough to interfere with its success in school classrooms. Thus in this study we are looking specifically for barriers to implementation that are unique to after-school programs.
After-school programs have traditionally been more informal and recreational than the school-day, with opportunities for physical activity, mentoring, social activities, and sometimes homework help. These programs tend to have staff with fewer credentials than school teachers and activities are sometimes held in outside spaces or large, open spaces such as cafeterias and gyms. But after-school programs are increasingly being used for academic support and enrichment for at-risk youth (Perkins & Borden, 2003; Smith, 2007). The U.S. government spends nearly $1 billion per year on 21st Century Community Learning Centers (21st CCLC), which are academically-oriented after-school programs for at-risk youth (U.S. Department of Education, 2007). The 21st CCLC programs often use school teachers as program staff and they frequently have children engaged in teacher-directed academic lessons and homework. These activities often entail traditional “school-like” behavior, such as sitting at desks or tables quietly, raising hands to speak, and listening while teachers direct lesson and monitor progress. These activities are similar to activities of the school day where the GBG effectively reduces behavior disturbances. These similarities should make the migration of the GBG into academically-oriented after-school programs relatively easy.
Before we present our results, we include a description of the after-school programs in our study. We focus primarily on whether the activities and behavioral expectations in these after-school programs are “school-like”. This description provides necessary context for our findings about barriers to implementing GBG in after-school programs. While these programs are not a representative sample, the description also provides some insight into the structure of today’s academically-oriented after-school programs.
In addition to behavioral expectations and activities, there are other significant differences between after-school programs and school classrooms that may make the migration of classroom-based interventions into after-school programs challenging. While the literature on migrating evidence-based interventions into existing after-school programs is limited, we can draw from the wider literature about after-school programs to anticipate several potential issues.
Staff turnover rates in after-school programs are estimated to be around 35 – 40% annually (Larner, Zippiroli, & Behrman, 1999; Seppanen, deVries, & Seligson, 1993). These rates are much higher than rates of teacher turnover, which are about 15% in the public schools (National Center for Education Statistics, 2005). Staff turnover is problematic not only because a new staff member needs to learn the GBG, but because of the disruption to the entire program each time a staff member leaves (Halpern, 1999). While some interventions, such as the GBG, may make it easier for new staff to enter the program because there are clear behavioral expectations and strategies for handling misbehavior, programs with high levels of staff turnover may not have the capacity to consistently implement an intervention.
Research also indicates that children’s attendance at 21st CCLC programs can be quite sporadic (James-Burdumy, et al, 2005). Child turnover can be problematic as well for interventions like the GBG, because each new child will need to be taught the game. While having a clear behavior management system in place may help children make the transition to the program quickly, at some point teaching new children the game may overwhelm staff.
Classroom teachers typically have college degrees and teacher certifications, plus experience implementing curriculums designed by others. Education levels of after-school program staff vary widely, with some programs employing school teachers while others employ staff with much lower education levels or rely heavily on community volunteers. Varying levels of staff education may influence the ability of after-school programs to adopt classroom-based interventions however the direction of this effect is unclear. Staff with less education may have less experience learning new curriculums and implementing them in their classrooms, making it harder for them to implement the GBG. Or staff with less formal education may be more open to learning new techniques that will help them improve their program, while certified teachers may be less interested in adopting new approaches if they have already learned or developed their own strategy for handling misbehavior.
A key difference between after-school programs and the school classrooms in which the GBG has been tested is the number of teachers in the evaluation group. During the school day, there is often one teacher leading a classroom that consistently includes the same children. In after-school programs, there are often many staff members and children often move frequently from one group to another throughout the afternoon. The multiple staff members in after-school programs may make evaluation more challenging. For instance, a measure of implementation fidelity for a classroom indicates the extent to which the teacher implemented the curriculum. For an after-school program, the fidelity of multiple staff members may need to be assessed and somehow aggregated up to the program level. Because the children are often exposed to many different staff members, measuring their exposure to the intervention is challenging.
Similarly, the effects of the intervention on the quality of the program may be more challenging to measure in the after-school setting. Programs can be assessed on both structural measures (e.g., student:staff ratios) and process measures (e.g., warmth, consistency in the environment, and exposure to learning opportunities) (Marshall, 2004). These process measures are important for children’s development (Van dell & Wolfe, 2000) and are often the target of interventions. But generating one measure of program warmth can be challenging if there is substantial within-program variation in the warmth of the staff members.
Methodological research has grappled with measuring both individual and settings-level constructs (e.g. Chan, 1998). Assessment of settings-level change is a more complex enterprise than the assessment of change at the individual level. Often researchers characterize a setting by aggregating individual reports of behavior such as staff practices. Shinn (2000) argues that appropriate assessment of settings should include not only aggregate scores but also measures of the amount of variation in the setting. Because after-school programs often include multiple staff members facilitating several different types of activities during the program, these issues about settings-level measurement become salient.
The amount of variation in quality within after-school programs has not been well documented, therefore the field has little information about whether this is a problem that needs to be considered when designing evaluations. Perhaps program directors set the tone for their programs and programs have little internal variation in staff quality. In his extensive research on the evidence-based program PATHS, Greenberg and colleagues have found that the school principals’ acceptance and enthusiasm for a school-wide social-emotional learning program has significant effects on program fidelity (Kam, Greenberg, & Walls, 2003). However, it may also be that programs have significant internal variation in quality, particularly in after-school programs that include staff with varying education levels or programs experiencing leadership turnover. Our data provides evidence about the amount of within-program variation in our sample of after-school programs to contribute to the on-going debate about how to assess overall program quality when programs vary internally (e.g. Smith & Hohmann, 2005).
The Current Study
We use qualitative interviews with after-school program staff, directors, and intervention coaches, as well as quantitative data from program observations and on-site coach observations, to understand the strengths and weaknesses of moving classroom-based interventions into after-school programs. We begin by describing the 21st CCLC programs that we observed, in order to understand whether these academically-oriented after-school programs are structured in “school-like” ways, thereby facilitating the transfer of classroom-based interventions. Then we answer the following research questions: (1) What are the barriers to implementing GBG in the after-school setting and did programs achieve implementation fidelity? and (2) What approaches to measurement will allow us to effectively evaluate the effects of classroom-based interventions in after-school programs? In particular, how much do activities within a program vary in quality and how does this influence evaluation?
Our results about the feasibility of migrating GBG into the after-school setting are based on our experiences with this pilot project. A large scale randomized trial of the effects of GBG in after-school programs is currently underway, but results will not be available for several years due to the scope of the evaluation project. Findings from the pilot project on barriers to migrating the GBG into after-school programs and issues to consider in evaluating these settings are timely because there is considerable interest among researchers and foundations in conducting evaluations to identify best practices in after-school programming, and there is pressure on program providers to document the effects of their programs.
The Sample
The data for this study come from the LEGACY (Leading, Educating, Guiding a Community of Youth) Together project. This was a pilot project in which we implemented the GBG in eight 21st Century Community Learning Center programs during the 2006 – 2007 academic year. The after-school programs in our sample were located in both urban and rural counties. All of the programs served primarily poor and minority students, who are disproportionately represented among students needing additional academic assistance and enrichment. The implementation sites received an initial training, all necessary start-up materials, and weekly on-site consultation as they implemented the GBG. The pilot was designed to: a) test the feasibility of conducting a large scale, randomized evaluation of GBG in after-school programs, and b) provide information about the quality of our measures and their means and standard deviations in order to conduct power analyses for the full evaluation project. Because we could test many of our measures (e.g. program quality) without the expense of implementation, we also collected data at four after-school programs in which we did not implement the GBG (measurement-only sites).
Purposive sampling was used to select the programs. Eight of the after-school programs were run by our community partner, who is the provider of 21st CCLC after-school programs in two counties in Central Pennsylvania. In each county, we randomly divided four programs into measurement-only and implementation sites. We also implemented the GBG in four after-school programs from a third county in order to ensure variation in program structure and content in our sample. Our final sample consisted of eight implementation programs and four measurement-only programs.
Because this was a pilot project, a randomized, case-flow design was used with four after-school programs implementing the GBG in the fall of 2006 and four additional after-school programs implementing the GBG in early 2007. As on-site coaching tends to increase fidelity to an intervention (Fixsen et al, 2005), the after-school programs implementing the GBG in the fall received on-site coaching for 10 – 12 weeks. These four after-school programs provided us with data on whether the sites could consistently implement the GBG. The four after-school programs implementing GBG in the winter received on-site coaching for only two to four weeks,1 but still provided us with information about barriers to implementation and evaluation.
Data
Multiple data collection techniques were used including qualitative interviews and formal program observations. Table 1 summarizes the data collection techniques and the research questions they inform.
Table 1.
Data Collection
Type of Data | Collected by: | Type of sites collected from: |
Number of times each site was observed: |
Information used to inform: |
---|---|---|---|---|
Observational data on program quality and activities | Trained observers | All intervention and measurement sites (12) | 4 intervention sites (3 times) 4 intervention sites (1 time) 4 measurement-only sites (3 times) |
Description of after-school programs, challenges with evaluation |
Observational data on implementation of GBG | Intervention coaches | 4 intervention sites (10 – 12 weeks) 4 intervention sites (2 – 4 weeks) |
Weekly | Ability to implement GBG |
Qualitative interviews with intervention coaches, site directors, and after-school staff | Researchers | All 3 coaches and all 8 directors or staff from all 8 intervention sites | Once at the end of the intervention | Barriers to implementation, challenges with evaluation |
To collect data on program quality and structure, trained observers visited all 12 of the after-school programs to collect baseline observational data. In addition, we conducted follow-up observations for eight after-school programs (four intervention programs and four measurement-only programs). The eight after-school programs with follow-up observations were each observed two more times during the school year for a total of 28 program observations (12 + 8 + 8). Two observers were sent to each program observation to ensure reliability; observers used a series of rating tools that are described below.
In addition, the intervention coaches who provided weekly on-site consultation to the after-school programs recorded the program’s level of implementation fidelity, staff enthusiasm for the GBG, and counts of student misbehavior. Our implementation data focuses on the four after-school programs from which we have 8 – 12 weeks of implementation data: Data from the four after-school programs that were followed for a shorter period of time do not provide reliable evidence about whether after-school programs can sustain implementation. These programs are included in our results, however, because they provide useful information about program structure and activities, quality, and barriers to initial implementation.
We also conducted qualitative interviews with all three implementation coaches and with fourteen after-school program staff and directors from these programs. The semi-structured interviews lasted from 30 minutes to two hours and were conducted at the after-school programs or over the telephone.
Analyses and Measures
Given the small sample size, analyses for this paper are descriptive. The following section outlines the methods and measures used to address each of our research questions.
Research Question 1: Barriers to implementation and implementation fidelity
Data on barriers to implementation come from the qualitative interviews. After-school program staff, directors, and the implementation coaches were asked about barriers to implementation, including motivational barriers, adequacy of the training and support, space or timing issues, children’s interest, and the appropriateness of the GBG for after-school programs. These barriers were categorized according to whether they were barriers to implementing the curriculum in any setting (school or after-school) or whether they were barriers to implementation that were unique to after-school programs. The results section focuses on barriers that are unique to implementing the GBG in after-school programs.
To establish the extent to which sites were able to implement the GBG, on-site coaches recorded the level of implementation fidelity they observed at programs during their weekly consultations. Descriptive information on trends in implementation shows the extent to which after-school programs implemented the GBG. To mask the performance of particular after-school programs, data on implementation is aggregated across programs within a county (therefore one line on a graph represents the average implementation score of two after-school programs). Because programs in different counties received consultation for varying lengths of time, data on implementation is presented by county.
We present data on three measures of implementation fidelity. First we show a count of the number of fundamental GBG components that programs used, such as use of the score board, use of prizes, playing secret games, counting misbehavior accurately and responding to misbehavior with low negative emotion. The maximum score is 8. Sites were also rated on their implementation of advanced Good Behavior Game components, such as using quiet signals to help students monitor their behavior and using notes to reward positive behavior. These components may be used sporadically by programs to enhance the game, but are not necessary for basic implementation. Programs were rated for each of eight advanced components based on whether the coach saw a lot (2), some (1), or none (0) of these components, for a maximum score of 16. Finally, coaches rated the enthusiasm of the staff for the GBG on a scale of 1 to 4 (with four indicating high enthusiasm).
Research Question 2: Approaches to measuring quality in after-school programs: Within-program variation?
Within-program variation in quality is assessed using the Promising Practices Rating Scale (PPRS) developed by Deborah Vandell and colleagues (Policy Studies Associates, 2005; Vandell et al. 2004). Trained observers watched three activities at each after-school program and rated each activity on six components: supportive relations with adults, supportive relations with peers, over-control, chaos, student engagement, and appropriate structure. Each component was rated on a scale of 1 – 4. A full PPRS score was created for each activity by averaging across these six components. Negative components were reverse coded so that higher scores reflect higher program quality. Because three activities were observed for each program, we can compare the quality of these three activities to generate information about within-program variation in quality. Given the small sample size, we document within-program variation in quality by counting the number of programs that have differences >= 0.5 in PPRS scores across their three observed activities.
Description of the After-School Programs in this Sample
The 21st CCLC programs in our sample began after-school and lasted from two to three hours. A few of the programs operated fewer than five days per week. A typical program schedule included a short snack time, an hour of homework, an hour of academic, social-emotional, and/or computer lessons, followed by either dismissal or dinner. Several programs had one day per week for fun activities, such as arts and crafts, Girl Scouts, playing in the gym, or dance instruction. One program did clubs every day instead of providing homework time. But for a substantial proportion of the after-school program time, children were engaged in academic work. These academic activities were typically teacher-directed. Though some programs used large spaces like gyms and cafeterias, many of the programs split the students up into small groups and the homework and lessons were done in regular classrooms, adding to the school-like environment of the after-school programs. Several of the programs used teachers from the school-day to staff the programs, further adding to the school-like environment. Only a small minority of activities involved substantially less structured activities that are often associated with after-school programs, such as playing in the gym.
Because these programs are not a representative sample of 21st CCLC programs, the program description above should not be generalized to all 21st CCLC programs. Instead, it is meant to provide a context for our findings about implementation barriers and evaluation challenges. In academically-oriented after-school programs such as these, many (though not all) activities are structured and have behavioral expectations that resemble a traditional school day. These similarities increase the likelihood of successful migration of interventions that were developed and tested in classrooms into after-school programs.
Results
Research Question 1: What are the barriers to implementing the GBG in after-school programs and can after-school programs achieve implementation fidelity?
Overall, there were few problems implementing the GBG in these after-school programs. Some staff members reported being excited to learn the intervention initially, while others reported skepticism and concern that the GBG would just mean more work. As they learned about the GBG and tried it, most staff members reported liking it and many reported wanting to use it in the future. For instance, one staff person noted:
“… initially… I was skeptical, and needed to see how it was going to play out, and then when I saw how well it motivated the students, then I thought, okay well, I’ll have to be a little bit more open-minded (laughter).”
Staff reported that children enjoyed the GBG as well. One staff person noted:
“I think they enjoy playing the games, and I think they know the consequences, and I think it establishes teamwork as well, very much so.”
Another added that the GBG created a positive environment because in part, children were encouraging each other to behave well:
“…it took them a little bit to get used to it, but after the first couple of weeks, you can hear the kids telling the other team members… ‘Don’t spleem! Don’t spleem!’ and they kind of encouraged each other that way….”
When asked about barriers to implementing the GBG in after-school programs, staff reported some challenges such as facilitating a lesson while simultaneously tracking and recording bad behaviors (known as ‘spleems’ in the GBG). Some also reported that giving the children rewards for good behavior immediately, as the curriculum initially suggested, was sometimes disruptive. Challenges like these, however, are not unique to implementing the GBG in after-school programs since classroom teachers would likely report these challenges as well.
As we describe below, after-school program staff did identify some issues that were unique challenges to implementing the GBG in after-school programs. Areas that we focus on in our results include: the role of the GBG during less academic time, small group sizes, older children, the use of teachers as after-school program staff, and child and staff turnover. Understanding these challenges will be helpful in the future as we design training materials and implement the GBG in other after-school programs, but the barriers generally do not reach a magnitude that inhibited the successful implementation of the GBG in after-school programs such as these. Indeed, many of the after-school programs that reported the barriers listed below also reported interest in using the GBG in the future.
After-school programs often struggle to balance their interest in supporting children’s academic achievement with their interest in meeting children’s other developmental needs. Finding a quality improvement tool that helps staff meet these different goals is challenging. One challenge to using the GBG in after-school programs was that staff members associated the GBG with enforcing classroom-like behavior. This may be because it was easy and effective to use the GBG during homework and academic lessons, or perhaps because the training materials still included many references to schools. Staff members were less inclined to use the GBG during their non-academic times. One staff member touched on this topic several times:
“I think it’s during academics, it’s fine. Because it helps you keep control… But then, preferably, I don’t like to do it at lunch time and snack time because when they’re first coming in, yes they’re a little hard to get settled, but they’ve been through so much, so you need to be a little understanding.”
When asked whether the GBG could be used during more recreational activities, this staff person responded:
“It can be done, but it’s just hard… You know we tried it with table games… and it was like everybody took their turn and, you know, but they didn’t look like they was having no fun! ….. Like they don’t want to do it in, you know this is their fun time, they don’t want to do the homework time thing.”
While these 21st CCLC after-school programs had an academic focus, some of the staff members felt the children needed more time for free play. Indeed, this concern reflects a larger issue about the developmental needs of children during their after-school hours. When children get to after-school programs, they have already been through the whole school day. They may be tired, and particularly if they do not have many recreation opportunities during school, they may need unstructured time to play or exercise. In programs that associated the GBG with academic activities and school-like behavior, the GBG got caught up in this larger debate about children’s developmental needs during after-school time.
Similarly, another teacher noted the more informal roles and relationships that after-school program staff often had with students:
“I like to get involved and play the games with the students in the gym... and if I’m doing that and having fun, … I don’t think the students would appreciate it if I stood up there with a check sheet, and like, I’m watching you, … so that’s why I just never really wanted to do it in the gym.”
The GBG could be adapted to these more recreational activities by defining what counts as bad behavior differently and the training could include a video showing staff implementing the GBG in a recreational activity. The advanced features of the GBG may be particularly appropriate for less structured activities. For example, the GBG includes hand signals that help students begin to transition from one activity to the next or that inform children about how loud their voices can be during each activity. But the broader idea that staff may want a different tool for academic time than they use for less structured time is important. Implementers should be cognizant of this concern and highlight ways that some components of the GBG might be useful in less formal contexts to provide continuity in behavior management strategies, while other components of the GBG may be less desirable in informal settings.
The less formal relationships between after-school program staff and children noted above are also an important part of after-school programs. Some (though not all) after-school programs have much lower student to teacher ratios than the school-day and the mentoring and individual help that children receive can be a clear benefit of after-school programs. In our pilot, the more deeply engaged the staff members were in an activity with the children, the harder it was to facilitate the GBG. Many staff felt the GBG was most effective during homework time, to settle the children down and keep them focused. But one staff member who provided substantial one-on-one homework help to her students noted that it was difficult to provide this individualized instruction while also facilitating the GBG for the larger group:
“and so many kids have homework, and they need so much help from us… we’ll be helping four kids at once with their homework, it was kind of hard to do it then…”
In addition to having difficulties playing the GBG when working one-on-one with students, staff reported challenges using the GBG when their group sizes fell below six students, something that may not be common during the school day but that may happen in after-school programs. Developing and documenting an adaptation that keeps the key elements of the GBG but allows it to be successfully played with a small group of children will be important.
Staff reported mixed success using the GBG with older children. One teacher who found the GBG effective with older children reported that the “silly” prizes in the GBG provided the older students with a chance to be “goofy” again. However, staff at another program reported having a group of fifth grade students who did not respond to the GBG at all:
“I think GBG works really awesome with the younger kids, but the older kids…. they’ll intentionally make spleems, because it’s fun, … and they know we’re timing them, and there will be kids that don’t care if they ruin it for their teams, … to be honest, it was kind of a frustrating experience with the older students, but the younger students, I think it works absolutely well…”
Identifying strategies, such as placing children on a “team of 1” or on mixed-age teams should be further developed when training and supporting staff working with older children. In addition, the ages of the children in the program should be considered when evaluating the effectiveness of the GBG in after-school programs.
As with children, some staff members were not enthusiastic about implementing the GBG. When asked which staff members were least enthusiastic about the GBG, one coach indicated that it was staff members who were least enthusiastic about their jobs and least effective with the children:
“… you would think the ones that would struggle a little bit more [with bad behavior], they would use this as a tool to boost it, but in this sense, … it actually kept them where they were…. And I think that just had a lot to do, again, with like attitude and motivation of the job overall.”
The coach went on to say:
“… my experience in watching and coaching, the more enthusiastic the staff were, the kids became just like that. I mean, they were modeling for the kids the excitement of the game, and … the competition of winning the game. So… there was improvement from the ones that needed improvement, but there could have been… more improvement.”
While variation in motivation is not unique to after-school programs, in programs that use school-day teachers to staff the after-school program, this motivation issue may be compounded. Classroom teachers are already receiving trainings on new curriculums as part of their day jobs. Some of the teachers expressed frustration with the frequent trainings on new curriculums they received for their day jobs, and these teachers may be resistant to additional trainings and curriculums for the after-school program. One staff member, when asked what could be done to win over staff members who were not initially interested, said:
“… there’s still going to be a major percentage that are going to go against the grain and not do what they’re told to do, … and I don’t know whether it’s just because they’ve been in the industry longer or just, I don’t know, they’re just fed up with the administration telling them what to do and how to do it.”
In programs staffed with teachers, then, it may be particularly important to carefully manage how teachers learn about the GBG or any other quality improvement initiative, to learn about the other curriculums on which these teachers are being trained, and to find ways to create a positive group dynamic early in the training to help reduce any initial resistance and frustration. It is important to note that not all teachers were resistant to the intervention. One coach who worked with programs that relied solely on classroom teachers to staff their after-school programs reported:
“They were welcoming of it. They … were like ‘whatever you can give us that’s going to help us, we’ll take it’…. They were inquisitive.”
Indeed several classroom teachers reported taking some of the components of the GBG and migrating them back to their daytime classrooms.
Providing training and coaching to classroom teachers who worked in after-school programs was also challenging. While classroom teachers may have an easy time implementing classroom-based interventions because many have experience learning and implementing curriculums, they are much harder to access for training and coaching. As one coach noted:
“I had this one teacher.. [tell] me, ‘You know, I got here at 6 o’clock this morning and it’ll be 6:30 when I go home.’… It’s like that’s all they do: they come in, they teach school during the day, and then they have after-school. So it’s very difficult… how do you consult with them?”
The coach went on to say:
“… the best thing that I could have done was stay after the program was over and offer… to chat with them if they needed me. Which, a lot of them just, by that time in the day, they just want to get the heck out of there! … they’re tired, and you know, they want to go home, they have lesson planning to do for the regular school day, and … they’re ready to leave.”
Even setting up qualitative interviews with staff and directors in programs that relied on school-teachers was challenging due to their extremely busy schedules. Thus having classroom teachers staff after-school programs does not necessarily mean an easier time implementing a classroom-based curriculum in after-school programs.
One of the key differences between after-school programs and schools is the amount of child and staff turnover that some programs experience. In the pilot project, limited child and staff turnover were experienced by some sites, and new staff and children were relatively easily taught the GBG. When asked whether it was easy or difficult to teach children who joined the program the GBG, a staff member reported:
“I wouldn’t say it was easy but it wasn’t hard. It was just more work…. You had to take at least 15 to 20 minutes to explain the game to them.”
The staff member went on to say:
“it was worth it, it was just, it was just tiresome. You know you do, you get tired…. It’s worth it because once the kid is taught then he just falls right into place with the other kids.”
Therefore, small amounts of turnover seem manageable, but it is very likely that there is a threshold above which staff and child turnover will make continuing this type of intervention challenging.
Beyond these main themes, after-school program staff reported a series of minor adaptations, such as advice on how to make teams effective when children are often choosing different activities, are in different groups, or only attend the program inconsistently.
Our quantitative data on implementation fidelity and staff enthusiasm largely support the qualitative finding that it was relatively easy for staff to implement this classroom-based curriculum in after-school programs. Data recorded by coaches during their visits to the after-school programs show that programs from both counties relatively consistently implemented the fundamental elements of the GBG (see Figure 1). Programs were less consistent in their use of the advanced GBG elements (see Figure 2) with some programs using some of the elements and others skipping most of the advanced elements. These components are helpful for creating a “culture of GBG” but are not essential components for successful implementation. Within the school context, teachers have been likely to use advance elements. Thus, we hypothesized it is feasible and likely that they can be used in after-school programs; however, given the short length of this pilot, programs did not consistently implement the advanced elements. Programs that were fundamentally better organized, with regular schedules and staff meetings, were better able to implement the GBG (Carmack, Smith, et al., in progress). Despite skipping the advanced elements, staff members were generally enthusiastic about the GBG throughout the intervention (see Figure 3).
Figure 1. Implementation of Fundamental GBG Elements.
Source: Legacy After-School Developmental Project, 2006 – 2007.
Notes: Implementation of fundamental elements measured on a scale of 0 (implemented on fundamental elements) to 8 (implemented 8 fundamental elements). Each county line averages data from two after-school programs, in order to mask the performance of individual programs.
Figure 2. Implementation of Advanced GBG Elements.
Source: Legacy After-School Developmental Project, 2006 – 2007.
Notes: Implementation of advanced elements measured on a scale of 0 – 16, with 8 elements about which coaches recorded that they saw 0 (none), 1 (some), 2 (a lot) of each element. Each county line averages data from two after-school programs, in order to mask the performance of individual programs.
Figure 3. Staff Enthusiasm for the Good Behavior Game.
Source: Legacy After-School Developmental Project, 2006 – 2007.
Notes: Enthusiasm assessed by GBG coach on a scale of 1 (low enthusiasm) to 4 (high enthusiasm). Each county line averages data from two after-school programs, in order to mask the performance of individual programs.
Research Question 2: What approaches to measurement and evaluation are needed to effectively evaluate the effects of classroom-based interventions in after-school programs?
During this pilot project we tested many measures of program quality that we are using in the full evaluation, thereby gaining experience evaluating the quality of after-school programs. Generally, we found few barriers to collecting data on the implementation of the GBG or on outcomes we are interested in such children’s knowledge of behavioral expectations or their actual observed behavior.
However, a central difference between evaluating a classroom and evaluating an after-school program is the number of staff in the setting. A classroom typically consists of one main teacher and potentially an aide, with whom children stay for most of the day. The after-school programs in our pilot study employed from three to more than fifteen after-school program staff. Instead of staying with one teacher for the whole day, children often moved around: in different peer groups, in different activities, and with different teachers throughout the afternoon.
Table 2 documents the amount of within-program variation in the quality of activities that we observed. Programs were coded as having within-program variation in quality if there was a difference on the Promising Practices Rating Scale between activities within a program of at least 0.5 (on a scale of 1 – 4). For example, if Activity 1 received a score of 3.5 on the PPRS, Activity 2 received a 3.0, and Activity 3 received a 2.8, the maximum within-program difference was 0.7. Conservative and upper-bounds estimates of within-program variation in quality were estimated by examining the amount of within-program variation in quality using different strategies for aggregating across the two observers’ scores.
Table 2.
Number of Programs with Substantial Within-Program Variation in Quality across Activities
Counts of Programs with Variation in the PPRS Across Activities |
||||
---|---|---|---|---|
PPRS Averaged Across Raters |
Rater 1 | Rater 2 | Both Raters Report Variation |
|
# of programs with variation at Time 1 (out of 12) | 8 | 7 | 6 | 5 |
# of programs with variation at Time 2 (out of 8) | 4 | 3 | 5 | 2 |
# of programs with variation at Time 3 (out of 8) | 6 | 6 | 2 | 3 |
Sum of Time 1 – Time 3 variation | 18 | 16 | 13 | 10 |
Total # of program observations across the 3 times | 28 | 28 | 27 | 27 |
% of observations with variation | 64% | 57% | 48% | 37% |
Source: Legacy After-School Developmental Project 2006 – 2007
Notes: For each program, three activities were observed and rated on the Promising Practices Rating Scale. Each activity was rated by two observers (Rater 1 and Rater 2). Programs were observed on a maximum of three occasions (Time 1, Time 2, and Time 3) over the course of the intervention. Programs are coded as having substantial within-program variation in quality across activities if the PPRS scores for the three activities differ by ≥ 0.5 (on a scale of 1 – 4).
For 37% of program observations, both raters’ scores indicate that they observed activities within the same program that varied by at least 0.5 on the PPRS. When looking at each rater’s scores separately, about half of programs included this amount of variation. An upper bounds estimate, looking for within-program variation when PPRS scores were averaged across both raters, nearly two-thirds of program observations include within-program variation in quality. The inter-rater reliability on this scale was 0.71, making the “true” variation somewhere in the middle of these lower and upper bounds estimates, but clearly indicating that there was a considerable amount of within-program variation in quality. This within-program variation in quality has several implications for evaluation that are discussed below.
Discussion
Overall, we are optimistic that the transfer of classroom-based interventions to after-school programs – particularly academically-oriented programs – seems feasible. Staff reported that children enjoyed playing the GBG and the combination of teamwork, positive peer pressure, and positive reinforcers improved children’s behavior.
As expected, some challenges that are unique to implementing the GBG in after-school programs also emerged. After-school staff struggle with striking a balance between meeting children’s academic needs and their needs for free play and mentoring relationships. Interventions designed for school classrooms may need significant adaptations to work during activities that are more recreational. Training materials need to be developed that show the GBG being used in less academic activities and implementation coaches should be prepared to help programs as they identify ways to use the GBG during these less academic times.
Because both staff and children associated the GBG with the academic component of the program, there was some resistance to using it during more informal program time. This resistance was often associated with staff members’ broader ambivalence about requiring such “school-like” behavior during the after-school hours. After-school program staff recognize the importance of the individualized attention and informal mentoring roles that are often possible in after-school programs. Classroom-based interventions that interfere with staff members’ ability to provide this one-on-one attention and informal mentoring may meet staff resistance. Interventions need to be adapted in ways that help staff members use the curriculum when needed without interfering in these less formal but developmentally important roles.
We did not have any recreation programs or purely mentoring programs in our sample, providing us with little evidence about whether these programs can successfully implement and use the GBG. It would be interesting, in future research, to see whether staff members in less academic programs could implement the GBG. It is possible that staff struggle with the back and forth between academic and mentoring roles when they try to perform both, but that staff in purely recreational programs might be able to adapt the GBG and implement it without the tensions we observed in our sample.
Working with certified teachers who were also employed during the school-day presented both advantages and challenges. Teachers easily learned the curriculum, likely due to their prior experience with curriculums and their knowledge of the theory behind the curriculums. But accessing the teachers for training, coaching, and data collection was challenging. Some suggestions for working with teachers in after-school programs include using program time for training and coaching, providing training during the summer, and designating a staff member as master interventionist to be the liaison for the research and intervention team. The master interventionist should not be the program director as they are often dealing with multiple daily issues that arise.
In addition, while some teachers and program staff were open to learning a new tool that might help them, some school teachers were frustrated with the request to learn another curriculum given the pressures on school teachers during the day and the frequency with which teachers in some schools are being asked to learn and implement new curricula for their day jobs. Managing this dynamic during the early stages of the intervention is extremely important. Careful recruitment efforts to make the intervention seem like a benefit not a burden or a penalty for poor performance may reduce this resistance.
Smaller challenges that were noted included using the GBG with small group sizes and with older children. Some adaptations to the curriculum for after-school programs should include revamping the rewards so they are meaningful for older children, describing ways to use GBG with smaller group sizes, and encouraging children to take leadership roles so that the GBG can be played while staff members are working one-on-one with students.
Finally, the presence of multiple staff members in the setting creates a clear challenge to effectively evaluating the effects of classroom-based interventions in after-school programs. Our data showed that many programs included within-program variation in quality across activities and staff members. In addition, children often move from one staff member to another over the course of the program, thereby exposing them to various levels of implementation fidelity and programming. This variation is not surprising, but researchers are still grappling with how to address these questions well in research on after-school programs (Smith & Hohmann, 2005).
First, we need to think carefully about how to document the overall quality of an after-school program when multiple staff members are present. Some measurement tools ask program observers to provide one score for the whole program. When multiple staff members are present, this requires observers to essentially average in their heads across the various contexts they have observed to generate a program-level score. Other tools require observers to record one score for each staff member or for each activity observed, but further research is necessary to understand the best way to aggregate from the individual staff person or activity to the program level. Various strategies can be used to aggregate the data, from selecting the scores of the staff member with whom children spend the most time to averaging across scores in the environment. Or it may be that variation within a program is not something to handle statistically, but has substantive meaning.
Second, this within-program variation has implications for data collection aimed at measuring changes in after-school program quality. If different staff members or different types of activities are observed pre-intervention and post-intervention, changes in the measure of program quality may not be caused by a true change in the quality of programming provided, but instead may be caused by simply measuring less effective staff pre-intervention and more effective staff post-intervention. With a large enough sample of programs, this measurement error should be random and should not influence results. But studying multiple programs is expensive. Evaluators need to intentionally manage the way their observers choose which program staff, program activities, and even days of the week they will observe to ensure that any error is randomly distributed and that program effects can be identified with a reasonable sample size.
The findings from this study must be tempered by some limitations. First, as in most pilot projects our sample size was small, thus a larger study is necessary to fully test the implementation and effectiveness of the GBG in after-school programs. Second, our data on implementation fidelity comes from only four after-school programs, making our results about implementation fidelity encouraging but not conclusive.
The findings from this study provide support for the notion that interventions designed and evaluated in school classrooms can be migrated into academically-oriented after-school programs. As more programs are set up in schools, and as governments increase their use of after-school programs to support academic achievement, it is likely that the number of programs to which classroom-based interventions could easily transfer is growing. While further research is warranted, we encourage researchers who are looking for ways to improve the quality of after-school programs to see whether interventions that have been successful in schools may meet their needs. This type of science migration may be an efficient and expedient way to provide the after-school field with effective quality improvement tools.
Policy and practice implications.
The migration of programs designed and tested in school-day classrooms into after-school programs can be successful, particularly when programs have an academic focus. After-school programs seeking evidence-based ways to improve their program quality may find helpful tools in the education field.
There is still ambivalence among after-school program staff about the goals of after-school programs and how to best meet children’s social, emotional, physical, and academic needs.
When school teachers are employed in after-school programs, identifying times when they are available for training and technical assistance is challenging.
There can be considerable variation in the quality of activities and staff members in the same after-school program. Evaluators must be aware of this variation as they design their evaluation methods.
Acknowledgments
We are grateful for support for this project from the William T. Grant Foundation (Grant #7392) and the Children, Youth and Families Consortium at Pennsylvania State University. We are also extremely grateful to the after-school program staff and children for participating in this project, and to the undergraduate and graduate students, staff, and coaches who worked hard to implement the intervention and collect the data.
Biographies
Dr. Kathryn Hynes is an assistant professor of Human Development and Family Studies at Pennsylvania State University. Her research focuses on the work-family decisions of parents and their implications for children, with particular attention given to the influence of social policies, social institutions, and current economic and social conditions on both parental decision-making and child outcomes.
Dr. Emilie Phillips Smith is a professor of Human Development and Family Studies and the director of the Center for Family Research in Diverse Contexts at Pennsylvania State University. Dr. Smith’s work is in the area of preventing youth problem behavior and promoting positive child and family development. Dr. Smith is particularly interested in approaches that develop partnerships across the home, school, and community contexts with attention to socio-cultural influences. She has been involved in a number of large-scale local and national prevention initiatives.
Dr. Daniel F. Perkins is a professor of family and youth resiliency and policy at the Pennsylvania State University. Dr. Perkins work involves teaching, research, and outreach through the Penn State Cooperative Extension Service. His scholarship involves the integration of practice and research into three major foci: (1) Positive Youth Development – decrease risks and increase skills and competencies of youth through evidence-based programs; (2) Healthy Family Development – increase resiliency through evidence-based, strength-based educational programming; and (3) Community Collaboration – promote strategies for mobilizing communities in support of children, youth, and families.
Footnotes
This paper was presented at the Society for Community Research and Action conference, Pasadena, CA, June 2007.
The short coaching period and implementation data collection period for the last four sites was due to the timeframe of the pilot project.
Contributor Information
Kathryn Hynes, Assistant Professor of Human Development & Family Studies, Pennsylvania State University, S118 Henderson Building, University Park, PA, 16802, kbh13@psu.edu Phone: 814-863-6422.
Emilie Phillips Smith, Professor of Human Development & Family Studies, Pennsylvania State University.
Daniel Perkins, Professor of Family and Youth Resiliency and Policy, Pennsylvania State University.
References
- Afterschool Alliance. America after 3pm: A household survey on afterschool in America, executive summary. 2004 Accessed on-line at: http://www.afterschoolalliance.org/press_archives/america_3pm/Executive_Summary.pdf.
- Barrish H, Saunders M, Wolf M. Good behavior game: Effects of individual contingencies for group consequences on disruptive behavior in a classroom. Journal of Applied Behavioral Analysis. 1969;2:119–124. doi: 10.1901/jaba.1969.2-119. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bodilly S, Beckett MK. Making out-of-school-time matter: Evidence for an action agenda. Santa Monica, CA: RAND Corporation; 2005. Available on-line at: http://www.wallacefoundation.org/WF/KnowledgeCenter/KnowledgeTopics/Out-Of-SchoolLearning/MakingOutofSchoolTimeMatter.htm. [Google Scholar]
- Chan D. Functional relations among constructs in the same content domain at different levels of analysis: A typology of composition models. Journal of Applied Psychology. 1998;83:234–246. [Google Scholar]
- Dumas JE, Lynch AM, Laughlin JE, Smith EP, Prinz RJ. Promoting intervention fidelity: Conceptual issues, methods, and preliminary results from the EARLY ALLIANCE Prevention Trial. American Journal of Preventive Medicine. 2001;20:38–47. doi: 10.1016/s0749-3797(00)00272-5. [DOI] [PubMed] [Google Scholar]
- Durlak JA, Weissberg RP. The Impact of After-School Programs That Promote Personal and Social Skills. Chicago, IL: Collaborative for Academic, Social, and Emotional Learning; 2007. [Google Scholar]
- Eddy J, Reid J, Fetrow R. An elementary school-based prevention program targeting modifiable antecedents of youth delinquency and violence: Linking the interest of families and teachers (LIFT) Journal of Emotional and Behavioral Disorders. 2000;8:165–176. [Google Scholar]
- Embry D. The Good Behavior Game: A best practice candidate as a universal behavioral vaccine. Clinical Child and Family Psychology Review. 2002;5:273–297. doi: 10.1023/a:1020977107086. [DOI] [PubMed] [Google Scholar]
- Embry DD, Straatemeier G, Richardson C, Lauger K, Mitich JE. The Pax good behavior game. Center City, MN: Hadelden; 2003. [Google Scholar]
- Halpern R. After-school programs for low income children: Promise and challenges. The Future of Children. 1999;9:81–95. [PubMed] [Google Scholar]
- Ialongo N, Poduska J, Werthamer L, Kellam S. The distal impact of two first grade preventive interventions on conduct problems and disorder in early adolescence. Journal of Emotional and Behavioral Disorders. 2001;9:146–160. [Google Scholar]
- James-Burdumy S, Dynarski M, Moore M, Deke J, Mansfield W, Pistorino C. When Schools Stay Open Late: The National Evaluation of the 21st Century Community Learning Centers Program: Final Report. 2005 Available at http://www.ed.gov.ies/ncee.
- Kam CM, Greenberg M, Walls CT. Examining the role of implementation quality in school-based prevention using the PATHS curriculum. Prevention Science. 2003;4:55–63. doi: 10.1023/a:1021786811186. [DOI] [PubMed] [Google Scholar]
- Kellam S, Anthony J. Targeting early antecedents to prevent tobacco smoking: Findings from an epidemiologically based randomized field trial. American Journal of Public Health. 1998;88:1490–1495. doi: 10.2105/ajph.88.10.1490. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kellam SG, Brown HC, Poduska JM, Ialongo NS, Wang W, Toyinbo P, Petras H, Ford C, Windham A, Wilcox HC. Effects of a universal classroom behavior management program in first and second grades on young adult behavioral, psychiatric, and social outcomes. Drug and Alcohol Dependence. 2008;95:S5–S28. doi: 10.1016/j.drugalcdep.2008.01.004. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kellam SG, Rebok GW, Ialongo N, Mayer LS. The course and malleability of aggressive behavior from early first grade into middle school: Results of a developmental epidemiologically-based preventive trial. Journal of Child Psychology & Psychiatry and Allied Disciplines. 1994;35:259–281. doi: 10.1111/j.1469-7610.1994.tb01161.x. [DOI] [PubMed] [Google Scholar]
- Kellam SG, Ling X, Merisca R, Brown CH, Ialongo N. The effect of the level of aggression in the first grade classroom on the course and malleability of aggressive behavior into middle school. Development and Psychopathology. 1998;10:165–185. doi: 10.1017/s0954579498001564. [DOI] [PubMed] [Google Scholar]
- Larner MB, Zippiroli L, Behrman RE. When school is out: Analysis and recommendations. The Future of Children. 1999;9:4–20. [PubMed] [Google Scholar]
- Mahoney JL, Lord H, Carryl E. An ecological analysis of after-school program participation and the development of academic performance and motivational attributes for disadvantaged children. Child Development. 2005;76:811–825. doi: 10.1111/j.1467-8624.2005.00879.x. [DOI] [PubMed] [Google Scholar]
- Mahoney J, Zigler E. Translating science to policy under the NoChild Left Behind Act of 2001: Lessons from the national evaluation of the21st-Century Community Learning Centers. Journal of Applied DevelopmentalPsychology. 2006;27:282–294. [Google Scholar]
- Marshall N. The quality of early child care and children’s development. Current Directions in Psychological Science. 2004;13:165–168. [Google Scholar]
- National Center for Education Statistics. Special Analysis 2005: Mobility in the teacher workforce. The Condition of Education. 2005 Accessed on-line 6/20/09 at http://nces.ed.gov/programs/coe/2005/analysis/sa07.asp.
- NICHD Early Child Care Research Network. Are child developmental outcomes related to before and after-school care arrangements? Results from the NICHD Study of Early Child Care. Child Development. 2004;75:280–295. doi: 10.1111/j.1467-8624.2004.00669.x. [DOI] [PubMed] [Google Scholar]
- Perkins DF, Borden LM. Key elements of community youth development programs. In: Villarruel FA, Perkins DF, Borden LM, Keith JG, editors. Community youth development: Practice, policy, and research. Thousand Oaks: Sage Publications; 2003. pp. 327–340. [Google Scholar]
- Policy Studies Associates, Inc and Wisconsin Center for Education Research. Study of Promising After-School Programs: Observation Manual for Site Verification Visits. 2005 Available on-line at http://www.wcer.wisc.edu/childcare/pdf/pp/observation_manual_spring_2005.pdf.
- Posner J, Vandell D. Low income children’s after-school care: Are there beneficial effects? Child Development. 1994;65:440–456. [PubMed] [Google Scholar]
- Rosenthall R, Vandell DL. Quality of care of school-aged child care programs: Regulatable features, observed experiences, child perspectives, and parent perspectives. Child Development. 1996;67:2434–2445. [PubMed] [Google Scholar]
- Scott-Little C, Hamann MS, Jurs SG. Evaluations of after-school programs: A meta-evaluation of methodologies and narrative synthesis of findings. American Journal of Evaluation. 2002;23:387–419. [Google Scholar]
- Seppanen P, Love J, deVries D, Bernstein L, Seligson M, Marx F. National Study of Before- and After-School Programs. 1993 Available on-line at http://eric.ed.gov/ERICDocs/data/ericdocs2/content_storage_01/0000000b/80/25/bc/b0.pdf.
- Shinn M. Community psychology: Methods of study. In: Kazdin Alan E., editor. Encyclopedia of psychology. Vol. 2. Washington, DC, US: American Psychological Association Oxford University Press; 2000. pp. 215–219. 2000. [Google Scholar]
- Smith EP. The role of afterschool programs in promoting positive youth development. Journal of Adolescent Health. 2007;43:219–220. doi: 10.1016/j.jadohealth.2007.06.010. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Smith C, Hohmann C. [Accessed 6/29/09];Full findings from the Youth PQA validation study. High/Scope Educational Research Foundation. 2005 online at: https://secure.highscope.org/file/EducationalPrograms/Adolescent/ResearchEvidence/WebFinalYouthPQATechReport.pdf.
- Tingstrom DH, Sterling-Turner HE, Wilczynski SM. The Good Behavior Game: 1969–2002. Behavior Modification. 2006;30:225–253. doi: 10.1177/0145445503261165. [DOI] [PubMed] [Google Scholar]
- U.S. Department of Education. 21st Century Community Learning Centers: Purpose. 2007 Accessed on-line 4/30/07 at http://www.ed.gov/programs/21stcclc/index.html.
- Vandell DL, Reisner ER, Brown BB, Pierce K, Dadisman K, Pechman EM. The study of promising after-school programs: Descriptive report of the promising programs. Report to the Charles Stewart Mott Foundation. 2004 [Google Scholar]
- Walker H, Horner R, Sugai G, Bulls M, Sprague J, Bricker D, Kaufman M. Integrative approaches to preventing antisocial behavior patterns among school-age children and youth. Journal of Emotional and Behavioral Disorders. 1996;4:194–209. [Google Scholar]