Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2020 Apr 4.
Published in final edited form as: J Technol Hum Serv. 2019 Apr 4;37(4):293–314. doi: 10.1080/15228835.2019.1588190

Using Connected Technologies in a Continuous Quality Improvement Approach in After-school Settings: The PAX Good Behavior Game

Yasemin Cava-Tadik 1, Emilie Phillips Smith 2, Dian Yu 3, Megan Leathers 4, Jaelyn R Farris 5
PMCID: PMC6936749  NIHMSID: NIHMS1018053  PMID: 31889926

Abstract

This demonstration study explored the use of connected technologies in a continuous quality improvement (CQI) approach to implementing evidence-based practices in after-school. Focus-group with staff indicated enjoyment of technology and offered feedback for future development. Ecological momentary assessments (EMA) were gathered daily. Three randomized conditions were compared among 4 programs and 12 staff implementing PAX Good Behavior Game (PAX GBG). ANOVA, Post-Hoc Tukey and Chi-square analyses indicated that the tech-enhanced condition showed better implementation using scoreboards than the in-person, while similar in game length. Both were superior to the control in behavioral strategies; highlighting the promise of technology in capacity-building.

Keywords: after-school, coaching, mobile-technology, ecological momentary assessment, continuous quality improvement, technical assistance, implementation, prevention, children, staff

The Significance of After-school

After-school is a critical time for youth because the majority of all illegal behavior by minors happens between 3 pm to 7 pm in the United States (OJJDP Statistical Briefing, 2018; Snyder & Sickmund 2006). An important issue facing families is the supervision of their children after-school. According to the Bureau of Labor Statistics (2018), over 71% of all mothers, and 62% of married parents have children under age 18, and both work outside the home. After-school programs have blossomed in the United States (U.S.) since they provide important support for students and families, often helping to connect them to critical sources within the community (Maynard, Peters, Vaughn & Sarteschi, 2013; E. Smith & Bradshaw, 2017a).

Over the past 2 decades, research has demonstrated the benefits of after-school programs on youth development (Hillman et al., 2014; Huang, Gribbons, Kim, Lee, and Baker, 2000; Newell, Zientek, Tharp, Vogt & Moreno, 2015; Posner & Vandell, 1994, 1999; E. Smith, Witherspoon, & Osgood, 2017b; Zarrett, Abraczinskas, Skiles Cook, Wilson & Ragaban, 2018). However, evaluating the effectiveness of after-school programs is difficult due to limited reports of intervention fidelity data and inadequate training and support (Maynard et al., 2013). The recent study by E. Smith, Osgood, Oh & Caldwell, (2018), emphasized the importance of implementation fidelity in fostering quality after-school programs that promote positive youth development.

Using Evidence-Based Practices in Quality After-School Programs

After-school is a significant context in which fostering the professional development of adult staff can lead to higher quality after-school programs (Akiva, Li, Martin, Horner, & McNamara, 2017; C.Smith et al, 2012; E. Smith, Osgood, Oh & Caldwell 2018). Quality in after-school programs has been defined often operationalized as supportive relationships, intentional programming, and strong partnerships by a body of qualitative and quantitative studies (Eccles & Gootman, 2002; Katoaka & Vandell, 2013; Palmer, Anderson, & Sabatelli, 2009; Yohalem & Wilson-Ahlstrom, 2010). Moreover, researchers have found that after-school programs (ASP) that implement evidence-based practices are found to be most beneficial in terms of youth outcomes (Durlak, Weissberg & Pachan, 2010; Gottfredson et al., 2004; Pierce, Bolt & Vandell, 2010; Tebes et al., 2007).

Unfortunately, there are challenges to implementing evidence-based practices and fostering adult staff professional development in order to establish quality programs. The training and turnover among staff members, and lack of interest in organized activities can make implementation difficult (Kennedy, Wilson, Valladares & Bronte-Tinkew, 2007; Lauver, Little, & Weiss, 2004). Luckily, there are technology-based approaches in the field that are being developed to foster a process of continuous quality improvement (CQI), that is assessment, training, coaching, and data feedback, approaches that have evidenced effects upon adult staff (Browne, 2015; Fixsen, Blase, Metz & Van Dyke, 2013; Sheldon, Arbreton, Hopkins and Grossman, 2010). Scholars surmise that initiatives that build the capacity of youth program leaders and staff to engage in data-informed processes result in increased staff program monitoring and improved practices (Halgunseth et al., 2012; Wandersman et al., 2008). The value of these approaches is that they go beyond single workshops to investigate how to implement and sustain quality evidence-based practices in after-school settings. As such connected technologies, e.g. web-based training, /text prompts and messages, and rapid data input and feedback, could be helpful, convenient tools in continuous staff training and support.

Technology and After-school Learning Settings

Today, the vast majority of people have access to technological devices. While 70% of young adults and 14% of seniors were using the internet in 2000, current estimates show that now 96% of young adults and 58% of older adults reported using the internet in 2015 (Serrano et al., 2017). Broadband Internet access shows a 12% decrease, while internet access via mobile devices has increased 53% from 2008 to 2014 (Serrano et al., 2017). The current study is focused on understanding how technology can be applied in implementing evidence-based practice for after-school programs. This topic is under-researched, and could provide more convenient and economical solutions for thefield (Branscum, Housley, Bhochhibhoya & Hayes, 2016).

Mobile technology supplies novel approaches to learning (Boticki, Baksa, Seow & Looi, 2015; Zhang, Wang, de Pablos, Tang & Yan, 2015). It fosters more collaborative and flexible learning environments in terms of when and where learning can occur (Boticki et al., 2015; Crompton, Burke, Gregory & Grabe, 2016; Peña-Ayala & Cárdenas, 2016; Shuib, Shamshirband & Ismail, 2015; Wu et al., 2012; Zydney & Warner, 2016). Connected technologies, such as online learning, video conferencing, and text messages allow learners to connect to each other and to experts (Hew & Brush, 2007) and have been used effectively in the fields of education and prevention to provide training and technical assistance and to promote beneficial practices (Downer, Kraft-Sayre, & Pianta, 2009; Vernon-Feagans, Kainz, Hedrick, Ginsberg & Amendum, 2013; Warren, et al., 2006). Mobile learning makes information and online resources more accessible to learners, and fosters greater social interaction (Suanpang, 2012).

According to the study about both higher-quality and lower-quality after-school sites, although the majority of staff and managers at the lowest-performing programs reported having inadequate access to technology (e.g., internet and computers), the staff and managers at the highest-performing programs reported satisfactory access to technology (Khashu & Dougherty, 2007). Even when staff members have access to technology, they do not routinely use technology due to the lack of sufficient time for its proper use (Clark, 2001). However, when staff members receive training on how to use technology, they tended to integrate technology into their routine practices (Adada, 2007).

Technology might also be beneficial in improving the effectiveness of youth-serving after-school programs. In fact, some research has already been done showing improved academic performance, and reduced substance use in after-school via technology-assisted or based programs (Craig et al., 2013; Hu et al., 2012; Schinke, Schwinn, & Ozanian, 2005). Schinke et al., (2005) used a10-session CD-ROM prevention program that provided awareness of the harmful effects of alcohol with games, videos or graphics. Results of this study demonstrated that the tech-enhanced program helped to reduced substance use and increase awareness of the maleficent impact of alcohol use (Schinke et al., 2005). However, adults in youth-serving organizations require support and technical assistance in order to build their capacity to integrate evidence-based practices into after-school settings (Junge & Manglallan, 2001).

The PAX Good Behavior Game: Fostering Settings Change in After-School

One example of an evidence-based practice tested in after-school is the Good Behavior Game, a universal prevention program designed 40 years ago to increase academic performance and to decrease disruptive behaviors for elementary school-aged children (Barrish, Saunders, & Wolf, 1969; Kellam et al., 2008; E. Smith et al., 2017b; Weis, Osborne, & Dean, 2015). Paxis Institute has developed a pre-packaged, manualized, commercialized version called PAX Good Behavior Game (PAX GBG, “pax” means peace in Latin). Referred to as a “behavioral vaccine” (Embry, 2002; Embry, 2004), PAX GBG is a peer-assisted intervention (Kohler & Strain, 1990) that relies on interdependent group contingencies to improve children’s behavior and selfregulation (Maggin, Johnson, Chafouleas, Ruberto, & Berggren, 2012; Wright & McCurdy, 2012). The strategy is based on social learning theory (Bandura, 1977), in which desirable behaviors are reinforced, encouraged and maintained by modelling appropriate behaviors, and by providing contingent group activity rewards for self-regulation and peer group co-regulation. Moreover, in addition to adults providing appropriate structure and support, “positive peer pressure” or “peer co-regulation” also plays a key role (Embry, 2002) because the children can encourage their teammates to engage in pro-social behaviors that benefit both themselves and the group.

In previous research, when PAX GBG was properly implemented in ASP, the programs improved in quality (i.e. less adult harshness, increased appropriate structure, adult support, and engagement) and participating youth reported more prosocial behavior (i.e., sharing, caring, and listening), and less hyperactivity (E. Smith et al., 2018; E. Smith, et al., 2017b). Overall, proper implementation of the evidence-based PAX GBG in the after-school context improved program quality and youth developmental outcomes.

Even an evidence-based intervention, however, can yield diminished returns if poorly run (Durlak & DuPre, 2008). Technology may be a way to combat the barriers to effectively implementing evidence-based programs during after-school. In terms of approaches to implementing evidence-based practices effectively and efficiently, the Interactive System Framework is a conceptual model that suggests that the prevention support system (PSS) (technical assistance to staff ) and the prevention delivery system (PDS) (staff implementing prevention programming) are critical to effective implementation (ISF; Wandersman et al., 2008). Moreover, Sheldon, Arbreton, Hopkins and Grossman’s (2010) model for continuous quality improvement (CQI) demonstrates the value of consistent training, on-site observation and coaching (i.e. the prevention support system), as well as ongoing program data collection, analysis, and feedback in fostering optimal levels of implementation. In traditional models of training and implementation, experts provide face-to-face training to staff in order to foster their use of evidence-based practices (Bertram, Blase, & Fixsen, 2015; Flaspohler, Lesesne, Puddy, Smith & Wandersman, 2012). However, it can be challenging to ensure proper implementation due to the physical distance between the ASP sites and the trainers, and the limited money and time available for additional support personnel. Ecological momentary assessments (EMA) in which staff provide more “in the moment, real time” data on implementation, receiving regular feedback into their practice, might prove to be helpful in fostering implementation of evidence-based practices (Smyth & Stone, 2003). Thus, using mobile technology to implement evidence- based practices may be an effective strategy in that mobile technology might be less expensive and equally or more effective than face-to-face training, at least in some settings (Foster & Jones, 2006; Fullard, Fowler, & Gray, 1987; Walters, Wright, & Shegog, 2006).

In this pilot demonstration study, we test the effectiveness of using technology to support after-school program staff in implementing evidence-based practices. In doing so, we build upon the ISF prevention support system conceptual model to test whether training using connected technologies (i.e. online training videos, text message reminders and feedback) can be as effective as face-to-face approaches in fostering quality implementation of evidence-based practices (the prevention delivery system). The conceptual model guiding this study is shown in Figure 1.

Figure 1.

Figure 1.

Conceptual model of CQI approach

In our conceptual model of utilizing technology for effective implementation of PAX GBG in ASP, the training (the Prevention Support System) is delivered via mobile devices provided to staff on which staff provide brief ecological momentary assessment data (Smyth & Stone, 2003) on the frequency and duration of their implementation immediately following the conclusion of daily programming. Coaches examine staff data and observe staff practice using onsite videos (PSS); within one week interval coaches provide feedback to staff to further improve practice (Prevention Delivery System). In essence, our model suggests that effective training and continuous quality improvement can be achieved via mobile technology without the training experts being on site. This model is compared to traditional in-person modalities and to program sites with no additional technical assistance, comprising a treatment-as-usual control group.

Research Questions

With the increasing popularity of and access to mobile technology, this study is intended to explore effectiveness in fostering implementation of evidence-based practices by training after-school program staff. We were specifically interested in:

  1. The degree to which technology could be used to enhance the training and technical assistance provided to after-school caregivers using PAX GBG. We provide both descriptive qualitative information and quantitative data to address this question.

  2. Whether technological approaches could be as effective as in-person training and technical assistance in fostering implementation of PAX GBG and superior to a treatment-as-usual control condition. We conduct multiple statistical tests to examine initial efficacy of technological support.

Given that our previous research has demonstrated the benefits of PAX GBG both for the after-school setting and the youth developmental outcomes with quality implementation, our research questions concern the degree to which technology can foster implementation of PAX GBG. In our previous work, training and technical assistance was provided in-person, which allowed our coaches to develop considerable trust and sensitivity to the issues facing staff. In this study, we were curious to learn whether technology, which might be perceived as slightly less personal, although more accessible in terms of using videos for training and boosters, more instantaneous provision of data on implementation, might be as effective as our in-person approaches. We provide qualitative and quantitative descriptive information on staff perceptions of the use of PaxGBG and the support delivered via connected technologies. We also explore whether both tech-enhanced and in-person implementation of PAX GBG were superior to the normal routines of after-school programs (e.g. control condition) potentially using evidence- based practices.

Methods

This study is part of the LEGACY Together After-school Research Project. LEGACY Together was a multi-site randomized control trial examining the PAX Good Behavior Game. Subsequent to our randomized trial, the aims of this smaller pilot demonstration study were to test the feasibility and implementation of a technological approach to training and coaching in PAX GBG, which was found to be effective in the randomized trial.

We partnered with a small, community-based foundation within a 30-mile radius of the university that we identified providing free after-school programming to children living in three different housing projects and one in a local church. The program provider was in a small city with a population of 46,320, 18% of the residents were living below the poverty level. Approximately 93% of the city was demographically White/European American, 3% Black or African American, and the remaining 4% Hispanic or Latino, Mixed Race, Asian, or Other. Having a site proximal to the university allowed us to deliver both the in-person and technological components. There were total of 123 children enrolled in the programs. The programs ran from approximately 3–6 pm every weekday.

Design

An experimental design was used in which programs were randomly assigned to condition by participating staff from each program who participated in a coin toss. To gather information on the smartphone app prototype and more fully test the innovative technological approach in this pilot project, two sites were assigned to the tech-enhanced coaching experimental sites, one assigned to the in-person coaching site, and the other to the business-as- usual control site. All staff in the tech and in-person coaching sites received two trainings in PAX GBG; once in the beginning of December after pre-data collection, and another refresher in January.

The in-person model included multiple interactive 3-hour training sessions and 8 weekly coaching sessions that occurred onsite at the program. Because after-school staff are often part-time with other job or educational demands, the coach was flexible in meeting staff before or after the program, often providing some assistance with routine activities and modeling of PAX GBG during programming. The in-person site received a coach visit once a week over 8 weeks for an entire program day. The coach provided the staff with a calendar upon which the staff could indicate how many strategies were used, the number of games played, the number of teams that “won,” and the prizes given to winners as ways of verifying and monitoring implementation. The coach would discuss with the staff their progress or any issues that arose.

The tech-enhanced sites were involved in testing the use of a prototype of a smartphone application and website designed to assist after-school staff in their implementation of PAX GBG. The training model was similar to the in-person sites in that training was provided in person in order to assist staff with the future use of the technology. The distinction was that coaching was provided via a process in which staff used project-provided smartphones to view online videos of programmatic strategies, and to videotape their use of the PAX GBG strategies (using only consenting children). These videos were then uploaded to a secure, password-protected project coaching website where the PAX GBG coach at the university could see the videos and provide brief weekly feedback on the site’s progress. Additionally, the tech-enhanced sites entered daily data on mobile devices provided, regarding their implementation of the PAX GBG strategies and game. Staff in tech-enhanced sites also received an online weekly Feedback Report every Monday morning summarizing both their personal progress and the site’s overall progress. The template form for entering daily data was called the Staff Daily Data Entry Log (Daily Data Log for short). The Tech-Enhanced sites entered their daily data into the Staff Daily Data Entry Log on a smartphone application which stored the data on a secure, password-protected website (i.e., for security, data was not stored on the smartphones). Both of the logs were the same, just one in print and the other electronic. The Daily Data log was comprised of 7 questions related to PAX GBG implementation such as which strategies they used, if they played a game, and if so, for how long, with how many teams, and whether they won a prize. Staff were entered into a weekly, random drawing for a gift-card, but only if they had entered daily data for every day of program operations during the week.

Staff at the Business as Usual site, along with staff of the other sites, received a 1–2 day training in PAX GBG by another source a year prior to this pilot study; thus, some staff might have recalled the strategies and used them, though the probability of this may have been low (Fixsen et al., 2013). Thus, we assess the degree to which the business-as-usual control site was using PAX GBG strategies without additional coaching or technical assistance.

Participants

Staff were engaged in the decision for their sites to participate in this novel community-engaged research study (E. Smith et al., 2014). The program director and staff agreed for their programs to participate and signed consent forms. The total number of staff members who were enrolled in the study was 12. During the course of the pilot two staff left the program (one at a tech-enhanced site and the other at the business-as-usual site) and one joined (at the business-as-usual site). Table 1 describes the demographic characteristics of program staff. The majority of the staff members reported their level of educational attainment as four year of college. The staff members were overwhelmingly female (91.7 %), and white (91.7 %). With regard to age, half of the staff members self-reported their age to be older than 45 years. Staff were initially assessed and trained in December of one academic year and implementation was conducted during the spring semester of the following year, for approximately 12 weeks. Even though the group sizes are unequal, each of the four sites had 2–3 staff members who served 25 – 35 children in kindergarten through 6th grade.

Table 1.

Demographic Characteristics of the Staff Members

Variable Total
Tech-based
In-person
Control
N % or M(SD) N % or M(SD) N % or M(SD) N % or M(SD)

Staff characteristics
Gender
  Female 11 94.4% 5 83.3% 2 100.0% 4 100.0%
  Male 1 5.6% 1 16.7%
Ethnicity/Race
  White 11 94.4% 5 83.3% 2 100.0% 4 100.0%
  African-American 1 5.6% 1 16.7%
Education
  Some college 1 8.3% 1 25.0%
  Two-year college 2 11.1% 2 33.3%
  Four-year college 4 47.3% 1 16.7% 2 100.0% 1 25.0%
  Some graduate 3 19.4% 2 33.3% 1 25.0%
  Master’s degree 2 13.9% 1 16.7% 1 25.0%
Age
  18–21 1 8.3% 1 25.0%
  22–25 3 22.2% 1 16.7% 2 50.0%
  26–35 2 22.2% 1 16.7% 1 50.0%
  36–45
  >45 6 47.3% 4 66.7% 1 50.0% 1 25.0%

Procedures and Measures

PAX GBG is a cooperative game played among teams of children who earn group-based activity rewards by minimizing off-task behavior. Teams are comprised of 4–5 children who vary in age, gender, and other behavioral characteristics to maximize the potential for each team to succeed. Teams displaying 3 or less misbehaviors within the given time period (usually 3–30 minutes) “win” the game. PAX GBG is a manualized, pre-packaged, user-friendly version developed by the Paxis Institute with additional components, including a common language and symbolism that adults and children use to encourage on-task and socially sanction off-task behaviors (Embry et al., 2010). PAX GBG also includes a number of strategies or “kernels” that are used to promote positive behavior (Table 2). Staff reported either on paper or on mobile technologies the number of strategies they utilized daily. Across the pilot period, the number of kernels were summed across the staff in the sites to examine overall use of the behavioral strategies.

Table 2.

PAX GBG Strategies

Strategy Meaning

Peer Tootle Notes Written note of behaviorally-oriented praise by peers
Staff Tootle Notes Written note of behaviorally-oriented praise by adult staff
Go Pax/Stop Spleems Beat the Timer Visual cues encouraging good behaviors and ceasing disruptive behaviors Timed transition period for wrap-up and cleanup
Pax Quiet Visual and sound cues reminding youth to be quiet
Pax Language Verbal cues reminding youth of positive languages
Pax Voices Verbal cues reminding youth of appropriate speaking volume in different settings (e.g. varying times at which whispering, talking, or loud talking might be allowed contingent upon
Pax Hands setting, for example, 10-foot voices would be allowed on playground) Verbal cues reminding youth to keep hands to themselves
Pax Feet Verbal cues reminding youth to walk quietly

The use of PAX GBG scoreboard was reported daily by each staff member and coded as “yes/no”. Staff were expected to record the score of the PAX GBG on the calendar-like scoreboard each day throughout the week. This indicator was measured by staff self-report of whether they used the scoreboard each day. The use of the scoreboard is important in that it contains the team names and members, and tracks the number of wins for each team; an indicator of implementation fidelity. PAX GBG typically begins with shorter games and gradually increases in length as youth begin to better self-regulate. However, games longer than 30 minutes are more difficult to monitor and less likely to be contingently rewarded. The minutes of PAX GBG play per game was reported by each staff member and coded as “3–5 minutes”, “6–9 minutes”, “10–15 minutes”, “16–25 minutes”, “26–30 minutes” and “over 30 minutes”.

Following implementation, a focus group was held with all of the sites to learn about their perceptions of staff implementing PaxGBG using the connected technologies. Nine afterschool staff participated with differing levels of access to the technology component. Staff talked with a faculty member and staff who were not involved in the pilot for almost 2 hours about their experiences with PaxGBG and the tech enhancement. The group members were candid and honest about their thoughts regarding PaxGBG overall and the technology component.Some of the findings of the focus group are reported in this paper.

Analyses

The focus group data provided insight into the experiences of the staff receiving various modes of technical assistance, invaluable insight for future refinement of the technological protocols. In addition to this more descriptive data, Statistical analyses were conducted comparing the three conditions (tech-enhanced, in-person, and control) on several implementation indicators: use of PAX GBG scoreboards, the number of PAX GBG’s played per condition, the minutes of PAX GBG play per game, and the sum of use of PAX GBG kernels. Moreover, we conducted paired chi-square tests between each condition and corrected the alpha based on number of tests conducted to avoid type I error. Given that there were different numbers of staff in different conditions, the one-way ANOVA analyses were conducted by summing the number of games played by program staff throughout the pilot study and then conducting an analysis of variance to look for differences between conditions.

For categorical and ordinal variables, we used Chi-Square tests to test the differences between the conditions. For interval level data, (e.g. Number of Games played, the sum of PAX GBG kernels), the conditions were compared using one-way ANOVA, and Post-Hoc Tukey tests was used to compare the differences between each the conditions.

Results

The current study focused on the following research questions:

  1. The degree to which technology could be used to enhance the training and technical assistance provided to after-school caregivers using PAX GBG, using both descriptive qualitative information and quantitative data to address this question.

  2. Whether technological approaches could be as effective as in-person training and technical assistance in fostering implementation of PAX GBG and superior to a treatment-as-usual control condition. We conduct multiple statistical tests to examine initial efficacy of technological support.

Staff were amenable to using ecological momentary assessments (EMA) via connected technologies such as online-videos, and the use of text messages for data entry and feedback on weekly implementation. The following section describes their perceptions from the focus groups.

Description of Staff Perceptions of Connected Technologies in Fostering CQI

Overall, the staff overall liked the app developed for entering data and receiving feedback. They responded that the ease of inputting the information for the daily implementation logs was great. The afterschool staff recognized that the technology did foster more accountability by having to enter information in daily texts at the end of the daily programming that was automatically uploaded to the database. Staff also indicated that they enjoyed recoding their activities on video and that “the kids liked it too.”

However, in some ways the technology was cumbersome in that the lack of wireless signals in some of their buildings limited their ability to upload videos of their implementation. They were concerned that they wanted to be “in the moment” with children and not distracted by the phones and technology (staff were given the option to upload videos at the end of the day’s programming). They were also concerned that parents might enter and not understand the role of the cellphones and in general would have preferred tablets that would be larger and easier to read. While some staff saw the value of learning collectively via sharing videos, others did not, particularly when few children were consented and could not be included in the videos. Sites doing well in implementation began to find the text messages with feedback on implementation redundant and impersonal. In general, staff desire personalized feedback and thought about having a “virtual coach.” Staff inquired “Could we set the coach up on a tripod?” Though the initial response to using technology indicated great accountability and opportunity for learning, they highlighted the value of personalized feedback and personal coaching using technology. Importantly, staff emphasized that the in-person training on the technology in which the coach was introduced, was critical to bonding and affinity with the coach, whom they perceived as quite supportive. Even with these areas for future product and training development, the following statistical analyses examine implementation across the tech-enhanced, in-person, and control conditions.

Overall, the statistical analysis revealed that both the in-person condition and the tech-enhanced condition outperformed the treatment-as-usual control condition in the use of PAX GBG scoreboards, the number of games played, the minutes of PAX GBG play per game, and the sum of use of PAX GBG kernels.

Use of Scoreboard

The first indicator of implementation to be analyzed was whether staff used the PAX GBG scoreboard; these results are shown in Table 3. The chi-square test revealed that there wasa significant difference in the use of the scoreboard among the different conditions (χ2 = 15.44, p <.001). Paired Chi-square tests of each two conditions were also conducted. Alpha was corrected (α= .05/3 = .017) based on the number of tests conducted in order to avoid type I error. The in-person condition did not differ in the use of Scoreboard from control condition (χ2 = 2.14, p =.14). Tech-enhanced condition which required daily data entry, outperformed both the in-person condition (χ2 = 9.30, p <.017) as well as the control condition (χ2 = 17.95, p <.017).

Table 3.

Descriptive Results of Use of Scoreboard

Strategy Control In-Person Tech-Enhanced
Scoreboard 18.6%b 31.6%a 55.8%a, b

Note. The descriptive number is the percentage of days that used the scoreboard. a, b, c indicate statistically significant differences between the designated conditions.

a, b, c indicate statistically significant differences between the designated conditions.

Number of Games Played

Given that there were two sites in the tech-enhanced condition but only one site in the inperson condition and one site in the control condition, there were different numbers of staff members in each condition and analyses were conducted by summing the number of games played by each staff member and then conducting an analysis of variance to look for mean differences between conditions. The descriptive results and comparison results are shown in Table 4. One-way ANOVA analysis revealed that there was marginally significant difference in the mean number of games played across conditions (F2,7 = 4.60, p = .074). A post-hoc Tukey test showed that the trend was driven by a marginally significant difference between the mean number of games played in the control versus tech-enhanced condition (p = .064). All other pairwise comparisons were non-significant, suggesting that the number of games played did not differ significantly between the in-person and tech-enhanced conditions (Table 4).

Table 4.

The Mean Number of Games and Range Played Per Staff Member across Conditions

Condition Number of Staff
Members
M (SD) Range F

Control   2 1.00 (.00) 1 – 1 4.60
In-person   2 16.50 (9.19) 10 – 23
Tech-enhanced   4 26.00 (11.11) 15 – 37

Note. p<.l

*

p<.05

**

p<.01

***

p<.001

Minutes of PAX GBG Play Per Game

Ideally, when children are learning to play the game, shorter periods of 3–5 or 6–9 minutes are preferred, allowing for more immediate positive reinforcing activities, gradually increasing the minutes of play. The descriptive results are shown in Table 5. Chi-square test revealed that time spent playing games differed across conditions (χ2 = 61.84, p < .001). Paired chi-square tests of each two conditions were also conducted. Alpha was corrected (α= .05/3 = .017) based on the number of tests conducted in order to avoid type I error. The majority of tech-enhanced and in-person sites played the game for 3–5 minutes (39.4% for in-person condition & 34.4% for tech-enhanced condition) and 6–9 minutes (57.6% for in-person condition & 47.9% for tech-enhanced conditioAt these levels of play, there was no statistically significant difference between these two conditions (χ2 = 4.42, p = .11). When a game was played at the control sites, it was played for 10–15 minutes almost 67% of the time and 16–25 minutes 33% of the time. The length of the game in the control condition was significantly different from both the in-person (χ2 = 32.86, p < .017) and tech-enhanced conditions (χ2 = 43.51, p < .017).

Table 5.

Distribution of Minutes of Games Played across Conditions

Strategy Control In-Person Tech-Enhanced

3–5 min 0% 39.4% a 34.4% a
6–9 min 0% 57.6% a 47.9% a
10–15 min 66.7% b, c 3.0% b 17.7% c
16–25 min 33.3% b, c 0% b 0% c
26+ min 0% 0% 0%
a

Note.2 = 4.42, p = .11), no statistically significant differences between in-person and tech-enhanced.

b

2 = 32.86, p < .017), statistically significant differences between inperson and control.

c

2 = 43.51 , p < .017), statistically significant differences between tech-enhanced and control.

Use of PAX GBG Kernels

The sum of the use of kernels (i.e. Peer Tootle Notes, Staff Tootle Notes, Go Pax/Stop Spleems, Beat the Time, PAX Quiet, PAX Voices, PAX Language, PAX Hands & PAX Feet) per staff member per day was compared across conditions. The descriptive and comparison results are shown in Table 6. One-way ANOVA revealed that there were statistically significant differences in the mean use of PAX GBG Kernels across conditions (F2,226 = 12.44, p <.001). Post-hoc tests further revealed that both in-person condition (p <.001) and tech-enhanced conditions (p <.001) outperformed the control condition in how many helpful kernels being used to promote self-regulation. However, there was no significant difference between the in-person and tech-enhanced conditions (p = .165).

Table 6.

Sum of Use of Kernels across Conditions

Condition M SD F

Control 4.07a, b 2.00 12 44***
In-person 5.79a 1.84
Tech-enhanced 5.8b 1.60

Note. p<.l

*

p<.05

**

p<.01

***

p<.001

When two conditions have the same alphabetic letter, it means the two measures are significantly different.

Summary and Discussion

Our work demonstrates that connected technologies can be used to foster implementation of PAX GBG, providing prompts and gathering real-time data on implementation of after-school programming for both the staff and youth in these community-based organizations. Ecological momentary assessements through the use of a smartphone application, an SMS (text message) reminder system and website feedback,tech-enhanced implementation of PAX GBG were observed similar or better compared with in-person condition and superior to the control condition.

The use of a scoreboard is a clear indicator that sites are using PAX GBG and tracking student behavior. There was a significant difference in the use of the scoreboard among the different conditions. The tech-enhanced condition with daily and weekly mobile data prompts and reminders demonstrated better implementation results than both the in-person and control conditions regarding use of the scoreboard. The staff in the in-person and tech-enhanced conditions also reported a larger mean number of games per staff member; however, the differences were marginally statistically significant. Lastly, we examined the minutes of game play in which smaller, gradual increases in length are indicated while longer games are usually less focused, unrewarded play. The control condition was significantly different from both the in-person and tech-enhanced conditions, evidencing longer games of 10–25 minutes while both the tech-enhanced and in-person conditions most frequently reported shorter games of 3–5 and 6–9 minutes in length. There were no statistically significant differences between the in-person and tech-enhanced conditions on game length. Additionally, both the in-person and tech-enhanced conditions outperformed the control condition in using helpful kernels or strategies like verbal cues for walking softly, using soft voices and visual symbols of praise and encouragement for good behavior. The in-person and tech-enhanced condition sites were similar in their use of these helpful behavioral management strategies that promote youth self-regulation.

In terms of staff perceptions of the use of technology, they were open to using technology when they have adequate internet access, but do not want it to distract from their important job of supporting children in afterschool. Though they were open to using technology, staff made suggestions for improving the technological protocol that would allow a “virtual coach,” who could be present via the use of technology and facilitate more personalized feedback on their implementation.

In this small pilot study in after-school programs, we built integrated systems of online training and technical assistance coupled with real-time data for rapid monitoring and adaptation of best practices in community-based organizations. Although there is not much prevention research about technology-based training versus online training, a few previous evidence-based studies suggested that online training can be as effective as in-person training (Becker, Bohnenkamp, Domitrovich, Keperling & Ialongo, 2014) or better than in-person training (Gold, 2001). With appropriate training and technical assistance, we also found these connected technologies could promote accountability and implementation and the mobile prompts might have facilitated superior performance. However, the feedback and assistance should be interesting, novel, personalized, and helpful. The goal of this phase of the project was to begin to test alternative, hopefully less resource-intensive approaches to training and technical assistance using technology, particularly for mobile, often part-time, paraprofessional yet caring after-school staff. Considering the costs of staff resources, space, and potential travel to sites that may be local or more distant, technology, with some personalization, has the promise of being more cost-effective in implementing evidence-based practices.

As the model implies with the multi-directional arrows between the PSS and the PDS, providing support is best provided when informed by real-time observed or staff-shared examples and data. Also, the mere act of data collection might attune staff to the strategies being used or not being used. Mobile technology with data prompts and feedbacks, and sharing hold promise for fostering implementation of evidence-based practices.

Limitations and Directions for Future Research

Based upon some of the staff feedback in the focus groups, in the future, afterschool programs that provide video/photo consenting for families during the enrollment process could facilitate adequate numbers of children included in the video-observational, technical assistance process. Also, to use technology effectively, afterschool programs would need consistent and effective internet access. Importantly, as promising as technology is, it is important that the CQI process still help staff feel that they are being personally supported.

In our statistical analyses, there were limitations as well. There were some differences based on the participants for each condition site. The tech-enhanced site had a greater proportion of 6th grade and Caucasian students, and the staff members were overwhelmingly female and White. Also, this study focused upon a small number of purposively selected sites, thus, limiting the potential generalizability of these findings to other sites or cities. However, this was a pilot study, utilizing random assignment to condition to demonstrate the acceptability and effectiveness of mobile applications in fostering the implementation of evidence-based practices. With randomization, even in this small context, differences in implementation are less likely due to uncontrolled variables and more attributable to the experimental design (Shadish, Cook, & Campbell, 2002). This tech-enhanced method of training and coaching could be potentially more accessible and affordable for after-school programs that often struggle to find the time, funding and trained staff members to effectively implement evidence-based programs.

As this was a pilot study with only one after-school organization, further research will be needed to further study this tech-enhanced training and continuous improvement system in additional, larger after-school programs with more staff. Despite this, this work demonstrates that the future use of technology in after-school settings to promote positive youth development is well worth further exploration.

Acknowledgments

This study, the design, measures, and procedures were approved by the University IRB Board. We acknowledge funding support from William T. Grant Foundation [Grant # 8529]; the Wallace Foundation [Grant #20080489]; and the National Institute for Drug Abuse [Grant # R01 DA025187]. We thank Dr. Dawn P. Witherspoon, the Pennsylvania State University for assistance with focus group data analysis that greatly improved the manuscript. We are also grateful for the staff, parents, and children whose participation made this study possible.

Footnote:

Yasemin Cava Tadik, Ph.D. Candidate yc25066@uga.edu / Phone: 706–542-4939 Human Development and Family Science, University of Georgia 373 Dawson Hall, 305 Sanford Dr., Athens, GA 30602

Dr. Emilie Smith, Ph.D. Emilie.Smith@uga.edu / Phone: 706–542-4831 Distinguished Professor and Department Head, Human Development and Family Science, University of Georgia 123 Dawson Hall, 305 Sanford Dr., Athens, GA 30602

Dian Yu, Ph.D. Candidate dy73046@uga.edu / Phone: 706–542-4905 Human Development and Family Science, University of Georgia 123 Dawson Hall, 305 Sanford Dr., Athens, GA 30602

Megan Leathers, B.S. pearl419@gmail.com Research Technologist Psychology / Crime, Law and Justice, The Pennsylvania State University

Dr. Jaelyn R. Farris, Ph.D. jrfarris@ysu.edu / Phone: 330–941-3406 Assistant Professor, Department of Psychology, Youngstown State University One University Plaza, Youngstown, OH 44555

Contributor Information

Yasemin Cava-Tadik, University of Georgia.

Emilie Phillips Smith, University of Georgia.

Dian Yu, University of Georgia.

Megan Leathers, The Pennsylvania State University.

Jaelyn R. Farris, Youngstown State University

References

  1. Adada NN (2007). The role of technology in teachers’ professional development. The University of Southern Mississippi. [Google Scholar]
  2. Akiva T, Li J, Martin K, Horner C, & McNamara A (2017). Simple Interactions: Piloting a Strengths-Based and Interaction-Based Professional Development Intervention for Out-of-School Time Programs. Child & Youth Care Forum, 46(3), 285–305. doi: 10.1007/s10566-016-9375-9 [DOI] [Google Scholar]
  3. Bandura A (1977). Social learning theory. Englewood Cliffs, NJ: Prentice Hall. [Google Scholar]
  4. Barrish HH, Saunders M, & Wolf MM (1969). Good behavior game: Effects of individual contingencies for group consequences on disruptive behavior in a classroom. Journal of Applied Behavior Analysis, 2(2), 119–124. [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Becker KD, Bohnenkamp J, Domitrovich C, Keperling JP, & Ialongo NS (2014). Online training for teachers delivering evidence-based preventive interventions. School Mental Health, 6(4), 225–236. [Google Scholar]
  6. Bertram R, Blase K, & Fixsen D (2015). Improving programs and outcomes: Implementation frameworks and organization change. Research on Social Work Practice, 25(4), 477–487. [Google Scholar]
  7. Boticki I, Baksa J, Seow P, & Looi CK (2015). Usage of a mobile social learning platform with virtual badges in a primary school. Computers & Education, 86, 120–136. doi: 10.1016/j.compedu.2015.02.015 [DOI] [Google Scholar]
  8. Branscum P, Housley A, Bhochhibhoya A, & Hayes L (2016). A formative evaluation of healthy heroes: A photo comic book-social cognitive theory based obesity prevention program. Journal ff Health Education Teaching, 7(1), 52–63. [Google Scholar]
  9. Browne D (2015). Growing together, learning together. New York, NY; Wallace Foundation. [Google Scholar]
  10. Clark T (2001). Virtual schools: Trends and issues, a study of virtual schools in the United States. Phoenix, AZ: WestEd and Western Illinois University. [Google Scholar]
  11. Craig SD, Hu X, Graesser AC, Bargagliotti AE, Sterbinsky A, Cheney KR, & Okwumabua T (2013). The impact of a technology-based mathematics after-school program using ALEKS on student’s knowledge and behaviors. Computers & Education, 68, 495–504. doi: 10.1016/j.compedu.2013.06.010 [DOI] [Google Scholar]
  12. Crompton H, Burke D, Gregory KH, & Grabe C (2016). The use of mobile learning in science: A systematic review. Journal of Science Education and Technology, 25(2), 149–160. doi: 10.1007/s10956-015-9597-x [DOI] [Google Scholar]
  13. Downer JT, Kraft-Sayre ME, & Pianta RC (2009). Ongoing, web-mediated professional development focused on teacher-child interactions: Early childhood educators’ usage rates and self-reported satisfaction. Early Education and Development, 20(2), 321–345. doi: 10.1080/10409280802595425 [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. Durlak JA, & DuPre EP (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41(3–4), 327–350. doi: 10.1007/s10464-008-9165-0 [DOI] [PubMed] [Google Scholar]
  15. Durlak JA, Weissberg RP, & Pachan M (2010). A meta-analysis of after-school programs that seek to promote personal and social skills in children and adolescents. American Journal of Community Psychology, 45(¾), 294–309. doi: 10.1007/s10464-010-9300-6 [DOI] [PubMed] [Google Scholar]
  16. Eccles JS, & Gootman JA (2002). Features of positive developmental settings In Eccles JS & Gootman JA (Eds.), Community programs to promote youth development (pp. 86–118). Washington, DC: National Academies Press. [Google Scholar]
  17. Employment Characteristics of Families Summary. (2018, April 19). Bureau of Labor Statistics; Retrieved July 28, 2018, from https://www.bls.gov/news.release/famee.nr0.htm [Google Scholar]
  18. Embry DD (2002). The Good Behavior Game: A best practice candidate as a universal behavioral vaccine. Clinical Child and Family Psychology Review, 5(4), 273–297. [DOI] [PubMed] [Google Scholar]
  19. Embry DD (2004). Community-based prevention using simple, low-cost, evidence-based kernels and behavior vaccines. Journal of Community Psychology, 32(5), 575–591. [Google Scholar]
  20. Embry DD, Richardson C, Schaffer K, Rosen A, Darney D, Kelly B, et al. (2010). PAX Good Behavior Game (3rd ed). Tucson, AZ: PAXIS Institute. [Google Scholar]
  21. Fixsen D, Blase K, Metz A, & Van Dyke M (2013). State wide implementation of evidence-based programs. Exceptional Children, 79(2), 213–230. [Google Scholar]
  22. Flaspohler P, Lesesne C, Puddy R, Smith E, & Wandersman A (2012). Advances in bridging research and practice: Introduction to the second special issue on the interactive system framework for dissemination and implementation. American Journal of Community Psychology, 50(3–4), 271–281. doi: 10.1007/s10464-012-9545-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Foster EM, & Jones D (2006). Can a costly intervention be cost-effective?: An analysis of violence prevention. Archives of General Psychiatry, 63(11), 1284–1291. [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Fullard E, Fowler G, & Gray M (1987). Promoting prevention in primary care: Controlled trial of low technology, low cost approach. British Medical Journal (Clinical Research ed.), 294(6579), 1080. [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Gold S (2001). A constructivist approach to online training for online teachers. Journal of Asynchronous Learning Networks, 5(1), 35–57. [Google Scholar]
  26. Halgunseth LC, Carmack C, Childs SS, Caldwell L, Craig A, & Smith EP (2012). Using the interactive systems framework in understanding the relation between general program capacity and implementation in afterschool settings. American Journal of Community Psychology, 50(3–4), 311–320. [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. Hillman CH, Pontifex MB, Castelli DM, Khan NA, Raine LB, Scudder MR, … & Kamijo K (2014). Effects of the FITKids randomized controlled trial on executive control and brain function. Pediatrics, 134(4), e1063–e1071. [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Hew KF, & Brush T (2007). Integrating technology into K-12 teaching and learning: Current knowledge gaps and recommendations for future research. Educational Technology Research and Development, 55(3), 223–252. [Google Scholar]
  29. Hu X, Craig SD, Bargagliotti AE, Graesser AC, Okwumabua T, Anderson C, … & Sterbinsky A (2012). The effects of a traditional and technology-based after-school program on 6th grade student’s mathematics skills. Journal of Computers in Mathematics and Science Teaching, 31(1), 17–38. [Google Scholar]
  30. Huang D, Gribbons B, Kim KS, Lee D, & Baker EL (2000). A decade of results: The impact of the LA’s BEST after-school enrichment program on subsequent student achievement and performance. Los Angeles, CA: UCLA Center for the Study of Evaluation. [Google Scholar]
  31. Junge SK, & Manglallan SS (2001). Professional development increases afterschool staff’s confidence and competence in delivering science, engineering and technology. Youth Development, 73. [Google Scholar]
  32. Kellam SG, Brown CH, Poduska JM, Ialongo NS, Wang W, Toyinbo P, … & Wilcox HC (2008). Effects of a universal classroom behavior management program in first and second grades on young adult behavioral, psychiatric, and social outcomes. Drug and Alcohol Dependence, 95, S5–S28. [DOI] [PMC free article] [PubMed] [Google Scholar]
  33. Kennedy E, Wilson B, Valladares S, & Bronte-Tinkew J (2007). Improving attendance and retention in out-of-school time programs. Research-to-Results Practitioner Insights Child Trends, 17. [Google Scholar]
  34. Khashu A, & Dougherty NL (2007). Staffing practices of high-quality after-school programs. Houston, TX: Cornerstones for Kids. [Google Scholar]
  35. Kohler FW, & Strain PS (1990). Peer-assisted interventions: Early promises, notable achievements, and future aspirations. Clinical Psychology Review, 10(4), 441–452. doi: 10.1016/0272-7358(90)90047-E [DOI] [Google Scholar]
  36. Lauver S, Little PM, & Weiss H (2004). Moving beyond the barriers: Attracting and sustaining youth participation in out-of-school time programs. Issues and Opportunities in Out-of-School Time Evaluation, 6, 1–16. [Google Scholar]
  37. Maggin DM, Johnson AH, Chafouleas SM, Ruberto LM, & Berggren M (2012). A systematic evidence review of school-based group contingency interventions for students with challenging behavior. Journal of School Psychology, 50(5), 625–654. doi: 10.1016/j.jsp.2012.06.001 [DOI] [PubMed] [Google Scholar]
  38. Maynard BR, Peters KE, Vaughn MG, & Sarteschi CM (2013). Fidelity in after-school program intervention research: A systematic review. Research on Social Work Practice, 23(6), 613–623. [Google Scholar]
  39. Newell AD, Zientek LR, Tharp BZ, Vogt GL, & Moreno NP (2015). Students’ attitudes toward science as predictors of gains on student content knowledge: Benefits of an after-school program. School Science and Mathematics, 115(5), 216–225. [DOI] [PMC free article] [PubMed] [Google Scholar]
  40. Office of Juvenile Justice and Delinquency Prevention (2018, July 24). Statistical Briefing Book, Juvenile Offending, Time of Day. Retrieved September 9, 2018 from https://www.oiidp.gov/oistatbb/offenders/qa03302.asp?qaDate=2010
  41. Palmer KL, Anderson SA, & Sabatelli RM (2009). How is the afterschool field defining program quality? A review of effective program practices and definitions of program quality. Afterschool Matters, 9, 1–12. [Google Scholar]
  42. Peña-Ayala A, & Cárdenas L (2016). A revision of the literature concerned with mobile, ubiquitous, and pervasive learning: A survey. Mobile, Ubiquitous, and Pervasive Learning (pp. 55–100). doi: 10.1007/978-3-319-26518-6_3 [DOI] [Google Scholar]
  43. Pierce KM, Bolt DM, & Vandell DL (2010). Specific features of after-school program quality: Associations with children’s functioning in middle childhood. American Journal of Community Psychology, 45(3–4), 381–393. doi: 10.1007/s10464-010-9304-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
  44. Posner JK, & Vandell DL (1999). After-school activities and the development of low-income urban children: A longitudinal study. Developmental Psychology, 25, 868–879. [DOI] [PubMed] [Google Scholar]
  45. Posner JK & Vandell DL (1994). Low-income children’s after school care: Are there beneficial effects of after school programs? Child Development, 65, 440–456. [PubMed] [Google Scholar]
  46. Schinke SP, Schwinn TM, & Ozanian AJ (2005). Alcohol abuse prevention among high-risk youth: Computer-based intervention. Journal of Prevention & Intervention in the Community, 29(1–2), 117–130. [DOI] [PMC free article] [PubMed] [Google Scholar]
  47. Serrano KJ, Thai CL, Greenberg AJ, Blake KD, Moser RP, & Hesse BW (2017). Progress on broadband access to the Internet and use of mobile devices in the United States: Tracking healthy people 2020 goals. Public Health Reports, 132(1), 27–31. doi: 10.1177/0033354916679365 [DOI] [PMC free article] [PubMed] [Google Scholar]
  48. Shadish WR, Cook TD, & Campbell DT (2002). Experimental and quasi-experimental designs for generalized causal inference. Belmont, CA: Wadsworth Cengage Learning. [Google Scholar]
  49. Sheldon J, Arbreton A, Hopkins L, & Grossman J (2010). Investing in success: Key strategies for building quality in after-school programs. American Journal of Community Psychology, 45(3–4), 394–404. [DOI] [PubMed] [Google Scholar]
  50. Shuib L, Shamshirband S, & Ismail MH (2015). A review of mobile pervasive learning: Applications and issues. Computers in Human Behavior, 46, 239–244. doi: 10.1016/j.chb.2015.01.002 [DOI] [Google Scholar]
  51. Smith CA, Akiva T, Sugar SA, Lo YJ, Frank KA, Peck SC, & Devaney T (2012). Continuous quality improvement in afterschool settings: Impact findings from the youth program quality intervention study. In Washington, DC: The Forum for Youth Investment. [Google Scholar]
  52. Smith EP and Bradshaw Catherine (2017a). Nurturing Environments in Afterschool Settings. Clinical Child and Family Psychology, 20: 117–126. doi: 10.1007/s10567-017-0239-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
  53. Smith EP, Osgood DW, Oh Y, & Caldwell LC (2018). Promoting afterschool quality and positive youth development: Cluster randomized trial of the Pax Good Behavior Game. Prevention Science, 19 (2), 159–173, doi: 10.1007/s11121-017-0820-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
  54. Smith EP, Wise E, Rosen H, Rosen A, Childs S, & McManus M (2014). Top-down, bottom-up, and around the jungle gym: A social exchange and networks approach to engaging afterschool programs in implementing evidence-based practices. American Journal of Community Psychology, 53(3–4), 491–502. [DOI] [PMC free article] [PubMed] [Google Scholar]
  55. Smith EP, Witherspoon DP, & Wayne Osgood D (2017b). Positive youth development among diverse racial-ethnic children: Quality afterschool contexts as developmental assets. Child Development, 88(4), 1063–1078. doi: 10.1111/cdev.12870 [DOI] [PMC free article] [PubMed] [Google Scholar]
  56. Smyth JM, & Stone AA (2003). Ecological momentary assessment research in behavioral medicine. Journal of Happiness Studies, 4(1), 35–52. [Google Scholar]
  57. Suanpang P (2012). The integration of m-learning and social network for supporting knowledge sharing. Creative Education, 3(08), 39. [Google Scholar]
  58. Snyder HN, & Sickmund M (2006). Juvenile offenders and victims: 2006 National Report (pp. 1–261). Washington, D.C.: U.S. Department of Justice. [Google Scholar]
  59. Tebes JK, Feinn R, Vanderploeg JJ, Chinman MJ, Shepard J, Brabham T, … & Connell C, (2007). Impact of a positive youth development program in urban after-school settings on the prevention of adolescent substance use. Journal of Adolescent Health, 47(3), 239–247. [DOI] [PubMed] [Google Scholar]
  60. Vernon-Feagans L, Kainz K, Hedrick A, Ginsberg M, & Amendum S (2013). Live webcam coaching to help early elementary classroom teachers provide effective literacy instruction for struggling readers: The targeted reading intervention. Journal of Educational Psychology, 105(4), 1175–1187. [Google Scholar]
  61. Walters ST, Wright JA, & Shegog R (2006). A review of computer and Internet-based interventions for smoking behavior. Addictive Behaviors, 31(2), 264–277. doi: 10.1016/j.addbeh.2005.05.002 [DOI] [PubMed] [Google Scholar]
  62. Wandersman A, Duffy J, Flaspohler P, Noonan R, Lubell K, Stillman L, … & Saul J,. (2008). Bridging the gap between prevention research and practice: The interactive systems framework for dissemination and implementation. American Journal of Community Psychology, 41(3–4), 171–181. doi: 10.1007/s10464-008-9174-z [DOI] [PubMed] [Google Scholar]
  63. Warren JS, Bohanon-Edmonson HM, Turnbull AP, Sailor W, Wickham D, Griggs P, & Beech SE (2006). School-wide positive behavior support: Addressing behavior problems that impede student learning. Educational Psychology Review, 18(2), 187–198. [Google Scholar]
  64. Weis R, Osborne KJ, & Dean EL (2015). Effectiveness of a universal, interdependent group contingency program on children’s academic achievement: A countywide evaluation. Journal of Applied School Psychology, 31(3), 199–218. doi: 10.1080/15377903.2015.1025322 [DOI] [Google Scholar]
  65. Wright RA, & McCurdy BL (2012). Class-wide positive behavior support and group contingencies: Examining a positive variation of the good behavior game. Journal of Positive Behavior Interventions, 14(3), 173–180. [Google Scholar]
  66. Wu W, Jim Wu Y, Chen C, Kao H, Lin C, & Huang S (2012). Review of trends from mobile learning studies: A meta-analysis. Computers & Education, 59(2), 817–827. doi: 10.1016/j.compedu.2012.03.016 [DOI] [Google Scholar]
  67. Yohalem N, & Wilson-Ahlstrom A (2010). Inside the black box: Assessing and improving quality in youth programs. American Journal of Community Psychology, 45(3–4), 350–357. doi: 10.1007/s10464-010-9311-3 [DOI] [PubMed] [Google Scholar]
  68. Zarrett N, Abraczinskas M, Skiles Cook B, Wilson DK, & Ragaban F (2018). Promoting physical activity within under-resourced afterschool programs: A qualitative investigation of staff experiences and motivational strategies for engaging youth. Applied Developmental Science, 22(1), 58–73. [DOI] [PMC free article] [PubMed] [Google Scholar]
  69. Zhang X, Wang W, de Pablos PO, Tang J, & Yan X (2015). Mapping development of social media research through different disciplines: Collaborative learning in management and computer science. Computers in Human Behavior, 51b, 1142–1153. doi: 10.1016/j.chb.2015.02.034 [DOI] [Google Scholar]
  70. Zydney JM, & Warner Z (2016). Mobile apps for science learning: Review of research. Computers & Education, 94, 1–17. doi: 10.1016/j.compedu.2015.11.001 [DOI] [Google Scholar]

RESOURCES