Skip to main content
Health Education Research logoLink to Health Education Research
. 2008 Jun 16;23(6):976–986. doi: 10.1093/her/cyn029

Process evaluation results from a school- and community-linked intervention: the Trial of Activity for Adolescent Girls (TAAG)

D R Young 1,*, A Steckler 2, S Cohen 3, C Pratt 4, G Felton 5, S G Moe 6, J Pickrel 10, C C Johnson 7, M Grieser 8, L A Lytle 6, J -S Lee 9, B Raburn 2
PMCID: PMC2583909  PMID: 18559401

Abstract

Process evaluation is a component of intervention research that evaluates whether interventions are delivered and received as intended. Here, we describe the process evaluation results for the Trial of Activity for Adolescent Girls (TAAG) intervention. The intervention consisted of four synergistic components designed to provide supportive school- and community-linked environments to prevent the decline in physical activity in adolescent girls. Process evaluation results indicate that the intervention components were delivered from intervention staff to teachers with high fidelity (84–97%) to the protocol and with lower fidelity (range: 18–93%) from teachers to students. Physical activity programs for girls, a unique feature of the TAAG intervention, increased from a mean of 10 programs per school to a mean of 16 and 15 in years 1 and 2, respectively, in intervention schools, with no change in control schools. These findings suggest that a multicomponent school- and community-based physical activity intervention can be delivered with fidelity and result in a middle school environment that supports physical activity for girls.

Introduction

Health education and behavior change intervention programs are often complex. They may include multiple components that target individuals, physical and social environments and may be conducted in multiple locations with target populations with unique characteristics and needs. These complex intervention characteristics necessitate the inclusion of a thorough process evaluation that assesses factors, indicating whether an intervention was delivered and received as intended [1, 2]. Process evaluation offers the potential to monitor and assure quality of intervention implementation and provides information on the depth and breadth of intervention implementation and adherence, secular trends and potential contamination of the control group. Typically, process evaluation includes evaluation of the intervention by measuring dose (amount of intervention that was delivered), reach (number of those intended who received the intervention) and fidelity (quality of the intervention that was delivered).

Process evaluation is particularly important in explaining the complexities of school-based interventions by documenting dose, reach and fidelity. In addition, school-based intervention studies often use a train-the-trainer model; that is school personnel are trained by research study staff to deliver interventions in school settings [38]. A train-the-trainer’s model adds challenges to the delivery of the intervention. School personnel may be required to deliver the intervention as part of their job but may not have the commitment to the intervention goals. In addition, these trained school personnel have limited control over the receptivity of interventions by students and their ability to deliver an intervention may be compromised by school district requirements or other factors within and pertaining to the school setting. Teachers’ belief in the intervention, their enthusiasm in intervention delivery and motivational levels, as well as their ability to model the behavior of interest and to present a behavior change curriculum, may contribute to students’ receptivity to the intervention [9].

We describe the process evaluation methods and results for the intervention components of the Trial of Activity for Adolescent Girls (TAAG). We specifically examined dose, fidelity and reach of the first 2 years of intervention.

Overview of TAAG

The National Heart, Lung and Blood Institute (NHLBI) sponsored TAAG, a group-randomized trial of 36 middle schools, to develop and test a school- and community-based intervention to prevent the decline in physical activity in middle school girls [10] building upon insights gained in previous school-based interventions [29, 1114]. TAAG was conducted at six university-based field sites representing diverse geographic locations and populations: Universities of Arizona, Maryland, Minnesota and South Carolina, San Diego State University and Tulane University [17]. The trial was coordinated by the Collaborative Studies Coordinating Center of the University of North Carolina at Chapel Hill in partnership with NHLBI. Outcome results of TAAG are reported elsewhere [18].

TAAG intervention framework and components

The social–ecological model [15] was the conceptual framework that guided the TAAG intervention, which consisted of four major components designed to provide supportive environments to reduce the decline in girls’ physical activity [18]. (i) TAAG physical education (PE): PE teachers attended workshops and received instructional materials and regular on-site support to conduct lessons that encouraged active participation of girls during PE classes and to promote out-of-class physical activity. (ii) TAAG health education: health education, PE, science or homeroom teachers attended workshops and received materials to teach a series of six lessons that promoted development of behavioral skills associated with physical activity. Each health education lesson included an activity challenge (i.e. homework) in which students monitored a behavior and set goals to increase their activity. (iii) TAAG physical activity programs: collaborations were created between schools, community agencies and TAAG university staff to increase girl-focused physical activity programs outside of PE classes. (iv) TAAG promotions: social marketing efforts that included posters, flyers and special activities were launched to encourage overall physical activity and promote TAAG-specific programs. Program champions (i.e. school and/or community staff who took ownership of the program) were recruited and trained during the second intervention year, and they directed the intervention to enhance its sustainability in the third year.

Intervention goals were identified to indicate optimal intervention implementation. Goals varied by component, but essentially were set for 100% fidelity for delivery of the intervention by TAAG staff to teachers and 80% fidelity for delivery by teachers to the students. Fidelity was defined as the consistency between established protocols and implementation. Reach (the level of participation by the target group) was for 100% of girls in the appropriate grades to receive TAAG PE and health education, 60% to participate in the health education activity challenges and for attendance at TAAG physical activity programs to systematically increase by at least 5% each semester.

Process evaluation for TAAG

Process evaluation research for TAAG was theoretically based and designed to take a broad approach, consistent with the purposes outlined by Baranowski and Stables [19] and Steckler and Linnan [20]. In addition to evaluating dose, reach and fidelity, we also assessed environmental factors and used process evaluation for quality control purposes [19, 20]. Specifically, the objectives were to (i) evaluate the implementation, or delivery, of the intervention (i.e. dose and fidelity); (ii) evaluate the extent to which the intervention reached the intended targets and the degree to which the targets were exposed to the intervention components (i.e. reach and exposure); (iii) document environmental factors that may have an influence on intervention effectiveness (i.e. context, contamination and secular trends) and (iv) provide periodic quality control information to intervention planners to refine the intervention for the purpose of optimizing their implementation and effectiveness (e.g. enhance dose, fidelity, reach and exposure). Intervention acceptability predicts continued use of intervention strategies [21]; thus, student enjoyment and teacher acceptability also were assessed.

Measures

Quantitative and qualitative methods, including structured observations, questionnaires, semi-structured interviews and completion logs, were used to collect process evaluation data. Instruments were developed iteratively with members of intervention planning groups, similar to the method used in developing TAAG formative assessment instruments [22]. Instruments were tested during the intervention pilot, reliability and validity were determined and instruments were revised prior to the main trial intervention. Measures were constructed to determine the degree to which each intervention component’s objective was met.

TAAG PE and health education

Intervention activities targeted for process evaluation were (i) staff development workshops delivered by TAAG staff and (ii) lesson content delivered by the teachers. Staff development workshops for both PE and health education were evaluated by attendance logs (dose and reach) and workshop observations to assess whether the workshop material was delivered as intended (fidelity). Adaptation of PE classes to meet TAAG objectives and implementation of health education lessons were assessed through structured observations throughout the academic year by TAAG staff and teacher surveys at the end of the school year (dose, fidelity and acceptability). Inter-rater reliability of each item of the lesson observation instruments was kappa = 0.4–1.00. Teacher interviews assessed health education activity challenges’ completion (reach).

TAAG programs for physical activity

The most innovative component of the TAAG intervention was to create links between community agencies, other community members and schools to provide activity programming for girls outside of PE class. To determine the existence and nature of these relationships, interviews were conducted with principals at all schools in the spring of each year. Principals were asked if their school partnered with other groups to provide physical activity programs, and if so, the types of programs that have resulted from the partnership.

The number, type and participation of girls in school-based physical activity programs were documented from two sources. One source included both TAAG and non-TAAG programs in intervention schools, and all programs available to girls in control schools. A survey, adapted from an instrument developed for the Middle School Physical Activity and Nutrition Trial [11], was conducted each spring with sponsors of physical activity programs that were either held at the school site or held off school grounds, but sponsored by the school.

A second data source collected information specifically on TAAG programs in intervention schools. TAAG process evaluators completed forms that documented TAAG programs and included information on program type, duration in weeks, number of sessions per week and session duration (dose). Number of attendees was tallied by the program instructor and given to the process evaluator (reach). No names of attendees were collected. A random sample of TAAG programs was chosen each semester (n = 2 per school in Semester 1, increasing by one program per school each semester), and participants were given an anonymous survey to assess program acceptability (enjoyment), during a session approximately midway through the program.

TAAG promotions

Exposure to promotional materials was assessed through the student questionnaire administered as part of the TAAG measurement protocol. In the spring semester of the second year of the TAAG intervention, 120 eighth grade girls randomly selected from each school were invited to participate in the TAAG main outcome measures (i.e. physical activity and body composition assessment and psychosocial questionnaires, which included questions on exposure to TAAG promotional messages) [16]. Student participation in special events and physical activity promotions were assessed through participation records (reach).

Analysis

The process evaluation data were comprised of observations, questionnaires, semi-structured interviews and completion logs that describe the characteristics of students, teachers, classes and school or community environments. All analyses took into account the expected positive intraclass correlation among responses for students, teachers and classes in the same school and school- or community-level responses within the same site [23]. SAS Proc Mixed [24] and SAS Proc Glimmix were used to model continuous and dichotomous response measures, respectively, with random effects for school and site to account for the correlated nature of the data. Race was included as a fixed effect for analyses of girl-level data to control for differences in the response measure by race/ethnicity. For all tests, statistical significance was determined at the 0.05 level. All statistical analyses were conducted using SAS software version 9.1.3 (SAS Institute, Cary, NC, USA).

Results

TAAG PE and health education

The first level of intervention implementation was for TAAG university staff to deliver staff development workshops to teachers at the intervention schools. As displayed in Table I, dose, reach and fidelity were high (range 84–97%) in both years. Workshop trainings were attended by >85% of teachers, with the remainder attending make-up sessions. Nearly 90% of the full-day PE and health education workshop content in years 1 and 2 was fully covered by TAAG university staff.

Table I.

Implementation of staff development workshops, years 1 and 2a

Year 1
Year 2
TAAG PE
TAAG health education TAAG PE
TAAG health education
Full day Booster 1 Booster 2 Full day Booster 1 Booster 2
Dose
    Percent of teachers attending entire training 86 ± 35 93 ± 26 86 ± 35 92 ± 27 86 ± 35 93 ± 25 87 ± 34 96 ± 20
    Percent of teachers attending a make-up training 14 ± 35 7 ± 26 14 ± 35 8 ± 27 14 ± 35 7 ± 25 13 ± 34 4 ± 20
Reach
    Percent attendance compared with that expected 91 ± 29 89 ± 32 80 ± 40 94 ± 24 93 ± 26 82 ± 39 87 ± 34 93 ± 25
Fidelity
    Percent of workshop 2components fully covered 92 ± 6 84 ± 13 88 ± 19 85 ± 26 93 ± 6 88 ± 18 93 ± 15 97 ± 7
a

Intervention goal was 100% dose, reach and fidelity.

Table II displays the dose, acceptability and fidelity of the teacher implemention of TAAG PE. From year 1 to year 2, acceptability of TAAG PE concepts significantly increased in one aspect: amount of change teachers made based on TAAG. Fidelity significantly increased by the second intervention year in four of the seven PE objectives. Intervention goals of at least 80% fidelity were reached for two of the seven objectives measured in year 1 and for three objectives in year 2. Compared with control schools in year 2, intervention schools were more likely to use strategies to minimize management time (P = 0.03).

Table II.

Implementation of TAAG PE by teachers, years 1 and 2, spring semester only

Intervention schools Intervention schools Control schools
Year 1 Year 2 Year 2
Dosea n = 69 n = 68
    Use of teacher’s guidebook 2.7 ± 0.2 2.7 ± 0.2 N/A
    Use of task cards 3.1 ± 0.1 3.0 ± 0.1
    Use of activity box 3.1 ± 0.1 3.1 ± 0.1
Acceptabilityb
    Teachers’ reaction to TAAG 4.1 ± 0.2 4.3 ± 0.2 N/A
    Teachers’ perception of student reaction to TAAG 3.5 ± 0.1 3.7 ± 0.1
    Amount of change teacher made based on TAAG 3.4 ± 0.2 3.7 ± 0.2c
    Teachers’ perception of ease of making change 3.6 ± 0.1 3.9 ± 0.1
    Teachers’ perception of TAAG’s benefits for students 4.0 ± 0.1 4.1 ± 0.1
Fidelityd n = 148–162 n = 146–162 n = 95–108
    Students were encouraged for out-of-PE-class physical activity (percent of classes) 18.4 ± 22.0 28.4 ± 29.8e 14.8 ± 21.3
    Teacher used strategies to minimize management time (percent of classes) 76.4 ± 32.0 84.6 ± 16.2ef 65.7 ± 33.1
    Students were provided with choices (percent of classes) 58.5 ± 34.3 48.8 ± 27.3 55.6 ± 34.8
    Students were encouraged for in-class physical activity (percent of classes) 85.4 ± 18.2 93.2 ± 12.7e 88.9 ± 17.1
    Student:equipment ratio was appropriate for activity (percent of classes) 70.0 ± 28.3 66.4 ± 22.8 57.2 ± 39.4
    Group sizes were appropriate for activity (percent of classes) 66.5 ± 28.3 70.8 ± 20.5 64.3 ± 33.0
    Girls appeared to enjoy PE (percent of classes) 86.5 ± 24.2 95.7 ± 9.4c 85.2 ± 25.5
a

Data reported by PE teachers. Scale 1–4: 1 = never, 2 = rarely, 3 = sometimes and 4 = always.

b

Data reported by PE teachers. Likert 1–5 scale: 1 = unfavorable/difficult and 5 = favorable/easy.

c

Significantly different from year 1 (P < 0.05).

d

Data assessed by observation. Implementation variable was observed ‘some’, ‘most’ or ‘all’ of class. Intervention goal = observation of 50% for item 1, 80% for all other items.

e

Significantly different from year 1 (P < 0.01).

f

Significantly different from control schools (P < 0.05).

Over 90% of the TAAG health education lessons were taught in both years at all of the schools (Table III). Observations indicated that the lesson components were partially or completely taught during 76 and 64% of observations during years 1 and 2, respectively. Sixty-two percent of the activity challenges were completed by the girls each year, meeting the intervention goal of 60%.

Table III.

Implementation of TAAG health education by teachers, years 1 and 2

Year 1, seventh grade lessons (n = 18) Year 2, eighth grade lessons (n = 18)
Dosea
    Mean percent of individual lessons taught at each school 91.7 ± 3.6 90.3 ± 3.6
Reach
    Percent of girls who were taught all health education lessonsa 90.9 ± 7.0 76.8 ± 7.0b
    Percent of girls who completed all activity challenges 43.5 ± 6.1 31.7 ± 6.3
Fidelityc
    Percent of lesson components fully or partially completed 75.7 ± 7.2 64.4 ± 7.3d
    Percent of activity challenges completed 62.3 ± 6.5 61.4 ± 6.5d
a

Intervention goal = 100%.

b

Different from year 1 at P < 0.05.

c

Intervention goal = 80% fidelity to lesson components, 60% of girls completing activity challenges.

d

Different from year 1 at P < 0.01.

TAAG physical activity programs

Based on interviews with principals, at baseline 44% of intervention schools and 44% of control schools reported community collaborations for physical activity programs (data not shown). This increased to 83% of intervention schools at intervention years 1 and 2, with no increase in control schools (Table IV). Based on surveys of physical activity program leaders at intervention and control schools, at baseline there were an average of 10.3 ± 5.4 and 10.2 ± 3.6 in intervention and control schools, respectively (data not shown). The number of physical activity programs was significantly greater in the intervention schools compared with control schools at the end of the first intervention year and approached significance in the second year (Table IV).

Table IV.

Implementation of programs for physical activity intervention component, including school–community collaborations and TAAG programs, Semesters 1–4

Intervention schools
Control schools Intervention schools
Control schools
Year 1
Year 1 Year 2
Year 2
Percentage of school reporting collaborations 83.3 ± 38.3 44.4 ± 51.1a 83.3 ± 38.3 27.8 ± 46.1a
Average number of physical activity programs 16.0 ± 7.7 10.7 ± 7.2a 15.2 ± 10.8 10.1 ± 4.0b
TAAG programs (intervention schools only) Semester 1 Semester 2 Semester 3 Semester 4
Dosec
    Average number of programs per school 7.5 ± 0.6 7.9 ± 0.6 7.7 ± 0.6 7.2 ± 0.6
    Number of schools that reached intervention goal 17 18 17 13
Reachd
    Average attendance at each program per school
    Sixth grade 7.5 ± 1.3 5.7 ± 1.1 6.6 ± 1.2 6.7 ± 1.2
    Seventh grade 6.6 ± 0.8 3.8 ± 0.7e 4.3 ± 0.7 3.5 ± 0.7
    Eighth grade 4.5 ± 1.0 2.3 ± 0.8f 5.0 ± 0.9f 4.0 ± 0.9
    Total 18.1 ± 2.1 11.5 ± 1.9e 16.1 ± 2.0f 13.9 ± 2.0
Acceptabilityg
    Girls’ perceptions of enjoyability of programs 4.5 ± 0.1 4.7 ± 0.1 4.7 ± 0.1 4.8 ± 0.1a
a

Differs from intervention schools at P < 0.05.

b

Differs from intervention schools at P < 0.08.

c

Intervention goal = 2 programs per school in Semester 1, increase by one each additional semester.

d

Intervention goal = 5% increase in attendance each semester.

e

Differs from previous semester at P < 0.001.

f

Differs from Semesters 1 and 3 at P < 0.05.

g

Likert 1–5 scale: 1 = no way! 3 = it was ok and 5 = absolutely!

The average number of TAAG programs exceeded intervention goals for each semester. Ninety-four percent (17 of 18 schools) met the target number of programs in Semesters 1 and 3, while all schools met the target in Semester 2. In Semester 4, 72% met the target number of programs (13 of 18 schools) (Table IV). Average attendance at each program ranged from 11.5 to 18.1 girls. Across all years, sixth grade girls were most likely to attend. Total attendance declined between Semesters 1 and 2 and increased between Semesters 2 and 3. Based on ∼1300 surveys in year 1 and 2000 surveys in year 2, girls rated the physical activity programs as highly enjoyable.

TAAG promotions

In the first intervention year, the major promotional event was a passport challenge targeting seventh grade girls, in which girls received validation stamps in their ‘passports’ for participating in specific kinds of physical activities. Approximately 22% of seventh grade girls participated, who did not meet the intervention goal of 35%. A pedometer challenge was promoted for eighth grade girls in the second intervention year. About 71% of eighth grade girls participated in this event, which met the intervention target of 70%. Girls from intervention schools were significantly more likely to recognize TAAG promotional messages used in posters and flyers compared with girls from control schools (P < 0.0001) (Table V).

Table V.

Percent of girls reporting exposure to TAAG promotional messages in intervention and control schools at the end of the 2-year intervention

Promotional message Intervention school girls (n = 1912) (%) Control school girls (n = 1803) (%) P valuea
TAAG messages
    Real girls, real activities, real funb 56 12 <0.0001
    Get active, stay activeb 58 29 <0.0001
    Combined exposure to either of the two messages above 72 32 <0.0001
Non-TAAG messages
    Eat right, stay strong, live longer 32 28 0.0541
    Play sports: it’s good for you 15 10 <0.0001
    Combined exposure to either of the two messages above 38 31 <0.0001
a

P values based on chi-square test.

b

TAAG promotional message; others were non-TAAG messages included to divert respondents.

Discussion

These results are an overview of the comprehensive process evaluation conducted to document the TAAG intervention implementation. School personnel were trained by TAAG interventionists with a high level of fidelity to the protocol and reach approached 100%, with almost all teachers attending intervention trainings. All students were exposed to TAAG PE, which was implemented with moderate to high fidelity. More than three-quarters of the targeted population were taught all TAAG health education lessons. A major thrust of the TAAG intervention was to increase the number of physical activity programs offered for girls, and intervention schools provided more programs than did the control schools. For most schools during most semesters, opportunities for girls to attend physical activity programs before school, during lunch or after school exceeded intervention goals. Eighth grade participation in the pedometer challenge met the goal. TAAG promotional messages were identified by more girls from the intervention schools than from the control schools.

Collaborations with outside agencies doubled in the intervention schools but did not change in the control schools—a clear indication of success of the TAAG physical activity program intervention component. Unlike previous school-based trials [36, 11, 13, 14], TAAG was the first to link schools with communities to provide more opportunities for physical activity. Girls who attended programs overwhelmingly enjoyed them. The process evaluation results clearly indicate that the TAAG approach of providing physical activity opportunities is feasible and acceptable by the girls.

Although some aspects of the intervention were implemented with high fidelity, particularly intervention trainings in which TAAG staff was responsible for implementation, other parts were less completely implemented. TAAG intervention staff was highly motivated to fully implement the intervention to teachers and community workers—it was one of their primary employment responsibilities. On the other hand, teachers and others had competing priorities, such as completing district-required curricula, which may have hindered them from fully implementing the intervention. They also may have had a limited interest in research activities [12]. Implementing some intervention components required them to change their standard teaching practices. For example, providing choice in PE could be perceived as decreasing the amount of control the teachers had over the students in class. These factors can reduce teachers’ motivation to implement an ‘extra’ program, such as TAAG, to its fullest. Thus, the results indicating that fidelity was lower when teachers implemented the intervention were not unexpected.

Even though the teacher-delivered approach is less effective for optimizing fidelity across all intervention components, it is an effective model for maximizing acceptability and sustainability of standardized interventions. Approximately two-thirds to three-fourths of TAAG health education lesson components were completed by teachers, a percentage similar to that found by Marcoux et al. [25] for Sport, Play and Active Recreation for Kids (SPARK). Also similar to our results, the teachers favorably rated the SPARK program [25]. This example underscores the usefulness of taking a comprehensive approach to process evaluation. If only intervention fidelity was assessed, an intervention may be deemed ‘not acceptable’ because the lessons were not partially/fully implemented at the predetermined goal (80%). However, we included assessment of acceptability and learned that the teachers liked the lessons, which is an indicator of continued use [21]. Although fidelity was lower than we would have liked, high acceptability ratings may indicate teacher motivation to sustain intervention programs.

Dose was consistently high across intervention components. The TAAG intervention staff saw that all school personnel attended trainings and worked closely with the teachers to ensure that TAAG PE was implemented and health education lessons were taught. They also played a major role in ensuring physical activity programs were implemented at each school. Reach, however, was more variable. While virtually all students were exposed to PE and there was high reach for the health education lessons, reach was lower for the promotional events and after school programs. These results suggest that reaching students during the regular school day is more effective than before or after school when there are competing time priorities.

As measured by process evaluation data, TAAG intervention goals were completely met for 18 of 56 specified intervention goals over the 2 years. Another 17 goals were within 10% points of meeting goals. Thus, 63% of goals were either met or mostly met. Setting intervention target goals was a difficult process—there was little precedence in the literature to help us determine what level of dose, reach and fidelity were needed to achieve trial goals. In the end, we chose a combination of what we thought would maximize intervention effectiveness and what seemed to be reasonable to achieve. For example, we thought it necessary for all PE teachers who taught girls attend the workshops (i.e. maximize intervention effectiveness). In contrast, we set the goal of 60% of girls to complete activity challenges (i.e. reasonable to achieve). Some targets that were not met were those that required the girls to do something outside of their regular school day. It is a continuing challenge to identify programmatic physical activity opportunities that appeal to a diverse group of girls. Future work is needed to determine the optimal dose, reach and fidelity of school-based interventions.

Process evaluation issues

Process evaluation is an emerging but important component of intervention research. In order to move the field forward, it is important for researchers to learn from the decisions that others make when designing process evaluation protocols. Foremost, the TAAG investigators struggled with determining the best methods of assessing process evaluation data. This included the issues of using observations versus self-report, information sources and the ability to measure similar ‘intervention-like’ activities in the control schools. We address these issues below.

While evaluating whether to use observations versus self-report to assess dose, fidelity and reach, the TAAG investigators examined the published process evaluation literature. Resnicow et al. [26] compared the use of trained observers and teacher self-report to examine how best to measure implementation of school health curricula. In short, they found that observational data were more valid and reliable than self-reported data. In the TAAG PE and health education components, trained data collectors used structured observations during visits. An advantage is that observers are specifically trained to be able to detect the extent to which the intervention is delivered with fidelity. Although self-reports from teachers may be adequate to assess what lessons were taught, it is not likely that they would be able to accurately report the extent to which components were taught in accordance with protocol guidelines. Another advantage of observations is that they require researchers to be in the schools during the time that the intervention is being implemented. This assists in understanding contextual factors that may influence program implementation that otherwise could go unnoticed and unreported.

However, there can be problems even with using observational data. For example, one component of the health education lessons, the follow-up to the activity challenge, may have been systematically missed because activity challenges were often returned on a day when observers were not present. Combining observations with teacher self-reported data or increasing the number of observation visits may have alleviated these inadequacies. However, availability of trial resources and school burden must be considered when designing process evaluation methods [13]. In retrospect, resources might have been diverted from less productive process evaluation data collection methods and used for additional observations.

Another ongoing issue is determining the source of process evaluation data. Because TAAG was an environmental intervention with school as the unit of analysis, individual girls were recruited only for measurement activities. Study staff did not have permission to monitor individual participation in out-of-class physical activity programs or participation in activity challenges or promotional challenges. This inability to track participation at the student level and link individual exposure to study outcomes resulted in a major limitation. Developing a strategy to link individual participation with outcomes without restricting program participation would be a useful tool for future school-based programs.

Another limitation is minimal process evaluation conducted in control schools. Annual interviews with principals, teachers and program leaders from both intervention and control schools were conducted, but it was not possible to fully characterize ‘TAAG-like’ programs that may have been occurring in control schools. Others have also struggled with this limitation [2]. Extensive questionnaires and observations would be needed to truly identify the extent to which other similar programs are occurring in control schools—resources that are better used on other trial activities.

In conclusion, process evaluation results indicated that the TAAG intervention was implemented with high levels of reach and fidelity and resulted in altering the school environment. School-based interventions are complex, and the TAAG intervention represents an evolution from previous work by linking schools with community groups to affect change. Process evaluation results clearly indicate that changes were made in the intervention schools’ environment that supported physical activity for girls.

Funding

National Heart, Lung and Blood Institute; National Institutes of Health (U01HL66858, U01HL66857, U01HL66845, U01HL66856, U01HL66855, U01HL66853 and U01HL66852).

Conflict of interest statement

None declared.

Acknowledgments

We acknowledge the contributions of Pamela Carr, Derek Coombs, Rachel Cope, Christine Cox, Jewel Harden, Melanie Hingle, JoAnn Kuo, Dale Murrie, Jenny Nadeau and Lakesha Stevens.

References

  • 1.Israel BA, Cummings KM, Dignan MB, et al. Evaluation of health education programs: current assessment and future directions. Health Educ Q. 1995;22:364–89. doi: 10.1177/109019819402200308. [DOI] [PubMed] [Google Scholar]
  • 2.Steckler A, Ethelbah B, Martin CJ, et al. Pathways process evaluation results: a school-based prevention trial to promote healthful diet and physical activity in American Indian third, fourth, and fifth grade students. Prev Med. 2003;37:S80–90. doi: 10.1016/j.ypmed.2003.08.002. [DOI] [PubMed] [Google Scholar]
  • 3.Caballero B, Clay T, Davis SM, et al. Pathways: a school-based, randomized controlled trial for the prevention of obesity in American Indian schoolchildren. Am J Clin Nutr. 2003;78:1030–8. doi: 10.1093/ajcn/78.5.1030. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Luepker RV, Perry CL, McKinlay SM, et al. Outcomes of a field trial to improve children's dietary patterns and physical activity. The Child and Adolescent Trial for Cardiovascular Health. CATCH collaborative group. J Am Med Assoc. 1996;275:768–76. doi: 10.1001/jama.1996.03530340032026. [DOI] [PubMed] [Google Scholar]
  • 5.Lytle LA, Murray DM, Perry CL, et al. School-based approaches to affect adolescents’ diets: results from the TEENS study. Health Educ Behav. 2004;31:270–87. doi: 10.1177/1090198103260635. [DOI] [PubMed] [Google Scholar]
  • 6.Pate RR, Ward DS, Saunders RP, et al. Promotion of physical activity among high-school girls: a randomized controlled trial. Am J Public Health. 2005;95:1582–7. doi: 10.2105/AJPH.2004.045807. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Nicklas TA, O'Neil CE. Process of conducting a 5-a-day intervention with high school students: Gimme 5 (Louisiana) Health Educ Behav. 2000;27:201–12. doi: 10.1177/109019810002700206. [DOI] [PubMed] [Google Scholar]
  • 8.Story M, Mays RW, Bishop DB, et al. 5-a-day Power Plus: process evaluation of a multicomponent elementary school program to increase fruit and vegetable consumption. Health Educ Behav. 2000;27:187–200. doi: 10.1177/109019810002700205. [DOI] [PubMed] [Google Scholar]
  • 9.Helitzer DL, Davis SM, Gittelsohn J, et al. Process evaluation in a multisite, primary obesity-prevention trial in American Indian schoolchildren. Am J Clin Nutr. 1999;69(Suppl. 4):816S–24S. doi: 10.1093/ajcn/69.4.816S. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Kimm SY, Glynn NW, Kriska AM, et al. Decline in physical activity in black girls and white girls during adolescence. N Engl J Med. 2002;347:709–15. doi: 10.1056/NEJMoa003277. [DOI] [PubMed] [Google Scholar]
  • 11.Sallis JF, McKenzie TL, Conway TL, et al. Environmental interventions for eating and physical activity: a randomized controlled trial in middle schools. Am J Prev Med. 2003;24:209–17. doi: 10.1016/s0749-3797(02)00646-3. [DOI] [PubMed] [Google Scholar]
  • 12.Lytle LA, Davidann BZ, Bachman K, et al. CATCH: challenges of conducting process evaluation in a multicenter trial. Health Educ Q. 1994;(Suppl. 2):S129–42. doi: 10.1177/10901981940210s109. [DOI] [PubMed] [Google Scholar]
  • 13.Sallis JF, McKenzie TL, Kolody B, et al. Effects of health-related physical education on academic achievement: project SPARK. Res Q Exerc Sport. 1999;70:127–34. doi: 10.1080/02701367.1999.10608030. [DOI] [PubMed] [Google Scholar]
  • 14.Young DR, Phillips JA, Yu T, et al. Effects of a life skills intervention for increasing physical activity in adolescent girls. Arch Pediatr Adolesc Med. 2006;160:1255–61. doi: 10.1001/archpedi.160.12.1255. [DOI] [PubMed] [Google Scholar]
  • 15.Stokols D. Establishing and maintaining healthy environments. Toward a social ecology of health promotion. Am Psychol. 1992;47:6–22. doi: 10.1037//0003-066x.47.1.6. [DOI] [PubMed] [Google Scholar]
  • 16.Elder JP, Lytle L, Sallis JF, et al. A description of the social-ecological framework used in the Trial of Activity for Adolescent Girls (TAAG) Health Educ Res. 2006 doi: 10.1093/her/cyl059. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Stevens J, Murray DM, Catellier DJ, et al. Design of the Trial of Activity in Adolescent Girls (TAAG) Contemp Clin Trials. 2005;26:223–33. doi: 10.1016/j.cct.2004.12.011. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Webber LS, Catellier DJ, Lytle LA, et al. Promoting physical activity in adolescent girls. Trial of activity for adolescent girls. Am J Prev Med. 2008;34:173–184. doi: 10.1016/j.amepre.2007.11.018. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Baranowski T, Stables G. Process evaluation of the 5-a-day projects. Health Educ Behav. 2000;27:157–166. doi: 10.1177/109019810002700202. [DOI] [PubMed] [Google Scholar]
  • 20.Steckler A, Linnan L. Process evaluation for public health interventions and research: an overview. In: Steckler A, Linnan L, editors. Process Evaluation for Public Health Interventions and Research. San Francisco, CA: John Wiley & Sons; 2002. pp. 11–21. [Google Scholar]
  • 21.Thaker S, Steckler A, Sanchez V, et al. Program characteristics and organizational factors affecting the implementation of a school-based indicated prevention program. Health Educ Res. 2007;22:155–165. doi: 10.1093/her/cym025. [DOI] [PubMed] [Google Scholar]
  • 22.Young DR, Johnson CC, Steckler A, et al. Data to action: using formative research to develop intervention programs to increase physical activity in adolescent girls. Health Educ Behav. 2006;33:97–111. doi: 10.1177/1090198105282444. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Murray DM, Stevens J, Hannan PJ, et al. School-level intraclass correlation for physical activity in sixth grade girls. Med Sci Sports Exerc. 2006;38:926–36. doi: 10.1249/01.mss.0000218188.57274.91. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Littell RC, Milliken GA, Stroup WW, et al. SAS for Mixed Models. 2nd edn. Cary, NC: SAS Institute Inc.; 2006. [Google Scholar]
  • 25.Marcoux M-F, Sallis JF, McKenzie TL, et al. Process evaluation of a physical activity self-management program for children: SPARK. Psychol Health. 1999;14:659–77. [Google Scholar]
  • 26.Resnicow K, Davis M, Smith M, et al. How best to measure implementation of school health curricula: a comparison of three measures. Health Educ Res. 1998;13:239–50. doi: 10.1093/her/13.2.239. [DOI] [PubMed] [Google Scholar]

Articles from Health Education Research are provided here courtesy of Oxford University Press

RESOURCES