Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2016 May 11.
Published in final edited form as: Am J Clin Nutr. 1999 Apr;69(4 Suppl):816S–824S. doi: 10.1093/ajcn/69.4.816S

Process evaluation in a multisite, primary obesity-prevention trial in American Indian schoolchildren13

Deborah L Helitzer, Sally M Davis, Joel Gittelsohn, Scott B Going, David M Murray, Patricia Snyder, Allan B Steckler
PMCID: PMC4863494  NIHMSID: NIHMS784526  PMID: 10195608

Abstract

We describe the development, implementation, and use of the process evaluation component of a multisite, primary obesity prevention trial for American Indian schoolchildren. We describe the development and pilot testing of the instruments, provide some examples of the criteria for instrument selection, and provide examples of how process evaluation results were used to document and refine intervention components. The theoretical and applied framework of the process evaluation was based on diffusion theory, social learning theory, and the desire for triangulation of multiple modes of data collection. The primary objectives of the process evaluation were to systematically document the training process, content, and implementation of 4 components of the intervention. The process evaluation was developed and implemented collaboratively so that it met the needs of both the evaluators and those who would be implementing the intervention components. Process evaluation results revealed that observation and structured interviews provided the most informative data; however, these methods were the most expensive and time consuming and required the highest level of skill to undertake. Although the literature is full of idealism regarding the uses of process evaluation for formative and summative purposes, in reality, many persons are sensitive to having their work evaluated in such an in-depth, context-based manner as is described. For this reason, use of structured, quantitative, highly objective tools may be more effective than qualitative methods, which appear to be more dependent on the skills and biases of the researcher and the context in which they are used.

Keywords: Process evaluation, qualitative research, intervention study, diffusion theory, social learning theory, children, American Indians

INTRODUCTION

Three types of evaluation are generally recognized in the health education literature: process, impact, and outcome. Process evaluation examines how a program was operated (1), focusing on what the intended intervention was and how it was actually implemented. Impact evaluation assesses a program’s effectiveness in achieving desired changes in targeted mediators, such as knowledge, attitudes, beliefs, and behavior of the target group. Outcome evaluation examines the effects of the program on health status, morbidity, and mortality (2). Until recently, evaluation tended to focus mainly on impacts and outcomes, but the value of process evaluation is now being increasingly recognized. One reason for this increased recognition is the comprehensive nature of the social and behavioral interventions used in contemporary health education programs. As the interventions become more complex, it is important to be able to ensure quality of implementation and exact documentation of the intervention in a given program. The overall purpose of process evaluation is to link impact and outcome data to intervention activities so as to explain any changes that occur in measurements before and after the intervention (3, 4), to describe the actual activities implemented in the intervention and the extent of participant exposure, to provide for quality assurance, to identify and describe the participants, and to elucidate the internal dynamics of program operations. By collecting information about the extent, fidelity, and quality of the intervention, answers about how and why the outcome was achieved can be obtained (5). The results of process evaluation during the feasibility phase of a project can also be used to help monitor and refine intervention components. Additionally, attribution of “no impact” to a program that was not implemented properly (type 3 error) can be avoided by including a process evaluation component (5).

Process evaluation can serve both formative and summative purposes. Formative evaluation data are used by program planners and implementers to improve the appropriateness and quality of the program. As discussed in the present article, in the feasibility phase of the Pathways study, process evaluation data were used to help document and refine the various intervention components. When a long-term, multifaceted intervention is finished, it is important to be able to document the interventions that actually occurred, which often have changed from those that were originally planned. Process data collected during a program, therefore, can also be used for summative purposes to document the interventions that were conducted and that produced the resulting impacts and outcomes.

A review and summary of the current state of the art of program evaluation (2) suggested the following as examples of typical process evaluation questions: What activities, educational materials, or services were provided to participants? What did the staff do? What did participants in the program experience? What was the nature of staff-client interactions? What were the strengths and weaknesses of the program? Which learning activities or strategies worked and which did not, and why? In addition, we suggest that process evaluations also ask, What resources (personnel and fiscal) were used to implement the program?

Three innovative applications of process evaluation that are worthy of note have been reported in the literature. The Child and Adolescent Trial for Cardiovascular Health (CATCH) was a collaborative, multicenter, randomized field trial to test the effectiveness of a multicomponent, school-based cardiovascular health promotion program for public elementary school students. The CATCH process evaluation, standardized across the 4 study centers and the 96 participating schools, included measures of 6 categories of data: external competing programs, school staff characteristics, training and support of school staff, curriculum implementation, student participation and exposure, and student characteristics. These data were successfully used to describe the implementation of the program for quality control and monitoring and to help explain program effects (impacts) (4).

The Working Well Trial is the largest work site cancer control trial in the United States. The study was conducted in 111 work sites by 4 study centers, a coordinating center, and the National Cancer Institute. The primary hypothesis of this study tested the concept of a participatory delivery strategy to address dietary change and smoking cessation (6). Based on the concept of the intervention having “senders” and “receivers,” an extensive process evaluation was developed that assessed the extent to which the intervention was delivered by the senders, ie, the project staff, and the extent to which it was received by employees in each work site. The process evaluation monitored the extent to which each of the 15 process objectives were achieved at each work site. To assess the delivery of the interventions, the mean proportion of process objectives achieved in each work site was summed and divided by the number of work sites. Receipt of interventions was documented through use of an employee survey that included awareness of intervention activities and measures of behavior changes.

The third innovative process evaluation reported in the literature is a method initially used to conduct process evaluations of community substance abuse prevention coalitions. Goodman and Wandersman (7) reported on a participatory evaluation based on models, markers, measures, and meaning; this formative evaluation method uses data collected jointly by evaluators and implementers to monitor program quality and to correct small problems and errors before they become large ones. In this process, evaluators work with program planners and implementers to develop a conceptual model of all of the steps and activities that are supposed to occur as the program unfolds. Then, collaboratively, the evaluators, planners, and implementers develop markers and measures that indicate whether each step, process, or activity actually occurred and the extent to which it did or did not occur. In the “meaning” step, evaluators and program implementers try to determine why various activities did or did not occur and what might be done to correct problems.

The process evaluation described in this paper followed lessons learned from the literature. A conceptual model or theory of evaluation was created that ensured the participatory process both in the design of objectives and instruments and through feedback of the information collected to the intervention development staff. The process evaluation examined whether and how the intervention was implemented during the feasibility phase of the study. The purposes of this paper are to 1) describe the development and pilot testing of the process evaluation instruments, 2) provide some examples of the criteria used to select the process evaluation instruments for use in the full-scale study, and 3) provide examples of how the results of the process evaluation were used to refine the intervention components.

METHODS

Between September and December 1995, process evaluation procedures were used to systematically document activities in the intervention schools participating in the Pathways study. Implementation procedures between the different intervention sites were compared for the purpose of assessing relative fidelity to the implementation plan. The theoretical and applied framework driving design of the Pathways process evaluation was threefold: first, that each component would be implemented according to the principles of dissemination and diffusion (8); second, that the theoretical basis of the project, social learning theory (9), would apply not only to the development of the intervention but also to its implementation; and third, that triangulation of several types of data through a variety of data collection methods would be the preferred model for evaluation (10, 11).

Diffusion to the teachers, physical activity instructors, and food service staff who would implement the program occurred through training conducted by Pathways staff. These persons who would implement the program were the “change agents”; however, to be effective they were also required to change their own behavior (for example, their teaching methods or how they cooked or served school meals). Therefore, each of the steps through which diffusion took required documentation to determine whether the message was consistently delivered from step to step, whether exposure to the intervention components was occurring as expected, and whether behavior change of the change agents was taking place as anticipated.

Social learning theory suggests that 3 types of factors influence behavior: environmental, individual (personal attributes), and behavioral. Fidelity to this model at each dissemination point would therefore be necessary for successful implementation. The questions implied by this requirement are as follows: Is the environment (school and home) conducive to and supportive of a successful implementation? Are the personal attributes required for behavior change (knowledge and attitudes) being communicated consistently at all sites? Are the trainers, teachers, physical activity instructors, and food service workers acting as role models of the behaviors?

Experience shows that triangulation, or confirmation, of information from a variety of sources provides stronger evidence than the use of information from any one source on its own (10). Both qualitative and quantitative methods can be used in a comprehensive process evaluation, especially to measure all the dimensions of program implementation (3). The process evaluation working group was composed of experts in qualitative and quantitative data collection methods. With this broad range of expertise and the desire to get in-depth information from the small sample of schools in which the intervention was piloted, a synergistic set of data collection instruments was developed.

Given this conceptual framework, the primary objectives of the process evaluation were to systematically document 1) the training process and content; 2) the implementation of the curriculum, family, physical education (PE) and recess, and school food service components of the intervention, within and between sites; 3) the attitudes of school administrators, teachers, food service directors, and PE instructors toward the intervention; 4) the exposure of teachers, PE instructors, food service workers, family members, and schoolchildren to the intervention components; and 5) the role of the field coordinator and other conditions affecting the implementation of the intervention components at each site.

Instruments

To accomplish these objectives, 27 sets of data collection instruments were developed. Some of these sets included multiple instruments (for example, teacher implementation checklists were developed for each of 24 lessons). The instruments were developed by members of the process evaluation working group, which was composed of at least one member from each of the 5 intervention working groups (formative assessment, school food service, PE and recess, curriculum, and family). This collaborative process entailed describing the different pieces of each intervention component, including those from the theoretical framework described earlier; outlining the process evaluation objectives of each piece of each component; discussing the different methods that might be appropriate to collect the required information; and developing the instruments. During this process, much time and energy was devoted to building consensus on what the evaluation questions should be.

As an example, we describe the process evaluation instruments used for the PE and recess component. For this component, the different pieces of the intervention are the training of teachers to teach the PE and recess intervention, the PE classes, and the recess periods. The working group decided that the following information would be needed from the process evaluation of the training: that the training occurred, who was trained, that the training was implemented according to the training plan, and that the trainees expressed or demonstrated competency at the skills and knowledge transmitted during the training session. For the PE classes, the working group wanted documentation of the dates and times classes were held, how many students participated, the teacher’s perception of the extent of participation by students, whether the teachers followed the curriculum as outlined, whether students were participating in the class activities to the extent that they would achieve a minimum standard of activity (50% of class time in moderate-to-vigorous physical activity), and the teachers’ attitudes toward the curriculum. For the recess periods, process evaluation objectives were similar to those for the PE classes. On the basis of this list of process evaluation objectives, a set of instruments was designed. These instruments included a training attendance list, a debriefing form for trainers, a self-administered training evaluation form, individual PE lesson feedback forms for teachers, a structured interview form to be used with teachers, a structured instrument for observation of PE classes and recess periods, checklists for recess periods, a survey for student feedback, and a survey for student exposure questions.

Given that there were 5 working groups, each with an extensive set of process evaluation objectives, a vast array of process evaluation instruments emerged. A list of the different instruments and the dates each instrument was piloted are provided in Table 1. In summary, the instruments and methods used included structured interviews, observations, checklists, attendance records or counts, self-administered evaluation forms, proctor-administered exposure measures, meeting minutes, and reports, representing a high degree of methodologic triangulation.

TABLE 1.

List of process evaluation instruments that were pilot tested during the feasibility phase of the Pathways study1

Intervention component and instrument name Date of administration
Intervention environment
 Structured interviews with school personnel and field coordinators Fall 1995
 Alternative school reports Spring 1996
Curriculum
 Training attendance, evaluation, and staff debriefing Fall 1995 to Spring 1996
 Teacher implementation checklists Fall 1995 to Spring 1996
 Classroom observations and structured interviews with classroom teachers Fall 1995
 Student exposure measures Fall 1995
 Student feedback form Fall 1995
Family
 Family Fun Night attendance roster Fall 1995
 Family Fun Night adult response card Fall 1995
 Family Fun Night child response card Fall 1995
 Family Fun Night booth count sheet and evaluation Fall 1995
 Student exposure measures Fall 1995
 Snack and action pack return cards and summary form Fall 1995 to Spring 1996
 Family advisory group minutes Fall 1995 to Spring 1996
Physical education and recess
 Training attendance, evaluation, and staff debriefing Fall 1995 to Spring 1996
 Observation (SOFIT) Fall 1995
 Teacher implementation checklists for 74 lessons Fall 1995 to Spring 1996
 Recess weekly report Checklists Fall 1995 to Spring 1996
 Student exposure measures Fall 1995
 Recess observation form (modified SOFIT) Fall 1995
 Teacher debriefing structured interview Fall 1995
School food service
 Training attendance, evaluation, and staff debriefing Fall 1995 to Spring 1996
 Observation of lunch Fall 1995
 Observation of food service preparation Fall 1995
 Food service personnel interview Fall 1995
1

SOFIT, system for observing fitness time (15).

Data collection

Data collection during the process evaluation took 2 forms: field visits to schools made by the chairperson of the process evaluation working group and the use of structured, self-administered instruments by program participants. Between October 20 and November 15, 1995, field visits were made to 4 intervention schools (one per site) to pilot test the instruments. During these visits, structured observations of intervention activities were conducted in classrooms, during physical activity sessions (PE classes and recess), during food service preparation, and in the cafeterias; additionally, structured interviews were held with principals, other administrators, classroom teachers, PE teachers, food service workers, and field coordinators. For self-administered data collection, classroom and PE teachers were given teacher feedback forms to fill in during the semester-long implementation of the curriculum, PE, and recess components. Field coordinators were asked to collect these forms from the teachers regularly over the course of the semester. Student feedback was solicited through the use of a self-administered survey titled “Tell Us What You Think About Pathways” at the end of the first 6 wk of the third-grade curriculum; teachers administered the survey to their students.

Exposure measures were collected by Pathways staff, using the same methods used to administer the knowledge, attitudes, and behavior instrument (12). Exposure measures were collected during December 1995 in both intervention and control schools, within 3 wk of the date that marked the end of the implementation of the first half of the third-grade intervention.

The focal points of the family component to be studied were a Family Fun Night, to which parents were invited, and take-home materials (snack and action packs). At each Family Fun Night, process evaluation data were collected by the Pathways staff. For the most part, Family Fun Nights occurred in late September or early October 1995. Data on snack and action packs were collected by teachers within each of the classrooms.

Analysis

All data were sent to the coordinating center, where they were inventoried, processed, and archived. Processing involved reporting on data received and either data entry or hand tabulations of items. With the exception of the exposure data, data represented the experience at only the 4 intervention schools. Frequency distributions were prepared by hand when the sample size was limited and by SAS PROC FREQ (version 6; SAS Institute Inc, Cary, NC) when the sample size was larger. Content analysis was based on methods described by Miles and Huberman (13). For 2 sets of data (the student feedback form and exposure data), data were weighted by school.

EXAMPLES OF RESULTS AND USES OF THE PROCESS EVALUATION DATA

The pilot testing of the process evaluation instruments was used as an opportunity to inform the development of both the intervention and the process evaluation itself. The timing of the process evaluation during the feasibility phase overlapped with the period of intervention development. Process evaluation during this phase thus served both to monitor the implementation of the intervention components and also to provide input on an ongoing basis that was used to refine the intervention components already developed. The following section provides examples of the results of the process evaluation, the ways in which the piloting of the process evaluation instruments was used to improve the process evaluation itself, and the ways in which the results of the process evaluation were used to improve the Pathways intervention. A summary of the results included in this section is provided in Table 2.

TABLE 2.

Examples of results and uses of process evaluation data to improve the intervention1

Process evaluation question Instrument used Outcomes
Are school officials familiar with Pathways?
Are family members familiar with Pathways?
  1. Structured interviews with school officials

  1. Prepackaged uniform introduction to Pathways for administrators and families

  2. Regional training

Do field coordinators have a clear idea of their responsibilities?
  1. Interviews with field coordinators

  1. Field coordinators’ checklist and protocol

  2. Field coordinators attend trainings

  3. Field coordinators document visits to schools and activities

Are schedules and implementation methods consistent across schools?
  1. Structured interviews with PE instructors

  1. Minimum standards developed for intervention implementation

  2. Work with administrators to accommodate Pathways activities

What are teachers’ reactions to the Pathways curriculum? Are they teaching the curriculum according to the training and teachers’ manual?
  1. Teacher implementation checklists

  2. Direct observation of lessons

  1. Regular classroom visits by intervention staff

  2. Greater emphasis in training on need to follow entire curriculum

  3. Regional training

Who attended family events? Were family members satisfied with the family events?
  1. Attendance data

  2. Return cards

  3. Booth evaluation cards

  1. Special invitations to tribal and health council members

  2. Thank you postcard sent to participating family members

  3. Change in booths types and numbers

Was the family advisory group working?
  1. Minutes from family advisory group meetings

  1. Improved efforts to form family advisory groups

What percentage of time did students spend being active? Was the PE intervention consistent across sites?
  1. SOFIT

  1. Improved PE training

  2. PE mentoring system on-site

  3. More effort to ensure between-site consistency in scheduling

Were food service guidelines being followed?
  1. Observation of food service preparation

  2. Interviews with food service personnel

  1. Continue to work with food service personnel on implementing and monitoring of guidelines

  2. Monthly visits

  3. Improved communication between visits

Were Pathways students exposed to the intervention? Were control students exposed to intervention components?
  1. Exposure questionnaire

  1. More specificity about Pathways intervention components

  2. Additional effort into family component

  3. Improvements in recess and PE components

  4. Improvements in school food service component

1

PE, physical education; SOFIT, system for observing fitness instruction time (15).

Intervention environment

Process evaluation was used to assess the extent to which school officials were familiar with the Pathways study. Five structured interviews were conducted with principals and assistant principals at the intervention schools. In general, all administrators were aware of Pathways and provided positive feedback on the intervention components. However, some administrators were not aware of all 4 components and at some sites the principals indicated that family members did not understand the role, objective, or integration of Pathways within the school. As a result of this information, a prepackaged, uniform introduction to Pathways was developed. A one-page information handout, written for family members, was developed to be distributed to families at the start of the school year. It was also apparent that all of the administrators wanted to be kept informed about Pathways; some expressed the desire to be included in future Pathways training with their staff. This suggestion, along with others from teachers, led to the decision to hold regional trainings at which more school personnel could participate.

As a second example, field coordinators were interviewed about their activities (both with the schools and with the universities) and their perceptions of the role they filled within the project. These interviews showed that the field coordinators did not have a clear idea of their responsibilities; additionally, their descriptions of their activities varied considerably from person to person. Some made suggestions about communication between themselves and the university, both the amount and type of communication seemed to vary from site to site. Although field coordinators were largely taking on activities they were trained for or felt competent to undertake, the choice of activity was not consistent from site to site. For example, one field coordinator spent a great deal of time and attention on the food service component, another felt more comfortable with the physical activity component, and others stated that their jobs were mostly administrative in nature. This information led to the development of a field coordinators’ checklist and accompanying protocol, so that the field coordinators’ roles could be standardized between schools and sites. In addition, all field coordinators are now required to attend all trainings and to document each of their visits to the schools and the activities they undertake during each visit.

A third and final example of the use of data from the process evaluation is our examination of schedules and implementation methods of the Pathways intervention components by school. In one school, a Pathways PE instructor was teaching PE; in another school, Pathways PE and recess were being taught by Pathways-trained staff; whereas in the 2 other schools, Pathways PE was being taught by elementary school PE teachers and staff. Scheduling of the curriculum, PE, and recess components also differed by school. This information suggested that the Pathways staff would have to work harder to get schools to incorporate the Pathways activities into their daily routines; it also suggested that the implementation of Pathways could be expected to differ and that minimum standards for implementation would need to be set by the project.

Curriculum component

Process evaluation was used to get feedback from teachers on the Pathways curriculum. Most of the teachers filled out teacher implementation checklists and gave above average marks to the 12 lessons implemented during the fall semester. There was a trend toward increased satisfaction with the lesson as the weeks advanced. A few teachers provided open-ended comments on the bottom of the forms but many teachers did not write any comments. It was also clear that the questions on the forms did not provide sufficiently useful information for the working group to determine whether the teachers believed the lessons had met their stated objectives. As a result, the forms were changed for the second half of the year to elicit more specific information from teachers related to the teaching and content objectives and to the implementation methods suggested in the curriculum materials.

Lessons were observed directly in 10 of 11 third-grade classrooms to objectively assess the implementation of the curriculum. In general, the students actively participated in and enjoyed the lessons, retained some of the primary concepts, and enjoyed the story circle and the Pathways music. On the other hand, some teachers were not following the lesson plan entirely, were omitting parts of the lesson, were not working through the activities together with their students, were not using the story visuals, were taking longer to teach the lesson than was described in the curriculum, and sometimes displayed or expressed frustration with the group work and emphasis on activity-type learning. Some teachers also spent a large proportion of their time in management-type activities. On a scale of 0–4 (0 being no enthusiasm, 4 being extremely enthusiastic) 7 of 10 teachers observed showed high levels of enthusiasm and 3 of 10 were moderately enthusiastic. These observations indicated that greater emphasis needed to be given during training on the need to follow the entire curriculum. The observations also pointed out the need for regular classroom visits by intervention staff members so that assistance and support for new teaching behaviors could be provided.

Interviews with teachers revealed that they enjoyed teaching the Pathways curriculum and were pleased with the content and focus on traditional values. Teachers expressed a desire for more flexibility, however, and a frustration with the need to teach the whole curriculum; also, they expressed a concern that the later lessons were less “meaty” than those earlier in the semester. All teachers said that the lessons were too long. Coordination between food service, PE, and curriculum was going well for the most part, although problems were identified in some areas. Davis et al (14) describe in more detail how the curriculum was modified in response to these data.

Teachers indicated that the family packs were being taken home and shared with a family member. Teachers expressed reservations about attending a centralized training for the second half of the curriculum. Instead, they stated a preference for local training, which could include all of the Pathways team members at their school. They felt this would improve coordination and support for the program.

After the process evaluation results were provided to the curriculum working group, the decision was made to hold localized trainings in the full-scale study. The following items were given added emphasis during the training: 1) teaching time for the curriculum, 2) consistency in teaching the complete lesson and the sequence of activities within each lesson, 3) the field coordinator’s role, 4) the proper use of visual aides, 5) the expectation of changed behavior among participating students and teachers, 6) the total scope of the Pathways intervention, and 7) the intention of the curriculum working group to include students at all levels of academic achievement.

Family component

Process evaluation data indicated good participation in the family component. In the 4 schools, the attendance figures for the Family Fun Nights ranged from 50% to 91% of students and at least one family member for each student. The return cards from adults and students showed that Family Fun Night participants did visit the required number of booths during the event. The comments on the back, collected to assist in determining which booths were thought to be most successful, led to the selection of the 7 most popular booths for future Family Fun Nights. The booth evaluation cards filled out by Pathways staff also provided useful suggestions for conducting the Family Fun Nights in the future, such as which booths were most popular, the kinds of preparations that would be required to make changes, and the comparative difficulties and ease with which the booths were operated.

Few tribal council and health authority members attended Family Fun Nights. For future Family Fun Nights, therefore, an invitation targeting these individuals was designed and will be distributed 1–2 wk before the event. To reinforce the Pathways Family Fun Night experience, a postcard thanking families for attending the event was designed and will be sent to each registered family.

Only one of the schools was able to establish a family advisory group; however, no minutes of the meetings held by this group were available. This information suggested to the family working group that efforts to form family advisory groups and to hold meetings must be increased or reconsidered.

PE and recess component

Process evaluation was also used to assess the implementation of the PE and recess component. PE classes were observed in 3 of 4 schools. The SOFIT (system for observing fitness instruction time) method (15) was used and provided 3 different summary measures: percentage of time spent by students being active, lesson content, and teacher behavior. The data on the percentage of time students were active for the 3 classes observed are shown in Figure 1. The process evaluation indicated that teachers spent too much time in management activities, that students did not reach the expected level of 50% of time spent in moderate-to-vigorous physical activity, and that teacher training was perhaps insufficient (TL McKenzie, personal communication, 1995).

FIGURE 1.

FIGURE 1

Percentage of time students were engaged in 5 different types of activity, as observed in physical education classes in 3 of the 4 schools during the feasibility phase of the Pathways study.

Recess activities were observed in 3 schools. In each school, many but not all students participated. In one school, recess was held during a period when several students were pulled out for other activities. Students who participated in recess activities walked between trail markers and performed the activities at each marker at least once. Compared with several free-play recess activities observed at the same schools, however, it appeared that Pathways recess activities engaged some of the children, those who were usually most active, in less activity than did free-play recess.

At the training, teachers were taught to teach both a health-related fitness and locomotor skills activity (type 1 activity) and a skill-related fitness activity (type 2 activity) during each PE period. The recommended schedule paired cooperative games with Frisbee, parachute with Frisbee, aerobic games with soccer, and walking, jogging, or running with field games. Not all teachers followed this schedule, however, and the level of student participation seemed to vary from site to site.

During the debriefing, teachers suggested that the training was not sufficient for creating understanding about the need to include both type 1 and type 2 lessons in each PE class; in addition, it appeared that the training, taken out of context without students, had not accomplished its goal of providing teachers with models of how to conduct their classes. All teachers suggested that having a Pathways staff member teach 1 or 2 lessons on-site would be helpful. PE teachers also commented on the new teaching styles that the SPARK (Sports Play and Active Recreation for Kids) curriculum encouraged (16). These styles require more student participation than the teachers were accustomed to and the teachers reported that this sometimes created management problems. Many of the teachers discussed the need to control their students during the lessons; as the teacher implementation checklists and the SOFIT observations suggested, many teachers used time-outs or frequently stopped class activity. Teachers suggested that Pathways staff was unaware of these management problems and so did not provide adequate training in how to handle them.

Process evaluation data revealed that the PE intervention was not being implemented in a standardized manner across sites. There was variability in who taught PE, the number of times PE was taught per week, the duration of the PE classes, and the quality of the classes as reflected in the proportion of time students participated in moderate-to-vigorous physical activity. These results have led to 1) the incorporation of a PE mentor system, in which a Pathways staff member visits the school once a month to observe the PE instructor, to model the teaching behaviors by teaching a class, and to talk with the instructor about his or her difficulties teaching the curriculum; 2) more emphasis during training on managing potential problems during classes; and 3) more effort to ensure that the duration and number of PE classes is consistent across sites. In addition, the plans to have structured recess were dropped in favor of a plan to foster more activity during free-play recess.

School food service component

Food service preparation was observed for 1 d at all 4 sites. At all sites, the Pathways food service intervention materials were visible and the food service personnel were aware of the program. The food service personnel were preparing and serving lower-fat food and Pathways food service posters on methods for lowering the fat content of foods were displayed and visible, although usually not in the food preparation area.

All sites were aware of the 8 food service guidelines for lowering the fat in school meals (17). All sites were observed or reported draining and rinsing cooked ground beef, but one site was observed not rinsing and draining cooked ground turkey. All 4 sites offered some type of lower-fat milk and prepared poultry without added fat. Two sites were using lower-fat cheese. No fat was observed being added to vegetables during either preparation or serving at any of the schools. Butter had been removed from the serving line in 3 of 4 schools and no students were observed taking the butter in the fourth school. All schools were observed serving second helpings of the school entree, although one site was concentrating on serving second helpings of fruit, vegetables, and breads only. No sites offered choices of fruit and vegetables to the students, although a few sites provided more than one fruit or vegetable on the tray.

Interviews with food service directors or managers revealed that they were trying to implement the behavioral guidelines and were pleased with the training and monthly visits provided by the Pathways staff. Each of the directors could name 1 or 2 ways they had learned to prepare and serve food or ways they were preparing and serving food differently as a result of the training. For example, some said that they were not putting butter on pancakes or waffles, another said that more fresh fruit and vegetables were being served, a third said that cooked ground beef was rinsed and drained, and a fourth said that only 2% milk was offered. None of the food service personnel distinguished between the training and the monthly visits, seeing both as opportunities to learn new things and get advice from the Pathways food service staff.

On the basis of the results of the process evaluation, the following recommendations were made for the food service intervention in the full-scale study: 1) the Pathways field coordinators should adhere to a culturally appropriate protocol and greet all personnel on each visit (not just the food service director), 2) all sites should continue to work with the food service personnel on implementing and monitoring the Pathways behavioral guidelines, 3) the Pathways nutrition staff should continue to visit each intervention school monthly and spend ≥3 h with the school food service personnel, 4) the Pathways nutrition staff should continue to develop and refine new visual materials, and 5) the Pathways nutrition staff should continue to communicate on a regular basis with school administrators and food service managers.

Exposure

Fifteen exposure questions were administered to students in the 8 control and intervention schools within 3 wk of the end of the intervention. Students were asked to report whether they had participated in or otherwise been exposed to activities that were supposed to occur as part of the Pathways intervention. All items were worded so as to be meaningful to students in both the control and intervention schools. One of the purposes of the questionnaire was to determine the extent to which students in control schools were exposed to “intervention-like” activities, even though these would not have been directly sponsored by Pathways. Five questions concerned concepts or activities covered in the curriculum, 4 concerned activities undertaken with family members, 5 concerned PE or recess activities, and 1 concerned food service items served at school. A total of 257 students (81.3% of those enrolled) answered the questions.

Of control students, ≥40% reported exposure to 7 of the 15 items. However, of these items, only 2 targeted a key activity that was part of the intervention program, whereas the other 5 described activities that could easily be part of any elementary school curriculum. Of intervention students, > 80% reported exposure to 7 of the 15 items. However, < 70% of the intervention students reported exposure to 5 of the 15 items.

The results of the exposure measures suggested that several items that described activities that could easily be part of any elementary school curriculum could be deleted from the questionnaire. Because < 40% of control school children reported exposure to 9 of 15 items, the data suggest that exposure to some of the key activities of Pathways appears limited in the absence of the Pathways intervention program. The exposure data also provided information on parts of the Pathways intervention that could benefit from improvement. Because 75% of intervention children reported being exposed to Family Fun Night and only 64% reported interviewing a parent, additional effort needs to be put into the family component of the intervention.

Important components of the PE and recess intervention could also benefit from attention. Only 36% of children reported taking hikes during recess, only 65% reported keeping a record of exercise, and 69% reported playing Frisbee on the same day that they played with a parachute. Other process evaluation data support these data: few PE teachers consistently included type 1 and type 2 activities in their classes, recess was implemented sporadically and differently in the 4 schools, and teachers and students reported that they did not understand the role of the Mount Pathways poster in their classrooms. In addition, anecdotal evidence suggests that a wording change may be required to clarify the meaning of hike because the word trail is used consistently in the Pathways intervention.

The exposure data describing access to the food service intervention also provided important information. Only 59% of intervention children reported drinking low-fat milk, whereas 68% of control students reported doing so. This is certainly a key component of the food service intervention and this result indicates that additional effort will be required in this area.

DISCUSSION

The process evaluation data were useful for elucidating areas in which the intervention was working as desired and areas in which improvement was needed. The analyses described here were provided to chairpersons of the working groups at a 2-d workshop held immediately after the first semester of intervention implementation; the information enabled the working groups to understand the strengths of the intervention components as well as to select areas for improvement. These analyses were also provided to the process evaluation working group and helped to clarify process evaluation needs; in most cases, it was clear that the process evaluation required trimming and instruments required revision.

In addition to providing information on the extent and nature of implementation, the pilot testing of the process evaluation instruments revealed the need for more precise instruments. For quality control and also to ensure consistency of implementation across sites, working groups were encouraged to delineate minimum standards of acceptability for each of the intervention components. Based on these standards, instruments could be developed to measure whether the standards were being met. For example, if parts of the curriculum were crucial and other parts were optional, observation could be used to determine whether the crucial parts were being taught. If role-modeling of certain behaviors was considered important, then these behaviors could be documented.

The data revealed that observation and structured interviews seemed to provide the most informative data; however, these methods were the most expensive and time consuming and required the highest skill level to undertake. The SOFIT observation method was found to be the most useful because it included content and quantitative measures that could be compared across sites in an objective manner. An instrument that can achieve this level of objectivity is important because of the sensitive nature of the process evaluation data.

The methods used to generate the process evaluation instruments themselves meant that the working group members had to achieve consensus, which proved difficult. Because the working group was composed of members of intervention committees, there were inherent concerns by these members that the process evaluation would make their work look bad. Every attempt was made to develop the evaluation in a participatory manner; however, most of the work was done by one of the members who had no role in the development of the intervention. Although the literature is full of idealism regarding the uses of process evaluation for formative and summative purposes, in reality, many persons are sensitive to having their work evaluated in such an in-depth, context-based manner. However, as we have suggested, process evaluation can provide useful information during the intervention development phase. Furthermore, pilot testing the process evaluation instruments is critical to the success of the ultimate process evaluation conducted during the implementation phase of a large project such as Pathways. Finally, because structured, quantitative, highly objective tools can be more effective than qualitative methods that are more dependent on the skills and biases of the researcher and the context in which they are used, the method described here, which combines qualitative and quantitative approaches to gathering information, can help to overcome some of the sensitivity to process evaluation.

Footnotes

2

Supported by the National Heart, Lung, and Blood Institute of the National Institutes of Health (U01-HL-50869, U01-HL-50867, U01-HL-50905, U01-HL-50885, and U01-HL-50907).

References

  • 1.Dignan MB. Measurement and evaluation of health education. 3. Springfield, IL: Charles C Thomas; 1995. [Google Scholar]
  • 2.Israel BA, Cummings KM, Dignan MB, et al. Evaluation of health education programs: current assessment and future directions. Health Educ Q. 1995;22:364–89. doi: 10.1177/109019819402200308. [DOI] [PubMed] [Google Scholar]
  • 3.Windsor RA, Baranowksi T, Clark N, Cutter G. Evaluation of health promotion and education programs. Mountain View, CA: Mayfield Publishing Co; 1994. [Google Scholar]
  • 4.McGraw SA, Stone EJ, Osganian SK, et al. Design of process evaluation within the Child and Adolescent Trial for Cardiovascular Health (CATCH) Health Educ Q. 1994;(suppl 2):S5–26. doi: 10.1177/10901981940210s103. [DOI] [PubMed] [Google Scholar]
  • 5.Steckler A. The use of qualitative evaluation methods to test internal validity. Eval Health Prof. 1989;12:115–33. [Google Scholar]
  • 6.Sorenson G, Thompson B, Glanz K, et al. Work site-based cancer prevention: primary results from the working well trial. Am J Public Health. 1996;86:939–47. doi: 10.2105/ajph.86.7.939. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Goodman RM, Wandersman A. FORECAST: a formative approach to evaluating community coalitions and community based initiatives. J Community Psychol. 1994:6–25. [Google Scholar]
  • 8.Rogers EM. Diffusion of innovations. 4. New York: Free Press; 1995. [Google Scholar]
  • 9.Perry CL, Baranowski T, Parcel G. How individuals, environments, and health behavior interact: social learning theory. In: Glanz K, Lewis RM, Rimer B, editors. Health behavior and health education. San Francisco: Jossey Bass; 1990. pp. 161–86. [Google Scholar]
  • 10.Steckler A, McLeroy KR, Goodman RM, Bird ST, McCormick L. Toward integrating qualitative and quantitative methods: an introduction. Health Educ Q. 1992;19:1–9. doi: 10.1177/109019819201900101. [DOI] [PubMed] [Google Scholar]
  • 11.Helitzer-Allen D, Kendall C. Explaining differences between qualitative and quantitative data: a study of chemoprophylaxis during pregnancy. Health Educ Q. 1992;19:41–54. doi: 10.1177/109019819201900104. [DOI] [PubMed] [Google Scholar]
  • 12.Stevens J, Cornell CE, Story M, et al. Development of a questionnaire to assess knowledge, attitudes, and behaviors in American Indian children. Am J Clin Nutr. 1999;69(suppl):773S–81S. doi: 10.1093/ajcn/69.4.773S. [DOI] [PubMed] [Google Scholar]
  • 13.Miles MB, Huberman AM. Qualitative data analysis: a sourcebook of new methods. Newbury Park, NJ: Sage Publications; 1984. [Google Scholar]
  • 14.Davis SM, Going SB, Helitzer DL, et al. Pathways: a culturally appropriate obesity-prevention program for American Indian schoolchildren. Am J Clin Nutr. 1999;69(suppl):796S–802S. doi: 10.1093/ajcn/69.4.796S. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.McKenzie TL, Sallis JF, Nader PR. SOFIT: system for observing fitness instruction time. J Teach Phys Educ. 1991;11:195–205. [Google Scholar]
  • 16.Sallis JF, McKenzie TL, Alcaraz JE, Kolody B, Hovell MF, Nader PR. Project SPARK: effects of physical education on adiposity in children. Ann N Y Acad Sci. 1993;699:127–36. doi: 10.1111/j.1749-6632.1993.tb18844.x. [DOI] [PubMed] [Google Scholar]
  • 17.Snyder P, Anliker J, Cunningham-Sabo L, et al. The Pathways study: a model for lowering the fat in school meals. Am J Clin Nutr. 1999;69(suppl):810S–5S. doi: 10.1093/ajcn/69.4.810S. [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES