Skip to main content
Health Education Research logoLink to Health Education Research
. 2007 Oct 22;23(4):732–743. doi: 10.1093/her/cym060

Process evaluation of the Pool Cool Diffusion Trial for skin cancer prevention across 2 years

Cam Escoffery 1,*, Karen Glanz 1, Tom Elliott 1
PMCID: PMC2733800  PMID: 17956884

Abstract

Though process evaluation of health programs has received growing attention, few interventions have reported process evaluation over multiple years. This article describes 2 years of process evaluation (2003–04) for the Pool Cool Diffusion Trial. Pool Cool is a skin cancer prevention program designed to increase sun protection habits among children and improve organizational and environmental supports for sun protection at swimming pools. Each year, 80 telephone interviews and 40 site visits at pools across the United States were completed, to examine how fully the program was implemented and the extent of use of program components between the two study conditions. Major components of the Pool Cool program, including sun safety lessons, sun safety signs and sunscreen use, had high implementation. Between the 2 years, most of the core elements were either maintained or increased in use. There were no significant differences between the basic and enhanced conditions on implementation. Reasons given for successful implementation were the provision of a toolkit, ease of implementing the program, pool staff and children enjoying the program and the field coordinators' support. These data provide information on programmatic factors that contribute to successful program diffusion.

Introduction

The body of literature on the process evaluation of health programs has grown substantially in the past decade [1]. Process evaluation measures the frequency and extent of implementation of selected program components and related factors [2, 3]. It has become an important contributor to comprehensive program evaluation [1]: data from process evaluations can assist in understanding how and why interventions work [4, 5] and how intervention activities link to outcomes [6].

A number of large-scale community interventions [7, 4], worksite programs [8, 9] and school-based interventions [10, 11] have conducted process evaluation of their programs. However, few interventions have reported process evaluation over multiple years. This study presents process evaluation data from the same program across 2 years.

Skin cancer, the most common form of cancer in the United States, is increasing [12], and childhood exposure to the sun's ultraviolet rays increases the risk for skin cancer later in life [13]. Skin cancer prevention programs that encourage reducing sun exposure, seeking shade and using sunscreen and protective clothing can influence children, families and outdoor recreation environments [14]. The Pool Cool skin cancer prevention program is a multicomponent educational and environmental intervention that was systematically developed, evaluated in a randomized trial in two states and found to have significant positive effects on children's sun protection behaviors and sun safety environments at swimming pools [15] and reduced sunburns among lifeguards [16]. Pool Cool was designed for children 5–10 years of age, their parents and aquatics staff at outdoor swimming pools.

Subsequently, the Pool Cool Diffusion Trial was funded by the National Cancer Institute. This trial aims to test the effects of two strategies for diffusion of the Pool Cool program during the summer season across a 3- to 4-year period. Pools in the basic condition received a kit with the Pool Cool lessons, a Leader's Guide and supporting materials, and pools in the enhanced condition received the basic materials with additional reinforcements and sun safety environmental supports. Within the context of the diffusion trial, the aims of the Pool Cool program are to: (i) increase sun protection habits and decrease sunburns among children and (ii) improve organizational and environmental supports for sun protection at swimming pools [17]. Beginning in 2003, >480 pools, grouped in metropolitan regions, were enrolled in the trial and randomized into basic and enhanced conditions. Each year, process evaluation was conducted by an independent team (i.e. not the intervention staff) through site visits and telephone interviews at 120 participating swimming pools.

This article describes the process evaluation methods and results for the first 2 years (2003–04) of the Pool Cool Diffusion Trial. It examines the extent to which the program was implemented and the use of various program components between the study conditions. The evaluation questions for this process evaluation are:

  • To what degree were the core and supplemental intervention components implemented in the two study conditions by participating pools?

  • What program- and pool-related factors supported or hindered program implementation?

Methods

Diffusion trial design, treatment groups, and intervention components

The Pool Cool Diffusion Trial uses a multilevel nested experimental design across 3–4 years of intervention. In each metropolitan region where pools take part in the study, a field coordinator (FC) works with between 4 and 15 pools. Clusters of pools that are linked to FCs are randomized to either a basic or enhanced condition [17].

The pools in the basic condition received staff training on the program and a Tool Kit containing a Leader's Guide that describes how to implement the program. Educational and activity components of the program include eight laminated lesson cards, a Mini Big Book of colorful cartoon depictions to use interactively with the lessons and guides and materials for poolside sun protection activities. Environmental components were a large dispenser of sunscreen and an aquatics-targeted sunscreen tips sign. Other materials distributed were a Decision Maker's Guide to help pool managers make environmental changes (such as increasing shade), a Resource Guide of list of organizations that provide products for sun safety, a CD-ROM with program information and electronic copies of materials and incentives to reinforce the sun safety messages. Pools in the enhanced condition received the basic condition materials, additional sun safety items (reinforcements) for distribution and environmental supports including more aluminum sun safety signs and shade structures [17].

Sample

A probability sample of eight metropolitan regions was selected to identify 40 pools for site visits, and a probability sample of 15 regions was selected to identify 80 pools for the telephone interviews. A probability sample uses a random process to ensure that each unit of a population has a specified chance of selection [18]. The sample was selected in regional clusters based on the overall trial design and to enhance efficiency of travel for site visits. The total number of pools in these regions exceeded the number planned for the evaluation, to account for scheduling problems or pools that were unable to participate (e.g. undergoing facility renovations, closed early). The sample was stratified for the telephone interviews and site visits to obtain equal number of pools in the basic and enhanced conditions. Each selected pool scheduled for either a site visit or a telephone call was mailed a letter informing the pool liaison about the process evaluation. Pools that were visited in 2003 were removed from the sample for 2004.

Data collection procedures

The process evaluation for Pool Cool included several components: annual site visits and telephone interviews with a sample of participating pools; items on surveys; monitoring of FC training and semistructured logs kept by FCs and research staff to record relevant information and contacts. This article reports on the main annual process evaluation across 2 years.

The main annual process evaluation used two data collection methods: (i) telephone interviews and (ii) site visit interviews and observations of participating pools. The Institutional Review Boards at the University of North Carolina at Chapel Hill and Emory University approved this research. Program implementation data were collected from the pool manager or supervisor at participating pools, and observations of the pool environments were conducted to observe sun protection practices of the aquatic staff and to identify sun protection features of pool environments.

Site visits

Evaluators contacted the pool managers to arrange for the site visits. If the pool manager was not available, the FC in the region assisted in scheduling the 2-day visit. The selected pools were clustered into groups of two to three pools that were located geographically close together and could be visited in the same day. The site visits included the same questions as the telephone interview, with the addition of observations of pool and aquatic staff characteristics to validate responses from the on-site interviews. Each visit lasted ∼1–2 h. Evaluators tried to keep the site visits as brief as possible, recognizing that pool staff are busy during the summer months. After arriving at the pool, the evaluator conducted the interview after obtaining consent and then completed the observation using a structured protocol. Pools were replaced with an alternate if they did not implement the program or did not respond to eight contact attempts over a period of 2 weeks.

Telephone interviews

Evaluators called the pool contact from the sample to schedule the interviews beginning in mid- to late July. Telephone interviews were completed after receiving consent from the contact.

Training of evaluation staff

The evaluation team was led by a lead evaluator and comprised at least two field evaluators. The field evaluators attended a 1-day training to learn about the Pool Cool process evaluation, the interview and observation instruments and interviewing techniques. They also took part in a training site visit at a pool to practice interviews and observations.

Measures

A 57-item interview schedule was used to guide collection of information on program participation, implementation and challenges to implementation (see Table I). It contained closed-ended and open-ended questions, with the latter intended to elicit in-depth comments about how pools responded to specific Pool Cool materials and the program in general. Site visit observations were used to further document program implementation, observe the pool environment and sun safety practices of aquatic staff and validate responses about sun safety practices and program implementation from the interview. The interview asked about the Pool Cool kit of materials, training of pool staff on the program and receipt and use of the educational and environmental components. Evaluators used the observation checklist to document the availability of sunscreen, shade structures and sun safety signs at the pools. They also noted lifeguard practices related to sun safety behaviors and Pool Cool clothing (i.e. T-shirts or hats) or items around the pool.

Table I.

Summary of Pool Cool implementation measures

Category variable Measure (% receiving item using ‘Yes/No’ responses)
Interview measures
    Receipt of kit Received Pool Cool kit
    Participation in training Completed training
    Receipt of education components Received Leader's Guide
Still have Leader's Guide
Received Pool Cool lessons
Received Mini Big Book
Received Decision Maker's Guide
Received Resource Guide
Received Pool Cool Disk/CD-ROM
    Receipt of environmental components Received sun signs
Received large pump bottle
    Use of education components Ever used Leader's Guide
Used Pool Cool lessons
Ever used Mini Big Book
Used Decision Maker's Guide
Used Resource Guide
Use Pool Cool Disk/CD-ROM
    Use of supplemental education components Ever used Weather Watch/UV sheets
Ever used Sun Jeopardy game
Ever used Purple People Color block
Ever used Emperor's Clothes
Ever used UV exposure cards/wristbands
    Use of environmental components Posted sun signs
Any use of large pump bottle of sunscreen
    More intensive use of program components Made or obtained extra materials
Ordered or copied brochures
Made or obtained additional signs
    Pool characteristics Reported presence of shade structures
    Lifeguard practices Frequency of wearing a hat
Frequency of wearing a shirt
Frequency of applying sunscreen (four-point scale: ‘rarely’ to ‘almost always’)
    FC support Percent reporting FC was helpful (four-point scale: ‘not’ to ‘very helpful’)
Summary implementation scores
    Primary implementation score (five scales) Training
Lessons: 6 items, score 0–6; α  = 0.44
Activities: 5 items, score 0–5; α = 0.51
Signage: 2 items, score 0–2; α  = 0.77
Sunscreen: 2 items, score 0–2; α = 0.89
    Supplementary implementation score (10 items) Leader's Guide: 2 items
Resource Guide: 2 items
Signage: 1 item
Disk/CD-ROM: 2 items
Decision Maker's Guide: 2 items
α for total = 0.44
Observational measures
Percent with signs displayed
Percent with free sunscreen available
Percent with shade structures available
Percent with Pool staff wears protective clothing

Data analyses

The field evaluators recorded all data manually during the site visits and telephone interviews on paper forms, both for practical reasons (i.e. being outdoors at swimming pools) and to minimally disrupt the flow of the interview. Each form was entered into a database. A 20% random sample of the site visit and telephone interview records was reviewed for data entry errors. The proportion of potential errors across all records was <0.01% for both years.

The quantitative data were analyzed using SPSS [19]. Descriptive statistics were run for the major implementation items. Chi-square analyses and t-tests were conducted to determine differences between implementation of specific program elements and summary implementation scores between the basic and enhanced condition and between years. A level of significance of P < 0.05 was set for these analyses.

An implementation score representing two summary indicators of overall program implementation was developed. Scores were calculated by aggregating responses to individual items to reflect the extent of implementation for primary and supplementary Pool Cool components at each pool (Table I). The primary components were categorized into training, education and environment. The supplementary elements consisted of making additional copies of the educational materials; ordering brochures listed in the Leader's Guide; obtaining more materials for the activities; making additional signs and using the Resource Guide, CD-ROM and Decision Maker's Guide. The internal consistency scores (alpha coefficients) of the subscales for the primary implementation scores were moderate to high, ranging from 0.44 to 0.89 (Table I). Scales for training lessons, training activities and supplementary implementation had lower internal consistency, likely due to the variety of items that comprised that scale. For example, the Training Lessons scale had questions about receipt and use of the Leader's Guide, Lessons sheets and Mini Big Book. Principal component analysis with Varimax rotation was run on the items comprising the primary implementation score. Using the eigenvalue greater than 1 criteria for the subscales resulted in two factors: education and environment.

Responses from open-ended questions were compiled into a text document from the database. For the qualitative data analysis, a codebook was developed for major themes and two evaluation team members coded the responses. Coding discrepancies were discussed in meetings and a final decision was made as to how to code discrepant comments/responses.

Additional analyses were conducted to validate the responses from site visit interviews and observations. The purpose of the analyses was to assess the extent to which site interview responses on signage, shade structures and lifeguard practices corresponded to observation data. The kappa statistic was used to test the agreement between two observations (i.e. interview and visual observation). Kappa statistics range from 0 to 1 (with 1 indicating complete agreement between raters). A κ of 0.41–0.60 represents ‘moderate agreement’, while a κ of 0.61–0.80 is considered ‘substantial agreement’ [20].

Results

Response rates and duration of interviews

Each year, 80 telephone interviews and 40 site visits were completed. The response rates were 80.1% (80 completed/99 contacted) for interviews and 95.2% (40/42) for site visits in 2003, and 58.4% (80/137) and 70.2% (40/57), respectively, in 2004. The most common reasons for non-response were inability to reach the pool contact, the pool was closed or the pool did not implement the program. The interviews lasted on average 15 min in 2003 and 22 min in 2004, and the site visit interviews lasted ∼1 h.

Implementation

Table II shows frequency of implementation for each core implementation item by condition. Overall, there were no significant differences between the basic and enhanced conditions on core implementation items. Between the 2 years, most of the core elements of the Pool Cool program were either maintained or increased in use. In contrast, in 2004 significantly more pools reported using the Resource Guide less often (χ2 (1) = 9.2, P = 0.00) and the Disk/CD-ROM more (χ2 (1) = 5.4, P = 0.02).

Table II.

Summary of percentages for core implementation items for 2003–04 by condition

Component Item 2003 2004
Basic (n = 61) Enhanced (n = 59) Total (n = 120) Basic (n = 61) Enhanced (n = 59) Total (n = 120)
Training Completed training 86.9 78.0 82.5 85.2 91.5 88.3
Found training useful/very useful 69.8 69.6 69.7 56.6 66.7 61.7
Education
        Leader's Guide
Received guide 90.2 86.4 88.3 98.4 94.9 96.7
Still have guide 98.2 96.1 97.2 91.8 86.4 89.2
Ever used guide 88.9 86.3 87.6 77.0 71.2 74.2
        Pool Cool lessons
Received lessons 95.1 86.4 90.8 98.4 94.9 96.7
Used lessonsa 91.8 79.7b 85.8 96.7 93.2 95.0
Used lessons >1 per week 63.1 50.0 56.4 65.6 62.7 64.2
        Mini Big Book
Received Mini Big Book 93.3 84.7 89.1 98.4 84.7 91.7
Ever used Mini Big Booka 94.6 87.8 91.4 88.5 79.7 84.2
        Pool Cool Poolside Activities (ever used)
Weather Watch 62.3 54.2 58.3 59.0 54.2 56.7
Sun Jeopardy 32.1 43.1 37.5 23.0 55.9b 39.2
Purple People Color block 86.7 69.5 78.2 82.0 81.4 81.7
Emperor's Clothes 58.9 63.8 61.4 62.3 45.8 54.2
UV exposure cards/patch 77.0 72.4 74.8 77.0 81.4 79.2
Environment
        Sun signs
Received sun signs 87.5 81.0 84.2 93.4 88.1 90.8
Posted sun signsa 84.2 75.9 80.0 91.8 86.4 89.2
        Sunscreen
Received large pump bottle 91.7 83.1 87.4 98.4 91.5 95.0
Any use of large pump bottlea 91.7 81.4 86.6 98.4 94.9 97.4
    Other materials
        Decision Maker's Guide
Received guide 83.3 74.1 79.2 70.5 57.6 64.2
Used guidea 51.2 39.0 45.2 30.2 35.3 32.5
        Resource Guide
Received Resource Guide 87.5 81.0 84.2 86.9 74.6 80.8
Used Resource Guidea 47.1 46.8 46.8 26.2 24.1 25.2c
        Computer Disk/CD-ROM
Received Disk/CD-ROM 70.6 69.6 70.1 68.9 55.9 62.5
Used diska 16.7 12.8 14.7 28.6 30.3 29.3c
Intensive Use Ordered or copied brochures 18.2 13.7 16.0 8.2 10.2 9.2
Made or obtained extra materials 8.2 13.6 10.8 9.8 22.0 15.8
Made or obtained extra sun signs 10.2 10.2 10.2 13.1 10.2 11.7
a

Use of items by pools that reported receiving them.

b

P < 0.05 for chi-square tests comparing basic and enhanced conditions.

c

P < 0.05 for chi-square tests comparing 2003 and 2004 pools.

Training

As shown in Table II, the Pool Cool training was completed by 82.5% of the pools in 2003 and 88.3% in 2004. Over 60% of the pools that held the training indicated that it was useful or very useful for both years. The pool contacts commented that reviewing the Pool Cool materials, learning new information about sun exposure and skin cancer and having the training made personally relevant to the staff were the most useful aspects of the training. They also explained that the presentation of information on the risks of exposure to UV radiation and how to protect themselves was beneficial. They noted that it was helpful to go through the materials to familiarize themselves with the program components.

Education

For the educational components, 87.6% of the pools used the Leader's Guide at least once in 2003 and 74.2% did so in 2004. Use of the Pool Cool lessons increased from 85.8% in 2003 to 95.0% in 2004. Generally, the use of the Poolside Activities was higher in the second year. Comments about the activities included that they were fun, engaging, great for bad weather days, interesting for the kids and helped keep their attention during lessons. Other feedback revealed that the use of the materials varied with some pools, including parents participating in the activities, sending activities home with children, using them during public swim time and creating a fun day of Pool Cool activities. Some negative comments about the activities related to running out of materials, the activities not working properly and lack of time to use them.

Environment

Use of the environmental components was high in both years. Over 80% of the pools had received the sun signs and >80% of pools had posted sun protection signs. Most of the pools posted their sun signs at the pool entrance, at the office or fences in the pool area. A majority of the signs was posted at the front entrance, where people sign in or pay, and where they have the best chance of people seeing them. The large pump bottle of sunscreen was used often, with 86.6% of the pools reporting they had used it in 2003 and 97.4% indicating its use in 2004.

Supplementary implementation items

Use of the supplementary implementation items was limited. In the education component, 16.0% of the pools had ordered or copied the brochures versus 9.2% in 2004, and 10.8% had made or copied Pool Cool lesson materials in 2003 versus 15.8% in 2004. Just >10% of pools had made or obtained additional sun safety signs.

Differences between conditions

Over the 2 years, few differences existed between the study conditions. In 2003, pools in the basic condition reported using the lessons significantly more than those in the enhanced one (χ2 (1) = 3.6, P = 0.05). In 2004, pools in the enhanced condition reported using the Sun Jeopardy game significantly more than ones in the basic condition (χ2 (1) = 13.7, P = 0.00).

Primary and supplementary implementation scores

Table III presents descriptive statistics on the implementation score for all pools and for the basic and enhanced conditions. Overall, the average implementation score for the primary implementation items across pools was 70.9% in 2003 and 73.2% in 2004. Use of the large pump bottle of sunscreen (95.8%), sun safety signs (90.0%) and training (88.3%) were the most often implemented primary components. There was a significant difference between the total primary score between basic and enhanced condition in 2003 because pools in the basic condition had higher scores on the lessons and sunscreen subscales (t(104) = 4.67, P = 0.03). In 2004, pools reported significantly higher levels of receipt and use of sunscreen than in 2003 (χ2 (1) = 24.7, P = 0.00). The supplementary implementation score decreased slightly from 34.0% in 2003 to 32.6% in 2004. There were no significant differences between the basic and enhanced conditions on the summary implementation score.

Table III.

Pool Cool implementation scores for 2003–04 by condition

Component 2003 P valuea
2004 P valuea P valueb
Basic (n = 61) Enhanced (n = 59) Total (n = 120) Between conditions Basic (n = 61) Enhanced (n = 59) Total (n = 120) Between conditions 2003 versus 2004
Primary source
    Training 86.9 78.0 82.5 0.20 85.2 91.5 88.3 0.29 0.41
    Lessons 72.3 66.8 69.5 0.05 72.1 66.7 69.5 0.12 0.98
    Activities 63.6 60.0 62.0 0.46 60.7 63.7 62.2 0.53 0.98
    Signs 81.0 78.5 80.0 0.61 92.6 87.2 90.0 0.29 0.15
    Sunscreen 90.0 82.0 86.3 0.02 98.3 93.2 95.8 0.14 <0.05
    Total primary 73.4 68.2 70.9 0.03 74.4 72.0 73.2 0.30 0.76
Supplementary score
    Lessons 14.7 9.3 12.0 0.25 9.1 16.1 12.5 0.13 0.92
    Activities 8.2 13.6 10.8 0.45 16.4 18.6 17.5 0.75 0.18
    Signs 9.8 10.2 10.0 0.94 13.1 10.2 11.7 0.62 0.70
    Resource Guide 59.8 58.5 59.0 0.83 56.6 49.2 52.9 0.22 0.43
    Decision Maker's Guide 54.9 50.0 52.5 0.47 45.9 39.0 42.5 0.31 0.19
    Disk 34.3 37.3 35.9 0.65 44.3 36.4 40.4 0.24 0.53
    Total supplementary 34.6 33.4 34.0 0.72 34.1 31.0 32.6 0.35 0.84
a

P < 0.05 for t-tests comparing pools in basic and enhanced conditions.

b

P < 0.05 for chi-square tests comparing 2003 and 2004 pools.

Comments about FCs

The most common response on how the FC supported the Pool Cool efforts was that the FCs provided training on the Pool Cool program for pool staff. Other supportive roles of FCs were providing the Pool Cool kit at the beginning of the summer, visiting pools to monitor the program, communicating with pool staff and checking in periodically to see if the pools needed materials. Many responded that the FC was enthusiastic, a good resource and always provided what they needed when they asked for it. In addition, 76% of pool contacts reported that the FC was helpful or very helpful. Only a few pool contacts remarked that their FCs were not very involved in the program and had only made contact with them once or twice.

Comments about the program

Pool staff also commented on the major program elements, what helped most to implement the program and what program elements worked best. Across the 2 years, respondents reported positive reactions to the program. Over 90% of the pools commented that the Pool Cool materials helped the most and thought the program was very easy to implement. Some pool contacts further specified which materials they used most frequently such as the Mini Big Book, sun safety signs, sunscreen, Leader's Guide or activities. Furthermore, several pool staff mentioned that they appreciated gaining new or added knowledge about sun safety and commented that the program made the staff and patrons more conscious of sun protection.

Almost half of the pools stated that the lessons were the one component of the Pool Cool program that worked the best. The laminated lessons were often mentioned as helpful because they could be brought to the poolside without fear of getting them wet. The lesson content was also praised for its simplicity, brevity and useful information. The large pump bottle of sunscreen also was frequently mentioned because its availability at the pool encouraged people to use it often. Many pools preferred the sun safety signs because they were a reminder that caught people’s attention and motivated them to ask about the availability of sunscreen.

Barriers and facilitators to program implementation

Key facilitating factors for the implementation of the Pool Cool program included the receipt of the Pool Cool kit or materials, knowledge learned about skin cancer, ease of the program and the work of the FCs. A significant barrier mentioned was the difficulty in fitting the Pool Cool lessons and the sun safety messages within such a limited time period of 30–45 min total given for swim lessons. Other barriers mentioned included staff-related issues such as motivating staff and training of staff on the Pool Cool program because of turnover; working with kids of varying age levels and motivation and completion of surveys for pool staff and parents. An external barrier mentioned was the parents of the children in terms of getting support from them or complaints that the lessons took time away from their kids being in the water for swim lessons.

Comparison of interview responses and observations

Analyses were conducted to assess the extent to which site interview responses about signage, pool staff sun safety practices and environmental factors corresponded to items observed by the field evaluators. Overall, the kappa statistics for the comparisons showed moderate to strong agreement, except for the receipt of sun signs and staff wearing shirts, across both years. There was very high agreement in 2003 (κ = 0.96) and moderate agreement in 2004 (κ = 0.45) on the responses that the sun signs from the program were posted and the observation of those signs around the pool. There was also high agreement on receipt of the large pump bottle of sunscreen and observations that free sunscreen was available to staff (κ = 0.64 in 2003; 1.0 in 2004). There was high agreement on shade structure availability in 2004 (κ = 0.66). These data provide some evidence that responses about sun protective practices and environments from the interviews correspond with observations of these same items during the site visit.

Discussion

Process evaluation data across 2 years of intervention indicated that the major components of the Pool Cool program had high implementation by the participating pools. The implementation of the core elements of the program increased from 2003 to 2004. More pools taught the Pool Cool sun safety lessons to children, posted sun safety signs and used the bottle of sunscreen in the second year. Key reasons for the increased implementation in 2004 were that pools received the Pool Cool materials earlier in the summer and were more familiar with the program in the second year. These results support the notion that adoption of intervention components may increase over time [21]. Reasons for successful implementation appeared to be due to the provision of a toolkit to implement the program, the appealing nature of the Pool Cool materials for children and the belief that the program was easy to implement among the pools. Research also supports that interventions that have attributes of relative advantage in terms of convenience and satisfaction, less complexity and ease of observability such as the Pool Cool program are more likely successfully diffused [22].

Over 85% of pools in 2003 and ∼95% of the pools in 2004 that received the materials used the lessons. These results are higher than those in some other programs reported in the literature: 75% of teachers received and used the Nutrition for Life program that was disseminated in New York State secondary schools [23] and 31–33% of zoos implemented sun safety activities to visitors using a program that was disseminated in zoos across the United States [24]. The diffusion of MPowerment project to community-based organizations found that core program elements were adopted by ≤50% of the agencies [25]. However, the rates of program use found for Pool Cool are consistent with a multistate obesity prevention study of American Indian elementary children in which >90% of the Pathways curriculum lessons were taught in third, fourth and fifth grades in schools [26].

The primary implementation score of 70.9% in 2003 increased to 73.2% in 2004. These data suggest that the pool contacts may be implementing the Pool Cool program more each year. The supplementary score was similar for the 2 years. For diffusion studies, it will be important to ascertain if these scores are maintained over time. There were few differences between the basic and enhanced condition in implementation of specific Pool Cool components and the overall implementation score in both years. This lack of difference may be due to that fact that the enhanced condition may not be that much more intensive than the basic condition or the ease of program implementation.

Based on the qualitative interviews with staff, facilitators and challenges to implementing the program were gleaned. Facilitators included the receipt of a Pool Cool kit, the support of the FC, the appeal of the materials for kids and ease of use of the program. Similar to previous research, the Pool Cool process evaluation showed that positive support from a linkage agent, in this case the FC, positively impacts implementation [27, 28]. These findings emphasize the critical role that the FCs play in facilitating the implementation of Pool Cool during the diffusion trial. The Pool Cool program exhibited factors that are key determinants to the speed and extent of diffusion as described by the pool staff: the program fits with the organizational mission of the pools and parks and recreation department, the activities can be easily observed, tried and communicated through trained aquatic staff and the time and commitment required is minimal because the Pool Cool lessons were embedded within swim lessons [22].

Some of the barriers to implementation were conducting the program in the time allotted for swim lessons (i.e. time constraints), staff turnover and motivation to conduct the program, completion of program surveys and range of ages and level of interest in the program of the participating children. Similar to other research, competing demands and time limitations of the adoption setting have been identified as barriers to dissemination [22, 29].

Conducting a process evaluation over multiple years was helpful for understanding the diffusion of the Pool Cool program. To our knowledge, no other multiyear process evaluation of a national program has been reported. Important lessons learned were gathered from the process evaluation. First, it provides information on the extent of program implementation across years of program diffusion to show patterns of adoption of various intervention components over time. Second, process evaluation data can assist with timely program improvements. The first year's findings resulted in recommendations for improving the intervention: (i) to boost the enhanced condition and to provide more incentive items to pools in the enhanced condition for 2004 and (ii) to continue to train, monitor and offer incentives to FCs to support participating pools. Third, this evaluation benefited from the use of mixed methods and triangulation of data. The use of mixed methods offered in-depth reactions about various components of the program and detailed feedback received about barriers and facilitators to conduct of the Pool Cool program. It also provided opportunities to triangulate and validate data from interviews with observations from the site visits. Finally, process evaluation may be time and resource intensive. Although the process evaluation data will help contribute to understanding the main outcomes of the trial, the evaluation has required funding for additional evaluation staff and time to conduct interviews and site visits in addition to the data collection for the main trial.

There were several limitations to this process evaluation. Data were collected from only 25% of the pools implementing the program. Therefore, the data may not be representative of all pools. However, there was no a priori basis for computing the required sample size to generalize, and this size sample seemed likely to be sufficient. In addition, pools that participated in the evaluation may be different from pools that did not respond. Another limitation is that comparison across the 2 years was based on two cross-sectional samples rather than a panel that was followed across 2 years. Interview data were based on reports from one staff member per pool. Finally, due to logistical constraints (i.e. travel costs), site visit observation data about each pool and its environment were collected by a single observer; multiple observers could establish interrater reliability of these methods. However, we established the interrater reliability of pool site observations in our earlier efficacy trial [15, 30].

Conclusions

In summary, process evaluation can be valuable in providing data about program implementation over time, especially for health promotion programs that are being diffused. It is important to conduct a process evaluation throughout the program to understand how each major program element is being used, the level of implementation and barriers and facilitators to implementation. These data will provide information on important programmatic factors that contribute to successful program diffusion. More attention is needed to study how best to triangulate voluminous amounts of data collected from many process evaluation instruments. It will be interesting to examine the process evaluation data relative to outcome surveys of lifeguards, pool managers and parents and children [17]. Future research can also focus on the role process evaluation serves in the pathway to intervention outcomes and the need for more valid and reliable process measures for health interventions.

Funding

National Cancer Institute (CA92505).

Conflict of interest

None declared.

Acknowledgments

We would like to thank Allan Steckler, Laura Linnan, John Rose and the 2003 and 2004 field evaluators and pool liaisons for participating in the process evaluation.

References

  • 1.Linnan L, Steckler A. Process evaluation for public health interventions and research: an overview. In: Steckler A, Linnan L, editors. Process Evaluation for Public Health Interventions and Research. San Francisco, CA: Jossey-Bass; 2002. pp. 2–24. [Google Scholar]
  • 2.Windsor R, Clark N, Boyd NR, et al. Evaluation of Health Promotion, Health Education, and Disease Prevention Programs. New York: McGraw-Hill; 2003. [Google Scholar]
  • 3.Rossi PH, Freeman HE, Lipsey MW. Evaluation: A Systematic Approach. Thousand Oaks, CA: Sage Publications; 1999. [Google Scholar]
  • 4.McGraw SA, Stone EJ, Osganian SK, et al. Design of process evaluation within the Child and Adolescent Trial for Cardiovascular Health (CATCH) Health Educ Q. 1994;21(Suppl. 2):S5–26. doi: 10.1177/10901981940210s103. [DOI] [PubMed] [Google Scholar]
  • 5.Baranowski T, Stables G. Process evaluation of the 5-a-day projects. Health Educ Behav. 2000;27:157–66. doi: 10.1177/109019810002700202. [DOI] [PubMed] [Google Scholar]
  • 6.Israel BA, Cummings KM, Dignan MB, et al. Evaluation of health education programs: current assessment and future directions. Health Educ Q. 1995;22:364–89. doi: 10.1177/109019819402200308. [DOI] [PubMed] [Google Scholar]
  • 7.Flora JA, Lefebvre RC, Murray DM, et al. A community education monitoring system: methods from the Stanford Five-City Project, the Minnesota Heart Health Program and the Pawtucket Heart Health Program. Health Educ Res. 1993;8:81–95. doi: 10.1093/her/8.1.81. [DOI] [PubMed] [Google Scholar]
  • 8.Sorensen G, Thompson B, Glanz K, et al. Work site-based cancer prevention: primary results from the working well trial. Am J Public Health. 1996;86:939–47. doi: 10.2105/ajph.86.7.939. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Tessaro I, Campbell MK, Benedict S. Health works for women: process evaluation results. In: Steckler A, Linnan L, editors. Process Evaluation for Public Health Interventions and Research. San Francisco, CA: Jossey-Bass; 2002. pp. 184–204. [Google Scholar]
  • 10.Perry CL, Sellers DE, Johnson C, et al. The Child and Adolescent Trial for Cardiovascular Health (CATCH): intervention, implementation, and feasibility for elementary schools in the United States. Health Educ Behav. 1997;24:716–35. doi: 10.1177/109019819702400607. [DOI] [PubMed] [Google Scholar]
  • 11.Steckler A, Ethelbah B, Martin CJ, et al. Pathways process evaluation results: a school-based prevention trial to promote healthful diet and physical activity in American Indian third, fourth, and fifth grade students. Prev Med. 2003;37(Suppl. 1):S80–90. doi: 10.1016/j.ypmed.2003.08.002. [DOI] [PubMed] [Google Scholar]
  • 12.Jemal A, Devesa SS, Fears TR, et al. Cancer surveillance series: changing patterns of cutaneous malignant melanoma mortality rates among whites in the United States. J Natl Cancer Inst. 2000;92:811–18. doi: 10.1093/jnci/92.10.811. [DOI] [PubMed] [Google Scholar]
  • 13.Gallagher RP. Sun exposure and non-melanocytic skin cancer. In: Grob JJ, Stern RS, MacKie R, editors. Epidemiology, Causes and Prevention of Skin Diseases. New York: Blackwell Science; 1997. pp. 72–5. [Google Scholar]
  • 14.Saraiya MA, Glanz K, Briss BA, et al. Interventions to prevent skin cancer by reducing exposure to ultraviolet radiation: a systematic review. Am J Prev Med. 2004;27:422–66. doi: 10.1016/j.amepre.2004.08.009. [DOI] [PubMed] [Google Scholar]
  • 15.Glanz K, Geller AC, Shigaki D, et al. A randomized trial of skin cancer prevention in aquatic settings: the Pool Cool Program. Health Psychol. 2002;21:579–87. [PubMed] [Google Scholar]
  • 16.Geller AC, Glanz K, Shigaki D, et al. Impact of skin cancer prevention on outdoor aquatics staff: the Pool Cool Program in Hawaii and Massachusetts. Prev Med. 2001;33:155–61. doi: 10.1006/pmed.2001.0870. [DOI] [PubMed] [Google Scholar]
  • 17.Glanz K, Steffen A, Elliott T, et al. Diffusion of an effective skin cancer prevention program: design, theoretical foundations, and first-year implementation. Health Psychol. 2005;24:477–87. doi: 10.1037/0278-6133.24.5.477. [DOI] [PubMed] [Google Scholar]
  • 18.Babbie E. 9th edn. Belmont, CA: Wadsworth; The Practice of Social Research. 2000. [Google Scholar]
  • 19.SPSS, Inc. SPSS 12.0.1 for Windows. Chicago, IL: SPSS Inc; 2003. [Google Scholar]
  • 20.Landis JR, Koch GG. The measurement of observer agreement for categorical data. Biometrics. 1977;33:159–74. [PubMed] [Google Scholar]
  • 21.Steckler A, Ethelbah B, Martin CJ, et al. Lessons learned from the Pathways Process Evaluation. In: Steckler A, Linnan L, editors. Process Evaluation for Public Health Interventions and Research. San Francisco, CA: Jossey-Bass; 2002. pp. 268–88. [Google Scholar]
  • 22.Rogers EM. Diffusion of Innovations. 4th edn. New York: Free Press; 1995. [Google Scholar]
  • 23.Olson CM, Devine CM, Frongillo EA., Jr Dissemination and use of a school-based nutrition education program for secondary school students. J Sch Health. 1993;63:343–8. doi: 10.1111/j.1746-1561.1993.tb07150.x. [DOI] [PubMed] [Google Scholar]
  • 24.Lewis E, Mayer JA, Slymen D, et al. Disseminating a sun safety program to zoological parks: the effects of tailoring. Health Psychol. 2005;24:456–62. doi: 10.1037/0278-6133.24.5.456. [DOI] [PubMed] [Google Scholar]
  • 25.Rebchook GM, Kegeles SM, Huebner D, et al. Translating research into practice: the dissemination and initial implementation of an evidence-based HIV prevention program. AIDS Educ Prev. 2006;18(4 Suppl. A):119–36. doi: 10.1521/aeap.2006.18.supp.119. [DOI] [PubMed] [Google Scholar]
  • 26.Steckler A, Ethelbah B, Martin CJ. Pathways process evaluation results: a school-based prevention trial to promote healthful diet and physical activity in American Indian third, fourth, and fifth grade students. Prev Med. 2003;37((Pt 2)):S80–90. doi: 10.1016/j.ypmed.2003.08.002. [DOI] [PubMed] [Google Scholar]
  • 27.O'Loughlin J, Renaud L, Richard L, et al. Correlates of the sustainability of community-based heart health promotion interventions. Prev Med. 1998;27:702–12. doi: 10.1006/pmed.1998.0348. [DOI] [PubMed] [Google Scholar]
  • 28.Smith DW, Redican KJ, Olsen LK. The longevity of growing healthy: an analysis of the eight original sites implementing the School Health Curriculum Project. J Sch Health. 1992;62:83–7. doi: 10.1111/j.1746-1561.1992.tb06022.x. [DOI] [PubMed] [Google Scholar]
  • 29.Glasgow RE, Lichtenstein E, Marcus AC. Why don't we see more translation of health promotion research to practice? Rethinking the efficacy-to-effectiveness transition. Am J Public Health. 2003;93:1261–7. doi: 10.2105/ajph.93.8.1261. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Glanz K, Isnec MR, Geller A, et al. Process evaluation of implementation and dissemination of a sun safety program at swimming pools. In: Steckler A, Linnan L, editors. Process Evaluation for Public Health Interventions and Research. San Francisco, CA: Jossey-Bass; 2002. pp. 58–82. [Google Scholar]

Articles from Health Education Research are provided here courtesy of Oxford University Press

RESOURCES