Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2010 Feb 18.
Published in final edited form as: Eval Health Prof. 2009 Jun;32(2):184–203. doi: 10.1177/0163278709333154

A Multi-Method Process Evaluation for a Skin Cancer Prevention Diffusion Trial

Cam Escoffery 1, Karen Glanz 2, Dawn Hall 3, Tom Elliott 4
PMCID: PMC2824082  NIHMSID: NIHMS163835  PMID: 19448162

Abstract

This article describes process evaluation methods for the Pool Cool Diffusion Trial across four years. Pool Cool is a skin cancer prevention program that was found to improve behaviors and environments for sun protection at swimming pools in a randomized efficacy trial, which was followed by a national Diffusion Trial. The process evaluation focus shifted from measuring program satisfaction to assessing widespread program implementation, barriers and facilitators to implementation, and program maintenance and sustainability. Data collection methods include training surveys, database tracking, field coordinator activity logs, emails, surveys of parents, lifeguards and pool managers, and process evaluation interviews and site visits. The data revealed high levels of implementation of major program components when disseminated in the diffusion trial, including sun safety lessons, sun safety signs, and sunscreen use. This paper describes program features and participant factors that facilitated local implementation, maintenance and sustainability across dispersed pools such as linkage agents, a packaged program, and adaptations of program elements.

Keywords: process evaluation, skin cancer prevention, sun protection, child health, organizational adoption

INTRODUCTION

There is a growing interest in the dissemination of evidence-based programs into public health and community practice. Evidence-based programs are interventions that have positive outcomes, ideally from findings of multiple well-designed studies (Brownson, Baker, Leet, & Gillespie, 2003).

Diffusion research investigates the factors necessary for successful widespread adoption and institutionalization of evidence-based interventions (Rebchook, Kegeles, Huebner, & Team, 2006). However, only a few programs with evaluated diffusion and dissemination efforts in the field of health behavior have been reported (Owen, Glanz, Sallis, & Kelder, 2006; Hoelscher et al., 2001; Dowda et al., 2005; Rebchook et al., 2006). Briss and colleagues (2004) noted the need for research to understand how dissemination and adoption of evidence-based interventions occurs, focusing on issues such as assessing fidelity to science-based programs in adoption, understanding barriers and facilitators to adoption and learning how to increase implementation.

Process evaluation measures the frequency and extent of implementation of selected program components (Linnan & Steckler, 2002; Windsor, Clark, Boyd, & Goodman, 2003). It can measure recruitment (attracting implementers or participants), reach (the extent to which the program is received by the participants), context (aspects of the environment around the intervention), resources (materials or characteristics of implementers or participants to achieve program goals), implementation (the extent to which the program is conducted as designed), barriers (problems that arise in reaching participants), and exposure (the extent to which the participants received the program materials) (Baranowski & Stables, 2000; Linnan & Steckler, 2002). In addition, process evaluation can measure the extent to which intervention providers delivered the program as intended, or fidelity of implementation (Baranowski & Stables, 2000; Basen-Engquist et al., 1994), and reactions (satisfaction with program activities). Process evaluations can assist in understanding how and why interventions work (McGraw et al., 1994) and how intervention components link to outcomes (Israel et al., 1995).

Few disseminated science-based programs address the generalizability of the intervention, or external validity. Because of the limited focus on this issue, Green and Glasgow (2006) proposed that research on diffusion of evidence-based interventions also collect data to help evaluate external validity, such as reach, program implementation and adaptation, maintenance and sustainability. The collection of these types of process evaluation data will enhance the relevance of findings to potential adopters in different populations, settings, and situations. Translational research examines factors that facilitate the effective transfer of research into public health policies and programs (Sussman et al., 2006), and process evaluation data can provide valuable information across the translation continuum from intervention development and diffusion to program sustainability.

This article describes process evaluation methods and findings from a multi-year diffusion trial of an evidence-based skin cancer prevention program, called Pool Cool. The diffusion trial conducted after an efficacy trial found significant positive effects on children’s sun protection behaviors and sun safety environments at swimming pools (Glanz, Geller, Shigaki, Maddock, & Isnec, 2002). The Pool Cool efficacy trial was a randomized, controlled trial conducted at 28 swimming pools in Hawaii and Massachusetts in 1999. The program audience was children 5 to 10 years of age, their parents, and lifeguards and aquatics instructors. Sites in the sun protection (SP) arm (n=15 pools) received staff training; a series of sun-safety lessons; on-site interactive activities; provision of sunscreen, shade, and signage; and promotion of sun-safe environments. Sites in the attention-matched control, or injury prevention (IP), arm (n=13 pools) received a program that involved lessons and activities on bike safety, traffic safety, fire safety, and poisoning and choking prevention. At baseline, 558 SP and 452 IP parent respondents participated in the study. Children’s use of sunscreen, shade, and overall sun protection habits were significantly higher at follow-up in the SP arm than in the IP arm. The SP group also reported a 23% reduction in child sunburns compared to the preceding summer, with little or no reduction occurring in the IP group (Glanz, Geller, Shigaki, Maddock, & Isnec, 2002).

Few studies have documented factors critical to local implementation of disseminated evidence-based programs. The current article also describes participation, use of key program components, implementation processes, and sustainability indicators across many pool settings. Finally, lessons learned about process evaluation are presented.

METHODS

The Pool Cool Diffusion Trial was conducted over a four-year period from 2003 to 2006 and evaluated the effects of two strategies for diffusion of the Pool Cool program on implementation, maintenance, and sustainability; improvements in environmental supports for sun safety in swimming pools; and sun protection habits and sunburn among participating children (Glanz, Steffen, Elliott, & O’Riordan, 2005).

The Diffusion Trial used a three-level nested experimental design, with Field Coordinators, swimming pools, and children ages 5–10 in swimming lessons making up the three levels. Each Field Coordinator was responsible for a cluster of four to fifteen pools in a metropolitan region, and regions were randomly assigned to Basic and Enhanced study arms. The pools in the Basic condition received a Tool Kit containing a Leader’s Guide that describes how to implement the program, laminated lesson cards, a Mini Big Book of colorful cartoons to use interactively with the lessons, materials for poolside sun protection activities, a large dispenser of sunscreen and an aquatics-targeted sunscreen tips sign. Pools in the Enhanced condition received the Basic condition materials, plus additional sun safety items (reinforcements) for distribution, environmental supports including aluminum sun-safety signs and shade structures, supplementary guidance and incentives to promote maintenance and sustainability of the Pool Cool program (Glanz, Steffen, Elliott, & O’Riordan, 2005). The participating pools included public pools, YMCA pools, and military pools. The main aims of the diffusion study were to evaluate the effects of two strategies for program diffusion on: 1) program implementation, maintenance, and sustainability; 2) improvements in organizational and environmental supports for sun protection at swimming pools; and 3) sun protection habits and sunburns among children.

Pool managers completed surveys at the beginning and end of the summer to provide pool-level data and lifeguard/aquatic instructors and parents completed surveys on the same schedule. In addition, logs and emails to and from the field coordinators were tracked. A comprehensive process evaluation of the program, including annual independent process evaluation, provided additional data that were used to supplement survey data.

During the earlier efficacy trial, process evaluation focused on reach, extent of implementation and satisfaction of participants exposed to the program (Glanz, Isnec, Geller, &, Spangler, 2002). It specifically was designed to assess the extent of program implementation, the amount of time required for the program, whether environmental changes were implemented, whether lifeguards and children were exposed to intervention components, and how they rated the program components. Lifeguards and aquatic instructors completed monitoring forms to report on their use of sun safety lessons across all participating pools. Several items on the posttest survey for parents asked about parents’ and their children’s participation, which incentive items they received and their reactions to the program. In addition, at posttest, aquatic staff members were asked how often they taught the lesson, teaching methods used and incentives they received (Glanz, Isnec, Geller & Spangler, 2002).

In the diffusion trial, much of the process evaluation shifted to focus more on implementation across pool sites. Its aims were to understand implementation across the study groups and diverse pools, to assess maintenance and sustainability of the program in the latter years, and to identify barriers and facilitators to program implementation and external factors occurring during the study. The process evaluation also provided an opportunity to examine concordance between various sources of data and reports from parents, pool manager, and lifeguard surveys, and to help interpret study outcome data.

The process evaluation of the Pool Cool Diffusion Trial involved multiple data collection instruments and strategies for pool staff, Field Coordinators, and the project/research team during the four years of the study. The process evaluation methods included: archival information from program databases, training surveys, process evaluation interviews and site visits with pool staff implementing the program, Field Coordinator emails and activity logs, and items on the pool manager, parent and lifeguard surveys. The process evaluation focused on participation, implementation of the program at diverse sites, patterns of communication and facilitators and barriers to program improvements. In the latter years of the program, issues related to program maintenance and sustainability were assessed. Table 1 provides information about each individual data source, its purpose, the population or sample, key measures, and data analysis. The following section describes each process evaluation method, including data collection, instruments and data analyses. In addition, key results are highlighted.

Table 1.

Summary of Pool Cool Diffusion Trial Process Evaluation Data Sources, Purpose and Analysis

Data Sources Purpose Population/Sample Measures Analysis
Training surveys Obtain feedback on the effectiveness of the Pool Cool Field Coordinator training and any improvements needed Field Coordinators Mostly close-ended questions with rating scales
One open-ended question about ways to improve the training
  • * Rating of training

  • * Thematic analysis of suggestions for improvement

Archival data from program databases (Excel) Track reach and level of program participation, assess time Field
Coordinators spent on program-related activities, and highlight problem areas
Project staff and
Field Coordinators
Conference call logs
Field Coordinator logs
Incoming survey logs
  • * Information on pools participating each year

  • * Descriptive statistics on # of calls, # of Field Coordinator logs, types of activities, time spent, & surveys received

  • * Calculation of stipend/participation

Field Coordinator logs Track primary program activities conducted by Field Coordinators and time spent on such activities Field Coordinators Date, pool, time spent on activity, type of activity
  • * Number and frequency of types of activities

  • * Time spent on activities

  • * Pattern across years

  • * Differences between the two study groups

Email Records Maintain contact with FCs over the summer months to remind FC’s about tasks, track program progress and address problems or difficulties Project staff and
Field Coordinators
Topics related to program implementation, research processes, and communication
  • * Number, frequency, length of emails sent

  • * Content analysis of topics

  • * Differences between groups

Pool manager follow-up surveys Measures program implementation of lessons, policy and environment Pool managers at participating pools Teaching of Pool Cool lessons and activities
Promotion of sun protective
Use of sunscreen dispenser
Receipt of Pool Cool items
Descriptive statistics
Computation of program implementation score
Parent follow-up surveys Measures program participation and exposure to program messages and materials Parents participating in the study (answering also for their children) Receipt of sun safety information from pool
Exposure to program messages and environmental supports
Receipt of Pool Cool items
Descriptive statistics
Computation of exposure scale
Lifeguard follow-up surveys Measures program implementation and exposure to Pool Cool materials Lifeguards at participating pools Teaching of Pool Cool lessons and activities
Promotion of sun protective behaviors and policies
Use of sunscreen dispenser
Receipt of Pool Cool items
Descriptive statistics
Computation of program implementation score
Pool site visits and (interviews +observation) and telephone interviews Evaluate program implementation and differences in implementation between the two study groups.
Identify internal and external factors and barriers to program implementation
Pool contacts from a sample of 120 pools each year: 80 phone interviews and 40 site visits Interviews (telephone or in-person): Open and close ended questions about staff training, receipt/use of materials, and staff responses to the program
Site visits: Observations of the pool environment, and staff practices, and program implementation
  • * Descriptive statistics

  • * Differences between groups

  • * Calculation of an implementation score for each pool

Field Coordinator Training Surveys

Prior to participating in the Pool Cool program, each Field Coordinator (FC) participated in a 1–2 day training course taught by the Pool Cool research staff. This course consisted of basic sun-safety information, an introduction to the Pool Cool staff and program materials, and training on the research protocols being used in the Diffusion study. Participants filled out brief surveys evaluating their Pool Cool Field Coordinator training experience shortly after the training was completed. The surveys included questions on a 4-point Likert scale (not at all, somewhat, mostly, completely) regarding Field Coordinator understanding of the program, its materials, its purpose, and their commitment to carry out program responsibilities. An open-ended question asked for additional comments on how the training might be improved. The data from these surveys have been used for the purposes of ensuring that FC training was being well received and to plan and improve subsequent trainings and ‘refresher’ trainings.

In the first year of training, 32 field coordinators reported on the training surveys that they understood the purpose of Pool Cool study (M=3.88 ± .34) and the responsibilities of a Pool Cool coordinator (M=3.88 ± .34) on a scale of 1=Not at all to 4 =Completely. They also reported that they mostly understood all of the program components (M=3.72 ± .46) and the purpose of each lesson (M=3.71 ± .59). Some suggested improvements were to provide the materials to them earlier for review, to conduct the training earlier and to shorten the subsequent training sessions.

Archival Data from Program Databases

The Pool Cool research staff kept records of each Field Coordinator’s activity in the program. Regularly updated databases were used to keep track of pool and coordinator participation, contact information, important dates/timelines, frequency/type of communication, submission of activity logs, and program materials and surveys shipped to and received from each pool and Field Coordinator. Contact information frequently changed, both for coordinators and pools. Staff training dates, swim lesson start/end dates, and pool facility open/close dates for each summer varied greatly across pools and regions, necessitating frequent adjustment by research staff in materials and survey shipment dates. In addition, the data helped to highlight trouble areas where pools or Field Coordinators might have fallen behind in their research and/or program requirements so that additional help could be provided where appropriate. This information was needed to keep track of field activity in such a large study.

The number of pools participating in the program each summer ranged from 262 pools to 469 pools across the four years. Field Coordinator participation showed similar variation ranging from 33 to 44 Field Coordinators participating from 2004 to 2006. Pools and Field Coordinators both frequently fell behind in their staff training and completion/return of surveys. Databases were used to identify these pools and then provide additional help and assistance as needed for getting the necessary research elements completed.

Field Coordinator Logs

Each summer, Field Coordinators were provided with an activity log in a Word document form to record their time spent on various Pool Cool program activities throughout the summer and were asked to return the logs on the 1st and 15th of each month. Field Coordinators typically returned the completed logs to Pool Cool staff by fax or as email attachments which were then printed out and organized by year and by Field Coordinator. In each log, Field Coordinators recorded the date, pool name(s), type of activities, and approximate amount of time spent on logged activities. Response options for “type of activity” were: phone call, phone message, sent fax, sent email, deliver materials, conduct training, collect surveys, and other. Analysis of the logs included computing descriptive statistics of number of logs submitted, frequency of activities logged, time spent on activities, and differences between the Basic and Enhanced groups.

A range of 96 to 121 activity logs were received from Field Coordinators each summer during the Diffusion study (404 logs total over the four years), with a range of 0 to 9 logs per Field Coordinator each summer (median = 3). The primary activities logged included communication, management of survey data collection, and management of Pool Cool program materials. Other activities that they recorded were training, site visits with participating pool staff, and administrative tasks.

Emails

Email was the primary means of communication between Field Coordinators and Pool Cool program staff throughout the Diffusion study. Research staff had access to an email account set up exclusively for program use, and Field Coordinators were instructed to use this address when trying to contact program staff about study questions. Through the use of email, Field Coordinators and program staff were able to stay updated on program progress, coordinate shipment of program materials and surveys, exchange administrative documents (as email attachments), and quickly address any problems that arose. Furthermore, using a single email address allowed for easy organization of tracking of emails, which were saved, printed, and organized by year and by Field Coordinator. Email length, date, and content were later coded, allowing for analyses of length, frequency, and content of the emails, as well as differences in communication between the Basic and Enhanced groups.

A total of 5,215 emails were sent to and from Field Coordinators over the four years, with a range of 428 to 892 emails sent to Field Coordinators each year and a range of 421 to 897 emails received from Field Coordinators each year. Emails most commonly discussed program administration, survey data collection, and program materials. Other email topics included recruiting participants, the Field Coordinator training, pool staff training, personal communication, and sustainability issues.

Lifeguard, Pool Manager, and Parent Surveys

Follow-up surveys completed by lifeguards, pool managers, and parents were a key source of information regarding program outcomes but included process evaluation data as well. Lifeguard follow-up surveys were mailed to Field Coordinators at the end of the summer for distribution. Field Coordinators were responsible for distributing the surveys before their pools closed for the season, collecting completed surveys and mailing surveys back to Pool Cool program headquarters. Across the years, responses rates for follow-up surveys ranged from 50.3% to 66.8% of lifeguards who completed a baseline survey. Process evaluation measures on the survey were the pool’s implementation of core program components, addition of shade structures, lifeguards’ use of the Leader’s Guide and sunscreen, conduct of poolside activities, and the receipt of Pool Cool items.

A centralized survey research contractor conducted parent follow-up surveys by phone or internet. The contractor was given a spreadsheet database with the names and addresses (or e-mail addresses, if provided) provided by parent participants on baseline surveys, and invited these parents to complete follow-up surveys by telephone or on the internet. Non-responders were also given the option of completing a mail survey. Process evaluation measures on parent follow-up surveys queried exposure and involvement in Pool Cool: the receipt of sun safety messages from the pool, whether the pool taught about sun protection in the lessons, advised parents to apply sunscreen to children, and provided sunscreen and shade, and the receipt of Pool Cool items.

Across the years, lifeguards reported that there was high implementation of the Pool Cool program and policies. A Pool Cool program participation score with a scale from 0–10 was calculated from 16 items including conduct of lessons, environmental change and instituting policies. In 2005, the mean participation score was 4.23 (SD=2.78). Finally, a composite score for receipt of Pool Cool items was created from 7 items with a range from 0–2. The receipt of the Pool Cool items (Yes/No) was M=1.13 (SD=0.84). Lifeguards reported receiving lanyards, pens and water bottles the most.

For the parent surveys, the follow-up responses rates ranged from 55.1% to 82.0% of parents who completed a baseline survey. A range of 79 to 83.1% of parents reported receiving sun safety information from the pools over the 4 years. An index score was computed for parental exposure (dose) to the program from 8 items related to policies, activities, and lessons received out of a scale of 0–3. For example in 2005, the score was 1.59 (SD= 0.95) Another composite score was the receipt of the 7 Pool Cool incentive items with a range of 0–2 (0=0 items, 1=1 item, 2=receipt of 2 or more items). The mean score in 2005 was 0.79 (SD=.80) with the most popular item received being the sunscreen bottles.

Pool manager follow-up surveys assessed program implementation at the pool-level with ten items asking whether the main components of the Pool Cool program were used and at what level. Four questions were on a 4-point scale and asked about the frequency of educational activities in swimming lessons, sun-safety educational programs, teaching of Pool Cool lessons, and use of the Pool Cool leader’s guide. These items were recoded (1 = Sometimes, rarely, or never to 3 = Usually or always) to a 3-point scale. The other six questions were dichotomous (yes/no) items and asked if the pool used the Mini Big Book, conducted poolside activities, displayed the sunscreen tips poster, displayed the aluminum sun-safety signs, used the sunscreen provided, and added shade structures or shaded areas this summer. These yes/no items were also recoded (1 = No and 3 = Yes) to have the same range as the recoded educational activities. The 10 items were summed to create a composite implementation score with a range of 10–30. The mean implementation sum score in 2005 was 23.27 (SD=4.19).

Process Evaluation Site Visits, Interviews and Pool Observations

The primary independent annual process evaluation – that is, not conducted by the main program/research team – involved site visits, on-site interviews, and telephone interviews of a sample of participating pools each year. Each year a sample of 120 pools – 40 for site visits and 80 for phone interviews - were stratified to ensure balanced sampling of regions from the two study groups. Selected pool contacts were interviewed using a 57-item interview guide in the middle of the summer to allow time for program implementation. The interview included closed-ended and open-ended questions to collect information on program participation, implementation, and challenges to implementation. The pool contacts reported on use of core educational and environmental program elements, comments about program materials, and general questions about implementing Pool Cool in their setting. During the site visits, evaluators made visual observations of program implementation, the pool environment, and sun safety practices of aquatic staff. An implementation score was developed and calculated for each pool based on responses to interview items (Escoffery, Glanz, & Elliot, 2008).

Data analysis included computing descriptive statistics for major implementation items and running chi-squares and t-tests to determine differences in implementation of specific program elements and differences in implementation scores between the Basic and Enhanced conditions and between years. The primary implementation score was computed from 1 training, 6 lessons, 5 poolside activity, 2 sun safety, and 2 sunscreen questions (Escoffery, Glanz, & Elliot, 2008). Qualitative data analysis involved coding the open-ended responses into major themes.

The process evaluation data from 2003 and 2004 indicated high implementation levels across pools across the 2 years (Escoffery, Glanz, & Elliot, 2008). The primary implementation score for a sample of the pools spanned from 68.3 to 73.2% from 2003 to 2006. Over 70% of the pool managers reported receipt of a training on the Pool Cool program over the 4 years. Likewise, for the environmental components, over 75% of the pools posted the sun safety signs, and over 90% used the large pump bottle of sunscreen. Pool contacts reported lower use of Poolside Activities. These activities were not considered a core part of the Pool Cool program, but were encouraged as supplemental strategies that could be used for special events or on rainy days. Finally, there were few significant differences between the Basic and Enhanced condition in implementation of specific Pool Cool components across all years.

Information about the adaptation of Pool Cool program activities and materials was also collected in the process evaluation interviews across the years. Participating pools reported that the lifeguards or pool managers made additional sun safety signs to put around the pool, tie dyed Pool Cool shirts, had a Pool Cool carnival, photocopied brochures or information sheets from the Pool Cool CD-ROM, purchased a Jeopardy board for the Poolside Activity of Sun Jeopardy, and developed new games based on the messages such as Lifeguard Protective Steps by asking the kids to point out which lifeguards had on sun protective items, and what those items were.

Factors that facilitated and hindered Pool Cool program implementation were identified through qualitative questions on the process evaluation interviews across the years. Key facilitating factors included the receipt of the Pool Cool Tool Kit or materials with the signs and sunscreen, simplicity of the lessons, knowledge gained about skin cancer, appeal of the materials for young children, and ease of implementing the program. In addition, pool contacts generally had praise for their Field Coordinators with over 70% reporting that s/he was helpful or very helpful. The barriers mentioned as challenges were limited time to conduct the program and parents’ complaints about loss of swim lesson time. Finally, several maintenance and sustainability data were assessed in the process evaluation interviews. For program maintenance, in the last year of the program, pool contacts reported that the program was easy to implement. In relation to program sustainability, over 65% of pool contacts were in strong agreement that the Pool Cool program was integrated in the pool operations, the staff perceived the program as effective, the program was visible in the community, and there was strong management support for the program.

DISCUSSION

This article describes the multi-method process evaluation of a skin cancer diffusion trial. It demonstrates the utility of process evaluation tracking, feedback from implementation sites that are implementing a science-based program, and monitoring of communications during a multi-year diffusion trial. The process data highlighted the benefits of process evaluation in the diffusion of an evidence-based program to many different community sites in better understanding program adoption, context, maintenance, and institutionalization.

The data contribute to understanding of the reach and fit of the diffused program to the context of implementing pools. The reach of the program is fairly extensive and most pools implemented the core elements of the program, or the features of an intervention that are responsible for its effectiveness (Kelly et al., 2000a), based on several data sources. The process evaluation survey and observation data demonstrate that the pools implementing the program had fairly high implementation of core components across setting characteristics such as community size, urban or rurality, municipal or independent pools, and region of the country. These findings present implementation data across local settings to begin to address issues of external validity and generalizability as suggested by Green and Glasgow (2006).

Many factors facilitated program implementation, including the Tool Kit, training of Field Coordinators and pool staff, provision of technical assistance, and the Field Coordinator. Research notes that a significant barrier to the utilization of evidence-based programs is the lack of information on how practitioners can replicate a program with fidelity (Kelly et al., 2000b; Sussman, Valente, Rohrbach, Skara, & Pentz, 2006). Packaging program materials so that they are user-friendly and appealing may help with adoption and successful implementation (Rohrbach, Grana, Sussman, & Valente, 2006). The distribution of the Pool Cool Tool Kit was documented as a critical component to program implementation. The kit included step-by-step instructions in the Leader’s Guide of how to implement the program and key program materials.

Training is another critical element to the implementation and maintenance of the Pool Cool program over the years. Wandersman and colleagues (2008) emphasize the importance of supporting organizations in their effort to adopt evidence into practice through the provision of training and technical assistance. Our finding of the need to continually provide and improve program training is similar to other diffusion research, in that groups receiving a manual, training, and regular consultation or technical assistance are more likely to adopt an evidence-based intervention and use the intervention more frequently (Hamdallah, Vargo, & Herrera, 2006; Kelly et al., 2000b; Mihalic, 2004). A related issue is technical assistance given during the diffusion process. The tracking of emails and logs allowed program staff to offer assistance in areas of program implementation and to identify key issues related to management of the program and distribution of the program materials. This process allowed staff to quickly identify field coordinators who were having difficulties fulfilling their role.

These process data further confirm the importance of a linkage agent or champion in the adoption of an evidence-base program (Forsetlund, Talseth, Bradley, Nordheim, & Bjorndal, 2003; Mihalic, 2004; Rogers, 2003; Titler, 2007). We observed that Field Coordinators were critical to the success of the program at pool sites as designed. The emails and logs support the notion that the majority of their work involved administration of the program to participating pools, facilitating data collection, and distribution of program materials. The Field Coordinators’ importance was also validated from the process evaluation interviews.

Another important factor in the adoption of evidence-based practice is adaptation of the program to fit the local context without changing the core elements and internal logic of an intervention are not modified (McKleroy et al., 2006). For example, program implementers may add their pool name, logos and local images or colors to program materials. The process evaluation data were helpful in documenting methods used by participating pools to modify existing Pool Cool materials or to enhance the program with new materials or ideas such as the Pool Cool carnival. Similarly, Harshbarger and colleagues (2006) found many program adaptations among community based organizations who were implementing an HIV evidence-based program.

This study also offers insight into maintenance and sustainability of the diffusion of an intervention nationally. Over three-fourths of pools reported that they would continue the program in the process interviews. Several pools mentioned that they would continue to teach Pool Cool lessons and implement the program, and a few reported that they will seek funding to sustain it. These are strong indicators of program sustainability for these participating pools (Pluye, 2004; Shediac-Rizkallah & Bone, 1998).

Lessons Learned

Some of the lessons learned from the diffusion trial are as follows. Most importantly, process evaluation can contribute valuable data to document program reach to sustainability in communities through the phases of program translation. In addition, the different methods helped track program implementation and troubleshoot training issues across the different sites over the four years. We received feedback about program implementation at different levels through the Field Coordinators and participating pool contacts. It also was helpful to have multiple process evaluation methods; however, it was labor intensive to implement and monitor the process evaluation data collection. Some of the process methods were routine communications, while the process evaluation across site involved intensive data collection and site visits. The different methods all contributed to program staff monitoring of implementation across sites, understanding the dissemination process, and offering technical assistance to participating programs with implementation issues. Finally, the process evaluation data from different sources validated each other. For instance, the surveys from the process evaluation confirmed what was found during the site visit. In addition, the email communications issues also matched the major activities Field Coordinators reported in their activity logs.

Conclusion

The promotion of translation of evidence-based interventions into communities has great potential for impacting the health of communities. Employment of multiple methods for collecting process evaluation data can contribute to understanding factors related to the generalizability of evidence-based programs across different settings and populations by measuring reach, program exposure, implementation, program adaptation, maintenance and sustainability. Further research on factors related to external validity of diffusion trials may build the knowledge of translation of science-based programs into real world settings.

Table 2.

Sample Questions from Process Evaluation Surveys

Data Sources Question Response Options
Lifeguard follow-up surveys
Pool Cool Activities: Did you take part in these Pool Cool Poolside Activities: [5 items]
  • ▪ Weather Watch: UV Index

  • ▪ Blue & Purple People, etc.?

Yes/No
Use of Sunscreen: Did you use the sunscreen in the large dispenser at the pool? Yes/No
Taught Lessons: How many times in total did you teach the Pool Cool sun safety lessons to children? 0=I do not teach swimming lessons or I was not able to teach the class of Pool Cool.
1=1–4 times
2=5–8 times
3=>8 times
Pool Cool Items: Did you receive any of these Pool Cool items: [7-items]
  • ▪ Sunscreen samples?

  • ▪ Water bottles?

  • ▪ Lanyards, etc.?

Yes/No
Program Participation Composite Score:
16 items = Pool Cool Activities (5 items) + Pool Cool Leader’s Guide (2 items) + Use of Sunscreen (1 item) + Taught Lessons (1 item) + Pool Cool items (7 items)
Pool Manager follow-up surveys
Educational activities: How often did you teach the Pool Cool sun-safety lessons? Rarely or never, sometimes, often, usually-always
Use Leader’s Guide: How often did the lifeguards and/or aquatic instructors use the Leader’s Guide? Rarely or never, sometimes, often, usually-always
This summer did your pool…
  • ▪ Display the aluminum sun-safety signs

  • ▪ Use the sunscreen provided by the Pool Cool program

  • ▪ Added any shade structures or shaded areas this summer?

Yes/No
Pool Manager Implementation Composite Score: Educational activities (1 items) + Sun safety program (1 item) + Times taught lesson (1 item) + Use Leader’s Guide (1 item) + Used Mini Big Book (1 item) + Conduct of Poolside Activities (1 item) + Displayed the sunscreen Posters (1 item) + Displayed the sunscreen Posters (1 item) + Used Sunscreen (1 item) + Added any shade structure (1 item)
Parent follow-up surveys
Does this pool teach about sun protection in swimming lessons?
Policies:
Does this pool:
  • ▪ Make an effort to reduce children’s sun exposure?

  • ▪ Provide sunscreen for swimmers who forgot to bring it along?

  • ▪ Provide shade for pool users?

Yes/No
Did you or your child receive any of the following Pool Cool items this summer:
  • ▪ Sunscreen samples?

  • ▪ Water bottles, etc.?

Yes/No
Parent Participation Composite Score: Policies and activities (7–8 items) +Teach Sun protection in Lesson (1 item) + Receipt of Pool Cool Items (6–7 items)
Pool site visits and telephone interviews
Did your lifeguards/instructors/staff complete a training for Pool Cool? Yes/No
Did your pool receive:
  • ▪ the Pool Cool Laminated Lessons and Half Sheets?

  • ▪ the Pool Cool Leader’s Guide?

  • ▪ the Pool Cool Mini Big Book?

  • ▪ any aluminum signs?

Yes/No/Don’t Know
How often are the Pool Cool lessons taught at your pool? Open-ended
On average, how many times per week this summer did you or your pool staff use the:
  • ▪ Weather Watch- the UV Index Sheets?

  • ▪ Sun Jeopardy Game?

____per day/per week/per month/per summer
___Don’t know
Have you posted any sun safety signs at your pool? Yes/No
If Yes: Where did you post them?
If No: Why?

Biographies

Cam Escoffery is an Assistant Professor in the Department of Behavioral Sciences and Health Education at the Rollins School of Public Health. Her contact information is: 1518 Clifton Road, Atlanta, GA 30322, Telphone: 404/727-4701, Fax: 404/727-1369, cescoff@sph.emory.edu

Dr. Escoffery’s areas of research include cancer prevention and control, program evaluation, technology, and health promotion. She has been recently published on process evaluation of the Pool Cool program, an online epilepsy self-management program, and tobacco prevention.

Karen Glanz is the Charles Howard Candler Professor and a Georgia Cancer Coalition Distinguished Research Scholar in the Department of Behavioral Sciences and Health Education at the Rollins School of Public Health. Dr. Glanz’s research emphasizes understanding and improving healthy environments and behaviors, especially related to nutrition and obesity, skin cancer prevention, cancer screening, and tobacco control.

Dawn Hall is a Research Project Coordinator in the Department of Behavioral Sciences and Health Education at the Rollins School of Public Health. Her research focuses on the skin cancer prevention and she has published in that area.

Thomas Elliott is the Pool Cool Project Coordinator. He is in the Department of Behavioral Sciences and Health Education at the Rollins School of Public Health. His research focus is also on skin cancer prevention and he has published on the Pool Cool program.

Contributor Information

Cam Escoffery, Assistant Professor, Rollins School of Public Health, Emory University, Dept. of Behavioral Sciences and Health Education, 1518 Clifton Road, 5th Floor, Atlanta, GA 30322, Phone: 404/727-4701, Fax: 404/727-1369.

Karen Glanz, Charles Howard Candler Professor, Rollins School of Public Health, Emory University, Dept. of Behavioral Sciences and Health Education, 1518 Clifton Road, 5th Floor, Atlanta, GA 30322.

Dawn Hall, Program Coordinator, Rollins School of Public Health, Emory University, Dept. of Behavioral Sciences and Health Education, 1518 Clifton Road, 5th Floor, Atlanta, GA 30322.

Tom Elliott, Project Director, Rollins School of Public Health, Emory University, Dept. of Behavioral Sciences and Health Education, 1518 Clifton Road, 5th Floor, Atlanta, GA 30322.

References

  1. Baranowski T, Stables G. Process evaluations of the 5-a-day projects. Health Education & Behavior. 2000;27(2):157–166. doi: 10.1177/109019810002700202. [DOI] [PubMed] [Google Scholar]
  2. Basen-Engquist K, O’Hara-Tompkins N, Lovato CY, Lewis MJ, Parcel GS, Gingiss P. The effect of two types of teacher training on implementation of Smart Choices: a tobacco prevention curriculum. Journal of School Health. 1994;64(8):334–339. doi: 10.1111/j.1746-1561.1994.tb03323.x. [DOI] [PubMed] [Google Scholar]
  3. Briss PA, Brownson RC, Fielding JE, Zaza S. Developing and using the Guide to Community Preventive Services: Lessons learned about evidence-based public health. Annual Review of Public Health. 2004;25:281–302. doi: 10.1146/annurev.publhealth.25.050503.153933. [DOI] [PubMed] [Google Scholar]
  4. Brownson RC, Baker EA, Leet TL, Gillespie KN. Evidence-based public health. Oxford: Oxford University Press; 2003. [Google Scholar]
  5. Dowda M, James F, Sallis JF, McKenzie TL, Rosengard P, Kohl HW., 3rd Evaluating the sustainability of SPARK physical education: A case study of translating research into practice. Research Quarterly for Exercise & Sport. 2005;76(1):11–19. doi: 10.1080/02701367.2005.10599257. [DOI] [PubMed] [Google Scholar]
  6. Escoffery C, Glanz K, Elliott T. Process evaluation of the Pool Cool Diffusion Trial for skin cancer prevention across two years. Health Education Research. 2008 doi: 10.1093/her/cym060. [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. Forsetlund L, Talseth KO, Bradley P, Nordheim L, Bjorndal A. Many a slip between cup and lip. Process evaluation of a program to promote and support evidence-based public health practice. Evaluation Review. 2003;27(2):179–209. doi: 10.1177/0193841X02250528. [DOI] [PubMed] [Google Scholar]
  8. Glanz K, Geller AC, Shigaki D, Maddock JE, Isnec MR. A randomized trial of skin cancer prevention in aquatics settings: The Pool Cool program. Health Psychology. 2002;21(6):579–587. [PubMed] [Google Scholar]
  9. Glanz K, Isnec MR, Geller A, Spangler KJ. Process evaluation of impmentation and dissemination of a sun safety program at swimming pools. In: Steckler A, Linnan L, editors. Process evaluation for public health interventions and research. San Francisco, CA: Jossey-Bass; 2002. pp. 58–82. [Google Scholar]
  10. Glanz K, Steffen A, Elliott T, O’Riordan D. Diffusion of an effective skin cancer prevention program: Design, theoretical foundations, and first-year implementation. Health Psychology. 2005;24(5):477–487. doi: 10.1037/0278-6133.24.5.477. [DOI] [PubMed] [Google Scholar]
  11. Green LW, Glasgow RE. Evaluating the relevance, generalization, and applicability of research: Issues in external validation and translation methodology. Evaluation & the Health Professions. 2006;29(1):126–153. doi: 10.1177/0163278705284445. [DOI] [PubMed] [Google Scholar]
  12. Hamdallah M, Vargo S, Herrera J. The VOICES/VOCES success story: Effective strategies for training, technical assistance and community-based organization implementation. AIDS Education & Prevention. 2006;18(4 Suppl A):171–183. doi: 10.1521/aeap.2006.18.supp.171. [DOI] [PubMed] [Google Scholar]
  13. Harshbarger C, Simmons G, Coelho H, Sloop K, Collins C. An Empirical Assessment of Implementation, Adaptation, and Tailoring: The Evaluation of CDC’s National Diffusion of VOICES/VOCES. AIDS Education & Prevention. 2006;18:184–197. doi: 10.1521/aeap.2006.18.supp.184. [DOI] [PubMed] [Google Scholar]
  14. Hoelscher DM, Kelder SH, Murray N, Cribb PW, Conroy J, Parcel GS. Dissemination and adoption of the Child and Adolescent Trial for Cardiovascular Health (CATCH): A case study in Texas. Journal of Public Health Management & Practice. 2001;7(2):90–100. doi: 10.1097/00124784-200107020-00012. [DOI] [PubMed] [Google Scholar]
  15. Israel BA, Cummings KM, Dignan MB, Heaney CA, Perales DP, Simons-Morton BG, et al. Evaluation of health education programs: current assessment and future directions. Health Education Quarterly. 1995;22(3):364–389. doi: 10.1177/109019819402200308. [DOI] [PubMed] [Google Scholar]
  16. Kelly JA, Heckman TG, Stevenson LY, Williams PN, Ertl T, Hays RB, et al. Transfer of research-based HIV prevention interventions to community service providers: Fidelity and adaptation. AIDS Education & Prevention. 2000a;12(5 Suppl):87–98. [PubMed] [Google Scholar]
  17. Kelly JA, Somlai AM, DiFranceisco WJ, Otto-Salaj LL, McAuliffe TL, Hackl KL, et al. Bridging the gap between the science and service of HIV prevention: Transferring effective research-based HIV prevention interventions to community AIDS service providers. American Journal of Public Health. 2000b;90(7):1082–1088. doi: 10.2105/ajph.90.7.1082. [DOI] [PMC free article] [PubMed] [Google Scholar]
  18. Linnan L, Steckler A. Process evaluation for public health interventions and research. San Francisco, CA: Jossey-Bass; 2002. Process evaluation for public health interventions and research: An overview; pp. 1–24. [Google Scholar]
  19. McGraw SA, Stone EJ, Osganian SK, Elder JP, Perry CL, Johnson CC, et al. Design of process evaluation within the Child and Adolescent Trial for Cardiovascular Health (CATCH) Health Education Quarterly, Suppl. 1994;2:S5–26. doi: 10.1177/10901981940210s103. [DOI] [PubMed] [Google Scholar]
  20. McKleroy VS, Galbraith JS, Cummings B, Jones P, Harshbarger C, Collins C, et al. Adapting Evidence-Based Behavioral Interventions for New Settings and Target Populations. AIDS Education & Prevention. 2006;18:59–73. doi: 10.1521/aeap.2006.18.supp.59. [DOI] [PubMed] [Google Scholar]
  21. Mihalic S, Irwin K, Fagan A, Ballard D, Elliott D. Successful program implementation: Lessons from Blueprints. Rockville, MD: U.S. Department of Justice; 2004. [Google Scholar]
  22. Owen N, Glanz K, Sallis JF, Kelder SH. Evidence-based approaches to dissemination and diffusion of physical activity interventions. American Journal of Preventive Medicine. 2006;31(4 Suppl):S35–44. doi: 10.1016/j.amepre.2006.06.008. [DOI] [PubMed] [Google Scholar]
  23. Pluye P, Potvin L, Denis JL. Making programs last: conceptualizing sustainability. Evaluation and Program Planning. 2004;27:121–133. [Google Scholar]
  24. Rebchook GM, Kegeles SM, Huebner D, Team TR. Translating research into practice: The dissemination and initial implementation of an evidence-based HIV prevention program. AIDS Education & Prevention. 2006;18(4 Suppl A):119–136. doi: 10.1521/aeap.2006.18.supp.119. [DOI] [PubMed] [Google Scholar]
  25. Rogers E. Diffusion of innovations. 5. New York: Free Press; 2003. [Google Scholar]
  26. Rohrbach LA, Grana R, Sussman S, Valente TW. Type II translation: Transporting prevention interventions from research to real-world settings. Evaluation & the Health Professions. 2006;29(3):302–333. doi: 10.1177/0163278706290408. [DOI] [PubMed] [Google Scholar]
  27. Shediac-Rizkallah MC, Bone LR. Planning for the sustainability of community-based health programs: Conceptual frameworks and future directions for research, practice and policy. Health Education Research. 1998;13(1):87–108. doi: 10.1093/her/13.1.87. [DOI] [PubMed] [Google Scholar]
  28. Sussman S, Valente TW, Rohrbach LA, Skara S, Pentz MA. Translation in the health professions: Converting science into action. Evaluation & the Health Professions. 2006;29(1):7–32. doi: 10.1177/0163278705284441. [DOI] [PubMed] [Google Scholar]
  29. Titler M. Translating research into practice. American Journal of Nursing. 2007;107(6 Suppl):26–33. doi: 10.1097/01.NAJ.0000277823.51806.10. quiz 33. [DOI] [PubMed] [Google Scholar]
  30. Wandersman A, Duffy J, Flaspohler P, Noonan R, Lubell K, Stillman L, et al. Bridging the gap between prevention research and practice: The Interactive Systems Framework for Dissemination and Implementation. American Journal of Community Psychology. 2008;41(3–4):171–181. doi: 10.1007/s10464-008-9174-z. [DOI] [PubMed] [Google Scholar]
  31. Windsor R, Clark N, Boyd NR, Goodman RM. Evaluation of health promotion, health education, and disease prevention programs. 3. New York, NY: McGraw-Hill; 2003. [Google Scholar]

RESOURCES