Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2016 May 1.
Published in final edited form as: J Public Health Manag Pract. 2015 May-Jun;21(Suppl 3):S16–S26. doi: 10.1097/PHH.0000000000000233

Applying a Mixed-Methods Evaluation to Healthy Kids, Healthy Communities

Ross C Brownson 1,2, Allison L Kemner 3, Laura K Brennan 3
PMCID: PMC4746076  NIHMSID: NIHMS755645  PMID: 25828217

Abstract

From 2008 to 2014, the Healthy Kids, Healthy Communities (HKHC) national program funded 49 communities across the United States and Puerto Rico to implement healthy eating and active living policy, system, and environmental changes to support healthier communities for children and families, with special emphasis on reaching children at highest risk for obesity on the basis of race, ethnicity, income, or geographic location. Evaluators designed a mixed-methods evaluation to capture the complexity of the HKHC projects, understand implementation, and document perceived and actual impacts of these efforts.


Eight complementary evaluation methods addressed four primary aims seeking to: 1) coordinate data collection for the evaluation through the web-based project management system (HKHC Community Dashboard) and provide training and technical assistance for use of this system; 2) guide data collection and analysis through use of the Assessment & Evaluation Toolkit; 3) conduct a quantitative cross-site impact evaluation among a subset of community partnership sites; and 4) conduct a qualitative cross-site process and impact evaluation among all 49 community partnership sites.

Evaluators identified successes and challenges in relation to the following methods: an online performance monitoring HKHC Dashboard system, environmental audits, direct observations, individual and group interviews, partnership and community capacity surveys, group model building, photos and videos, and secondary data sources (surveillance data and record review). Several themes emerged, including: the value of systems approaches, the need for capacity building for evaluation, the value of focusing on upstream and downstream outcomes, and the importance of practical approaches for dissemination.

The mixed-methods evaluation of HKHC advances evaluation science related to community-based efforts for addressing childhood obesity in complex community settings. The findings are likely to provide practice-relevant evidence for public health.

INTRODUCTION

Over the past four decades, obesity rates have increased dramatically among U.S. children and adolescents making childhood obesity a key public health issue.13 In response to this epidemic, there has been a focus on identifying and applying effective interventions to reverse trends. These intervention strategies include policy, systems, and environmental changes that are designed to provide opportunities, support, and cues to help people develop healthier behaviors.49

In conjunction with these newer intervention approaches that move beyond individual-level behavior change to approaches focused on the higher levels of the ecological framework, newer methods for evaluation are also advised. These evaluation approaches need to better take into account the complexity and inter-relatedness of interventions, focusing on core elements such as external validity,10 systems approaches,6 mixed methods (integrating quantitative and qualitative methods),11 and the value of learning collaboratives.12

Background on Healthy Kids, Healthy Communities

From 2008 to 2014, the Healthy Kids, Healthy Communities (HKHC) national program of the Robert Wood Johnson Foundation funded 49 community partnerships across the United States and Puerto Rico to implement healthy eating and active living policy, system, and environmental changes to support healthier communities for children and families, with special emphasis on reaching children at highest risk for obesity on the basis of race, ethnicity, income, or geographic location.13 HKHC used a “high touch, low dollar” approach, including four years of funding, ranging from $360,000 to $400,000 (nine leading sites), and customized technical assistance from a Project Officer of the HKHC National Program Office. Complementary initiatives funded during this time period tended to include much higher awards, such CDC’s “Communities Putting Prevention to Work” grants, ranging from $900,000 to $16,100,000.14 Because these and many other related national, state, or local initiatives (e.g., Safe Routes to School, USDA’s Farmers’ Market Promotion Program, YUSA’s Action Communities for Health, Innovation, and Environmental Change [ACHIEVE] or Pioneering Healthy Communities programs) occurred in the same states, regions, or communities at the same time, the resulting collaboratives and policy, system, and environmental changes often reflected a collection of influences across initiatives.

Background on the HKHC Evaluation

Given the complexity of the HKHC initiatives and their correspondence to simultaneous and related initiatives, evaluators designed a mixed-methods evaluation based on previous success with this approach15 to increase the comprehensiveness and validity of this evaluation. The HKHC evaluation had the following aims: 1) to coordinate data collection for the evaluation through the web-based project management system (HKHC Community Dashboard) and provide training and technical assistance for use of this system; 2) to guide data collection and analysis through use of the Assessment & Evaluation Toolkit; 3) to conduct a quantitative cross-site impact evaluation among a subset of community partnership sites; and 4) to conduct a qualitative cross-site process and impact evaluation among all 49 community partnership sites. This article describes the methods used to evaluate progress toward meeting these four aims. Its purpose is to orient readers to the array of methods and tools being applied across the other papers in this issue of the Journal of Public Health Management and Practice.

Coordination, communication, and capacity building

In 2009, the evaluation team worked with the HKHC National Program Office and the HKHC leading sites to develop a logic model for the overall initiative (see sample logic model in Figure 1 and complete logic model in Online Figure 1). The layers within this model represented: 1) the individual and household levels, influences within the context of family and home environments; 2) the community levels, influences (direct and indirect) of local systems, policies, environments, programs, and promotions; 3) the state, national, and industry-level, influences of systems, policies, and environments on communities and individuals; and 5) the macro social system level, influences permeating the other layers through social determinants health inequities, as well as cultural and psychosocial influences. Based on this color-coded model, the team customized logic models for each of the 49 community partnerships (visit Community Case Reports at www.transtria.com/hkhc).

Figure 1.

Figure 1

Sample Healthy Kids, Healthy Communities Logic Model

The evaluation team conducted a thorough content analysis of the HKHC grantee proposals to identify initial and planned strategies, assessment and evaluation capacity, technical assistance needs, and several other indicators (e.g., settings, partners, leadership, readiness, funding). With these data, evaluators collaborated with partners (i.e., HKHC community partnership representatives, Project Officers, an RWJF representative, and advisors from the national Evaluation Advisory Group) to identify priority healthy eating and active living policy, system, and environmental strategies for the cross-site evaluation. Through an online survey, partners ranked 24 strategies gleaned from the HKHC grantee proposals and rated each strategy (i.e., good, fair, or poor) on the following criteria: feasibility, evidence, impact, innovation, prevalence, and tools and resources. The final strategy recommendations included priority cross-site evaluation strategies, namely parks and play spaces, corner stores, active transportation, farmers’ markets, child care nutrition standards, and child care physical activity standards, as well as secondary strategies of interest, including joint use, Safe Routes to School, zoning, comprehensive plans, grocery stores, nutrition assistance, and gardens/ greenhouses.

During 2009, the evaluation team visited each of the nine leading sites as part of the planning period to understand potential assessment and evaluation needs, inform development of the HKHC Community Dashboard (see Evaluation Methods), create the HKHC initiative logic model, coordinate potential methods and measures with common intervention strategies, and identify opportunities for the leading sites to serve as mentors in assessment and evaluation for the other 40 HKHC community partnerships. The evaluation team developed a set of principles to guide the evaluation approach, including: knowing the community, honoring the community’s needs and boundaries, checking assumptions, being responsive to community needs, making realistic agreements, respecting differences, and working toward a shared language. Furthermore, the evaluation team and the HKHC National Program Office intended to complement and not duplicate the overall technical assistance provided to the communities. Therefore, another set of principles for technical assistance addressed roles and responsibilities of the respective teams, communication and coordination between the teams, and communication and coordination with the communities.

From 2010 to 2014, the evaluation team worked collaboratively with HKHC community partnerships, the RWJF, the HKHC National Program Office, and the HKHC Evaluation Advisory Group to implement evaluation methods and activities as well as to develop customized community reports to support local dissemination efforts (i.e., enhanced evaluation reports, systems thinking storybooks, and community case reports). Although translation and dissemination were not central aims of this evaluation, the evaluation team worked to ensure the findings had face validity from the communities’ perspectives and to provide technical assistance (e.g., audience segmentation, delivery channels) to the communities interested in creating dissemination products. The evaluation team also supported the community partnerships in developing peer-reviewed publications for this supplement.

EVALUATION METHODS

A set of eight overlapping evaluation components supported our mixed-methods evaluation,11 which was designed to address the primary aims and to specifically assess policy, system, and environmental changes as a result of the community partnerships’ efforts to increase healthy eating and active living in order to reduce childhood obesity. All evaluation tools are available at www.transtria.com/hkhc.

1. Performance Monitoring through the HKHC Community Dashboard (www.hkhcdashboard.org)

Performance monitoring is a method used to track progress on different goals and benchmarks as well as other related indicators of interest.16 The Dashboard tracked progress (i.e., “actions”) on community partnerships’ goals, tactics, activities, and benchmarks as well as other related indicators of interest (e.g., announcements, products or photos posted). The Dashboard also supported the following functions: project communications within community partnerships, social networking within and across community partnerships, progress monitoring by Project Officers from the HKHC National Program Office (NPO), data coding and analysis by Evaluation Officers from Transtria, program communications and announcements (e.g., “shout-outs”), tool and resource sharing, and data sharing and reporting.

The actions reported by community partnerships were coded by Evaluation Officers using a taxonomy consisting of 593 codes, such as healthy eating, active living, and obesity prevention strategies. Each action had a brief description, entry date, funding sources, associated media, and nature of the partnership’s role (i.e., direct, indirect, or not attributable partnership efforts). In turn, Evaluation Officers tagged each action for the types of settings, geography populations, and partners and organizations involved. Types and counts of actions by community partnership and across community partnerships were produced. Preliminary findings were summarized in action reports distributed to community partnerships at three six-month intervals over the course of the evaluation. A total of 17,400 actions were entered by HKHC community partnerships from March 2010 to May 2014.

Assessment and Evaluation Toolkit (A feature on the Dashboard)

The Assessment and Evaluation Toolkit provided assessment and evaluation tools, protocols, guides, manuals, and related resources to support local data collection and analysis efforts within community partnerships. Evaluation Officers provided technical assistance to community partnerships to adapt these tools and resources for local use, supported development of new tools and resources, or recommended data collection and analysis approaches. Technical assistance providers initially posted 393 toolkit items, HKHC community partnerships posted 219 toolkit items, and the remaining 409 items were posted as part of technical assistance responses to community partnerships. [Note: This did not include enhanced evaluation tools and resources.]

Findings associated with the development, implementation, and analysis of the HKHC Dashboard are described in a complementary article in this supplement.17

2. Enhanced Evaluation

The enhanced evaluation focused on the six cross-site strategies—including parks and play spaces, street design, farmers’ markets, corner stores, child care physical activity standards, and child care nutrition standards, as well as two data collection methods— environmental audits and direct observation. Participation by community partnerships was optional. Evaluation Officers trained local representatives (e.g., research assistants, AmeriCorps VISTAs, city government volunteers, community residents) with prior evaluation experience (e.g., collecting or analyzing data) to conduct the audits and/or observations for one or more of the six strategies. To enhance data quality, Evaluation Officers provided staff time for data entry, cleaning, analysis, and summary. Community partnerships received stipends to support local data collection efforts. See http://www.transtria.com/enhanced_evaluation_resources.php for tools, protocols, and training materials.

Evaluation Officers from Transtria worked with community partnerships to customize their enhanced evaluation plans to select a design, including baseline and/or follow-up data collection activities, with or without comparison sites. For instance, an environmental audit might assess factors affecting walk ability before and after the addition of sidewalks or completion of a community trail while direct observation might assess walking in these environments before and/or after construction. The training also addressed time requirements (e.g., observations at multiple times per day on multiple days per week) and other special considerations (e.g., good weather conditions). Evaluation Officers encouraged community partnerships to use multiple auditors/observers to increase inter-rater reliability. Given the limited number of trained individuals within each community, data to assess inter-rater agreement were only supplied for two strategies for the environmental audits (i.e., parks and play spaces and farmers’ markets).18

A total of 87 trainings were conducted by Transtria. Thirty-one HKHC community partnerships collected data resulting in a total of 41 environmental audits and 17 direct observations. After entering, cleaning, analyzing, and summarizing the data, Evaluation Officers provided a data report to each participating community partnership (visit the enhanced evaluation reports at www.transtria.com/hkhc). The enhanced evaluation is described in a complementary article in this supplement,18 and four community case examples are also reported.1922

3. Individual and Group Interviews

Key informant interviews provided an opportunity for in-depth dialogue with individuals who had expertise, experience, or perspectives related to the HKHC community partnerships’ activities, with a focus on the cross-site strategies.23 Evaluation Officers conducted phone and in-person interviews with project staff, partners, or community representatives before, during, and after site visits. General topics included: how long the community partnership was in operation; why the partnership was established; what organizations, agencies, or coalitions served on the partnership; whether community members were involved; major strengths/challenges of the partnership; sources of funds leveraged funding and factors influencing resources secured, and ways to sustain the partnership. Interview tools and protocols were adapted from previous evaluation efforts.15 Individual and group interviews were recorded, transcribed, and subsequently coded by theme (e.g., partnership development, policy assessment, strategy implementation challenges, sustainability efforts).

Policy Assessment (A focus of the interviews)

Policy assessment is a method used to review policies and political processes.24 HKHC community partnerships involved their partners in these assessments to increase understanding of dynamics at organizational and community levels that influence policy-making processes and resources. The evaluation team examined key policy indicators, including: processes and means used to develop, implement, and enforce policies; roles and interests (e.g., population health, economic feasibility, environmental protection) of different partners in the policy process; relative power and influence of different groups in the process (e.g., community participation); structural factors influencing the policy process (e.g., systems, institutions); contextual factors influencing the policy process (e.g., political, economic, socio-cultural); decision-making processes (e.g., criteria for weighing policy options); and perceived or anticipated impacts on health (e.g., obesity prevention, active living, healthy eating), the environment (e.g., water quality, air quality), the economy (e.g., benefits, costs), and equity (e.g., resource distribution for racial and ethnic and lower income populations). In addition, the evaluation team captured social and cultural acceptability, practicality, and legal considerations related to policy initiatives.

Cost Assessment (A focus of the interviews)

Cost assessment is an approach to document initiative costs and sources of revenue to support those costs.25 Evaluators tracked costs and funding associated with the design, development, implementation, and enforcement of strategies. Cost elements included a wide range of expenses associated with people’s time invested in different policy development, implementation, enforcement, evaluation, or communication activities (e.g., personnel wages, value of volunteer time); assets purchased or acquired (e.g., land use value, building use value, equipment); or other resources obtained or used (e.g., materials, supplies, travel reimbursement). Revenue elements included an array of funds and resources supporting strategy efforts, including: funds from RWJF, matching funds from other sources, new revenue generated through the strategy, in-kind resources, and other sources of revenue or capital (e.g., adopted expenses into existing community or organizational budgets). Many of these revenue elements were tracked in the HKHC Dashboard and the community partnerships’ financial reports to RWJF.

Through individual and group interviews and other available information, the evaluation team created strategy-specific cost and revenue frameworks, or value frameworks, including common categories of cost measures and sources of revenue for each strategy (inputs) in conjunction with a host of economic, social, environmental, and health outcomes (outputs). The value frameworks are described in a complementary article in this supplement.26

A total of 264 interviews were conducted by the evaluation team throughout the HKHC program, with a total of 659 participants from a range of different organizations (e.g., government agencies, elected and appointed officials, community-based organizations, businesses, civic organizations, and community residents) and disciplines (e.g., health care, parks and recreation, transportation, food policy, education).

4. Partnership and Community Capacity Surveys

Partnership and community capacity refers to the ability of communities to identify social and public health problems, develop collaborative approaches to address these problems, mobilize resources to intervene to create positive changes, and sustain these changes over time.27 The survey was designed to identify partnership, leadership, and community characteristics associated with the community partnerships’ work. The survey was derived from three primary sources: 1) early work from the CDC Prevention Research Centers develop a 38-item partnership capacity survey;28 2) later work from the Prevention Research Centers to conduct reliability and validity testing on an expanded list of survey items with eight community-based initiatives as well as a national sample of both leaders and non-leaders of 291 community-based initiatives;29 and 3) lessons learned from a survey developed and administered to the 25 Active Living by Design community partnerships based on the early work of the Prevention Research Centers.30 Modeled after this earlier work, an 82-item partnership capacity survey solicited perspectives of members of 49 community partnerships on structure and function of the partnership.

Questions addressed respondents’ understanding of the partnership in the following areas: partnership capacity and functioning, purpose of partnership, leadership, partnership structure, relationship with partners, partner capacity, political influence of partnership, and perceptions of community members. Participants completed the survey online and rated each item using a 4-point Likert-type scale (strongly agree to strongly disagree). Responses were used to reflect partnership structure (e.g., new partners, committees) and function (e.g., processes for decision making, leadership in the community).

The survey was conducted in two phases: for leading sites, the survey was open between December 2012 and April 2013; secondary sites completed the survey between September 2013 and December 2013. The survey was translated into Spanish to increase respondent participation in predominantly Hispanic/Latino communities.

A total of 603 surveys were completed by representatives of the HKHC community partnerships from December 2012 to April 2013 for the nine leading sites and from September to December 2013 for the 40 remaining sites. The partnership and community capacity survey findings are described in a complementary article in this supplement.31

5. Group Model Building

The purpose of Group Model Building (GMB) sessions was to introduce systems thinking at the community level by identifying the essential parts of the system for each community partnership and how the system influences policy and environmental changes to promote healthy eating and active living and to prevent childhood obesity.3234 To accomplish this goal, community partners and residents at each site participated in a group model building session and related discussions. The group model building exercises actively involved a wide range of participants in modeling complex systems and provided a way for different representatives (e.g., residents, elected officials, government agencies, community-based organizations, businesses, universities) to better understand the systems (i.e., dynamics and structures) in the community. GMB sessions were facilitated by two trained evaluation staff and carried out in all 49 HKHC communities using a protocol (see the Healthy Kids, Healthy Communities Group Model Building Facilitation Handbook, www.transtria.com/hkhc).

Behavior Over Time Graphs (A group model building exercise)

The behavior over time graphs exercise was designed to generate responses to the following prompt: “things that affect or are affected by policy, system and environmental changes in your community related to healthy eating, active living, and childhood obesity”. Participants created graphs with the following key components: a title or topic, a time frame on the x-axis (e.g., days, months, years), a scale on the y-axis (numbers or descriptors, such as “low” to “high”), and a trend line reflecting the perception of how this topic, or variable, has changed over time. Graphs included a historic trend from the past to the present and a projected trend, indicating both the participant’s “hope” and “fear.” Each graph was supposed to reflect only one topic and participants were encouraged to create as many graphs as possible in the time allotted. Participants also reported their stories to go with each graph to the group. The behavior over time graphs findings are described in a complementary article in this supplement.35

Causal Loop Diagrams (A group model building exercise)

From the range of variables identified in the behavior over time graphs exercise, facilitators selected approximately nine to twelve variables to use as “seed” variables for the causal loop diagrams exercise. In reference to the same purpose statement (i.e., “things that affect or are affected by policy, system and environmental changes in your community related to healthy eating, active living, and childhood obesity”), participants identified causal connections among the “seed” variables or generated new variables to be added to the diagram, indicating causal relationships. The causal loop diagrams findings are described in a complementary article in this supplement.36

6. Photos and Videos

Digital photographs or videos portrayed the condition of the facilities or environments and the impact of the healthy eating or active living policy or environment interventions on the environment. Community partnerships’ staff provided tours for Evaluation Officers from Transtria for intervention sites. Photos and videos were used to supplement and validate findings from the qualitative data collected (e.g., images of environment changes).

7. Surveillance Data and Record Review

With respect to surveillance data, the evaluation team examined community sociodemographics and relevant policy and environmental indicators using a systematic approach, with the intention of triangulating this data with the other quantitative and qualitative data collected as part of the evaluation methods. The Evaluation Officers also reviewed the community partnerships’ proposals and budgets, annual and final narrative reports, annual and final financial reports, and any other reports or materials that were shared by the HKHC NPO or the community partnerships themselves.

8. Data Triangulation

An Access database was created to integrate, store, and code the multiple sources of qualitative and quantitative data (e.g., Dashboard Actions, interview and survey data, data gleaned from HKHC community partnerships’ narrative and financial reports) for analysis. Data sources included in the database were the 2010 Census, 2007–2011 American Community Survey, population data available through local health departments, workplans and action entries from the HKHC Community Dashboard, matching fund reports, annual budgets, annual narratives, products shared by the community partnerships, qualitative data from key informant interviews, enhanced evaluation participation, and technical assistance exchanges (e.g., email, phone conversations). Some of the key elements of the database included:

Community Partnership and Subpopulations

Each community partnership profile contained information about the sociodemographics of the community served by the partnership, including total population, racial and ethnic breakdown, and poverty rate. Within each partnership, strategies were often targeted to more specific populations, and these were entered as “subpopulations” with similar sociodemographic information documented. Revenue: Sources of revenue generated for the partnership, both cash and in-kind, were recorded, including HKHC budgets and expenditures, matching funds leveraged, and other funding leveraged as a result of HKHC (e.g., Community Transformation Grants).

Media

All media events captured by the partnership were entered by media type (e.g., newspaper, TV, radio, website) and by strategy (e.g., corner stores, farmers’ markets, general partnership).

Assessments

Any assessments conducted by the community partnerships were documented, including the methods (e.g., environmental audits, direct observations, surveys, interviews), strategy (e.g., farmers’ markets, parks and play spaces), and youth and resident involvement in these activities. Partners: All partners involved with the community partnership were entered in the database. Partners were categorized into core partners (i.e., organizations or individuals directly responsible for decision-making and carrying out partnership activities) and network partners (i.e., organizations or individuals supporting specific strategies or activities as opposed to the local HKHC initiative as a whole). Other key information was documented about the partners, including the partners’ type (e.g., government, foundation, business, community-based organizations) and disciplines (e.g., healthcare, agriculture, parks, transportation).

Policy, Practice, and Environmental Settings (PpE)

PpE changes occurred within a particular setting (e.g., farmers’ markets, parks) and the specific setting location was documented along with the zip code tabulation area, which was used to assign population data to the setting and the PpE.

Policy, Practice, and Environmental Changes

PpE changes were entered into the database within a setting. Key information was reported for the PpE changes, including grant year implemented, duration, type of PpE change, strategy and setting tags, and reach, implementation, and dose indicators.

FINDINGS: WHAT WORKED AND WHAT CHALLENGES EMERGED

In a complex, multi-level, multi-year evaluation such as HKHC, there are numerous successes and challenges. Table 1 describes a few notable issues. Many others are covered in the other articles in this issue of the Journal of Public Health Management and Practice.

Table 1.

Findings – What Worked and What Challenges Emerged

Method/Topic What Worked What Challenges Emerged
Value of Systems Approaches for Multi-Component and Complex Initiatives One year of funding for planning the evaluation
  • Ensured community-based participatory evaluation approaches were aligned with diverse strategies of the community partnerships.

  • Collaborated to design and develop the performance monitoring system (i.e., HKHC Community Dashboard), which served as a critical evaluation component.


Logic models
  • Represented the depth, complexity, interconnectedness, and transformational processes and associated initiative impacts across socioecological layers encompassed active living and healthy eating systems of communities.


System thinking
  • Embraced the complexity of causal pathways, which were mapped among social determinants of health, partnership and community capacity, healthy eating policies and environments, active living policies and environments, and health behaviors and health outcomes.


Relationships and communication
  • Benefitted from strong working relationships and regular communications with the funder (RWJF) and the HKHC National Program Office.

  • Ensured the evaluation plans were responsive to the community partnerships’ needs for assessment and evaluation technical assistance as well as the funder’s desires to understand the impact of the initiative as a whole.

  • Gained insight into the proposed evaluation approaches and methods from the nine leading sites, whose funding overlapped with the evaluation planning period.

  • Identifying designs and plans to evaluate the HKHC initiative, including cross-site evaluation components to capture the complexity and variability among the 49 funded sites.

  • Turnover and loss of institutional memory, particularly among the Project Directors and Project Coordinators for the communities and the Evaluation Officers supporting the communities.

Capacity Building
  • Created capacity for community partnerships to continue to conduct evaluation in the future through training and technical assistance (e.g., enhanced evaluation, technical assistance during site visits, sessions at annual grantee meetings).

  • Helped community partnerships understand the bigger systems at play through creation of their own theory of change using group model building methods.

  • Gathered many participants (e.g., Project Directors/Coordinators, partners, residents, and HKHC National Program Office Project Officers) to discusses processes, resources, successes, and challenges through evaluation site visits (i.e., opportunity for critical reflection).

  • Responding to requests for additional technical assistance on evaluation (e.g., additional group model building sessions, additional support for environmental audit and direct observation data collection) without sufficient time or resources available to provide this level of technical assistance to the communities.

Focus on Upstream vs. Downstream Outcomes Enhanced evaluation
  • Relied on perceived (often subjective) measures in several data collection approaches, whereas the environmental audits and direct observations as part of the enhanced evaluation provided an opportunity for community partnerships to engage in objective data collection.

  • Provided context on health behaviors and, in some cases, demonstrated changes in behavior after policy and/or environmental changes occurred through direct observation data collection.


RE-AIM framework (i.e., reach, effectiveness, adoption, implementation, and maintenance)
  • Included both upstream and downstream outcomes and constructed a comprehensive data management, reduction, and analysis approach, identifying categorical variables and indices to summarize the large volume of qualitative data collected through the mixed-methods described.

Exploratory evaluation
  • Having 49 community partnerships implementing different healthy eating and active living policy and environmental changes eliminated the feasibility of experimental or quasi-experimental study designs with the resources available.


RE-AIM framework (i.e., reach, effectiveness, adoption, implementation, and maintenance)
  • Documenting effectiveness, given the evaluation was designed to assess policy, system, and environmental changes and not health and health behaviors (i.e., impact on nutrition, physical activity, or obesity outcomes).

  • Tracking a large volume of process and formative evaluation data (i.e., reach, adoption, implementation, maintenance/sustainability).

  • Insufficient time to assess maintenance and sustainability of communities’ initiatives, given the community partnerships were still working to implement policy and environmental changes into the final year of the grant.

Dissemination and Practical Application of Findings Group model building/casual loop diagrams used to:
  • Generated surveys to increase partners’ common understanding of their theory of change and specific causal pathways from the healthy eating or active living policy and environmental changes to behavioral and health outcomes (Columbia, MO).

  • Increased community awareness of the partnerships’ activities (e.g., article in a local newsletter in Milwaukee, WI).

  • Identified new or complementary strategies to improve the partnerships’ impacts in communities (several community partnerships).


Enhanced evaluation (example success story)
  • Trained youth to collect environmental audit data on local parks with support from the local university. Youth presented the environmental audit findings to the Charleston City Council’s Parks and Recreation Committee in March 2013 as part of the KEYS 4 Healthy Kids Youth Council in Charleston, West Virginia. The City Council asked that the youth conduct another audit in the spring and they stated that improvements will be made based on the youth recommendations.

  • Insufficient funds for the evaluation to support development of a full range of dissemination and communication products in addition to the evaluation reports.

  • Inadequate time to finalize and validate products with the community partnerships and the NPO (i.e., only three months at the end of the HKHC initiative).

Some of the key issues and lessons from the HKHC evaluation included:

The value of focusing on systems approaches

By including a planning period, making extensive use of logic models, using principles from systems thinking, and applying partnership principles, we were able to design and implement an evaluation that was robust, timely, and relevant.

The need to build local capacity

Given the complex nature of the evaluation and the large number of sites, a centrally focused/controlled evaluation was not feasible. Therefore, our ability to foster local capacity building allowed for tailored evaluation approaches that had a higher likelihood of being sustained. Yet, resources invested in increasing the volume of data collection efforts across all 49 community partnerships likely compromised data quality.

The importance of an upstream focus

Increasingly, it is becoming clear that upstream (environmental and policy) approaches are essential for improving public health.3739 By using RE-AIM,40 our evaluation included a significant focus on upstream determinants to increase understanding of the multi-component interventions and local context for implementation.

The value of practical dissemination strategies

By following principles of designing for dissemination,41 along with the use of non-traditional dissemination methods (e.g. causal loop diagram storybooks, HKHC Dashboard action reports, case reports with infographics), we were able to convey findings in a way that fit well with the needs of the target audiences.

CONCLUSION

Although childhood obesity rates appear to be plateauing in some regions and countries,42 the imperative to promote active living and health eating remains a high priority. Community-based initiatives such as HKHC provide promising approaches for addressing childhood obesity. To demonstrate internal and external validity, this article and others in this issue of the Journal of Public Health Management and Practice illustrate how mixed-methods evaluation approaches can provide practice-relevant evidence that has the potential to improve population health.

Supplementary Material

SM F1

Acknowledgments

Support for this evaluation was provided by a grant from the Robert Wood Johnson Foundation (#67099). Transtria LLC led the evaluation and dissemination activities from April 2009 to March 2014. Representatives from all 49 community partnerships actively participated in the evaluation planning, implementation, and dissemination activities. This cross-site report is a synthesis of information collected through multiple evaluation methods as part of a collaborative, community-based approach to evaluation.

We are grateful for the collaboration with and support from the Robert Wood Johnson Foundation (Laura Leviton, PhD and Tina Kauh, PhD), the Washington University Institute for Public Health, the Healthy Kids, Healthy Communities (HKHC) National Program Office (Casey Allred; Rich Bell, MCP; Phil Bors, MPH; Mark Dessauer, MA; Fay Gibson, MSW; Joanne Lee, LDN, RD, MPH; Mary Beth Powell, MPH; Tim Schwantes, MPH, MSW; Sarah Strunk, MHA; and Risa Wilkerson, MA), the HKHC Evaluation Advisory Group (Geni Eng, DrPH, MPH; Leah Ersoylu, PhD; Laura Kettel Khan, PhD; Vikki Lassiter, MS; Barbara Leonard, MPH; Amelie Ramirez, DrPH, MPH; James Sallis, PhD; and Mary Story, PhD), the Social System Design Lab at Washington University in St. Louis (Peter Hovmand, PhD), the University of Memphis (Daniel Gentry, PhD), and Innovative Graphic Services (Joseph Karolczak).

Special thanks to the many individuals who have contributed to these efforts from Transtria LLC, including past and present evaluation officers (Tammy Behlmann, MPH; Kate Donaldson, MPH; Carl Filler, MSW; Peter Holtgrave, MPH, MA; Christy Hoehner, PhD, MPH; Jessica Stachecki, MSW, MBA; Cheryl Valko, MPH), project assistants (James Bernhardt; Rebecca Bradley; Ashley Crain, MPH; Emily Herrington, MPH; Ashley Farell, MPH; Amy Krieg; Brandye Mazdra, MPH; Kathy Mora, PhD; Jason Roche, MPH; Carrie Rogers, MPH; Shaina Sowles, MPH; Muniru Sumbeida, MPH, MSW; Caroline Swift, MPH; Gauri Wadhwa, MPH; Jocelyn Wagman, MPH), additional staff (Michele Bildner, MPH, CHES; Daedra Lohr, MS; Melissa Swank, MPH), interns (Christine Beam, MPH; Skye Buckner-Petty, MPH; Maggie Fairchild, MPH; Mackenzie Ray, MPH; Lauren Spaeth, MS), transcriptionists (Sheri Joyce; Chad Lyles; Robert Morales; Vanisa Verma, MPH), and editors (Joanna Bender and Julie Claus, MPH).

REFERENCES

  • 1.Hedley AA, Ogden CL, Johnson CL, Carroll MD, Curtin LR, Flegal KM. Prevalence of overweight and obesity among US children, adolescents, and adults, 1999–2002. Jama. 2004 Jun 16;291(23):2847–2850. doi: 10.1001/jama.291.23.2847. [DOI] [PubMed] [Google Scholar]
  • 2.Ogden CL, Carroll MD, Curtin LR, McDowell MA, Tabak CJ, Flegal KM. Prevalence of overweight and obesity in the United States, 1999–2004. JAMA. 2006 Apr 5;295(13):1549–1555. doi: 10.1001/jama.295.13.1549. [DOI] [PubMed] [Google Scholar]
  • 3.Ogden CL, Carroll MD, Kit BK, Flegal KM. Prevalence of Obesity and Trends in Body Mass Index Among US Children and Adolescents, 1999–2010. JAMA. 2012 Jan 17; doi: 10.1001/jama.2012.40. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Brennan L, Castro S, Brownson RC, Claus J, Orleans CT. Accelerating evidence reviews and broadening evidence standards to identify effective, promising, and emerging policy and environmental strategies for prevention of childhood obesity. Annu Rev Public Health. 2011 Apr 21;32:199–223. doi: 10.1146/annurev-publhealth-031210-101206. [DOI] [PubMed] [Google Scholar]
  • 5.Brennan LK, Brownson RC, Orleans CT. Childhood obesity policy research and practice: evidence for policy and environmental strategies. Am J Prev Med. 2013 Jan;46(1):e1–e16. doi: 10.1016/j.amepre.2013.08.022. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Cockrell Skinner A, Foster EM. Systems science and childhood obesity: a systematic review and new directions. J Obes. 2013;2013:129193. doi: 10.1155/2013/129193. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Flynn MA, McNeil DA, Maloff B, et al. Reducing obesity and related chronic disease risk in children and youth: a synthesis of evidence with 'best practice' recommendations. Obes Rev. 2006 Feb;7(Suppl 1):7–66. doi: 10.1111/j.1467-789X.2006.00242.x. [DOI] [PubMed] [Google Scholar]
  • 8.Sallis JF, Cutter CL, Lou D, et al. Active living research: creating and using evidence to support childhood obesity prevention. Am J Prev Med. 2014 Feb;46(2):195–207. doi: 10.1016/j.amepre.2013.10.019. [DOI] [PubMed] [Google Scholar]
  • 9.Economos C, Blondin S. Obesity interventions in the community. Curr Obes Rep. 2014;3:199–205. doi: 10.1007/s13679-014-0102-2. [DOI] [PubMed] [Google Scholar]
  • 10.Green LW, Glasgow RE. Evaluating the relevance, generalization, and applicability of research: issues in external validation and translation methodology. Eval Health Prof. 2006 Mar;29(1):126–153. doi: 10.1177/0163278705284445. [DOI] [PubMed] [Google Scholar]
  • 11.Creswell J, Klassen A, Plano Clark V, Clegg Smith K. Best Practices for Mixed Methods Research in the Health Sciences. Bethesda, MD: Office of Behavioral and Social Sciences Research, National Institutes of Health; 2011. [Google Scholar]
  • 12.Hargreaves MB, Honeycutt T, Orfield C, et al. The Healthy Weight Collaborative: using learning collaboratives to enhance community-based prevention initiatives addressing childhood obesity. J Health Care Poor Underserved. 2013;24(2 Suppl):103–115. doi: 10.1353/hpu.2013.0095. [DOI] [PubMed] [Google Scholar]
  • 13.Healthy Kids HC. About Healthy Kids, Healthy Communities. [Accessed July 5, 2014]; http://www.healthykidshealthycommunities.org/about. [Google Scholar]
  • 14.U.S. Department of Health and Human Services. American Recovery and Reinvestment Act Prevention and Wellness Initiative: Communities Putting Prevention to Work, Center for Disease Control and Prevention. Atlanta, GA: Center for Disease Control and Prevention; 2010. [Google Scholar]
  • 15.Brownson RC, Brennan LK, Evenson KR, Leviton LC. Lessons from a mixed-methods approach to evaluating Active Living by Design. Am J Prev Med. 2012 Nov;43(5 Suppl 4):S271–S280. doi: 10.1016/j.amepre.2012.07.002. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Bors PA. Capturing community change: Active Living by Design's progress reporting system. Am J Prev Med. 2012 Nov;43(5 Suppl 4):S281–S289. doi: 10.1016/j.amepre.2012.07.008. [DOI] [PubMed] [Google Scholar]
  • 17.Bors P, Kemner A, Fulton J, Stachecki J, L B. HKHC Community Dashboard: Design, development, and function of a performance monitoring system. J Public Health Manag Pract. 2014 doi: 10.1097/PHH.0000000000000207. in review. [DOI] [PubMed] [Google Scholar]
  • 18.Kemner A, Stachecki J, Bildner M, Brennan L. Promoting community-based participatory evaluation of healthy eating and active living strategies through direct observations and environmental audits. J Public Health Manag Pract. 2014 doi: 10.1097/PHH.0000000000000212. in review. [DOI] [PubMed] [Google Scholar]
  • 19.Panke S, Holaly-Zembo L. Using an Integrated approach to evaluate ‘Where do Flint’s Children Play’. J Public Health Manag Pract. 2014 doi: 10.1097/PHH.0000000000000213. in press. [DOI] [PubMed] [Google Scholar]
  • 20.Patton-Lopez M, Olson B, Brown G, Munoz R, Polanco K, DeGhetto S. Recreating a neighborhood park to Increase physical activity among rural youth: A community based participatory case study. J Public Health Manag Pract. 2014 doi: 10.1097/PHH.0000000000000206. in press. [DOI] [PubMed] [Google Scholar]
  • 21.Raja S, Booth J, Crowell B, Norton T, Gouck J, Bonaro K. Evaluating community-based active living efforts: Lessons from Buffalo, New York. J Public Health Manag Pract. 2014 doi: 10.1097/PHH.0000000000000208. in press. [DOI] [PubMed] [Google Scholar]
  • 22.Rifkin R, Williams L, Grode G, Johnson W. Enhanced evaluation data initiates a collaborative out-of-school time food sponsors work group. J Public Health Manag Pract. doi: 10.1097/PHH.0000000000000215. 2104 in press. [DOI] [PubMed] [Google Scholar]
  • 23.Kumar N, Stern L, Anderson J. Conducting interorganizational research using key informants. Academy of Management Journal. 1993;36(6):1633–1651. [Google Scholar]
  • 24.Boothroyd P. Policy assessment. In: Vanclay F, Bronstein D, editors. Environmental and Social Impact Assessment. Chichester: Wiley; 1995. pp. 83–126. [Google Scholar]
  • 25.Drummond M, Sculpher M, Torrance G, O’Brien B, Stoddart G. Methods for the Economic Evaluation of Health Care Programmes. 3rd. New York: Oxford University Press; 2005. [Google Scholar]
  • 26.Swank M, Brennan L, Gentry D, Kemner A. Using frameworks to diagram value in complex policy and environmental interventions to prevent childhood obesity. J Public Health Manag Pract. 2014 doi: 10.1097/PHH.0000000000000210. in review. [DOI] [PubMed] [Google Scholar]
  • 27.Goodman RM, Speers MA, McLeroy K, et al. Identifying and defining the dimensions of community capacity to provide a basis for measurement. Health Educ Behav. 1998;25(3):258–278. doi: 10.1177/109019819802500303. [DOI] [PubMed] [Google Scholar]
  • 28.Baker E, Motton F. American Public Health Association 131st Annual Meeting. San Francisco, CA: 2003. Is there a relationship between capacity and coalition activity: The road we’ve traveled. [Google Scholar]
  • 29.Lempa M, Goodman RM, Rice J, Becker AB. Development of scales measuring the capacity of community-based initiatives. Health Educ Behav. 2008 Jun;35(3):298–315. doi: 10.1177/1090198106293525. [DOI] [PubMed] [Google Scholar]
  • 30.Baker EA, Wilkerson R, Brennan LK. Identifying the role of community partnerships in creating change to support active living. Am J Prev Med. 2012 Nov;43(5 Suppl 4):S290–S299. doi: 10.1016/j.amepre.2012.07.003. [DOI] [PubMed] [Google Scholar]
  • 31.Kemner A, Donaldson K, Brennan L. Partnership and community dapacity’s role in creating change. 2014 in review. [Google Scholar]
  • 32.Andersen D, Vennix J, Richardson G, Rouwette E. Group model building: Problem structuring, policy simulation and decision support. Journal of the Operational Research Society. 2007;58(5):691–694. [Google Scholar]
  • 33.Vennix J. Group Model Building. New York: John Wiley & Sons; 1996. [Google Scholar]
  • 34.Vennix J. Group model-building: Tackling messy problems. System Dynamics Review. 1999;15(4):379–401. [Google Scholar]
  • 35.Hoehner C, Brennan L, Sabounchi N, Hovmand P, Kemner A. Behavior-over-time-graphs: Assessing perceived trends in healthy eating and active living environments and behaviors across 49 communities. J Public Health Manag Pract. 2014 doi: 10.1097/PHH.0000000000000211. in review. [DOI] [PubMed] [Google Scholar]
  • 36.Brennan L, Sabounchi N, Kemner A, Hovmand P. Systems thinking in 49 communities related to healthy eating, active living, and childhood obesity. J Public Health Manag Pract. 2014 doi: 10.1097/PHH.0000000000000248. in review. [DOI] [PubMed] [Google Scholar]
  • 37.Brownson RC, Haire-Joshu D, Luke DA. Shaping the context of health: a review of environmental and policy approaches in the prevention of chronic diseases. Annu Rev Public Health. 2006;27:341–370. doi: 10.1146/annurev.publhealth.27.021405.102137. [DOI] [PubMed] [Google Scholar]
  • 38.Frieden TR. A framework for public health action: the health impact pyramid. Am J Public Health. 2010 Apr;100(4):590–595. doi: 10.2105/AJPH.2009.185652. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39.McKinlay JB. Paradigmatic obstacles to improving the health of populations--implications for health policy. Salud Publica Mex. 1998 Jul-Aug;40(4):369–379. doi: 10.1590/s0036-36341998000400010. [DOI] [PubMed] [Google Scholar]
  • 40.Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health. 1999 Sep;89(9):1322–1327. doi: 10.2105/ajph.89.9.1322. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41.Brownson RC, Jacobs JA, Tabak RG, Hoehner CM, Stamatakis KA. Designing for Dissemination Among Public Health Researchers: Findings From a National Survey in the United States. Am J Public Health. 2013 Jul 18; doi: 10.2105/AJPH.2012.301165. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42.Wabitsch M, Moss A, Kromeyer-Hauschild K. Unexpected plateauing of childhood obesity rates in developed countries. BMC Med. 2014;12:17. doi: 10.1186/1741-7015-12-17. [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

SM F1

RESOURCES