Skip to main content
Ethnicity & Disease logoLink to Ethnicity & Disease
. 2019 Jun 13;29(Suppl 2):385–392. doi: 10.18865/ed.29.S2.385

A Participatory Evaluation Framework for the Implementation of a Transdisciplinary Center for Health Disparities Research

Latrice Rollins 1,2,, Tiffany Zellner Lawrence 1, Tabia Henry Akintobi 1,2, Jammie Hopkins 2,3, Ananya Banerjee 1, Mario De La Rosa 4
PMCID: PMC6604778  PMID: 31308610

Abstract

This article describes the participatory evaluation framework for the Transdisciplinary Collaborative Center for Health Disparities Research (TCC) funded by the National Institute of Minority Health and Health Disparities. In collaboration with TCC stakeholders, logic models, the McKinlay model, and process and outcome evaluation plans, including quantitative and qualitative methods, have been developed and used to document the impact of the TCC. The McKinlay model, a widely used comprehensive health model for eliminating health disparities, was also tailored to document the outcomes of the TCC. The process and outcome evaluation plans for the TCC guide continuous improvement and the achievement of its specific aims. The evaluation of the TCC occurred between 2012 and 2019 and involved key stakeholders in TCC research and programming. Several challenges exist for implementing an evaluation plan of a health equity-focused policy research center. However, we learned several lessons that will ensure progress toward specific aims and will help the TCC serve as a model for similar programs and centers.

Keywords: Evaluation, Participatory Approach, Framework, Health Disparities, Health Policy, Health Equity, Transdisciplinary

Introduction

Generally, an evaluation framework outlines the methods to be used and includes evaluation design or model, evaluation research questions, strategies for data collection, activities involved, stakeholders, and a timeline for completing the evaluation. Additionally, it is important to consider how data collected in the program is used in the evaluation plan to measure changes in the outcomes.1 The Transdisciplinary Collaborative Center for Health Disparities Research (TCC) is an institution-wide health policy research center at Morehouse School of Medicine. The TCC supports research using a multidisciplinary supportive core infrastructure and shared resources. The TCC cores (Administrative, Research, Implementation and Dissemination, and Evaluation) work collaboratively to ensure all TCC research, outreach, and evaluation activities are methodologically rigorous, aligned with the unifying theme of the TCC, and are sufficiently supported to achieve maximum impact. The TCC uses the power of collaboration supported by quality research to identify and unite unrelated health policy issues under a health equity lens.

Since the research sub-projects supported by the TCC address health disparities, the process of evaluating the health policy research outcomes are complex, dynamic and non-linear. In addition to the usual challenges that evaluation of health policy research entails, there are several difficulties that are specific to the evaluation of work that addresses health disparities. Evaluation of health policy research and supporting infrastructure that help inform policy change and eliminate health disparities among minority groups are complicated because of the multilayered and interactional nature of barriers, services used, mediators of care, and their anticipated outcomes.2

The evaluation of the TCC is designed to assess its administrative and health policy science functions. Because of the complex nature of the TCC’s work, a participatory approach to evaluating its outcomes, which involves TCC stakeholders as equal partners in evaluation, is necessary.3 Collaborative approaches that equitably involve stakeholders and community partners are superior because they incorporate the unique strengths that community members, organizations, and other key stakeholders bring. This participatory approach recognizes that the community has the power of knowledge, the power to act and decide, and the power of essential resources, not only to conduct the research, but also to affect the change systems, programs, and policies to improve community health. Participatory approaches emphasize an equal partnership, power sharing in decision-making, and data ownership between stakeholders and evaluators. This collaborative approach enables the creation of evaluation methods and strategies that are specifically tailored to needs, existing resources and intended outcomes. All activities are conducted to address needs mutually identified by partners to assure that initiatives are: 1) audience-driven; 2) foster sustained ownership of evaluation processes; and 3) are central to program decision-making and sustainability.4-9

Methods

Evaluation Framework

The specific aims of the TCC evaluation are to: 1) document processes toward the establishment of a transparent and participatory governance model that shares technology and resources with TCC partners to collaboratively design, implement, evaluate, and disseminate innovative transdisciplinary programs of health policy research; 2) establish systematic interaction with TCC sub-projects to provide technical assistance and guidance in evaluation planning and implementation associated with the TCC’s approach to develop and refine, with the input of TCC partners, health policy research sub-projects that will drive and sustain health equity by addressing quality and cost reduction; and 3) develop and establish an evaluation design that monitors the implementation and dissemination of a regional model for health equity policy research that will serve as a national resource for adaptable policies on health equity. This article will focus on the third specific aim, describing the evaluation framework of the TCC. The evaluation of the TCC occurred between 2012 and 2019 and involved TCC leadership/staff, TCC core staff, subproject and pilot project principal investigators/staff, external advisory committee members, and TCC event attendees.

TCC Logic Model

A logic model is a valuable tool that aids in planning, implementation, and assessment of a program or initiative. It serves as a blueprint or road map to help illustrate the elements that work together to achieve goals and objectives.10 The TCC logic model was developed in collaboration with the Center’s administrative core and covers the inputs (resources necessary for the success of the center and subprojects), outputs (activities and participants), and outcomes (learning, actions/behaviors, or conditions that should be changed as a result of the TCC). Figure 1 illustrates the logic model and its components used for the implementation and evaluation of the TCC overall and its health equity policy research sub-projects, in particular.

Figure 1. Transdisciplinary Collaborative Center on Health Disparities Research: logic model.

Figure 1.

McKinlay Model

Health policy research evaluations are critical to the understanding of the impact on population, organizational, systems, community- and individual-level behavior changes. The McKinlay Model identifies three levels of changes at the individual (downstream), community/organizations (midstream), and policy (upstream) levels that can be targeted in the elimination of health disparities.11, 12 For the TCC, these changes include ‘‘upstream’’ (eg, inform new public policies that address health disparities), ‘‘midstream’’ (eg, collaborations with policy and community organizations to implement programs that address health disparities), and ‘‘downstream’’ (eg, increased knowledge of health disparities and health equity) outcomes.13-14 Figure 2 illustrates the TCC’s adoption of the McKinlay model to guide the implementation and evaluation of policy changes resulting from its activities and research.

Figure 2. Transdisciplinary Collaborative Center on Health Equity Policy Research and Practice McKinlay Model.

Figure 2.

In addition to having models to plan and guide the evaluation, planning the TCC evaluation involved stakeholders and garnered input from the community, research, and health policy leaders to determine TCC process and outcome measures. The evaluation plan focuses on devising assessment measures associated with capacity building, infrastructure development, research implementation strategies, translational policy research activities for TCC, assessment of research activities including pilot projects, and the overall impact and the effectiveness of TCC research projects. Qualitative and quantitative methods are central to process and outcome evaluations to assess context-specific activities associated with overall and project-specific outcomes.

Process evaluation questions include questions that address the day-to-day processes of the TCC that will determine its critical components for successful implementation. The questions also determine the extent to which TCC health equity policy research activities have been implemented in terms of its established goals, objectives and outcomes. In addition, they will track the degree to which the policy implementation has been modified and measure its impact on achieving established outcomes, goals, and adjectives. Lastly, the questions will look for evidence on innovation, collaboration, and communication.

Questions pertaining to outcome evaluation assess the internal and external factors associated with the differential impact of the TCC and its health equity policy research projects. They will also determine if there has been an increase in the number of academic-community partnerships since the initiation of TCC. Finally, the questions will look for evidence in the pattern of policy changes facilitated by the TCC. Table 1 illustrates the evaluation questions and data sources that will be used to assess the TCC’s impact.

Table 1. Evaluation questions and data sources.

Evaluation Questions Data Sources Frequency
Process
What day-to-day processes of the TCC will determine its critical components for successful implementation? Monthly Updates Monthly
Meeting Minutes Monthly
Joint Core Quarterly
Consultations
Logic Models Quarterly
To what extent have TCC health policy research activities been implemented according to established outcomes, goals, and objectives? Progress Profiles Annually
Monthly Updates Monthly
Reports Quarterly
Joint Core Quarterly
Consultations
Logic Models Quarterly
To what extent has TCC health policy research implementation been modified and what is the impact on achieving established outcomes, goals, and objectives? Joint Core Quarterly
Consultations
Reports Quarterly
Logic Models Quarterly
What evidence exists on innovation, collaboration and communication, and integration and synergy signaling progress toward translational policy change? Reports Quarterly
Progress Profiles Annually
Outcome
What internal or external factors are associated with differential impact of the TCC health policy research project? Event Survey Data Ongoing
Reports Quarterly
Executive Advisory Committee (EAC) Survey Once at end of funding period
Have the number of academic-community partnerships increased since initiation of the TCC? Reports Quarterly
What evidence exists on patterns of translational policy change facilitated by TCC? Reports Quarterly
Partnership Survey Once at end of funding period
EAC Survey Once at end of funding period

The implementation of standardized evaluation metrics and a centralized data repository also facilitates and enhances the quality of data collection among all cores and sub-projects. A data and information repository was developed using SharePoint to securely house process and outcome data and community partnership information.

Results

Process Evaluation

The process evaluation will determine whether the resources and efforts of the TCC and its health equity policy research projects support the intended goals, provide evidence-based explanation for results produced, and deliver timely information for on-going improvement by identifying the strengths and weaknesses of the projects. The TCC process evaluation data sources and strategies are outlined below.

Meeting Minutes

Examination of data obtained from administrative meetings will help identify the strengths and limitations of the projects for the current programs. The Administrative Core will also document and create best practices and outline challenges faced by the implementation of the programs for regular review.

Time-Phased Work Plans

Time-phased work plans were completed by subprojects and cores on an annual basis. The time-phased work plan provided a projection of what the team planned to accomplish for each quarter of the year. Subprojects and cores were asked to report planned activities across four domains: 1) research progress; 2) scholarly output; 3) dissemination; and 4) sustainability. Time-phased work plans were carefully reviewed by the evaluation and administrative cores to determine feasibility of plans, modifications necessary, and resources that would be needed to maximize productivity. Time-phased work plans were used as a guide in quarterly consultation meetings to represent what was projected by each subproject against what their actual progress was during each quarter.

Logic Models

The Core Leaders and Pilot Project Leaders submitted project-specific annual logic models to track the key inputs, strategies, outputs and outcomes associated with projected health equity impacts. Regular review of the logic models is important due to changes that may occur in program administration, research directions, the target population or communities. The logic models are reviewed annually and updated, as necessary, to document changes made to the program over time with documented history to explain why changes were made.

Reports

The TCC cores and subprojects provide monthly updates, quarterly reporting, and annual reports. These reports are used to track the progress in implementing key strategies, outputs and outcomes associated with expected project impacts. Additionally, all TCC sub-projects, cores, and pilot project grantees are encouraged to continuously report peer-reviewed scholarly output; development and dissemination of non-peer reviewed products and activities (eg, webinars, white papers, policy briefs, public comments); training and professional development opportunities hosted and attended; and funding/sustainability opportunities pursued.

Joint Core Consultations

After the first year of TCC implementation, the joint quarterly consultation model was adopted and was designed to bring together representatives from each Center core (ie, Administrative, Research, Evaluation, Implementation & Dissemination) to collectively discuss projects with meet with the sub-project teams. These efforts minimized the number of meetings required for each sub-project and bolstered the effectiveness of the meetings by maximizing communication between cores and sub-projects toward the most efficient response to identified technical assistance or support needs. Consultations were conducted quarterly as face-to-face meetings. Each consultation had a specific focus and goal that was shared prior to meeting. Meetings were responsive to the needs of the sub-project, but generally included a report from the sub-project on their implementation progress to date, any challenges encountered, requests for technical assistance or support, and projections of what will be accomplished in the next quarter based on the reported factors. Following each meeting a summary was developed by the evaluation core to document what was discussed, next steps, technical assistance requests, and changes to be made to time-phased work plans and progress profiles as a result of the meeting.

Progress Profiles

Progress profiles provided a pictorial representation of each sub-project’s progress toward accomplishing their specific aims. The profile summarized their research activities, anticipated outcomes, and progress to date. For each activity, the evaluation team provided a rating of “on track,” “slightly behind,” or “serious delays” for each specific aim of the sub-project. Progress profiles were reviewed quarterly with each sub-project team during joint core consultations. During this time, the evaluation core was able to expound upon factors related to the ratings given to the sub-project and discuss ways in which support could be provided for the subproject to return to an on-track status.

Outcome Evaluation

Outcome evaluation will assess whether the TCC health equity policy research projects have been effective in achieving their goals and objectives (outcomes) and are useful to the target population (impact). The TCC outcome evaluation data sources and strategies are outlined below.

Event Survey Data

To assess satisfaction with TCC events (eg, health equity seminars, community forums), surveys were administered to participants electronically after each event. Summaries of survey data were provided to TCC leadership to improve future events and provide recommendations for future topics/themes of the events.

External Advisory Committee (EAC) Survey

The purpose of this survey is to facilitate documented feedback and accountability in the interaction and assessment of the TCC by its EAC. The survey asks that the committee members rate the overall effectiveness of the TCC by specific aim, describe its strengths and weaknesses, and provide feedback or recommendations. The survey was administered at the end of the funding period to assist with sustainability and project improvement efforts.

Partnership Survey

Evaluation of collaborative activities of TCC and its subprojects will focus on assessing the effectiveness of the TCC’s grassroots approach to partnership development. Reports will be developed from the survey to gather information on health equity policy partnerships that have been expanded, developed, and sustained over time.

External Evaluation

To enhance overall program quality, the evaluation plan also proposed a formal evaluation by an external evaluator selected by TCC leadership. The external evaluation would be both formative and summative, and independent of the internal evaluation. External evaluation provides an objective assessment of the outcomes and impact of the project along with project recommendations. This evaluation includes interviews of leads from each TCC core and sub-project and report or document review. All collected data will be analyzed and compared with the TCC proposal goals and objectives. Areas of improvement will be identified, and corrections will be recommended.

Discussion

There are several models and frameworks for evaluating health interventions, though very few frameworks exist for evaluating research and interventions that aim to address policy and health disparities.15 The evaluation framework of the TCC is guided by the TCC logic model, the McKinlay model and participatory evaluation approaches.2,3,11,13 Although this article does not present the outcomes and impact of TCC activities, this framework has met the process and outcome evaluation needs of the TCC and can assist in planning the evaluation of other research centers focused on health policy and addressing health disparities. It includes diverse data sources and strategies such as logic models, time-phased work plans, continuous reporting, joint core consultations, progress profiles, surveys, and external evaluation, some of which have been described in the evaluation of similar centers or projects.3 A subsequent paper will disseminate information pertaining to key outcomes, insights and specific lessons learned based on the current evaluation approach.

While the evaluation is still ongoing, there have been several lessons learned in the process of implementing this evaluation. Similar to Scarinci et al,3 we found continuous communication to be important to ensure that all partners, staff, and investigators remain equitably engaged and informed of all TCC activities and collaborative opportunities. This was further facilitated by successful, mutually beneficial partnerships between the academic institution and the various stakeholders. Communication through joint core consultations and monthly meetings assisted with revisions to project plans and logic models to ensure that activities were leading to desired specific aims and downstream, midstream, and upstream outcomes.

The use of such diverse data sources across several TCC activities has been challenging without a data management system that could accept all data and generate customized reports. While it is beneficial to have specific, detailed information on each activity to assess progress and outcomes, it took a lot of time and administrative review to manually integrate data from the various sources into streamlined tools, such as progress profiles, which were used to assess the project and help make data-driven decisions.

Conclusion

The participatory evaluation framework employed to document the processes and outcomes of the TCC represents an approach merging both traditional evaluation tools and types as well as a public health transformation model adapted to guide contextual understanding of policy health factors central to advancing successful implementation and outcomes. Logic models were collaboratively developed to serve as the overarching blueprints for the TCC at large, as well as each specific research sub-project. Process and outcome measures were collaboratively discussed and agreed upon through consultations designed to identify and resolve their corresponding challenges and opportunities through technical assistance and support. Beyond these well-recognized tools and approaches was the integration of the McKinlay model, adapted to the TCC context and reflecting the unique health policy development adaptation or integration considerations for each community, clinical and educational research context. Merging data-driven assessment strategies, accounting for cultural relevance, is well-aligned with approaches designed to not only catalog health disparities or assess implementation approaches but to strengthen their success through an understanding of the nuances associated with policy determinants. It is anticipated that this model can be useful for public health stakeholders who can adopt this toward utilization in their evaluation contexts (eg, science, policy, practice).

Acknowledgments

Research reported in this publication was supported by the National Institute On Minority Health And Health Disparities of the National Institutes of Health under Award Number U54MD008173, the Centers for Disease Control and Prevention (CDC) Health Promotion and Disease Prevention Research Center (Grant # 1U58DP005945-01), the National Institute on Minority Health and Health Disparities (NIMHHD) (Grant #S21MD000101) and the National Center for Advancing Translational Sciences of the National Institutes of Health under Award Number UL1TR000454. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

References

  • 1. Cooper LA, Hill MN, Powe NR. Designing and evaluating interventions to eliminate racial and ethnic disparities in health care. J Gen Intern Med. 2002;17(6):477-486. 10.1046/j.1525-1497.2002.10633.x [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2. Henry Akintobi T, Trotter J, Evans D, Laster N, Johnson T. Community-based participatory approaches to evaluation. In: Blumenthal D, Braithwaite R, Smith S, eds. Community-based Participatory Health Research. 2nd ed New York, NY: Springer Publishing Company; 2013:231-262. [Google Scholar]
  • 3. Scarinci IC, Moore A, Benjamin R, Vickers S, Shikany J, Fouad M. A participatory evaluation framework in the establishment and implementation of transdisciplinary collaborative centers for health disparities research. Eval Program Plann. 2017;60:37-45. 10.1016/j.evalprogplan.2016.08.020 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4. Akintobi TH, Yancey EM, Muteteke D, Bailey J. Partnership for evaluation of the Bilingual Bicultural Service Demonstration Program: merging public health research and practice. J Interprof Care. 2004;18(4):440-441. 10.1080/13561820400010596 [DOI] [PubMed] [Google Scholar]
  • 5. Mayberry RM, Daniels P, Yancey EM, et al. Enhancing community-based organizations’ capacity for HIV/AIDS education and prevention. Eval Program Plann. 2009;32(3):213-220. 10.1016/j.evalprogplan.2009.01.002 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6. Mayberry RM, Daniels P, Akintobi TH, Yancey EM, Berry J, Clark N. Community-based organizations’ capacity to plan, implement, and evaluate success. J Community Health. 2008;33(5):285-292. 10.1007/s10900-008-9102-z [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7. Akintobi TH, Trotter JC, Evans D, et al. Applications in bridging the gap: a community-campus partnership to address sexual health disparities among African-American youth in the south. J Community Health. 2011;36(3):486-494. 10.1007/s10900-010-9332-8 10.1007/s10900-010-9332-8 [DOI] [PubMed] [Google Scholar]
  • 8. Wingfield JH, Akintobi TH, Jacobs D, Ford ME. The SUCCEED Legacy Grant program: enhancing community capacity to implement evidence-based interventions in breast and cervical cancer. J Health Care Poor Underserved. 2012;23(2)(suppl):62-76. 10.1353/hpu.2012.0081 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9. Akintobi TH, Yancey EM, Daniels P, Mayberry RM, Jacobs D, Berry J. Using evaluability assessment and evaluation capacity-building to strengthen community-based prevention initiatives. J Health Care Poor Underserved. 2012;23(2)(suppl):33-48. 10.1353/hpu.2012.0077 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10. Henry Akintobi T, Yancey E, Berry J, Daniels P, Mayberry R. Fundamentals of a Program Evaluation: A Practical Guide for Community- Based Organizations. Atlanta, GA: Morehouse School of Medicine Prevention Research Center; 2009. [Google Scholar]
  • 11. McKinlay JB. The new public health approach to improving physical activity and autonomy in older populations. In: Heikkinen E, ed. Preparation for aging. New York: Plenum Press; 1995:87-103, 10.1007/978-1-4615-1979-9_10 [DOI] [Google Scholar]
  • 12. Satcher D. Ethnic disparities in health: the public’s role in working for equality. PLoS Med. 2006;3(10):e405. 10.1371/journal.pmed.0030405 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13. McKinlay JB. Paradigmatic obstacles to improving the health of populations—implications for health policy. Salud Publica Mex. 1998;40(4):369-379. 10.1590/S0036-36341998000400010 10.1590/S0036-36341998000400010 [DOI] [PubMed] [Google Scholar]
  • 14. Brownson RC, Chriqui JF, Stamatakis KA. Understanding evidence-based public health policy. Am J Public Health. 2009;99(9):1576-1583. 10.2105/AJPH.2008.156224 10.2105/AJPH.2008.156224 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15. Jilcott S, Ammerman A, Sommers J, Glasgow RE. Applying the RE-AIM framework to assess the public health impact of policy change. Ann Behav Med. 2007;34(2):105-114. 10.1007/BF02872666 [DOI] [PubMed] [Google Scholar]

Articles from Ethnicity & Disease are provided here courtesy of Ethnicity & Disease Inc.

RESOURCES