Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2018 Jan 2.
Published in final edited form as: J Community Psychol. 2016 Dec 13;45(1):17–32. doi: 10.1002/jcop.21828

EVALUATING COMMUNITY ENGAGEMENT IN RESEARCH: QUANTITATIVE MEASURE DEVELOPMENT

Melody S Goodman 1, Vetta L Sanders Thompson 2, Cassandra Arroyo Johnson 3, Renee Gennarelli 4, Bettina F Drake 5, Pravleen Bajwa 6, Maranda Witherspoon 7, Deborah Bowen 8
PMCID: PMC5749252  NIHMSID: NIHMS930071  PMID: 29302128

Abstract

Although the importance of community engagement in research has been previously established, there are few evidence-based approaches for measuring the level of community engagement in research projects. A quantitative community engagement measure was developed, aligned with 11 engagement principles (EPs) previously established in the literature. The measure has 96 Likert response items; 3–5 quality items and 3–5 quantity items measure each EP. Cronbach’s alpha is used to examine the internal consistency of items that measure a single EP. Every EP item group had a Cronbach’s alpha > .85, which indicates strong internal consistency for all question groups across both scales (quality and quantity). This information determines the level of community engagement, which can be correlated with other research outcomes.

INTRODUCTION

It is important to translate research programs and findings into practice, which requires interventions that are relevant to the lives of the target community or population. Community-engaged research (CER) has emerged as an evidence-based approach to conducting research that uses community–academic partnerships to better address the complex issues that affect the health of marginalized populations (Davis et al., 2011; Israel et al., 2010; Jagosh et al., 2012; Salimi et al., 2012). CER is an umbrella term for the many forms of research (e.g., community-based participatory research, participatory action research, patient-centered research, community–academic partnerships) that have community engagement as a core principle. Although sharing this core principle makes the multiple forms of CER similar in spirit, they are not identical in implementation because they span a spectrum of community engagement from no or minimal engagement (e.g., outreach, community advisory boards) to fully collaborative partnerships (e.g., participatory action research, community based participatory research).

CER requires community involvement and control and integrates research and practice, prompting researchers to partner with the communities and populations experiencing a disproportionate burden of poor health or education or other social outcomes (Campbell & Jovchelovitch, 2007; Chesler, 1991; Israel, Schulz, Parker, & Becker, 1998; Israel et al., 2008; Minkler & Wallerstein, 2010; Mohatt et al., 2004; Nelson, Ochocka, Griffin, & Lord, 1998; Rappaport, 1987; Watts & Flanagan, 2007; Zeldin, 2004). CER has become valued as an effective research strategy for improving community conditions and reducing disparities, particularly those in health (Balazs & Morello-Frosch, 2013; Bordeaux et al., 2007; Eder, Tobin, Proser, & Shin, 2012; Nguyen, Hsu, Kue, Nguyen, & Yuen, 2010; Salimi et al., 2012; Shalowitz et al., 2009; Trinh-Shevrin et al., 2007; Wallerstein & Duran, 2006; Wallerstein & Duran, 2010).

CER is defined by working collaboratively with and through groups of people affiliated by geographic proximity, special interest, or similar situations to address issues affecting the well-being of those people; it is a powerful vehicle for developing trust in community–academic partnerships to bring about changes to improve community health (Fawcett et al., 1995). Engaging community members in the research process is often the missing link to improving the quality and outcomes of health promotion activities, disease prevention initiatives, and research studies (Balazs & Morello-Frosch, 2013; Minkler & Wallerstein, 2003; Minkler, 2004). This work involves a long-term process that includes building trust between researchers and the community through a collaborative framework (Brandon, Isaac, & LaVeist, 2005; Butterfoss & Francisco, 2004; Corbie-Smith, Thomas, & St. George, 2002; Quinn, Kass, & Thomas, 2013).

Community engagement and participation in research can contribute to a more nuanced understanding of health problems, increasing the relevance of problems examined for the affected communities and improving the fit of research activities in community-based settings (Cargo & Mercer, 2008; Jagosh et al., 2011; Minkler, 2005; Wallerstein & Duran, 2010). These changes in process can increase the quality of research, leading to higher participation rates, insightful interpretation of findings, and greater reliability and validity of measures in diverse populations (Cargo & Mercer, 2008; Jagosh et al., 2011, 2012; Nueces et al., 2012).

One research question that has received little attention is the extent to which community members in these community–academic partnerships feel engaged in the research process. Measuring the extent of partner engagement is of critical importance both as partnerships are developing and as a predictor of outcomes in the larger study.

Although the utility of CER is perceived as well established in the literature (Campbell & Jovchelovitch, 2007; Israel et al., 1998; Israel, 2005; Minkler & Wallerstein, 2010; Nelson et al., 1998; Wallerstein & Duran, 2006; Wallerstein & Duran, 2010; Zeldin, 2004), measuring and evaluating community engagement in research activities (the extent to which community members are involved with the decisions and activities of the research project) have been limited and have primarily focused on qualitative approaches (Francisco, Paine, & Fawcett, 1993; Goodman et al., 1998; Khodyakov et al., 2013; Lantz, Viruell-Fuentes, Israel, Softley, & Guzman, 2001; McCloskey et al., 2012; Sanchez, Carrillo, & Wallerstein, 2011; Schulz, Israel, & Lantz, 2003). Qualitative methods are effective at assessing community engagement at a project or program level; however, they are time consuming, do not easily scale up for the evaluation of large-scale or multicommunity projects. In addition, the results cannot be easily compared across programs or institutions for the development of evidence-based best practices.

The use of CER to address health disparities has increased markedly and the need to engage community members in the research enterprise has increased dramatically (Ross et al., 2010a,b). The 68 National Cancer Institute Comprehensive Cancer Centers and 62 medical research institutions that are members of the Clinical and Translational Science Award consortium have been mandated to engage communities in their work and disseminate evidence-based strategies (Clinical and Translational Science Awards Consortium Community Engagement Key Function Committee Task Force on the Principles of Community Engagement, 2011; Eder et al., 2013; Eder et al., 2012; Institute of Medicine, 2013; Wilkins et al., 2013). Thus, the measurement of community engagement in large-scale research programs and an examination of how this approach affects research, discovery, and translation of findings are integral to the evaluation and progress of CER.

Community–academic partners could use a quantitative measure from the beginning, which examines the quality and quantity of adherence to engagement principles (EPs), and examine community engagement longitudinally throughout the partnership. This information might determine the level of engagement across a continuum–no engagement, outreach, mobilization, organization, cooperation, collaboration, full partnership–that could be correlated with other research outcomes (e.g., recruitment rates, diversity of participants, retention rates, and other study specific outcomes). In addition, these data might be useful in strategies for monitoring and improving these partnerships.

This article describes the development process of a quantitative measure that assesses the level of community engagement among community members, building on limited existing quantitative measures of community engagement in public health research (Israel, 2005; Khodyakov et al., 2013; Schulz et al., 2003; Weir, D’Entremont, Stalker, Kurji, & Robinson, 2009). When developed, this measure should provide scores on the overall engagement of people, as well as differentiating the level of engagement among groups in the project. The components of the Program for the Elimination of Cancer Disparities (PECaD) at the Siteman Cancer Center (a National Cancer Institute designated Comprehensive Cancer Center), which works with communities to reduce cancer disparities through outreach, education, and training, served as the partnership site evaluated.

Program

The PECaD was established in 2003 in response to known cancer health disparities; its goals are to create a national model for eliminating disparities in cancer through community-based partnerships, be a catalyst for change in the region by fostering healthy communities, and break down barriers to quality cancer care. The Disparities Elimination Advisory Committee (DEAC) comprises community leaders representing Federally Qualified Health Centers; private physicians; health, social service, and religious organizations; survivors; survivors’ family members; and other interested community groups. The DEAC has worked to identify and develop strategies to address barriers to cancer screening, treatment, and research participation in the region. The DEAC has guided PECaD’s engagement in health promotion and education efforts to address these barriers (Arroyo-Johnson et al., 2015; Thompson et al., 2014).

METHODS

Program Evaluation

PECaD began administering a biannual evaluation survey in 2011 (April–May) to evaluate PECaD’s implementation of community EPs. The web-based survey was sent to all individuals affiliated with PECaD, including members of the PECaD disease-specific partnerships, DEAC, partners in PECaD programs, and PECaD-affiliated academics. Although this initial survey was informative in assessing PECaD’s adherence to the community EPs, it lacked specificity into how adherence to these principles was being achieved (Arroyo-Johnson et al., 2015). To address this issue, the DEAC and PECaD researchers formally convened a team to create a measure.

Participants

The community engagement measure was examined among participants in the Community Research Fellows Training (CRFT) program, a PECaD pilot project (Coats et al., 2015; D’Agostino McGowan, Stafford, Thompson, Johnson-Javois, & Goodman, 2015). CRFT aims to enhance the infrastructure for CER and promote the role of underserved populations in the research enterprise.

A total of 50 community members were selected to participate in the first cohort of this 15-week-long research training program that is based on the Community Alliance for Research Empowering Social Change training (Coats et al., 2015; Goodman, Dias, & Stafford, 2010; Goodman et al., 2014; Goodman, Si, Stafford, Obasohan, & Mchunguzi, 2012). The Institutional Review Board at Washington University School of Medicine designated CRFT research as nonhuman subjects research. We reasoned that members of this group were all community activists, engaged in multiple kinds of community projects, and attending this training program that was also a research project. Their extensive experience would give them a good understanding of working with academic researchers.

Procedures

Survey development

The PECaD survey development team comprised a mixture of research and community members: three PECaD investigators, the PECaD data manager, PECaD program coordinator, and the DEAC community co-chair. The development team met monthly to develop the evaluation framework and second biannual community engagement survey. Several DEAC meetings were dedicated to discussing the evaluation framework, developing consensus on the principles that should guide community participation in research, and examining how to classify projects based on the level of participation by members of the specified community or population, as well as the activities used to encourage or sustain this participation. This bidirectional communication led PECaD to adopt a community-engaged partnership framework and seek to align projects with 11 EPs that have been previously developed in the literature (Israel et al., 1998; Israel, 2005; Israel et al., 2008; Khodyakov et al., 2011, 2013; McCloskey et al., 2012; Minkler & Wallerstein, 2010).

These EPs are based on the 11 principles of CER (Ahmed & Palermo, 2010; Burke et al., 2013; Butterfoss, Goodman, & Wandersman, 1996; Butterfoss & Francisco, 2004; Clinical and Translational Science Awards Consortium Community Engagement Key Function Committee Task Force on the Principles of Community Engagement, 2011; Israel et al., 1998; Israel et al., 2008; Khodyakov et al., 2013; Nueces et al., 2012; Report & Assessment, 2003; Wallerstein & Duran, 2006) and are as follows:

  1. Focus on local relevance and determinants of health

  2. Acknowledge the community

  3. Disseminate findings and knowledge gained to all partners

  4. Seek and use the input of community partners

  5. Involve a cyclical and iterative process in pursuit of objectives

  6. Foster co-learning, capacity building, and co-benefit for all partners

  7. Build on strengths and resources within the community

  8. Facilitate collaborative, equitable partnerships

  9. Integrate and achieve a balance of all partners

  10. Involve all partners in the dissemination process

  11. Plan for a long-term process and commitments

Initial test of initial community engagement measure

The PECaD survey development team developed items aligned with the 11 EPs to assess the level of community engagement in PECaD projects and worked with DEAC in a cyclical and iterative community-engaged process.

To assess reliability, we calculated Cronbach’s alpha to measure both the degree to which the EP-specific items were correlated with a single EP (internal consistency). It is widely accepted that alpha should exceed 0.70 to show internal consistency in the early stages of measure development (Nunnally, 1994). Previous work determined which items pertaining to six EPs lacked sufficient internal consistency (Cronbach’s alpha < 0.70) in the initial measure (Gennarelli & Goodman, 2013). We redesigned these EP-specific question groups with insufficient internal consistency; to reexamine internal consistency, we included revised items in a second dissemination of the community engagement measure (Gennarelli & Goodman, 2013, 2014). Participants were administered the community engagement measure (initial and revised) on two different program evaluation surveys approximately 6 weeks apart.

Measures

Test of revised community engagement measure

A total of 47 fellows were administered the revised community engagement measure, which comprised 96 items (questions), each designed to pertain to a specific EP. Half of the items measured quality (how well) and the other half measured quantity (how often) of community engagement; all of the items have Likert scale response options (see the Appendix for community engagement measure). Three to five quality items and corresponding three to five quantity items measure each EP. All 48 quality items had the following response options: 1 = Poor, 2 = Fair, 3 = Good, 4 = Very Good, 5 = Excellent. All 48 quantity questions had the following response options: 1 = Never, 2 = Rarely, 3 = Sometimes, 4 = Most of the time, 5 = Always.

Analyses

Cronbach’s alpha was calculated to examine internal consistency for each set of community engagement items that are meant to measure the same EP and are on the same scale (quantity or quality). After internal consistency of the items for each EP was established, we summarized responses into (a) EP-specific mean scores on each scale (quantity, quality) and (b) community engagement quantity and quality scores; mean of the EP-specific scores on each scale.

Because of the small sample size, we conducted a sensitivity analysis to compare results based on a subsample containing only complete case to the data that includes observations with missing items. Only those observations with complete responses to items measuring each EP were included in the complete case analysis and observations that satisfy inclusion criteria are included in the full sample analysis. To be included in the analyses, observations had to meet the following criteria: Each observation had to have at least three answered items for EPs that contained four to five items and at least two answered items for EPs that contained three items. Additional detail on the calculation of scores including SAS code is provided elsewhere (Gennarelli & Goodman, 2013, 2014). Analyses were conducted in 2014; results for the revised community engagement measure are presented here.

RESULTS

Of the 47 fellows, 46 (98%) completed the revised measure (see Appendix). The majority of respondents were female (85%), African American/Black (87%), had obtained a graduate degree (52%), and identified themselves as a community member or as being affiliated with a community-based organization (54%). Table 1 displays the demographic characteristics of the sample. Every EP question group had a Cronbach’s alpha > .85, indicating very strong internal consistency for all question groups on both quantity and quantity scales. Cronbach’s alpha for each EP can be seen in Table 2. Internal consistency was strong for measures (across all 11 EPs) on each scale (quality α = 0.99 and quantity α = 0.98; Table 2).

Table 1.

Demographic Characteristics of Participants (N = 46)

n %
Gender Female 39 85
Male 7 15
Race African American/Black 40 87
White 6 13
Education Jr. high school or some high school 1 2
High school diploma 1 2
Some college or associates degree 13 29
College degree 7 15
Graduate degree 24 52
Age 25–44 13 28
45–64 29 63
65+ 4 9
Affiliation Academic 3 6
Government 5 11
Community-based organization 12 26
Community member 13 28
Faith-based organization 4 9
Healthcare worker 9 20

Table 2.

Internal Consistency for Engagement Principles for Quantity and Quality Scales

Engagement Principle # of items Quantity
Quality
N Standardized Cronbach’s alpha N Standardized Cronbach’s alpha
1. Focus on local relevance and determinants of health 4 43 0.87 40 0.95
2. Acknowledge the community 4 42 0.90 38 0.94
3. Disseminate findings and knowledge gained to all partners 5 39 0.91 33 0.93
4. Seek and use the input of community partners 5 35 0.93 33 0.96
5. Involve a cyclical and iterative process in pursuit of objectives 5 34 0.90 27 0.93
6. Foster co-learning, capacity building, and co-benefit for all partners 5 38 0.91 36 0.95
7. Build on strengths and resources within the community 4 38 0.92 36 0.98
8. Facilitate collaborative, equitable partnerships 5 35 0.87 34 0.93
9. Integrate and achieve a balance of all partners 4 41 0.87 39 0.91
10. Involve all partners in the dissemination process 4 36 0.95 32 0.97
11. Plan for a long-term process and commitment 3 35 0.94 33 0.97
All 11 engagement principles 48 24 0.98 22 0.99

For the full sample, quantity scores across EPs ranged from 3.4 to 4.0 (between sometimes and most of the time) on a 5-point scale, with an average of 3.8 (community engagement quantity score) and a standard deviation = 0.7; quality scores ranged from 3.1 to 3.9 (between good and very good), with a mean = 3.6 (community engagement quality score) and a standard deviation = 0.9. These results indicate that participants felt the academic partners adhered to 11 EPs between sometimes and most of the time on the quantity scale and the quality of the engagement was good to very good. EP 5–a cyclical and interactive process in the pursuit of objectives–was rated the lowest on the quantity scale. EP 11–plan for a long-term process and commitment–was rated the lowest on the quality scale. EP 1–focus on local relevance and social determinants of health–was rated high on both quantity and quality scales. These results are especially useful in community-engaged program evaluation to pinpoint areas that need improvement. Sensitivity analysis on complete case data shows similar results to the sample that includes missing cases (Table 3).

Table 3.

Average Scores by Engagement Principle

Engagement principle Full sample
Complete case
N M SD Min Max N M SD Min Max
Quantity scores
EP1: Focus on local relevance and determinants of health 46 4.0 0.6 2.8 5.0 43 4.0 0.6 2.8 5.0
EP2: Acknowledge the community 45 3.9 0.9 1.0 5.0 42 4.0 0.8 1.0 5.0
EP3: Disseminate findings and knowledge gained to all 44 3.6 0.9 1.0 5.0 39 3.6 0.9 1.0 5.0
EP4: Seek and use the input of community partners 43 3.6 0.9 1.0 5.0 35 3.5 1.0 1.0 5.0
EP5: Involve a cyclical and iterative process 41 3.4 0.8 1.6 5.0 34 3.5 0.9 1.6 5.0
EP6: Foster co-learning, capacity building, and co-benefit 42 4.0 0.7 2.4 5.0 38 3.9 0.7 2.4 5.0
EP7: Build on community strengths and resources 40 3.6 0.8 2.0 5.0 38 3.6 0.8 2.0 5.0
EP8: Facilitate collaborative, equitable partnerships 42 4.0 0.7 2.6 5.0 35 3.9 0.7 2.6 5.0
EP9: Integrate and achieve a balance of all partners 42 4.0 0.7 2.8 5.0 41 4.0 0.7 2.8 5.0
EP10: Involve all partners in the dissemination process 38 3.8 0.9 2.0 5.0 36 3.8 0.9 2.0 5.0
EP11: Plan for a long-term process and commitment 37 3.6 1.0 1.0 5.0 35 3.5 1.0 1.0 5.0
Community engagement Quantity score (M) 46 3.8 0.7 2.3 5.0 24 3.8 0.7 2.7 4.9
Quality scores
EP1: Focus on local relevance and determinants of health 43 3.9 0.9 2.0 5.0 40 3.9 0.9 2.0 5.0
EP2: Acknowledge the community 41 3.9 0.9 2.0 5.0 38 4.0 0.9 2.0 5.0
EP3: Disseminate findings and knowledge gained to all 38 3.6 0.9 1.8 5.0 33 3.5 0.9 1.8 5.0
EP4: Seek and use the input of community partners 34 3.5 1.0 1.0 5.0 33 3.5 1.0 1.0 5.0
EP5: Involve a cyclical and iterative process 36 3.3 1.1 1.0 5.0 27 3.3 1.0 1.2 4.8
EP6: Foster co-learning, capacity building, and co-benefit 42 3.6 1.0 1.3 5.0 36 3.6 1.0 1.8 5.0
EP7: Build on community strengths and resources 36 3.4 1.1 1.0 5.0 36 3.4 1.1 1.0 5.0
EP8: Facilitate collaborative, equitable partnerships 41 3.6 0.9 2.0 5.0 34 3.5 0.9 2.0 5.0
EP9: Integrate and achieve a balance of all partners 42 3.7 0.9 2.0 5.0 39 3.7 0.9 2.0 5.0
EP10: Involve all partners in the dissemination process 35 3.2 1.0 1.0 5.0 32 3.2 1.1 1.0 5.0
EP11: Plan for a long-term process and commitment 34 3.1 1.1 1.0 5.0 33 3.1 1.1 1.0 5.0
Community engagement Quality score (M) 46 3.6 0.9 1.9 5.0 22 3.6 0.9 2.0 4.9

Note. SD = standard deviation; M = mean.

DISCUSSION

These results are encouraging for early stage measurement development. The strong internal consistency for all question groups indicates that the survey is well designed to measure adherence to the 11 EPs. The scores assess the level of community engagement, with higher scores corresponding to higher quality or frequency of engagement. For example, a project that completely engages community partners in all aspects of the research would have higher scores when compared to a project with community members serving on a traditional advisory board. Depending on the information of interest, different scores can be calculated.

Although community engagement quantity and quality scores allow for scale-specific, overall measurement of community engagement, EP-specific average scores measure how well the project adhered to a specific principle on each scale (quality and quantity). The best way to examine adherence to each of the 11 EPs is with EP-specific scores because they allow for a comprehensive picture of community engagement.

Average EP scores for the CRFT program ranged from 3.1 to 4.0; these results suggest that CRFT researchers adhered to the 11 PECaD EPs between sometime and most of the time on the quantity scale and between good and very good on the quality scale. The development of EP-specific mean scores on each scale was necessary to fully understand gaps in the implementation of community EPs in research projects and improve upon the successes of community engagement in future projects. Although CER evaluation tends to be largely qualitative, EP-specific mean scores allow for simple, evidence-based, quantitative measurement of community engagement. A quantitative measure that can be implemented using web-based surveys is a major strength, especially in large-scale projects in which qualitative approaches can be cumbersome. As an increasing number of researchers begin to engage communities in their work, these measurements will help to evaluate and improve the quality of CER.

Limitations

Because of missing values, most alpha calculations are based on n < 50. The smallest sample size for an alpha calculation was n = 34 for the quality-based items of EP 5. However, this sample size is sufficient as Cronbach’s alpha is precise when n ≥ 30, the number of items analyzed is at least 5, and the mean intercorrelation is at least 0.50, which is the case for EP 5 (Iacobucci & Duhachek, 2003). Additionally, Cronbach’s alpha calculations are also sufficiently precise for n ≥ 30 when at least 2 items are analyzed and have mean intercorrelation of at least 0.70 (Iacobucci & Duhachek, 2003), which is true for all calculations in Table 2, making the results of this pilot study quite valuable despite the small sample size.

The small sample size of this initial effort led us to include observations with missing items in score calculations. Sensitivity analyses were conducted to compare the results of complete case analyses to that of the full sample including observations with missing values. These analyses showed that the EP-specific scores are not sensitive to missing items because there were no statistically significant differences between scores calculated with complete case data and those calculated with data including missing values. Our measure was tested on a sample that was primarily African American females with high levels of education and may not be generalizable to other populations.

Future work is necessary to examine the tool in other populations, including cognitive response testing of items, to explore participants’ reactions and thought processes when exposed to items measuring the quality and quantity of community engagement in research (e.g., comprehension or interpretation of the questions, retrieval of relevant information from memory, the formation of judgments about how to respond, and the process of deciding how much information to reveal) (Lutz & Swasy, 1977; Schwarz, 2007; Willis, Royston, & Bercini, 1991).

Additional studies should also seek to extend data on the validity of the measure, identify latent constructs, and use item response theory for the creation of a revised shorter version of the measure. The current 96-item measure is comprehensive; a shorter version would allow for reduced participant burden. However, the sample size (N = 46) is not adequate for additional analyses (e.g., item response theory) that would allow for a reduction in the number of items. Given the amount of time it takes for partnerships to develop and change when administering this comprehensive tool longitudinally, to assess change in the level of community engagement over the course of a project, we suggest biannual intervals as the minimum time between assessments.

CONCLUSION

The field of CER has matured to the point where principles have been established and are often implemented in community-based studies; a major gap in this field is the ability to rigorously evaluate the level of community engagement and its impact on research processes and outcomes. The development of a quantitative measure to assess community engagement in research makes a major contribution to community-engaged science. These measures are necessary to assess associations between community engagement and research outcomes and understand the mechanisms through which community engagement affects the development and quality of scientific discovery. Although this measure was developed and initially tested in projects addressing cancer disparities in African Americans, the engagement principles measured are generalizable to other diseases and populations. The next steps in measure development include cognitive testing, development of instructions, examination of the impact of administration strategy, psychometric properties in other populations, and examination of measure validity. This measure can be used for monitoring and improving partnerships and exploring how partnerships facilitate outcomes.

Acknowledgments

This work, the work of the Program for the Elimination of Cancer Disparities evaluation team, and the Community Research Fellows Training pilot project are funded by National Institutes of Health, National Cancer Institute (grant U54CA153460).

The work of Dr. Goodman is supported by the Barnes-Jewish Hospital Foundation, Siteman Cancer Center, National Institutes of Health, National Cancer Institute (grant U54CA153460), and Washington University School of Medicine.

APPENDIX. Quantitative Community Engagement Measure

Please rate how often* you think the academic team did each of the following.**

Engagement Principle 1: Focus on local relevance and social determinants of health
Focus on issues important to my community.
Focus on health problems that the community thinks are important.
Focus on the combined interaction of factors (i.e. personal, social, economic . . . ) that influence health status.
Focus on cultural factors that influence health behaviors
Engagement Principle 2: Acknowledge the community
Show appreciation for community time and effort
Highlight the community’s involvement.
Give credit to community members and others for work.
Value community perspectives.
Engagement Principle 3: Disseminate findings and knowledge gained to all partners
Let community members know what is going on with the project
Help community members with problems of their own
Empower community members with knowledge gained from a joint activity
Get findings and information to community members
Help community members disseminate information using community publications
Engagement Principle 4: Seek and use the input of community partners
Ask community members for input
Use the ideas and input of community members
Change plans as a result of community input
Involve community members in making key decisions
Ask community members for help with specific tasks
Engagement Principle 5: Involve a cyclical and iterative process in pursuit of objectives
Share the results of how things turned out with the community
Seek community input and help at multiple stages of the process
Inform the community of what happened when their ideas were tried
Plan for ongoing problem solving
Involve the community in determining next steps
Engagement Principle 6: Foster co-learning, capacity building, and co-benefit for all partners
Learn from community members
Help community members gain important skills from involvement
Encourage academic partners and community members to learn from each other
Help community partners get what they need from academic partners
Help community members achieve social, educational, or economic goals
Engagement Principle 7: Build on strengths and resources within the community
Build on strengths within the community
Build on resources within the community
Help to fill gaps in community strengths and resources
Work with existing community networks
Engagement Principle 8: Facilitate collaborative, equitable partners
Foster collaborations win which community members are real partners
Handle disagreements fairly
Demonstrate that community members are really needed to do a good job
Demonstrate that community members’ ideas make things better
Enable community members to voice disagreements
Engagement Principle 9: Integrate and achieve a balance of all partners
Enable all people involved to voice their views
Make final decisions that reflect the ideas of everyone involved
Demonstrate that community members’ ideas are just as important as academics’ ideas
Treat community members’ ideas with openness and respect
Engagement Principle 10: Involve all partners in the dissemination process
Make sure that all partners are involved with sharing findings
Include community members in plans for sharing findings.
Involve community members in sharing health messages in community settings.
Listen to community members when planning dissemination activities.
Engagement Principle 11: Plan for a long-term process and commitment
Make plans for community-engaged activities to continue for many years.
Make commitments in communities that are long-term.
Want to work with community members for many years.
*

The same items are repeated to measure the quality of engagement using the question; Please rate how well you think the academic team did each of the following with Likert response options: Poor, Fair, Good, Very Good, Excellent

**

Quantity Scale Likert Response options: Never, Rarely, Sometimes, Most of the time, Always

Contributor Information

Melody S. Goodman, Washington University School of Medicine

Vetta L. Sanders Thompson, Brown School of Social Work, Washington University in St. Louis

Cassandra Arroyo Johnson, Washington University School of Medicine.

Renee Gennarelli, Washington University School of Medicine.

Bettina F. Drake, Washington University School of Medicine

Pravleen Bajwa, Washington University School of Medicine.

Maranda Witherspoon, Missouri Foundation for Health.

Deborah Bowen, University of Washington School of Medicine.

References

  1. Ahmed SM, Palermo AGS. Community engagement in research: Frameworks for education and peer review. American Journal of Public Health. 2010;100(8):1380–1387. doi: 10.2105/AJPH.2009.178137. http://doi.org/10.2105/AJPH.2009.178137. [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Arroyo-Johnson C, Allen ML, Colditz GA, Hurtado GA, Davey CS, Thompson VLS, … Goodman MS. A tale of two community networks program centers: Operationalizing and assessing CBPR principles and evaluating partnership outcomes. Progress in Community Health Partnerships. 2015;9(Special Issue):61–69. doi: 10.1353/cpr.2015.0026. [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Balazs CL, Morello-Frosch R. The three Rs: How community-based participatory research strengthens the rigor, relevance, and reach of science. Environmental Justice. 2013;6(1):9–16. doi: 10.1089/env.2012.0017. http://doi.org/10.1089/env.2012.0017. [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Bordeaux BC, Wiley C, Tandon SD, Horowitz CR, Brown PB, Bass EB. Guidelines for writing manuscripts about community-based participatory research for peer-reviewed journals. Progress in Community Health Partnerships: Research, Education, and Action. 2007;1(3):281–288. doi: 10.1353/cpr.2007.0018. http://doi.org/10.1353/cpr.2007.0018. [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Brandon DT, Isaac LA, LaVeist TA. The legacy of Tuskegee and trust in medical care: Is Tuskegee responsible for race differences in mistrust of medical care? Journal of the National Medical Association. 2005;97(7):951. [PMC free article] [PubMed] [Google Scholar]
  6. Burke JG, Hess S, Hoffmann K, Guizzetti L, Loy E, Gielen A, … Yonas M. Translating community-based participatory research principles into practice. Progress in Community Health Partnerships: Research, Education, and Action. 2013;7(2):109. doi: 10.1353/cpr.2013.0025. http://doi.org/10.1353/cpr.2013.0020. [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. Butterfoss FD, Francisco VT. Evaluating community partnerships and coalitions with practitioners in mind. Health Promotion Practice. 2004;5(2):108–114. doi: 10.1177/1524839903260844. http://doi.org/10.1177/1524839903260844. [DOI] [PubMed] [Google Scholar]
  8. Butterfoss FD, Goodman RM, Wandersman A. Community coalitions for prevention and health promotion: Factors predicting satisfaction, participation, and planning. Health Education & Behavior. 1996;23(1):65–79. doi: 10.1177/109019819602300105. [DOI] [PubMed] [Google Scholar]
  9. Campbell C, Jovchelovitch S. Health, community and development: Towards a social psychology of participation. Journal of Community and Applied Social Psychology. 2007;10(4):255–270. http://doi.org/10.1002/1099-1298(200007/08)10. [Google Scholar]
  10. Cargo M, Mercer SL. The value and challenges of participatory research: Strengthening its practice. Annual Review of Public Health. 2008;29:325–350. doi: 10.1146/annurev.publhealth.29.091307.083824. http://doi.org/10.1146/annurev.publhealth.29.091307.083824. [DOI] [PubMed] [Google Scholar]
  11. Chesler MA. Participatory action research with self-help groups: An alternative paradigm for inquiry and action. American Journal of Community Psychology. 1991;19(5):757–768. doi: 10.1007/BF00938043. http://doi.org/10.1007/BF00938043. [DOI] [PubMed] [Google Scholar]
  12. Clinical and Translational Science Awards Consortium Community Engagement Key Function Committee Task Force on the Principles of Community Engagement. Principles of Community Engagement. 2011 NIH Publication No. 11–7782. Retrieved from http://www.atsdr.cdc.gov/communityengagement/
  13. Coats JV, Stafford JD, Sanders Thompson V, Johnson Javois B, Goodman MS. Increasing research literacy: The community research fellows training program. Journal of Empirical Research on Human Research Ethics. 2015;10(1):3–12. doi: 10.1177/1556264614561959. http://doi.org/10.1177/1556264614561959. [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. Corbie-Smith G, Thomas SB, St George DMM. Distrust, race, and research. Archives of Internal Medicine. 2002;162(21):2458–2463. doi: 10.1001/archinte.162.21.2458. Retrieved from http://www.ncbi.nlm.nih.gov/pubmed/12437405. [DOI] [PubMed] [Google Scholar]
  15. D’Agostino McGowan L, Stafford JD, Thompson VL, Johnson-Javois B, Goodman MS. Quantitative evaluation of the community research fellows training program. Frontiers in Public Health. 2015;3(July):1–12. doi: 10.3389/fpubh.2015.00179. http://doi.org/10.3389/fpubh.2015.00179. [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. Davis SW, Cassel K, Moseley MA, Mesia R, De Herrera PA, Kornfeld J, Perocchia R. The Cancer Information Service: Using CBPR in building community capacity. Journal of Cancer Education: The Official Journal of the American Association for Cancer Education. 2011;26(1):51–57. doi: 10.1007/s13187-010-0159-x. [DOI] [PubMed] [Google Scholar]
  17. Eder MM, Carter-Edwards L, Hurd TC, Rumala BB, Wallerstein N. A logic model for community engagement within the clinical and translational science awards consortium: Can we measure what we model? Academic Medicine: Journal of the Association of American Medical Colleges. 2013;88(10):1430–1436. doi: 10.1097/ACM.0b013e31829b54ae. http://doi.org/10.1097/ACM.0b013e31829b54ae. [DOI] [PMC free article] [PubMed] [Google Scholar]
  18. Eder M, Tobin JN, Proser M, Shin P. Special issue introduction: Building a stronger science of community-engaged research. Progress in Community Health Partnerships: Research, Education, and Action. 2012;6(3):227–230. doi: 10.1353/cpr.2012.0040. http://doi.org/10.1353/cpr.2012.0040. [DOI] [PubMed] [Google Scholar]
  19. Fawcett SB, Paine-Andrews a, Francisco VT, Schultz Ja, Richter KP, Lewis RK, … Fisher JL. Using empowerment theory in collaborative partnerships for community health and development. American Journal of Community Psychology. 1995;23(5):677–697. doi: 10.1007/BF02506987. [DOI] [PubMed] [Google Scholar]
  20. Francisco VT, Paine AL, Fawcett SB. A methodology for monitoring and evaluating community health coalitions. Health Education Research. 1993 doi: 10.1093/her/8.3.403. http://doi.org/10.1093/her/8.3.403. [DOI] [PubMed]
  21. Gennarelli R, Goodman MS. measuring internal consistency of community engagement using the APLHA option of PROC CORR. New England SAS Users Group; 2013. pp. 1–7. [Google Scholar]
  22. Gennarelli R, Goodman MS. SAS Global Forum. 2014. Using SAS® to examine internal consistency and to develop community engagement scores; pp. 1–10. [Google Scholar]
  23. Goodman MS, Dias JJ, Stafford JD. Increasing research literacy in minority communities: CARES fellows training program. Journal of Empirical Research on Human Research Ethics. 2010;5(4):33–41. doi: 10.1525/jer.2010.5.4.33. [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Goodman MS, Gonzalez M, Gil S, Si X, Pashoukos JL, Stafford JD, … Pashoukos DA. Brentwood community health care assessment. Progress in Community Health Partnerships: Research, Education, and Action. 2014;8(1):29–39. doi: 10.1353/cpr.2014.0017. [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Goodman MS, Si X, Stafford JD, Obasohan A, Mchunguzi C. Quantitative assessment of participant knowledge and evaluation of participant satisfaction in the CARES training program. Progress in Community Health Partnerships: Research, Education, and Action. 2012;6(3):359–366. doi: 10.1353/cpr.2012.0051. [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. Goodman RM, Speers MA, Mcleroy K, Fawcett S, Kegler M, Parker E, … Wallerstein N. Identifying and defining the dimensions of community capacity to provide a basis for measurement. Health Education & Behavior. 1998;25(3):258–278. doi: 10.1177/109019819802500303. http://doi.org/10.1177/109019819802500303. [DOI] [PubMed] [Google Scholar]
  27. Iacobucci D, Duhachek A. Advancing alpha: Measuring reliability with confidence. Journal of Consumer Psychology. 2003;13(4):478–487. http://doi.org/10.1207/S15327663JCP1304_14. [Google Scholar]
  28. Institute of Medicine. The CTSA program at NIH: Opportunities for advancing clinical and translational research. Washington, DC: National Academies Press; 2013. [PubMed] [Google Scholar]
  29. Israel BA. Methods in community-based participatory research for health. San Fransisco: Jossey-Bass; 2005. [Google Scholar]
  30. Israel BA, Coombe CM, Cheezum RR, Schulz AJ, McGranaghan RJ, Lichtenstein R, … Burris A. Community-based participatory research: a capacity-building approach for policy advocacy aimed at eliminating health disparities. American Journal of Public Health. 2010;100(11):2094–2102. doi: 10.2105/AJPH.2009.170506. [DOI] [PMC free article] [PubMed] [Google Scholar]
  31. Israel BA, Schulz AJ, Parker EA, Becker A. Review of community-based research: Assessing partnership approaches to improve public health. Annual Review of Public Health. 1998;19:173–202. doi: 10.1146/annurev.publhealth.19.1.173. http://doi.org/10.1146/annurev.publhealth.19.1.173. [DOI] [PubMed] [Google Scholar]
  32. Israel BA, Schulz AJ, Parker EA, Becker AB, Allen AJ, Guzman JR. Critical issues in developing and following CBPR principles. Community-Based Participatory Research for Health: From Process to Outcomes. 2008:47–66. [Google Scholar]
  33. Jagosh J, Macaulay AC, Pluye P, Salsberg J, Bush PL, Henderson J, … Green-halgh T. Uncovering the benefits of participatory research: Implications of a realist review for health research and practice. The Milbank Quarterly. 2012;90(2):311–346. doi: 10.1111/j.1468-0009.2012.00665.x. http://doi.org/10.1111/j.1468-0009.2012.00665.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  34. Jagosh J, Pluye P, Macaulay AC, Salsberg J, Henderson J, Sirett E, … Green LW. Assessing the outcomes of participatory research: Protocol for identifying, selecting, appraising and synthesizing the literature for realist review. Implementation Science: IS. 2011;6(1):24. doi: 10.1186/1748-5908-6-24. http://doi.org/10.1186/1748-5908-6-24. [DOI] [PMC free article] [PubMed] [Google Scholar]
  35. Khodyakov D, Stockdale S, Jones A, Mango J, Jones F, Lizaola E. On measuring community participation in research. Health Education & Behavior: The Official Publication of the Society for Public Health Education. 2013;40(3):346–354. doi: 10.1177/1090198112459050. http://doi.org/10.1177/1090198112459050. [DOI] [PMC free article] [PubMed] [Google Scholar]
  36. Khodyakov D, Stockdale S, Jones F, Ohito E, Jones A, Lizaola E, Mango J. An exploration of the effect of community engagement in research on perceived outcomes of partnered mental health services projects. Society and Mental Health. 2011;1(3):185–199. doi: 10.1177/2156869311431613. http://doi.org/10.1177/2156869311431613. [DOI] [PMC free article] [PubMed] [Google Scholar]
  37. Lantz PM, Viruell-Fuentes E, Israel BA, Softley D, Guzman R. Can communities and academia work together on public health research? Evaluation results from a community-based participatory research partnership in Detroit. Journal of Urban Health: Bulletin of the New York Academy of Medicine. 2001;78(3):495–507. doi: 10.1093/jurban/78.3.495. http://doi.org/10.1093/jurban/78.3.495. [DOI] [PMC free article] [PubMed] [Google Scholar]
  38. Lutz RJ, Swasy JL. Integrating cognitive structure and cognitive response approaches to monitoring communication effects. Advanced Consumer Research. 1977;4(1):363–371. [Google Scholar]
  39. McCloskey DJ, McDonald MA, Cook J, Heurtin-Roberts S, Updegrove S, Sampson D, … Eder M. Principles of Community Engagement. 2. The Centers for Disease Control and Prevention; 2012. Community engagement: Definitions and organizing concepts from the literature; pp. 3–41. [Google Scholar]
  40. Minkler M. Ethical challenges for the “outside” researcher in community-based participatory research. Health Education & Behavior. 2004;31(6):684. doi: 10.1177/1090198104269566. [DOI] [PubMed] [Google Scholar]
  41. Minkler M. Community-based research partnerships: Challenges and opportunities. Journal of Urban Health. 2005;82(2)(Supplement 2):ii3–ii12. doi: 10.1093/jurban/jti034. [DOI] [PMC free article] [PubMed] [Google Scholar]
  42. Minkler ME, Wallerstein NE. Community-based participatory research for health. San Fransisco: Jossey-Bass; 2003. [Google Scholar]
  43. Minkler M, Wallerstein NN. Community-based participatory research for health: From process to outcomes. San Francisco: Jossey-Bass; 2010. [Google Scholar]
  44. Mohatt GV, Hazel KL, Allen J, Stachelrodt M, Hensel C, Fath R. Unheard Alaska: Culturally anchored participatory action research on sobriety with Alaska Natives. American Journal of Community Psychology. 2004;33(3–4):263–273. doi: 10.1023/b:ajcp.0000027011.12346.70. http://doi.org/10.1023/B:AJCP.0000027011.12346.70. [DOI] [PubMed] [Google Scholar]
  45. Nelson G, Ochocka J, Griffin K, Lord J. “Nothing about me, without me”: Participatory action research with self-help/mutual aid organizations for psychiatric consumer/survivors. American Journal of Community Psychology. 1998;26(6):881–912. doi: 10.1023/a:1022298129812. http://doi.org/10.1023/a:1022298129812. [DOI] [PubMed] [Google Scholar]
  46. Nguyen G, Hsu L, Kue KN, Nguyen T, Yuen EJ. Partnering to collect health services and public health data in hard-to-reach communities: A community-based participatory research approach for collecting community health data. Progress in Community Health Partnerships: Research, Education, and Action. 2010;4(2):115–119. doi: 10.1353/cpr.0.0120. [DOI] [PubMed] [Google Scholar]
  47. Las Nueces DD, Hacker K, DiGirolamo A, Hicks S, De las Nueces D, Hicks LS. A systematic review of community-based participatory research to enhance clinical trials in racial and ethnic minority groups. Health Services Research. 2012;47(3 Pt 2):1363–1386. doi: 10.1111/j.1475-6773.2012.01386.x. http://doi.org/10.1111/j.1475-6773.2012.01386.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  48. Nunnally JC. Psychometric theory. 2. New York: McGraw-Hill; 1994. [Google Scholar]
  49. Quinn SC, Kass NE, Thomas SB. Building trust for engagement of minorities in human subjects research: Is the glass half full, half empty, or the wrong size? American Journal of Public Health. 2013;103(12):2119–2121. doi: 10.2105/AJPH.2013.301685. http://doi.org/10.2105/ajph.2013.301685. [DOI] [PMC free article] [PubMed] [Google Scholar]
  50. Rappaport J. Terms of empowerment/exemplars of prevention: Toward a theory for community psychology. American Journal of Community Psychology. 1987;15(2):121–148. doi: 10.1007/BF00919275. http://doi.org/10.1007/BF00919275. [DOI] [PubMed] [Google Scholar]
  51. Report E, Assessment T. Journal of General Internal Medicine. 2003. Community-based participatory research: Summary; p. 99. [Google Scholar]
  52. Ross LF, Loup A, Nelson RM, Botkin JR, Kost R, Smith GR, Gehlert S. Human subjects protections in community-engaged research: A research ethics framework. Journal of Empirical Research on Human Research Ethics. 2010a;5(1):5–17. doi: 10.1525/jer.2010.5.1.5. http://doi.org/10.1525/jer.2010.5.1.5.Human. [DOI] [PMC free article] [PubMed] [Google Scholar]
  53. Ross LF, Loup A, Nelson RM, Botkin JR, Kost R, Smith GR, Gehlert S. Nine key functions for a human subjects protection program for community-engaged research: Points to consider. Journal of Empirical Research on Human Research Ethics. 2010b;5(1):33–47. doi: 10.1525/jer.2010.5.1.33. http://doi.org/10.1525/jer.2010.5.1.5.Human. [DOI] [PMC free article] [PubMed] [Google Scholar]
  54. Salimi Y, Shahandeh K, Malekafzali H, Loori N, Kheiltash A, Jamshidi E, … Majdzadeh R. Is community-based participatory research (CBPR) useful? A systematic review on papers in a decade. International Journal of Preventive Medicine. 2012;3(6):386–393. Retrieved from http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=3389435&tool=pmcen-trez&rendertype=abstract. [PMC free article] [PubMed] [Google Scholar]
  55. Sanchez V, Carrillo C, Wallerstein N. From the ground up: Building a participatory evaluation model. Progress in Community Health Partnerships: Research, Education, and Action. 2011;5(1):45–52. doi: 10.1353/cpr.2011.0007. [DOI] [PubMed] [Google Scholar]
  56. Schulz AJ, Israel BA, Lantz P. Instrument for evaluating dimensions of group dynamics within community-based participatory research partnerships. Evaluation and Program Planning. 2003;26(3):249–262. doi: 10.1016/j.evalprogplan.2018.04.014. http://doi.org/10.1016/S0149-7189(03)00029-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  57. Schwarz N. Cognitive aspects of survey methodology. Applied Cognitive Psychology. 2007;21:277–287. http://doi.org/10.1002/acp. [Google Scholar]
  58. Shalowitz MU, Isacco A, Barquin N, Clark-Kauffman E, Delger P, Nelson D, … Wagenaar KA. Community-based participatory research: A review of the literature with strategies for community engagement. Journal of Developmental and Behavioral Pediatrics. 2009;30(4):350–361. doi: 10.1097/DBP.0b013e3181b0ef14. http://doi.org/10.1097/DBP.0b013e3181b0ef14. [DOI] [PubMed] [Google Scholar]
  59. Thompson VLS, Drake B, James AS, Norfolk M, Goodman M, Ashford L, … Colditz G. A community coalition to address cancer disparities: transitions, successes and challenges. Journal of Cancer Education: The Official Journal of the American Association for Cancer Education. 2014 doi: 10.1007/s13187-014-0746-3. http://doi.org/10.1007/s13187-014-0746-3. [DOI] [PMC free article] [PubMed]
  60. Trinh-Shevrin C, Islam N, Tandon SD, Abesamis N, Hoe-Asjoe H, Rey M. Using community-based participatory research as a guiding framework for health disparities research centers. Progress in Community Health Partnerships: Research, Education, and Action. 2007;1(2):195. doi: 10.1353/cpr.2007.0007. [DOI] [PMC free article] [PubMed] [Google Scholar]
  61. Wallerstein NB, Duran B. Using community-based participatory research to address health disparities. Health Promotion Practice. 2006;7(3):312. doi: 10.1177/1524839906289376. [DOI] [PubMed] [Google Scholar]
  62. Wallerstein N, Duran B. Community-based participatory research contributions to intervention research: The intersection of science and practice to improve health equity. American Journal of Public Health. 2010;100:S40–S46. doi: 10.2105/AJPH.2009.184036. http://doi.org/10.2105/AJPH.2009.184036. [DOI] [PMC free article] [PubMed] [Google Scholar]
  63. Watts RJ, Flanagan C. Pushing the envelope on youth civic engagement: A developmental and liberation psychology perspective. Journal of Community Psychology. 2007;35(6):779–792. http://doi.org/10.1002/jcop.20178. [Google Scholar]
  64. Weir E, D’Entremont N, Stalker S, Kurji K, Robinson V. Applying the balanced scorecard to local public health performance measurement: Deliberations and decisions. BMC Public Health. 2009;9:127. doi: 10.1186/1471-2458-9-127. http://doi.org/10.1186/1471-2458-9-127. [DOI] [PMC free article] [PubMed] [Google Scholar]
  65. Wilkins CH, Spofford M, Williams N, McKeever C, Allen S, Brown J, … Strelnick AH. Community representatives’ involvement in clinical and translational science awardee activities. Clinical and Translational Science. 2013 doi: 10.1111/cts.12072. 0(0), n/a–n/a. http://doi.org/10.1111/cts.12072. [DOI] [PMC free article] [PubMed]
  66. Willis GB, Royston P, Bercini D. The use of verbal report methods in the development and testing of survey questionnaires. Applied Cognitive Psychology. 1991;5:261–267. [Google Scholar]
  67. Zeldin S. Preventing youth violence through the promotion of community engagement and membership. Journal of Community Psychology. 2004;32(5):623–641. http://doi.org/10.1002/jcop.20023. [Google Scholar]

RESOURCES