Skip to main content
Injury Epidemiology logoLink to Injury Epidemiology
. 2025 Dec 10;13:7. doi: 10.1186/s40621-025-00642-5

Engaging practitioners and academic researchers in co-developing evaluation measures for community violence interventions

Meron Girma 1,2,#, Julia P Schleimer 1,2,✉,#, Andrew Hillen 3, Astrid Aveledo 4, Ayah Mustafa 1, Deepika Nehra 5, Dominique Davis 6, Elaine Gonzalez 7, Emily Westlake 8, Esprene Liddell-Quintyn 1, Kris Torset 4, Kristian Jones 1,9, Laura Johnson 5, Lina R Benson 5, Lynniah Grayson 1,2, Orlando Ames 6, Rachel Ross 1, Samantha Decker 7, Taffy Hunter 10, Tarell Harrison 5, Tier Simon-Matthews 11, Vivian Lyons 1,12, Zaheed Lynch 6, Ali Rowhani-Rahbar 1,2
PMCID: PMC12801489  PMID: 41373002

Abstract

Background

Community violence intervention (CVI) is increasingly considered an important component of comprehensive public safety infrastructure, but research and evaluation of CVI programs remain underdeveloped. There is a critical need for evaluation measures that are developed collaboratively with those most proximate to these interventions and that capture interventions' nuanced and holistic impacts. This paper presents the process and results of a practitioner-academic partnership to co-develop youth-focused CVI evaluation measures in Washington state.

Results

The process of co-developing evaluation measures involved two phases. For phase 1, we created a menu of quantitative measures (n = 60) for each outcome construct in a previously co-developed CVI theory of change by integrating existing measures, identified through a literature review, with insights and recommendations from CVI practitioners gathered during a workshop. For phase 2, we tailored and refined quantitative and qualitative measures with/for 7 CVI programs involved in the collaboration via individual meetings (n = 45 one-hour meetings over the course of 10 months, average = 6.4 meetings per program). The process of refining measures involved extensive discussion around several key considerations, including confidentiality, age appropriateness of questions, and language/jargon. After revisions, each CVI program had a customized list of quantitative and qualitative measures that fit their program and the population they served. We also created an online toolkit accompanying this paper so others may easily use, tailor, and build upon our work.

Conclusions

Our process of co-developing youth-focused CVI evaluation measures drew from existing literature while heavily prioritizing the knowledge, expertise, and capacity of CVI practitioners. This helped facilitate power sharing and responsiveness to community needs, and we believe it resulted in more appropriate and contextually relevant measures. By detailing the iterative process that we took along with the resulting evaluation measures, our intent is to encourage practitioner-academic collaboration and underscore how such partnerships can enhance the field’s understanding of CVI implementation and impacts.

Supplementary Information

The online version contains supplementary material available at 10.1186/s40621-025-00642-5.

Keywords: Community violence intervention, Firearms, Program evaluation

Background

In 2023, 22,829 people died from homicide in the United States (US), and firearms were involved in 79% of those deaths[1]. Most firearm homicides in the US take the form of community violence, or “intentional acts of interpersonal violence committed in public areas by individuals who are not intimately related to the victim” [2]. Community violence has profound and far-reaching consequences, including physical disability, mental distress, substance use, and economic harm [3]. While entire families and communities are impacted by community violence, young Black and Brown men experience a disproportionate burden of injuries and deaths [1, 4].

Community violence intervention (CVI) is increasingly considered an important component of comprehensive public safety infrastructure and a key intervention to decrease violence in communities while promoting healing, safety, and overall well-being [5]. CVI is a healing-centered approach largely developed and implemented by and for those most impacted by community violence; it has been described as “a person-focused approach… [that] … incorporates peer support, harm reduction, and resource offerings to help meet the immediate and longer-term needs of those individuals at the highest risk of violence involvement” [6].

However, due in part to decades of inadequate funding for violence prevention research and CVIs, research and evaluation of CVI programs is underdeveloped, and existing studies have often focused narrowly on violence-related outcomes, showing promising, albeit mixed, findings [717]. To more comprehensively and accurately understand and improve CVI implementation and effectiveness, there is a need for novel and contextually relevant CVI evaluation measures that capture the nuanced and holistic impacts of these interventions [5]. In this paper, we build on prior work [1821] and document a collaborative effort between CVI practitioners and academic researchers to co-develop evidence-informed, community-centered quantitative and qualitative evaluation measures for CVI programs in Washington state.

Given the historical harms of some research for Black and Brown communities, the urgent need for equitable and effective approaches to reduce community violence, and the diversity of CVI program models, contexts, and populations served, it is critical that evaluation measures are developed collaboratively with those who are deeply embedded in these efforts [22, 23]. In the context of CVI, who defines success and how success is measured are critical questions, as these decisions can have substantial consequences for CVI programs and the communities they serve. Power dynamics rooted in systems of oppression often shape whose voices are heard and which types of evidence are valued, typically resulting in the prioritization of academics’ perspectives over CVI practitioners’ and community members’ [24]. In contrast, when community members are actively involved in the design and implementation of CVI evaluations, such assessments may be more valid, comprehensive, and aligned with the real-world contexts of CVI programs. Beyond these instrumental goals of producing more rigorous and relevant evaluations, such collaboration is arguably ethically important and can help resist white supremacy culture, which “…tells us who has value, who doesn’t, what has value, what doesn’t in ways that reinforce a racial hierarchy of power and control that diseases and destroys all it touches” [25] (p3).

Here we present our approach, grounded in a commitment to genuine partnership, to co-developing CVI evaluation measures that were culturally relevant and reflect the true work of CVIs, as understood by the people working on the ground to implement these interventions. By detailing the iterative process that we took along with the resulting evaluation measures, we aim to encourage practitioner-academic collaboration and underscore how such partnerships can enhance the field’s understanding of CVI implementation and impacts. This work is rooted in principles of community-based participatory research, an increasingly valued form of community-engaged research in which community members and academic researchers work together, with equal decision-making power, to co-construct knowledge with the goals of enhancing the relevance and utility of research findings and improving health and social equity [26].

Methods

Community-academic partnership

This paper describes the process and results of co-developing CVI evaluation measures via a practitioner-academic partnership during 2023–2025, initiated via a contract with the Washington State Office of Firearm Safety and Violence Prevention (OFSVP). Academic partners were based at the University of Washington (UW) Firearm Injury & Policy Research Program. CVI practitioner partners were from 7 community- or hospital-based youth firearm violence intervention programs in Washington state (hereafter “CVI programs”) that submitted proposals to and were awarded funding by OFSVP. Of these, one was based in Thurston County (Port of Support & Pathwayz to Success), two in King County (Community Passageways and Harborview Medical Center’s hospital-based violence intervention program), one in Clark County (Community Mediation Services), one in Spokane County (Spokane Regional Domestic Violence Coalition), one in Yakima County (Walk About Yakima), and one in Grays Harbor County (Dispute Resolution Center of Grays Harbor and Pacific Counties). The overarching goal of the project was to evaluate the implementation and potential impacts of these CVI programs. CVI practitioners and academic researchers were co-leaders in the evaluation design and implementation, and processes were adapted to practitioners’ capacity and insights.

Our work was guided by a theory of change (TOC) for youth-focused CVI, described in detail elsewhere [27], that our team previously co-developed with community partners. The TOC included six key domains: (1) root causes, (2) promotive factors, (3) activities, (4) intermediate outcomes (6 months to 1 year), (5) longer-term outcomes (1–2 + years), and (6) multilevel context (youth/family, staff/organizational, community, and societal). We focused on developing self-reported measures for intermediate and longer-term outcomes (which are the changes hypothesized to result from the intervention in relation to baseline), but also included several measures related to context (which are factors external to the intervention that may affect youth, families, and CVI program staff) and activities (which are the services and supports provided by the intervention). Outcome constructs are in Table 1.

Table 1.

Intermediate and longer-term outcome constructs and corresponding existing measures reviewed in phase 1

Theory of change construct Existing measure Reference
Promotive factors & inter-mediate outcomes
Community-level safety, collective efficacy, partnerships, trust Save Our Streets Community Survey, Neighborhood Violence Questions [34] Picard-Fritsche S, Cerniglia L. Testing a Public Health Approach to Gun Violence: An Evaluation of Crown Heights Save Our Streets, a Replication of the Cure Violence Model. Center for Court Innovation New York, NY; 2013. https://www.innovatingjustice.org/sites/default/files/documents/SOS_Evaluation.pdf
Trust In Health Promotion Partnerships Survey [35] Jones J, Barry MM. Developing a scale to measure trust in health promotion partnerships. Health Promot Int. 2011;26(4):484–491. doi:10.1093/heapro/dar007
Resources and Linkages— Cultural Competence Self Assessment Questionnaire [36] Mason J. Cultural Competence Self-Assessment Questionnaire: A Manual for Users. Portland State University, Research and Training Center on Family Support and Children’s Mental Health; 1995. https://www.pathwaysrtc.pdx.edu/pdf/CCSAQ.pdf
Youth/family ability to meet basic needs, progress towards personal goals Material Hardship Survey [28] McLanahan L, Garfinkel I, Waldfogel J, Edin K. Scales and Concepts Documentation, Future of Families and Child Wellbeing Study. https://ffcws.princeton.edu/data-and-documentation/scales-and-concepts-documentation
Social Needs Screening [37] Health Leads. Social Needs Screening Toolkit. 2018. https://healthleadsusa.org/wp-content/uploads/2023/05/Screening_Toolkit_2018.pdf
Well-Being and Basic Needs Survey, Health and Health Insurance [38] Karpman M, Zuckerman S, Gonzalez D. The Well-Being and Basic Needs Survey: A New Data Source for Monitoring the Health and Well-Being of Individuals and Families. Urban Institute; 2018. https://www.urban.org/research/publication/well-being-and-basic-needs-survey
Youth/family behavioral health, self-efficacy Mental Health Continuum Short Form [39] Lamers SMA, Westerhof GJ, Bohlmeijer ET, ten Klooster PM, Keyes CLM. Evaluating the psychometric properties of the Mental Health Continuum-Short Form (MHC-SF). J Clin Psychol. 2011;67(1):99–110. doi:10.1002/jclp.20741
Drug Abuse Screening Test, DAST-10 [40, 41]

Yudko E, Lozhkina O, Fouts A. A comprehensive review of the psychometric properties of the Drug Abuse Screening Test. J Subst Abuse Treat. 2007;32(2):189–198. doi:10.1016/j.jsat.2006.08.002

Skinner HA. The drug abuse screening test. Addict Behav. 1982;7(4):363–371.

General Self-Efficacy Scale [42] Chen G, Gully SM, Eden D. Validation of a New General Self-Efficacy Scale. Organ Res Methods. 2001;4(1):62–83. doi:10.1177/109442810141004; https://www.imperial.ac.uk/education-research/evaluation/what-can-i-evaluate/self-efficacy/tools-for-assessing-self-efficacy/general-self-efficacy-scale/
Self-Esteem—Rochester Youth Development Study [43] Dahlberg L, Toal S, Swahn M, Behrens C. Measuring Violence-Related Attitudes, Behaviors, and Influences Among Youths: A Compendium of Assessment Tools. Centers for Disease Control and Prevention, National Center for Injury Prevention and Control; 2005. https://www.wcasa.org/wp-content/uploads/2020/05/Evaluation_11.-CDC_Measuring-Violence-Related-Attitudes-Behaviors-and-Influences-Among-Youths.pdf
Youth/family economic resources, employment and educational advancement Commitment To School—Rochester Youth Development Study [43] Dahlberg L, Toal S, Swahn M, Behrens C. Measuring Violence-Related Attitudes, Behaviors, and Influences Among Youths: A Compendium of Assessment Tools. Centers for Disease Control and Prevention, National Center for Injury Prevention and Control; 2005. https://www.wcasa.org/wp-content/uploads/2020/05/Evaluation_11.-CDC_Measuring-Violence-Related-Attitudes-Behaviors-and-Influences-Among-Youths.pdf
Youth Employment Survey [44] Inspiration Corporation. Youth Employment Survey, Research Results. 2021. https://omsinspiration.wpenginepowered.com/wp-content/uploads/2021/09/2021_Inspiration_Corporation_YES_Report.pdf
National Longitudinal Survey of Youth, Employment [45] Bureau of Labor Statistics, U.S. Department of Labor. National Longitudinal Survey of Youth 1997 cohort, 1997–2021 (rounds 1–20). 2024. Produced and distributed by the Center for Human Resource Research (CHRR), The Ohio State University. Columbus, OH. https://www.nlsinfo.org/content/cohorts/nlsy97/topical-guide/employment
Youth/family pro-social response to conflict, peaceful conflict resolution Regulation of Emotions Questionnaire [46] Phillips KFV, Power MJ. A new self-report measure of emotion regulation in adolescents: The Regulation of Emotions Questionnaire. Clin Psychol Psychother. 2007;14(2):145–156. doi:10.1002/cpp.523
Community Resident Survey Instrument [47] Research & Evaluation Center, John Jay College of Criminal Justice. NYC-Cure Survey Instrument: Cure Violence Evaluation Study. City University of New York (JohnJayREC); 2014. https://johnjayrec.nyc/wp-content/uploads/2019/04/NYC_Cure_survey_instrument.pdf
Likelihood of Violence and Delinquency [43] Dahlberg L, Toal S, Swahn M, Behrens C. Measuring Violence-Related Attitudes, Behaviors, and Influences Among Youths: A Compendium of Assessment Tools. Centers for Disease Control and Prevention, National Center for Injury Prevention and Control; 2005. https://www.wcasa.org/wp-content/uploads/2020/05/Evaluation_11.-CDC_Measuring-Violence-Related-Attitudes-Behaviors-and-Influences-Among-Youths.pdf
Conflict Resolution [43] Dahlberg L, Toal S, Swahn M, Behrens C. Measuring Violence-Related Attitudes, Behaviors, and Influences Among Youths: A Compendium of Assessment Tools. Centers for Disease Control and Prevention, National Center for Injury Prevention and Control; 2005. https://www.wcasa.org/wp-content/uploads/2020/05/Evaluation_11.-CDC_Measuring-Violence-Related-Attitudes-Behaviors-and-Influences-Among-Youths.pdf
Youth/family connection, positive role models, supportive relationships

Presence of

Caring—Individual

Protective Factors

Index[43]

Dahlberg L, Toal S, Swahn M, Behrens C. Measuring Violence-Related Attitudes, Behaviors, and Influences Among Youths: A Compendium of Assessment Tools. Centers for Disease Control and Prevention, National Center for Injury Prevention and Control; 2005. https://www.wcasa.org/wp-content/uploads/2020/05/Evaluation_11.-CDC_Measuring-Violence-Related-Attitudes-Behaviors-and-Influences-Among-Youths.pdf
Social Capital Assessment [48] Search Institute. Social Capital Assessment + Learning for Equity (SCALE) Measures. Search Institute; 2021. https://d2pck61xhq74q6.cloudfront.net/Resources-Hub/Beyond-the-Classroom/Ref-406_SCALE_Measures_User_Guide.pdf
Social Capital Questionnaire [49] Paiva PCP, de Paiva HN, de Oliveira Filho PM, et al. Development and Validation of a Social Capital Questionnaire for Adolescent Students (SCQ-AS). PLoS ONE. 2014;9(8):e103785. doi:10.1371/journal.pone.0103785
Staff resources/capacity, well-being, competency, work quality Secondary Trauma Stress Scale [50] Bride BE, Robinson MM, Yegidis B, Figley CR. Development and Validation of the Secondary Traumatic Stress Scale. Res Soc Work Pract. 2004;14(1):27–35. doi:10.1177/1049731503254106
Cultural Competence Self-Assessment, Personal Involvement [36] Mason J. Cultural Competence Self-Assessment Questionnaire: A Manual for Users. Portland State University, Research and Training Center on Family Support and Children’s Mental Health; 1995. https://www.pathwaysrtc.pdx.edu/pdf/CCSAQ.pdf
Cultural Competence Self-Assessment, Staffing [36] Mason J. Cultural Competence Self-Assessment Questionnaire: A Manual for Users. Portland State University, Research and Training Center on Family Support and Children’s Mental Health; 1995. https://www.pathwaysrtc.pdx.edu/pdf/CCSAQ.pdf
Root causes & longer-term outcomes
Safety, including from gun violence Save Our Streets Community Survey, Questions about Safety [34] Picard-Fritsche S, Cerniglia L. Testing a Public Health Approach to Gun Violence: An Evaluation of Crown Heights Save Our Streets, a Replication of the Cure Violence Model. Center for Court Innovation New York, NY; 2013.
Children’s Exposure To Community Violence [43] Dahlberg L, Toal S, Swahn M, Behrens C. Measuring Violence-Related Attitudes, Behaviors, and Influences Among Youths: A Compendium of Assessment Tools. Centers for Disease Control and Prevention, National Center for Injury Prevention and Control; 2005. https://www.wcasa.org/wp-content/uploads/2020/05/Evaluation_11.-CDC_Measuring-Violence-Related-Attitudes-Behaviors-and-Influences-Among-Youths.pdf
Personal Safety – Joyce Foundation Youth Survey [43] Dahlberg L, Toal S, Swahn M, Behrens C. Measuring Violence-Related Attitudes, Behaviors, and Influences Among Youths: A Compendium of Assessment Tools. Centers for Disease Control and Prevention, National Center for Injury Prevention and Control; 2005. https://www.wcasa.org/wp-content/uploads/2020/05/Evaluation_11.-CDC_Measuring-Violence-Related-Attitudes-Behaviors-and-Influences-Among-Youths.pdf
Freedom from incarceration National Longitudinal Survey of Youth 1997, Criminal History [45] Bureau of Labor Statistics, U.S. Department of Labor. National Longitudinal Survey of Youth 1997 cohort, 1997–2021 (rounds 1–20). (2024). Produced and distributed by the Center for Human Resource Research (CHRR), The Ohio State University. Columbus, OH. https://www.nlsinfo.org/content/cohorts/nlsy97/topical-guide/crime/crime-delinquency-arrest
National Longitudinal Survey of Youth 1979, School Discipline [51] Bureau of Labor Statistics, U.S. Department of Labor. National Longitudinal Survey of Youth 1979 cohort, 1979–2022 (rounds 1–30). (2025). Produced and distributed by the Center for Human Resource Research (CHRR), The Ohio State University. Columbus, OH: https://www.nlsinfo.org/content/cohorts/nlsy79/topical-guide/education/school-discipline
Survey of Criminal Justice Experience[52] Brown S, Manning W. The Survey of Criminal Justice Experience (SCJE), 2013. Published online August 7, 2014. doi:10.3886/ICPSR35080.v1
Self-determination Future Orientation [53] Whitaker DJ, Miller KS, Clark LF. Reconceptualizing Adolescent Sexual Behavior: Beyond Did They or Didn’t They? Fam Plann Perspect. 2000;32(3):111–117. doi:10.2307/2648159
Perceived Choice and Awareness of Self Scale [54] Center for Self-Determination Theory. Perceived Choice and Awareness of Self Scale (PCASS). https://selfdeterminationtheory.org/perceived-choice-and-awareness-of-self-scale/
Goal Setting Formative Questionnaire [55] Gaumer Erickson A, Soukup J, Noonan P, McGurn. Goal Setting Formative Questionnaire Technical Report. College & Career Competency Framework; 2022. https://www.cccframework.org/wp-content/uploads/GoalSettingQuestionnaireInfo.pdf
Youth Motivation, Engagement, and Beliefs Survey [56] Naftzger N, Sniegowski S. Exploring the Relationship Between Afterschool Program Quality and Youth Development Outcomes: Findings From the Washington Quality to Youth Outcomes Study. American Institutes for Research; 2018. https://ospi.k12.wa.us/sites/default/files/2024-01/air_qualityoutofschooltime_2018.pdf
Economic stability Financial Wellbeing Scale [57] Comerton-Forde C, de New J, Salamanca N, Ribar DC, Nicastro A, Ross J. Measuring Financial Wellbeing with Self-Reported and Bank Record Data. Econ Rec. 2022;98(321):133–151. doi:https://doi.org/10.1111/1475–4932.12664.12664.12664
Financial Behavior Scale [58] Dew JP, Xiao JJ. The financial management behavior scale: Development and validation. Published online 2011.https://scholarsarchive.byu.edu/cgi/viewcontent.cgi? article=5489&context=facpub
Belonging Psychological Safety [48] Search Institute. Social Capital Assessment + Learning for Equity (SCALE) Measures. Search Institute; 2021. https://d2pck61xhq74q6.cloudfront.net/Resources-Hub/Beyond-the-Classroom/Ref-406_SCALE_Measures_User_Guide.pdf
General Belongingness Scale [59] Malone GP, Pillow DR, Osman A. The General Belongingness Scale (GBS): Assessing achieved belongingness. Personal Individ Differ. 2012;52(3):311–316. doi:10.1016/j.paid.2011.10.027
Sense of Program Community [48] Search Institute. Social Capital Assessment + Learning for Equity (SCALE) Measures. Search Institute; 2021. https://d2pck61xhq74q6.cloudfront.net/Resources-Hub/Beyond-the-Classroom/Ref-406_SCALE_Measures_User_Guide.pdf
Culture of violence/non-violence (as rooted/reflected in media, US history, etc.) Survey on Attitudes About Guns and Shootings (SAGAS) [60] Milam AJ, Furr-Holden CD, Leaf P, Webster D. Managing Conflicts in Urban Communities: Youth Attitudes Regarding Gun Violence. J Interpers Violence. 2018;33(24):3815–3828.
Peer reactions to delinquency [43] Dahlberg L, Toal S, Swahn M, Behrens C. Measuring Violence-Related Attitudes, Behaviors, and Influences Among Youths: A Compendium of Assessment Tools. Centers for Disease Control and Prevention, National Center for Injury Prevention and Control; 2005. https://www.wcasa.org/wp-content/uploads/2020/05/Evaluation_11.-CDC_Measuring-Violence-Related-Attitudes-Behaviors-and-Influences-Among-Youths.pdf
Beliefs Supporting Aggression [43] Dahlberg L, Toal S, Swahn M, Behrens C. Measuring Violence-Related Attitudes, Behaviors, and Influences Among Youths: A Compendium of Assessment Tools. Centers for Disease Control and Prevention, National Center for Injury Prevention and Control; 2005. https://www.wcasa.org/wp-content/uploads/2020/05/Evaluation_11.-CDC_Measuring-Violence-Related-Attitudes-Behaviors-and-Influences-Among-Youths.pdf

We did not include separate measures for the theory of change construct “positive trajectory of health/wellbeing” because we considered existing measures for this construct to be captured by other constructs. Additional sources that were not reviewed in phase one but that were drawn from as needed during phase two, included: (1) Aubel AJ, Wintemute GJ, Shev AB, Kravitz-Wirtz N. Optimism Bias Among Gun Owners: Associations With Firearm Injury Prevention Practices and Policy Support. Health Education & Behavior. 2024;52(3):266–277. doi:https://doi.org/10.1177/10901981241267212 [61], and (2) Mohatt NV, Fok CC, Burket R, Henry D, Allen J. Assessment of awareness of connectedness as a culturally-based protective factor for Alaska native youth. Cultur Divers Ethnic Minor Psychol. 2011 Oct;17(4):444 − 55. https://doi.org/10.1037/a0025456 [62]

Approach

The process of co-developing evaluation measures involved two phases, each of which consisted of multiple, iterative steps. The first phase involved creating a menu of quantitative measurement options for each outcome construct in the TOC. The second phase involved tailoring and refining quantitative measures for each CVI program and developing parallel qualitative measures. Figure 1 summarizes both phases and the steps of each phase.

Fig. 1.

Fig. 1

Phases and steps of co-developing measures

Phase one: creating a menu of quantitative measurement options

Step 1: The first step of phase one involved reviewing the existing literature to identify quantitative survey measures for each outcome construct (Table 1). Two authors (JPS, MAG) conducted this search, drawing on sources known to the academic team. We prioritized measures, to the extent possible, that had been used/developed for youth and Black and Brown populations in other studies or surveys (see Table 1 for references). We did not conduct a systematic search because we did not intend to include all possible relevant measures but rather create a parsimonious list that could be used as a launching point for discussion and community input. Further, a systematic search of measures related to all 13 outcome constructs would have been extremely time intensive and exceeded our team’s resources.

Step 2: The second step of phase one involved collecting input on these existing measures from CVI practitioners via a workshop. During the workshop, practitioners reviewed existing measures and indicated on paper worksheets whether the questions were appropriate to their work/the populations they serve and suggested any changes. Each individual reviewed one theory of change construct. Then, in a gallery walk activity, CVI practitioners generated new ideas for measures by writing on large posters hung around the room. There was one poster per theory of change construct. Worksheets and posters were collected, retained, and compiled for the third step of phase one.

Step 3: In the third step, two members of the academic team (JPS, MAG) consolidated feedback and created a refined menu of measurement options, which included existing measures that CVI practitioners indicated as appropriate and new ideas generated by CVI practitioners. This process involved (1) ensuring each measure mapped to the appropriate TOC construct, (2) incorporating suggested revisions on question wording from CVI practitioners, (3) minimally rephrasing/rewording new ideas as for clarity, and (4) adding measures from existing surveys that the academic team deemed relevant but had not been reviewed by CVI practitioners.

Phase two: tailoring and refining quantitative measures for each CVI program and developing parallel qualitative measures

Step 1: In the first step of phase two, each CVI program selected outcome constructs from the TOC that were relevant to their work.

Step 2: For each selected TOC construct, academic and CVI practitioner partners worked together to select specific measures from the menu of measurement options and further tailor them into individualized surveys. This involved one-on-one meetings with members of the academic team and CVI practitioners (meetings were held separately for each CVI program) to discuss each question and make necessary changes. Conversations also included discussion of measures related to CVI program activities/implementation and the broader context in which CVIs operate, though these had not been part of phase one.

Step 3: Once the quantitative measures were finalized, we collaboratively developed and refined parallel qualitative measures through additional one-on-one meetings. In this step, the academic team reviewed each CVI program’s finalized quantitative survey and brainstormed qualitative questions that could accompany and provide context for the quantitative survey questions. Members of the academic team and CVI practitioners then discussed, revised, and refined qualitative questions based on CVI practitioners’ feedback.

Results

Findings are presented below for each phase as described in the Methods section:

Phase one: menu of quantitative measurement options

Step 1: Based on a review of the literature, we compiled quantitative measures from 41 existing sources (resulting in an average of 3.2 sources per construct, Table 1).

Step 2: The feedback shared by CVI practitioners during the workshop provided key insights into the appropriateness of and improvements to existing measures. CVI practitioners expressed that many of the questions were relevant to their work and the populations they serve, though some practitioners suggested modifications to better align the questions with specific community contexts and make the language more age/culturally appropriate. For example, CVI practitioners suggested rewording “in this neighborhood” to “in my neighborhood” in the statement: “In this neighborhood, it is sometimes necessary for people to carry guns to protect themselves or their family.” CVI practitioners indicated this change would allow participants to reflect on and define the place they personally consider their neighborhood, eliciting a response that is authentic to their own experiences.

During the gallery walk activity, CVI practitioners generated new ideas for measures, offering creative suggestions and perspectives on how to better capture the nuances of their experiences and the experiences of those they serve. Figure 2 shows a practitioner as they think about and write suggested questions. Some examples of questions that CVI practitioners developed during the activity are: “What is the relationship between the community and police like in your neighborhood?”, “How confident are you in setting goals for yourself?”, and “How often are you around or exposed to guns?”.

Fig. 2.

Fig. 2

CVI practitioners brainstorming measures during workshop

Step 3: We consolidated measures and input from CVI practitioners to create an initial menu of measurement options (n = 60 questions, see Additional File 1, initial menu of measurement options).

Phase two: tailored and refined measures for each CVI program and developed parallel qualitative measures

Step 1: CVI programs selected an average of 5.7 constructs (of the 13) to measure. The most commonly selected constructs were “youth/family connection, positive role models, supportive relationships” (selected by 6 programs), “safety, including from gun violence” (selected by 6 programs), and “youth/family behavioral health, self-efficacy” (selected by 5 programs). The least commonly selected constructs were “belonging” (selected by 1 program) and “youth/family pro-social response to conflict, peaceful conflict resolution” (not selected).

Step 2: While we used a similar approach, the time it took to develop measures with each CVI program varied due to capacity, interest, and communication preferences (meeting in-person vs. Zoom). Over the span of about 10 months, we had a total of 45 one-hour meetings, with an average of 6.4 meetings per CVI program (range = 4–15 meetings).

During these meetings, CVI staff members were presented with the initial menu of measures for the specific TOC constructs they chose. The process of refining measures involved extensive discussion around several key considerations, including confidentiality, age appropriateness, and language/jargon, as detailed below. After revisions, each CVI program had a customized list of quantitative measures that fit their program and the population they served.

Confidentiality

Confidentiality was a primary concern for all the CVI programs. We therefore took several actions to ensure confidentiality, including co-designing survey measures to avoid requesting sensitive information or information that might be perceived as criminalizing or overly invasive, which CVI practitioners noted might make respondents hesitant to answer the question or complete the survey (either at all or honestly). For example, because of concerns about confidentiality, we excluded or reworded questions about firearm access/carrying, criminal history, and immigration status.

For instance, an early iteration of a CVI program’s survey included the following question:

How many times have you carried a gun in the past 6 months?

  1. 0

  2. 1–3

  3. 4–6

  4. 7–10

  5. More than 10 times

Due to concerns about perceived self-incrimination, the question was revised to:

How many times have you felt the need to carry a gun in the past 6 months?

  1. 0

  2. 1–3

  3. 4–6

  4. 7–10

  5. More than 10 times

Age appropriateness of questions

Another consideration was the age appropriateness of questions. For example, for questions that asked about household bills, “you” was replaced with “you or your family” to account for the possibility that respondents may not be directly responsible for the bills, but their family might be.

Another example shows how a question that came out of the gallery walk activity was revised. The original question was: “What steps have you taken towards your employment and educational goals?”

After revisions, a CVI program changed the question to:

If you have an employment or educational goal right now, what have been the most important steps you’ve taken to reach your goals? (select all that apply)

  1. Connected with family

  2. Worked hard at school

  3. Connected with an adult mentor outside your family

  4. Saved money

  5. Used social media to gain clout or connections

  6. Worked hard at sports

  7. Other:

The added response options, based on CVI practitioners’ local knowledge, align with the most common steps participants take, making the question more specific and relevant to youth in their community. Additionally, the revision introduced the phrase “If you have an employment or educational goal right now” to acknowledge that some youth may not have specific goals, broadening the applicability of the question and making it more age appropriate (e.g., youth may not have employment goals right now).

Language and jargon

CVI practitioners provided insights into the literacy levels and language preferences of their participants, which led us to simplify jargon, include definitions for complex terms, and exclude questions deemed confusing. For example, one CVI program initially chose the following survey measure from the Future of Families & Child Wellbeing Study’s Material Hardship Scale [28].

In the past 6 months, did you do any of the following because there wasn’t enough money:

  1. Go hungry because could not afford food

  2. Did not pay rent/mortgage in full

  3. Had utilities turned off because there was not enough money

  4. Stayed at a shelter, or a place not meant for housing, or couch surfed

After meeting and discussing revisions, the question was edited to:

In the past 6 months, did any of the following happen because you didn’t have enough money?

  1. You went hungry because you couldn’t afford food

  2. You or your family didn’t pay their rent/mortgage in full

  3. Your water or electricity was turned off because the bill wasn’t paid

  4. You lived at a shelter, in your car, couch surfed or were unhoused

These revisions were made to make the question more accessible and relevant to the youth in the community. For example, the term “utilities” was changed to “water or electricity” to define terms that youth may not be familiar with. Additionally, asking “did any of the following happen” aligns more closely with trauma-informed language than “did you do any of the following” [29].

Step 3: To complement and add context to quantitative responses, the academic team created qualitative measures to parallel each final quantitative question. For example, one of the CVI programs used the following question on their quantitative survey:

How well are you able to manage your stress/anxiety since your injury?

  1. Very well

  2. Well

  3. Neutral

  4. Poorly

  5. Very poorly

To help contextualize this question and any others related to stress and anxiety, the academic team suggested the following question for their qualitative interviews:

How have you been managing stress or anxiety since your injury?

Follow up: What has made it harder to manage stress or anxiety?

Follow up: What strategies or practices have you been using to manage stress or anxiety?

These qualitative questions aimed to provide deeper insight into participants’ experiences with stress and anxiety, offering a narrative that contextualizes their quantitative responses.

In another example, a CVI program’s survey included the following question:

In the past 3 months, have you or has anyone in your family experienced any barriers getting treatment for substance use or mental health support?

  1. Yes

  2. No

  3. Not applicable

The academic team suggested the following qualitative question to complement this survey question and other substance use or mental health related measures.

How have you navigated challenges that you or your family have faced related to substance use or mental health?

This qualitative question was designed to go beyond simply identifying whether challenges exist to understand the nature of these challenges, the gaps in support, and how youth and the community navigate these issues. It also starts with a premise of the existence of substance use or mental health challenges, which CVI practitioners suggested can help reduce stigma.

Then, taking a similar process to the quantitative measures, each CVI program edited and revised the proposed qualitative measures to meet their needs. Most revisions to qualitative measures involved simplifying jargon and using more common terms for the communities (e.g., changing “friends” to “homies” and “neighborhood” to “hood”), as well as reordering questions based on priority. We continued making revisions as needed throughout the qualitative data collection based on feedback from interviewers and interviewee responses. Examples of final qualitative measures are in the Additional File 1 (sample client and staff measures for selected CVI programs).

Discussion

This paper documented the process and results of a collaborative effort between CVI practitioners and academic researchers to co-develop evaluation measures for youth-focused CVI programs in Washington State. Our process drew from existing literature while heavily prioritizing the knowledge, expertise, and capacity of CVI practitioners. This helped facilitate power sharing and responsiveness to community needs, and we believe it resulted in more appropriate and contextually relevant measures. For example, CVI practitioners not only decided which constructs they wanted to measure (rather than researchers or funders), they offered critical insights into and had control over the development and final versions of those measures. This involved identifying the kinds of questions the young people they serve would most likely answer truthfully and creating and re-phrasing questions to elicit authentic responses. This approach—which centered the voices and expertise of primarily Black and Brown communities that have traditionally experienced research-related mistreatment, extraction, and disinvestment—addresses an important need in the field and has significant implications for practice.

It is critical for CVI evaluations to rigorously measure constructs that are relevant to the intervention and hypothesized to change because of it (guided by a theory of change). Especially in the field of community violence intervention—in which funding and investment are tenuous and issues of violence and crime prevention are often highly politicized—evaluations must be rigorous and contextually grounded [2430]. For example, poorly conducted evaluations that use inappropriate (to the intervention or context) constructs or measures can show misleading or incomplete results and thus potentially cause harm. The integrity of the research/evaluation, including but not limited to the validity of measurement tools, depends on the inclusion of community members engaged in this work and those most proximate to community violence [31].

CVI evaluations have increasingly centered community voice in recent years, but this has not always been the norm [32]. In addition to scholarly biases and structural inequities, historically-limited community involvement may partly reflect the fact that community-engaged research takes time (e.g., to develop relationships, build trust, facilitate engagement, co-develop evaluation plans and tools)—longer than the duration of most traditional research/evaluation projects [31]. The process of co-developing evaluation measures with/for the 7 CVI programs documented here took approximately 10 months (though with notable variation across programs, reflecting varied preferences and capacity). This underscores the need for realistic and flexible project timelines that prioritize community involvement over quick results. It also underscores the importance of authorization to experiment with non-traditional, iterative evaluation paradigms, facilitated by external funders who see the value in this approach. This “positive deviance” relates to principles of the Problem-Driven Iterative Adaptation model, which has been successfully used to build capacity for addressing complex problems via iterative, localized solutions [33].

Notwithstanding the importance of meaningful and sustained community-academic collaborations at a local level and the diversity of CVI programs and contexts, we have created and made available a toolkit accompanying this paper (toolkit available at: https://fiprp.uw.edu/toolkit/). The toolkit is not meant to be prescriptive or imply that the context of the 7 CVI programs included here will generalize to others; instead, the toolkit is designed to share resources and potentially jumpstart the measurement process for other CVI programs. The toolkit allows users to select constructs they would like to measure (from the abovementioned theory of change), browse quantitative and qualitative measures for those constructs (drawing from the inventory of refined/tailored measures, examples of which are presented in the Additional File 1, sample client and staff measures from selected CVI programs), and create a customized set of measures for download, which can be further refined to their context. The finalized measures can then be deployed in ways appropriate to the population and organization. For example, of the CVI programs involved in this collaboration, modes of quantitative data collection included combinations of electronic (e.g., Google forms, Survey Monkey) and pen and paper forms, and self-administered and staff-administered modalities, while qualitative data collection included interviews and focus groups.

Limitations and challenges

This paper focused on primary data collection of quantitative surveys and qualitative interviews/focus groups to capture outcomes among CVI clients and staff. It did not focus on process/implementation measures or data collection and analysis. Thus, this paper did not address several important aspects of CVI evaluation. Furthermore, our approach did not directly engage CVI clients. Additionally, as noted above, the specifics of our work (i.e., the theory of change and resulting measures) may not generalize to other CVI programs. Future work could consider focusing on other modes of data collection beyond surveys and interviews/focus groups, co-developing measures that more directly capture process/implementation data, involving the perspectives of CVI clients, and further refining and adapting CVI theories of change and evaluation measures to different contexts. Data collection may be a particularly important area for future attention and investment (including documenting and sharing approaches), as data collection can impose burdens, and CVI programs may lack robust data collection infrastructure due to historical and continued underfunding and lack of culturally and contextually relevant training and technical assistance.

Conclusion

This paper described a collaborative effort between CVI practitioners and academic researchers to co-develop evidence-informed, community-centered quantitative and qualitative youth-focused CVI evaluation measures. Evaluations using such measures may be more valid, equitable, and impactful. We also created an online toolkit accompanying this paper so others may easily use, tailor, and build upon our work.

Supplementary Information

Supplementary material 1 (57.5KB, docx)

Acknowledgements

The authors thank all community members who provided insight during the workshop.

Abbreviations

CVI

Community Violence Intervention

TOC

Theory Of Change

Author contributions

MG and JPS conceptualized and designed the study, conducted analyses, and drafted the manuscript. MG, JPS, RR, ELD, AM contributed to data management. ARR provided supervision and obtained funding. All authors contributed to the interpretation of data, critically revised the manuscript, and read and approved the final manuscript.

Funding

This work was supported by the Washington State Department of Commerce Office of Firearm Safety and Violence Prevention. The funding source had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.

Data availability

The dataset supporting the conclusions of this article is included in an online toolkit at https://fiprp.uw.edu/toolkit/.

Declarations

Ethics approval and consent to participate

This study was considered not Human Subjects Research by the University of Washington Institutional Review Board. As such, written informed consent was waived.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Footnotes

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Meron Girma and Julia P. Schleimer are co-first authors.

References

  • 1.Centers for disease control and prevention. WISQARS Fatal Injury Reports. https://www.cdc.gov/injury/wisqars/fatal/index.html. Accessed 24 July 2023.
  • 2.The national child traumatic stress network. Community Violence. December 8. 2017. https://www.nctsn.org/what-is-child-trauma/trauma-types/community-violence. Accessed 28 August 2024.
  • 3.Sharkey P. The long reach of violence: a broader perspective on data, theory, and evidence on the prevalence and consequences of exposure to violence. Annu Rev Criminol. 2018;1(1):85–102. 10.1146/annurev-criminol-032317-092316. [Google Scholar]
  • 4.Barrett JT, Lee LK, Monuteaux MC, Farrell CA, Hoffmann JA, Fleegler EW. Association of county-level poverty and inequities with firearm-related mortality in US youth. JAMA Pediatr. 2022;176(2):e214822. 10.1001/jamapediatrics.2021.4822. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Community violence intervention action plan: Mapping Transformation for the Field. CVI Action Plan. Fall 2024. https://www.cviactionplan.com. Accessed 13 Sept 2024.
  • 6.Buggs S. Community-based violence interruption and public safety. Arnold Ventures; 2022.
  • 7.Buggs SA, Webster DW, Crifasi CK. Using synthetic control methodology to estimate effects of a cure violence intervention in Baltimore, Maryland. Inj Prev. 2022;28(1):61–7. 10.1136/injuryprev-2020-044056. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Webster DW, Richardson J, Meyerson N, Vil C, Topazian R. Research on the effects of hospital-based violence intervention programs: observations and recommendations. Ann Am Acad Pol Soc Sci. 2022;704(1):137–57. 10.1177/00027162231173323. [Google Scholar]
  • 9.Matthay EC, Farkas K, Rudolph KE, et al. Firearm and nonfirearm violence after operation peacemaker fellowship in Richmond, California, 1996–2016. Am J Public Health. 2019;109(11):1605–11. 10.2105/AJPH.2019.305288. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Bhatt MP, Heller SB, Kapustin M, Bertrand M, Blattman C. Predicting and preventing gun violence: an experimental evaluation of READI Chicago*. Q J Econ. 2024;139(1):1–56. 10.1093/qje/qjad031. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Ross MC, Ochoa EM, Papachristos AV. Evaluating the impact of a street outreach intervention on participant involvement in gun violence. Proc Natl Acad Sci USA. 2023;120(46):e2300327120. 10.1073/pnas.2300327120. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Webster DW, Tilchin CG, Doucette ML. Estimating the effects of safe streets baltimore on gun violence.; 2023. https://publichealth.jhu.edu/sites/default/files/2023-03/estimating-the-effects-of-safe-streets-baltimore-on-gun-violence-march-2023.pdf. Accessed 24 July 2023.
  • 13.Roman C, Klein H, Wolff K. Quasi-experimental designs for community-level public health violence reduction interventions: a case study in the challenges of selecting the counterfactual. J Exp Criminol. 2018;14(2):155–85. 10.1007/s11292-017-9308-0. [Google Scholar]
  • 14.Wilson J, Chermak S. Community-driven violence reduction programs examining Pittsburgh’s one vision one life. Criminol Public Policy. 2011;10(4):993. 10.1111/j.1745-9133.2011.00763.x. [Google Scholar]
  • 15.Roman C, Link N, Hyatt J, Bhati A, Forney M. Assessing the gang-level and community-level effects of the Philadelphia focused deterrence strategy. J Exp Criminol. 2019;15(4):499–527. 10.1007/s11292-018-9333-7. [Google Scholar]
  • 16.Braga A, Zimmerman G, Barao L, Farrell C, Brunson R, Papachristos A. Street gangs, gun violence, and focused deterrence: comparing place-based and group-based evaluation methods to estimate direct and spillover deterrent effects. J Res Crime Delinq. 2019;56(4):524–62. 10.1177/0022427818821716. [Google Scholar]
  • 17.Corburn J, Boggan D, Muttaqi K, Vaughn S. Preventing urban firearm homicides during COVID-19: preliminary results from three cities with the advance peace program. J Urban Health-Bull N Y Acad Med. 2022;99(4):626–34. 10.1007/s11524-022-00660-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Monopoli W, Myers R, Paskewich B, Bevans K, Fein J. Generating a core set of outcomes for hospital-based violence intervention programs. J Interpers Violence. 2021;36(9–10):4771–86. 10.1177/0886260518792988. [DOI] [PubMed] [Google Scholar]
  • 19.Hausman A, Baker C, Komaroff E, et al. Developing measures of community-relevant outcomes for violence prevention programs: a community-based participatory research approach to measurement. Am J Community Psychol. 2013;52(3–4):249–62. 10.1007/s10464-013-9590-6. [DOI] [PubMed] [Google Scholar]
  • 20.Gonzalez J, Trickett EJ. Collaborative measurement development as a tool in CBPR: measurement development and adaptation within the cultures of communities. Am J Community Psychol. 2014;54(0):112–24. 10.1007/s10464-014-9655-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Myers RK, Kapa HM, Garcia SM, Vega L, Fein JA. Development of a brief client satisfaction and quality improvement tool for hospital-based violence intervention programs: opportunities for enhancing client perspectives. J Patient Exp. 2025;12:23743735251314622. 10.1177/23743735251314622. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Abaya R, Buggs S, Fontaine J, Hudson T, Kravitz-Wirtz N. Guiding research principles and priorities for the black & brown collective for community solutions to gun violence. Black & Brown Collective for Community Solutions to Gun Violence; 2024. https://thebbcollective.org/wp-content/uploads/2024/12/Collective-ResearchAgenda.pdf
  • 23.Chicago Beyond. Why Am I Always Being Researched? A Guidebook for Community Organizations, Researchers, and Funders to Help Us Get from Insufficient Understanding to More Authentic Truth. 2018. https://chicagobeyond.org/insights/philanthropy/why-am-i-always-being-researched/
  • 24.Hudson T. Interrogating the notion of evidence-based policy in community-based violence prevention. Ph.D. The New School; 2022. https://www.proquest.com/docview/2715687334/abstract/2A70FE9AB166490APQ/1. Accessed 22 July 2023.
  • 25.Okun T. White Supremacy Culture – Still Here.; 2021:1–32. https://socialwork.wayne.edu/events/4_-_okun_-_white_supremacy_culture_-_still_here.pdf. Accessed June 4, 2024.
  • 26.Wallerstein N, Duran B, Oetzel J, Minkler M. Community-Based Participatory Research for Health: Advancing Social and Health Equity. 3rd edition. Jossey-Bass; 2017. [Google Scholar]
  • 27.Schleimer JP, Lyons VH, Smith D, et al. Codeveloping theories of change for improved community-based violence intervention evaluation. J Trauma Acute Care Surg. 2024;97(2):278. 10.1097/TA.0000000000004277. [DOI] [PubMed] [Google Scholar]
  • 28.McLanahan L, Garfinkel I, Waldfogel J, Edin K. Scales and Concepts Documentation, Future of Families and Child Wellbeing Study. https://ffcws.princeton.edu/data-and-documentation/scales-and-concepts-documentation. Accessed June 3, 2025.
  • 29.Hart L, Bliton JN, Castater C, Beard JH, Smith RN. Trauma-informed language as a tool for health equity. Trauma Surg Acute Care Open. 2024;9(1):e001558. 10.1136/tsaco-2024-001558. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Pugliese K, Odér P, Hudson T, Butts JA. Community Violence Intervention at the Roots (CVI–R). Research and Evaluation Center, John Jay College of Criminal Justice City University of New York; 2022.
  • 31.Black & Brown Collective for Community Solutions to Gun Violence. The Case for More Equitable and Community-Engaged Research to Address Firearm-Related Violence in Black and Brown Communities. 2024. https://thebbcollective.org/wp-content/uploads/2024/12/Collective-EquityReport.pdf
  • 32.Girma M, Schleimer J, Aveledo A, et al. Evaluating community violence intervention programs: a scoping review synthesizing methods and measures. Inquiry. 2025;62:00469580251361742. 10.1177/00469580251361742. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Andrews M, Pritchett L, Woolcock M. Building State Capability: Evidence, Analysis, Action. Oxford University Press; 2017. [Google Scholar]
  • 34.Picard-Fritsche S, Cerniglia L. Testing a public health approach to gun violence: an evaluation of crown heights save our Streets, a replication of the cure violence model. Center for Court Innovation New York, NY; 2013.
  • 35.Jones J, Barry MM. Developing a scale to measure trust in health promotion partnerships. Health Promot Int. 2011;26(4):484–91. 10.1093/heapro/dar007. [DOI] [PubMed] [Google Scholar]
  • 36.Mason J. Cultural Competence Self-Assessment Questionnaire: A Manual for Users. Portland State University, Research and Training Center on Family Support and Children’s Mental Health; 1995. https://www.pathwaysrtc.pdx.edu/pdf/CCSAQ.pdf
  • 37.Health Leads. Social Needs Screening Toolkit. 2018. https://healthleadsusa.org/wp-content/uploads/2023/05/Screening_Toolkit_2018.pdf
  • 38.Karpman M, Zuckerman S, Gonzalez D. The Well-Being and Basic Needs Survey: A New Data Source for Monitoring the Health and Well-Being of Individuals and Families. Urban Institute; 2018. https://www.urban.org/research/publication/well-being-and-basic-needs-survey. Accessed August 12, 2025.
  • 39.Lamers SMA, Westerhof GJ, Bohlmeijer ET, ten Klooster PM, Keyes CLM. Evaluating the psychometric properties of the Mental Health Continuum-Short Form (MHC-SF). J Clin Psychol. 2011;67(1):99–110. 10.1002/jclp.20741. [DOI] [PubMed] [Google Scholar]
  • 40.Yudko E, Lozhkina O, Fouts A. A comprehensive review of the psychometric properties of the drug abuse screening test. J Subst Abuse Treat. 2007;32(2):189–98. 10.1016/j.jsat.2006.08.002. [DOI] [PubMed] [Google Scholar]
  • 41.Skinner HA. The drug abuse screening test. Addict Behav. 1982;7(4):363–71. [DOI] [PubMed] [Google Scholar]
  • 42.Chen G, Gully SM, Eden D. Validation of a new general self-efficacy scale. Organ Res Methods. 2001;4(1):62–83. 10.1177/109442810141004. [Google Scholar]
  • 43.Dahlberg L, Toal S, Swahn M, Behrens C. Measuring Violence-Related Attitudes, Behaviors, and Influences Among Youths: A Compendium of Assessment Tools. Centers for Disease Control and Prevention, National Center for Injury Prevention and Control; 2005. [Google Scholar]
  • 44.Inspiration Corporation. Youth Employment Survey, Research Results. 2021. https://omsinspiration.wpenginepowered.com/wp-content/uploads/2021/09/2021_Inspiration_Corporation_YES_Report.pdf
  • 45.Bureau of Labor Statistics, U.S. Department of Labor. National Longitudinal Survey of Youth 1997 cohort, 1997–2021 (rounds 1–20). 2024. https://www.nlsinfo.org/content/cohorts/nlsy97. Accessed August 12. 2025.
  • 46.Phillips KFV, Power MJ. A new self-report measure of emotion regulation in adolescents: the regulation of emotions questionnaire. Clin Psychol Psychother. 2007;14(2):145–56. 10.1002/cpp.523. [Google Scholar]
  • 47.Research & Evaluation Center, John Jay College of Criminal Justice. NYC-Cure Survey Instrument: Cure Violence Evaluation Study. City University of New York (JohnJayREC); 2014. [Google Scholar]
  • 48.Search Institute. Social Capital Assessment + Learning for Equity (SCALE) Measures. Search Institute; 2021. [Google Scholar]
  • 49.Paiva PCP, de Paiva HN, de Oliveira Filho PM, et al. Development and validation of a social capital questionnaire for adolescent students (SCQ-AS). PLoS One. 2014;9(8):e103785. 10.1371/journal.pone.0103785. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 50.Bride BE, Robinson MM, Yegidis B, Figley CR. Development and validation of the secondary traumatic stress scale. Res Soc Work Pract. 2004;14(1):27–35. 10.1177/1049731503254106. [Google Scholar]
  • 51.Bureau of Labor Statistics, U.S. Department of Labor. National longitudinal survey of youth 1979 cohort, 1979–2022 (rounds 1–30). 2025. https://www.nlsinfo.org/content/cohorts/nlsy79
  • 52.Brown S, Manning W. The Survey of Criminal Justice Experience (SCJE), 2013. Published online August 7, 2014. 10.3886/ICPSR35080.v1
  • 53.Whitaker DJ, Miller KS, Clark LF. Reconceptualizing adolescent sexual behavior: beyond did they or didn’t they? Fam Plann Perspect. 2000;32(3):111–7. 10.2307/2648159. [PubMed] [Google Scholar]
  • 54.Center for Self-Determination Theory. Perceived Choice and Awareness of Self Scale (PCASS). https://selfdeterminationtheory.org/perceived-choice-and-awareness-of-self-scale/. Accessed August 12. 2025.
  • 55.Gaumer Erickson A, Soukup J, Noonan P, McGurn. Goal Setting Formative Questionnaire Technical Report. College & Career Competency Framework; 2022. https://www.cccframework.org/wp-content/uploads/GoalSettingQuestionnaireInfo.pdf
  • 56.Naftzger N, Sniegowski S. Exploring the relationship between afterschool program quality and youth development outcomes: findings from the Washington quality to youth outcomes study. American Institutes for Research; 2018. https://ospi.k12.wa.us/sites/default/files/2024-01/air_qualityoutofschooltime_2018.pdf.
  • 57.Comerton-Forde C, de New J, Salamanca N, Ribar DC, Nicastro A, Ross J. Measuring financial wellbeing with self-reported and bank record data. Econ Rec. 2022;98(321):133–51. 10.1111/1475-4932.12664. [Google Scholar]
  • 58.Dew JP, Xiao JJ. The financial management behavior scale: Development and validation. Published online 2011.
  • 59.Malone GP, Pillow DR, Osman A. The general belongingness scale (GBS): assessing achieved belongingness. Personal Individ Differ. 2012;52(3):311–6. 10.1016/j.paid.2011.10.027. [Google Scholar]
  • 60.Milam AJ, Furr-Holden CD, Leaf P, Webster D. Managing conflicts in urban communities: youth attitudes regarding gun violence. J Interpers Violence. 2018;33(24):3815–28. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 61.Aubel AJ, Wintemute GJ, Shev AB, Kravitz-Wirtz N. Optimism bias among gun owners: associations with firearm injury prevention practices and policy support. Health Educ Behav. 2025;52(3):266–77. 10.1177/10901981241267212. [DOI] [PubMed] [Google Scholar]
  • 62.Mohatt NV, Fok CCT, Burket R, Henry D, Allen J. Assessment of awareness of connectedness as a culturally-based protective factor for Alaska native youth. Cultur Divers Ethnic Minor Psychol. 2011;17(4):444–55. 10.1037/a0025456. [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplementary material 1 (57.5KB, docx)

Data Availability Statement

The dataset supporting the conclusions of this article is included in an online toolkit at https://fiprp.uw.edu/toolkit/.


Articles from Injury Epidemiology are provided here courtesy of BMC

RESOURCES