Skip to main content
HHS Author Manuscripts logoLink to HHS Author Manuscripts
. Author manuscript; available in PMC: 2025 Dec 1.
Published in final edited form as: Eval Health Prof. 2024 Dec;47(4):437–445. doi: 10.1177/01632787241269069

Embracing Complexity: Developing a Framework for Evaluating a Multi-Faceted Training and Technical Assistance System

Ayana R Stanley 1, Calla Jamison 1, Alice Chen 1, Lindsey Barranco 1, Delaney Welsh 1, Katie Jones 1
PMCID: PMC11686402  NIHMSID: NIHMS2010058  PMID: 39604107

Abstract

The benefits of training and technical assistance (TTA) have been well documented. There is limited literature that explores how complex systems of TTA are implemented and evaluated particularly in the violence prevention field. The Violence Prevention Practice and Translation Branch (VPPTB) within the Centers for Disease Control and Prevention’s (CDC) Division of Violence Prevention funds multiple technical assistance providers who are tasked with building the capacity of program recipients to implement comprehensive approaches to prevent multiple forms of violence. VPPTB designed the Violence Prevention Technical Assistance Center (VPTAC) with the goal of implementing and evaluating comprehensive TTA efforts that integrates the work of multiple TTA providers to build the capacity of recipients to plan, implement, and evaluate violence prevention efforts. The VPTAC evaluation incorporates data from program recipients, TTA providers, and TTA modalities enabling the VPPTB staff to show improvement in technical knowledge, gather examples of enhanced implementation, and facilitate proactive TTA planning. An important step in the process of evaluating VPTAC from a system-level perspective required an expansion beyond evaluating a single TTA event, provider, or engagement. This is essential to understand how a diverse set of TTA activities and partners work together in their efforts to build capacity.

Keywords: training and technical assistance, violence prevention, evaluation

Background

The Violence Prevention Practice and Translation Branch (VPPTB) within the Division of Violence Prevention (DVP) at the Centers for Disease Control and Prevention (CDC) administers programmatic funding to prevent multiple forms of violence using primary prevention strategies. The VPPTB’s funded programs cover adverse childhood experiences (ACEs) and multiple domains of violence including child abuse and neglect, youth violence, sexual violence, and intimate partner violence. These programs directly fund state and local health departments, community coalitions, and nonprofit organizations (referred to as funded recipients) to strengthen infrastructure and build capacity that enables the development, implementation, and evaluation of a comprehensive approach to violence prevention. Funded recipients are required to create and implement a state or community-level prevention plan that incorporates a range of community- and- societal-level prevention efforts addressing risk and protective factors that have the potential to impact multiple forms of violence. Community- and- societal-level prevention approaches include, but are not limited to, (1) improving school climate and safety through organizational policy changes and physical modifications; (2) modifying the physical and social environment of neighborhoods through place-based approaches like greening; (3) improving family-friendly workplace policies, such as parental leave; or (4) strengthening economic supports through income supports, for example, tax credits or livable wages. The prioritization of these approaches aligns with the Division of Violence Prevention’s guiding principles: advance economic, gender and racial equity; enhance positive relationships and environments, address factors that cut across multiple forms of violence; and prioritize efforts that create societal and community-level impact (Centers for Disease Control and Prevention, 2022b).

To be successful in achieving the goals and objectives of the funded programs, recipients must have a combination of general (e.g., developing evaluation plans, building partnerships, developing program materials) and innovation specific (e.g., identify essential elements of specific programs, make programmatic adaptations) capacity. General support is intended to enhance the infrastructure, skills, and motivation of an organization. Innovation specific capacity involves gathering information about possible innovations to put in place, choosing which innovations to use, and taking steps to implement an innovation and continue its use over time. The Interactive Systems Framework (ISF) for Dissemination and Implementation, widely used in violence prevention, distinguishes between these two areas of capacity – both of which are needed to support and deliver prevention strategies (Wandersman et al., 2008). In order to address this need, and as a reflection of DVP’s focus on addressing shared risk and protective factors (Wilkins et al., 2014), the VPPTB staff designed a technical assistance system – the Violence Prevention Technical Assistance Center (VPTAC) – that focuses on building the general capacity of funded recipients to conduct violence prevention activities while incorporating training and technical assistance (TTA) related to the implementation and evaluation of specific types of prevention strategies. This approach enables VPTTB to support funded recipients in accomplishing funding program goals. This includes the creation and strengthening of partnerships and infrastructure to strategically plan, select, and evaluate a comprehensive set of prevention strategies that have the potential to address shared risk and protective factors across multiple types of violence. To complement general capacity building, VPTAC provides TTA that supports implementation of evidence-based prevention programs, practices, and policies. The TTA system incorporates a wide range of expertise across specific subject matter related to types of violence, types of prevention strategies, and modalities of TTA to effectively support funded recipients across all of VPPTB’s funded programs.

The VPPTB staff has historically utilized separate technical assistance providers to provide focused TTA for each violence prevention funding program. This strategy resulted in lack of coordination in technical assistance planning, delivery, tracking, and evaluation. Moreover, this also prevented VPPTB from systematically planning and delivering cross-program TTA in an efficient and effective manner. Each technical assistance provider developed a specific technical assistance plan, had individual systems to monitor TTA, and collected separate needs assessment and evaluation data. Essentially, each TTA provider operated in a silo, which resulted in miscommunication, duplication of efforts, survey fatigue, and confusion among funded recipients. This approach largely ignored the significant amount of overlap between funding programs in terms of the capacity needed for recipients to achieve the program goals and objectives. In many cases, the same recipients received funding for multiple programs within VPPTB. This siloed approach to evaluation and in many cases TTA, provided limited insight into the experiences of recipients who were receiving multiple types of TTA, any changes in recipient capacity, and the ability of the TTA system to further the VPPTB’s mission. Because each TTA provider was using their own system for planning, delivering, and evaluating TTA, it was challenging for CDC’s VPPTB staff to incorporate a strategic continuous quality improvement process. All of these factors made it challenging for the VPPTB staff to fully understand and evaluate the comprehensive impact of TTA being provided to recipients.

A recent systematic review of evaluations of technical assistance found that the provision of technical assistance is seldom systematically planned, implemented or evaluated (Scott et al., 2022). This is particularly true when it comes to complicated and comprehensive TTA systems that focus on supporting planning, implementation, and evaluation of a wide range of violence prevention strategies across a diverse set of practitioners. While there are examples of evaluations of comprehensive TTA (Moreland-Russell et al., 2018; National Center for Education Evaluation and Regional Assistance, 2011; Olson et al., 2020), there is a paucity of literature on the evaluation of multi-provider TTA systems, particularly related to implementing community- and societal-level primary prevention. The purpose of this article is to highlight the approach used by VPPTB staff to develop and evaluate a comprehensive TTA system focused on general and innovation specific capacity building. Although this specific TTA system is focused on supporting funded recipients to implement violence prevention, we believe that this approach has heuristic utility and can be applied to any comprehensive multi-provider TTA system. It is also intended to contribute to the nascent literature on approaches to evaluating comprehensive TTA systems focused on building general and innovation specific capacity of practitioners.

The Violence Prevention Technical Assistance Center Model

The goal of VPTAC is to coordinate implementation and evaluation of TTA across multiple TTA providers. As part of this goal, the VPTAC delivers comprehensive TTA that builds the capacity of recipients to plan, implement, and evaluate comprehensive prevention efforts across multiple forms of violence. The VPTAC structure was developed in alignment with the Interactive Systems Framework (ISF) for Dissemination and Implementation, which helps address the research-to-practice translation gap (Wandersman et al., 2008). The ISF describes three interconnected systems that work together to promote the implementation and maintenance of evidence-based practices. The three systems are (1) The Prevention Synthesis and Translation System, which functions to make information about innovations usable for implementation, (2) The Prevention Support System, which functions to support the implementers of innovations, and (3) The Prevention Delivery System, which functions to implement innovations (e.g., to deliver programs). The majority of VPTAC activities fall within the Prevention Support System of the ISF Framework, while also leveraging DVP activities within the Prevention Synthesis and Translation System. The VPTAC model utilizes the expertise of VPPTB staff, CDC subject matter experts, and national TTA providers engaged through partnerships, cooperative agreements, and contracts. VPTAC is designed to build the capacity of recipients to plan, implement, and evaluate evidence-informed prevention strategies. The World Health Organization defines capacity building as:

“the development of knowledge, skills, commitment, structures, systems, and leadership to enable effective health promotion. It involves actions to improve health at three levels: the advancement of knowledge and skills among practitioners; the expansion of support and infrastructure for health promotion in organizations, and; the development of cohesiveness and partnerships for health in communities” (Smith et al., 2006, p. 341).

VPTAC aims to provide TTA and resources that increase recipients’ knowledge and skills, bolster organizational infrastructure for prevention, and support development of partnerships. While evaluations of capacity building interventions are limited, a systematic review of capacity building for public health practice found that internet-based education, training, technical assistance, and communities of practice have the potential to increase knowledge, skills, and self-efficacy (DeCorby-Watson et al., 2018). Training and technical assistance was also found to be associated with changes in practices and policies. Based on a review of the capacity building literature, Brownson et al. (2018) concluded that a one-size fits all approach to building public health capacity is less likely to be effective. The authors of this review found that many capacity building efforts focus simply on whether or not specific evidence-based practices are used and ignore the many complexities related to selecting, adapting, and implementing effective strategies in diverse communities. Even though Brownson et al. (2018) and DeCorby-Watson et al. (2018) focused on capacity building at the individual and organizational levels, there is a need for evidence to support capability building at the community and systems levels. These findings support VPTAC’s approach of offering a variety of capacity building options (across individual, organizational, community and systems level) and tailoring individualized TTA to meet the needs of the requestor, which highlights the complexity of the VPTAC TTA structure.

When designing the VPTAC, the VPPTB staff aligned the framework with the DVP’s Violence Prevention in Practice tool on VetoViolence, which was developed to support state and local health agencies and other stakeholders who have a role in planning, implementing, and evaluating violence prevention efforts (Barranco et al., 2022). The seven capacity areas from Violence Prevention in Practice (planning, partnerships, policy efforts, strategies and approaches, implementation, adaptation, and evaluation) were identified as the priority foci for capacity building targeting violence prevention practitioners through a combination method: interviews with recipients, information-gathering from subject matter experts, and a review of the implementation science and violence prevention literature (Barranco et al., 2022). In addition to these seven capacity areas, VPTAC focuses on an eighth capacity area, health equity, as this is a priority of DVP and VPPTB and has been consistently identified as a need by recipients. Health equity is defined as the state in which everyone has a fair and just opportunity to attain the highest level of health (Centers for Disease Control and Prevention, 2022a). Although health equity considerations and best practices are integrated throughout all capacity areas, health equity is highlighted as its own capacity area to ensure it receives direct attention. The eight capacity areas are briefly described in Table 1.

Table 1.

VPTAC Capacity Areas

Capacity area Description

Planning Capacity related to comprehensive violence prevention planning such as development of a shared vision, using data to guide decision making, and prioritization of risk and protective factors and their integral role in violence prevention.
Partnerships Capacity related to engaging and sustaining multi-sector partners such as identifying and engaging partners and community members and developing effective communication strategies.
Policy efforts Capacity related to involvement in policy efforts such as framing violence prevention as public health, educating policymakers, and supporting policy implementation.
Strategies and approaches Capacity to prioritize specific approaches and select evidence-informed programs, practices, and policies. This includes capacity to understand innovation-specific capacities required for specific programs, practices, and policies.
Implementation General capacity to implement a coordinated violence prevention plan across multiple partners along with innovation-specific capacity for implementing specific violence prevention programs, practices, and policy efforts.
Adaptation Capacity to estimate the essential elements of evidence-informed violence prevention approaches and to make strategic decisions about adaptation and evaluation of specific violence prevention programs, practices, and policies.
Evaluation Capacity to develop an evaluation plan, identify data for tracking meaningful outcomes and indicators, and use data to guide programmatic improvement efforts.
Health equity Capacity to incorporate health equity practices and principles into all aspects of prevention planning, implementation, and evaluation.

Within the ISF’s Prevention Support System (Wandersman et al., 2008), VPTAC provides TTA focused on general capacity building related to prevention infrastructure and skills as well as innovation-specific capacity. General capacities include skills, structures, and processes that help organizations implement prevention efforts overall. Innovation-specific capacities apply to implementing a specific strategy. For example, VPTAC TTA that is focused on general capacity may include a webinar on partnership building whereas VPTAC TTA that is focused on innovation-specific capacity may include a webinar emphasizing how to effectively implement a greening program within a neighborhood. One unique aspect, which further contributes to the complexity of the VPTAC model, is that some of the TTA recipients are functioning in multiple systems of the ISF. Although all of the recipients function within the Violence Prevention Delivery System, many of the funded recipients are also funders themselves (e.g., state health departments funding local community-based organizations to carry out violence prevention activities). Therefore, these entities are working simultaneously within the Prevention Support System and the Violence Prevention Delivery System. Consequently, VPTAC must deliver TTA that builds the capacity of recipients to plan, implement, and evaluate primary prevention while also building the capacity of recipients to build the capacity of their own funded partners (subrecipients).

VPTAC builds the capacity of recipients through a wide range of TTA modalities (Table 2) to ensure that recipients are able to engage in multiple ways depending on their needs. Similar to dissemination and implementation science capacity building (Viglione et al., 2023), VPTAC requires a multi-pronged approach where various types of activities, such as trainings, learning collaboratives, tools, and other resources work together synergistically to build the capacity of the VPPTB violence prevention funded recipients around the previously mentioned eight capacity areas. A key part of designing VPTAC was the development of operational definitions for TTA types in collaboration with the various TTA partners. This ensured that all TTA providers were categorizing and reporting on their TTA in a consistent manner.

Table 2.

VPTAC TTA Modalities and Examples of Activities

Modality Definition Examples

Individual technical assistance Technical assistance provided to either a single person or multiple people who are all associated with the same VPPTB-funded grant recipient Tailored support with specific activities such as developing a public education campaign to change harmful gender norms, brainstorming partnership engagement strategies, assisting with developing a program evaluation plan including identification of data sources and indicators to assess progress toward goals
Group learning events Training and technical assistance provided to multiple VPPTB-funded grant recipients Virtual or in-person general or innovation specific trainings, webinars, or office hours
Peer learning communities Peer learning opportunities, facilitated by TTA providers, in which recipients have opportunities to collaborate and share lessons learned Peer learning communities, communities of practice (groups that meet regularly for the purpose of learning about a shared interest such as program evaluation or health equity)
Resource development Any tool or resource developed by TTA providers to increase the capacity of recipients to plan, implement, and evaluate strategies Resource documents, online tools, learning modules, podcasts

Developing a Coordinated Approach to Planning and Evaluating Technical Assistance

Implementation of the VPTAC requires a coordinated and systematic approach for planning, implementing, and evaluating the system as a whole. This includes the development of overarching plans that guide the implementation and evaluation of VPTAC activities across all TTA providers. The development of these plans was guided by a system-level logic model (see Figure 1) (Scott et al., 2022) and supported by a shared TTA tracking and reporting system.

Figure 1.

Figure 1.

VPTAC Logic Model

The VPTAC strategic plan ensures that all TTA activities – regardless of provider or modality – are aligned with the overarching priorities and goals for VPTAC. The plan, developed and refined collaboratively on an annual basis, identifies the key TTA priorities for the year. The priorities are selected based on the prior year’s evaluation and needs assessment report (described below) as well as the strategic priorities of DVP and the goals of the funding programs. Identification of strategic TTA priorities guides decision making for VPPTB staff and funded TTA providers as they plan specific TTA activities for the year. It also allows VPPTB staff to examine progress made on those specific priorities at the end of the year. The strategic plan enables VPTAC to intentionally balance the delivery of reactive TTA (provided in response to a specific request from a recipient) with proactive TTA (planned activities or resources focused on priorities from the strategic plan). Intentionally focusing on both proactive and reactive TTA ensures that technical assistance is not only provided to those recipients who have the motivation to request it but also to recipients who may be less likely to take initiative and request assistance.

The VPPTB staff used several guiding principles to approach evaluating TTA and understanding recipient TTA needs. The evaluation needed to be formative, outcome focused, comprehensive, flexible, and minimally burdensome. This included incorporating a data-to-action framework (Zakocs et al., 2015), which ensures continual use of evaluation and needs data to refine TTA topics and improve TTA delivery. The VPPTB staff also strategically combined the collection of TTA needs and evaluation data, wherever possible, in order to increase response rates, reduce recipient burden, and ensure a continuous feedback loop between planning and evaluation. The data that is gathered and analyzed for the VPTAC evaluation is also used to directly inform TTA planning and delivery so that it is responsive to recipients’ needs. The system level logic model depicted in Figure 1 was developed early in the process to guide the selection of key evaluation questions (Supplemental Table 1) and identification of outcomes, indicators, and data sources. The evaluation questions informed every aspect of what and how the data are collected. VPTAC evaluation is a mixed-methods approach that includes quantitative (surveys) and qualitative data (listening sessions). The VPPTB staff strategized how to best answer the evaluation questions, including who should facilitate the listening sessions, when the surveys should be disseminated, and who should disseminate the surveys (all of these are part of the data collection methodology). Each question is associated with one or more data sources that will inform the answer. The questions will also inform analysis, which has not yet begun.

To allow for an overarching analysis, TTA providers use coordinated, shared measurement tools to collect needs assessment and evaluation data. Data sources for the evaluation include participant feedback forms for individual TTA, participant feedback forms for group learning events, and participant feedback forms for peer learning communities (Supplemental Table 1). Although each TTA provider is bound by separate funding mechanisms and requirements, they are all required to incorporate the shared measurement tools into their individual evaluation processes. TTA providers analyze this data individually in order to ensure continuous quality improvement of TTA delivery and are required to submit evaluation summaries that allow CDC staff to ensure compliance with funding requirements. In addition, each TA provider submits their raw data to the VPPTB staff who aggregate and analyze the data in order to understand how VPTAC is functioning as a complete system of TTA.

Additional data sources for the evaluation include data from the TTA tracking system (which includes information about TTA requests and delivery); annual key informant interviews with TTA providers; and annual VPPTB staff surveys (Supplemental Table 1). In addition, the VPPTB staff integrated questions about the use of TTA, self-reported changes in capacity, and TTA needs into the annual reporting process for recipients in order to obtain consistent data across all funded recipients rather than relying on solely voluntary responses to TTA feedback forms. Information from all of these data sources allows VPPTB staff to answer VPTAC-created evaluation questions and conduct continuous quality improvement of TTA processes, coordination, and activities. Continuous quality improvement in this project means that feedback from recipients is used to determine future TTA topics and modes (mechanism that TTA was delivered, e.g., webinar, podcast, etc.) and tailor content more effectively. Because of changes in recipient needs and contexts, TTA recipient feedback plays a critical role to ensure that VPTAC can meet their needs efficiently and effectively. The VPPTB staff routinely analyze data on recipient satisfaction, facilitators and barriers, and recipient needs in order to plan future TTA, refine TTA delivery, and identify emerging TTA topics. This allows VPTAC to tailor TTA to specific recipients, programs, and funding project timelines and respond flexibly to help recipients overcome challenges, making for a more effective and responsive TTA system. Although VPTAC’s evaluation is in the first year of data collection and results are not yet available, it is anticipated that this approach to evaluation and continuous quality improvement will enable VPPTB staff to understand the collective impact of technical assistance – across topics, modalities, dose, VPPTB programs, and TTA providers.

Assessing Changes in Capacity

An important goal of the VPTAC evaluation is moving beyond satisfaction and engagement with TTA to better understand changes in recipients’ capacity to plan, implement, and evaluate prevention efforts. The data collected is intended to gauge changes in capacity in several ways. Participants in group TTA events are asked about their intention to use what they have learned. While this is not a direct measure of capacity, responses will reveal whether recipients plan to put into practice what they have learned for planning, implementing, or evaluating their work. This will indicate if the TTA provided has equipped recipients with an additional tool for capacity building to conduct prevention activities and support subrecipients. Recipients are also asked directly about perceived changes in capacity in their annual performance reports. Questions are both close-ended and open-ended and ask about changes in organizational capacity as well as the capacity of their subrecipients. Because the annual performance report also asks about utilization of TTA, it will enable the VPPTB staff to look at perceived changes in capacity by relative engagement in TTA. Finally, recipients of individualized TTA are asked directly if the TTA received increased organizational capacity to plan, implement, or evaluate their violence prevention efforts. This question is included in a survey administered six months after their receipt of TTA allowing respondents to reflect on growth in capacity and to describe application of knowledge or skills gained.

The focus of the VPTAC evaluation is to better understand the implementation and impact of the TTA system in building the capacity of recipients in the eight identified capacity areas (refer to Table 1). The efforts to evaluate the impact of VPPTB funding programs, which are varied and complex across the 135 recipients and more than 800 implementation efforts are outside of the scope of VPTAC and are not described in this article. In addition, each funded TTA provider is required to review and analyze data on their own specific TTA activities in order to better understand the effectiveness of their specific TTA delivery and make adjustments and improvements. However, the unique value of the VPTAC evaluation approach is in providing a framework for better understanding how the implementation and impact of a complex, comprehensive TTA system with multiple TTA providers can efficiently and effectively work together to collaboratively build the capacity of communities to plan, implement, and evaluate comprehensive prevention efforts.

Challenges and Lessons Learned

There are several areas in which VPTAC will need to continue to refine its TTA and evaluation approach. First, there is a challenge with better understanding the potential influence of motivation on participation in and impact of technical assistance. Measuring and understanding motivation is complicated by the fact that the TTA is overseen – and sometimes delivered - by the funder. VPTAC delivers both reactive and proactive TTA. For reactive TTA, motivation is built in, as the requestor is motivated to receive the TTA and build their skills or learn how to address their challenges. Conversely, proactive TTA, while based on identified TTA needs, assumes internal motivation on behalf of the recipients to attend, participate, and respond to the evaluation. To ensure the design and evaluation of TTA maximizes recipients’ motivation to engage in proactive TTA activities, the VPTAC team has taken several approaches. During planning, providers ensure that TTA topics and learning objectives are relevant and beneficial and based on needs assessment data from recipients. In addition, recipients are asked about their expectations during the registration process for some TTA events to better tailor TTA to specific questions or needs of recipients who register in advance. Finally, the evaluation uses a mixed-methods evaluation approach so that data is not only collected through voluntary surveys after TTA events but also through ongoing recipient reporting processes and regular interactions between recipients and staff. Although these approaches do not fully address the influence of motivation on the participation in TTA and application of knowledge gained through participation, they attempt to increase motivation to participate in TTA and ensure data is collected from recipients with varying levels of motivation.

Another ongoing challenge in the evaluation of TTA is the fact that there is a wide range of engagement or use of TTA across recipients, with some recipients requesting support multiple times annually and some recipients never requesting TTA. Recipients also engage in TTA in many different ways, and at widely varying doses, frequency, and depth of TTA. Depending on the specific need and requestor, TTA may consist of a one-time call or sharing of technical assistance resources, participation on a group learning event, or ongoing in-depth individual technical assistance. Training and Technical Assistance is also provided through podcasts, blogs, publications, and provision of resources. There is also diversity in the programs, practices, and policies being implemented, settings and environments of implementation, and capacity of implementers and their partners which impacts both the frequency and complexity of TTA. Although this diversity is a strength of the VPTAC approach in that it enables VPPTB staff to meet a wide range of needs, the breadth of TTA modalities and dosage makes it challenging to understand the impact of any specific TTA effort on changes in recipient capacity. The evaluation of VPTAC efforts attempts to account for this diversity in several ways: distinct evaluation tools based on clearly defined categories of TTA (e.g., criteria for what constitutes a discrete session of individual TTA and what does not); measuring participation in group events; and categorizing levels of engagement in TTA as part of data analysis. Although these approaches do not completely solve this challenge, it will assist with accurate interpretation of evaluation results.

Finally, the evaluation of VPTAC as a whole does not replace the need for each individual TTA provider to evaluate their own specific TTA objectives to satisfy funding requirements and provide a means for continuous quality improvement. However, specific evaluation requirements often vary by funding mechanism and the timing and length of funding for TTA providers varies widely. The evaluation approach adopted for VPTAC was designed to ensure the data collected can be analyzed across TTA providers or for populations of interest to minimize the amount of data collection required. One of the goals of the VPTAC evaluation is to understand the cumulative impact of TTA delivered by multiple TTA providers and through a variety of modalities. Therefore, the VPPTB staff have incorporated requirements into funding for TTA providers that require them to compile and analyze evaluation data on the specific activities that they implemented. Although the VPTAC evaluation may not clearly identify particular challenges or strengths of specific TTA providers, the VPPTB staff have access to the evaluation reports submitted by each individual TTA provider and can review then as needed for program improvement.

Implications for the Field

The VPTAC’s comprehensive approach to planning and implementing TTA– as well as the challenges encountered – provides a number of key insights that are applicable for anyone who is funding, planning, or evaluating a complex multi-provider, multi-level technical assistance center.

Foster Collaboration and Engagement in TTA Planning and Evaluation

It is important to take the time and effort to engage all key partners in co-constructing the evaluation plan and process to ensure there is sufficient buy-in and the evaluation approach selected is feasible and meaningful for all providers. This required that the VPTAC approach move beyond general collaboration and coordination among TTA providers with separate evaluations to an overarching evaluation process and measures that are utilized by all providers. The VPPTB staff led multiple planning meetings to outline high-level processes and map out updated tools while including all TTA providers in the decision-making process. All of the partners jointly developed processes and procedures for communication, collaboration, strategic planning, tracking, and evaluation of TTA. The VPPTB staff and the TTA providers meet on a regular basis and communicate regularly through a SharePoint site that includes technical assistance resources, planning documents, and a shared calendar. This partnership extends beyond simple collection of common evaluation data to collaborative reviews of evaluation and needs assessment data and discussion of overarching TTA priorities to ensure all TTA providers are aligned, which helps ensure each TTA providers’ efforts are data informed and geared toward capacity building.

Regular engagement of funders is also critical to ensuring continued commitment to establishing and adhering to standardized processes and procedures for TTA delivery and evaluation. For example, the requirement that all TTA providers and funded programs use a unified tracking system for TTA requests is critical to being able to effectively track and coordinate TTA and ensures the ability to measure progress of recipients who receive support. Similarly, incorporating TTA evaluation into recipient annual performance reporting is under the responsibility of the VPPTB staff, as a funder.

Clearly Define Terminology and Concepts

Another important step is the development of operational definitions for all key concepts and terms. This is a step that is often more complex than expected when designing and evaluating TTA systems. For example, clearly defining what constitutes technical assistance versus general program administration support is an ongoing challenge within VPTAC. There is often overlap between support provided to recipients related to completing required funding deliverables and technical assistance related to planning, implementing, and evaluating primary prevention strategies. While this distinction may not be critical to recipients who are receiving the support, it is critical when determining which specific activities will be included in the VPTAC evaluation. Similarly, providers may use different terminology for types and modalities of technical assistance. For example, TTA providers and program staff may use different terms for “Communities of Practice” and may have different expectations of what these would entail. It is critical that partners agree upon clear definitions to ensure consistent implementation and evaluation of the TTA system.

Streamline Data Collection

Identifying ways to align and streamline data collection reduces the burden of evaluation activities on recipients and ensures comparability across TTA providers. Creating a shared TTA tracking system to compile all TTA requests and TTA plans enables the VPPTB staff to ensure that the evaluation can capture both the process and impact of TTA while also providing a mechanism for monitoring TTA provider activities. In addition, identifying strategies for incorporating questions related to the TTA evaluation into existing data collection such as annual reporting is a strategy for ensuring data is collected from a wide range of recipients with a minimum level of effort.

Prioritize Continuous Quality Improvement

Integrating data-to-action in all of the evaluation and planning activities ensures that the TTA system is continuously refined to better meet the needs of practitioners. Ensuring that data collected for the evaluation also incorporates questions about TTA needs allows VPTAC to minimize data collection burden while also ensuring that TTA is continuously tailored to meet recipients’ needs at various stages of programs, based on previous programs and shared needs across recipients.

Strengthen General Prevention Capacity

Much of the TTA provided by VPTAC is focused on building the capacity of funded recipients to plan, select, implement, adapt, and evaluate primary prevention strategies. The TTA focuses on skills and knowledge not related to implementation of a specific program but to the general competencies and expertise required to implement sustainable comprehensive violence prevention efforts in a community. This aligns with the general approach of the VPPTB to empower recipients to establish a sustainable prevention infrastructure and to select, adapt, and implement programs, practices, and policies that meet specific community needs. This effort is in stark contrast to some funding mechanisms and TTA systems which may specify that recipients implement specific programs. Building recipients’ general capacity to plan, implement, and evaluate comprehensive violence prevention efforts will enable state and local communities to leverage a wide range of partners and resources in order to ensure their efforts are comprehensive and sustainable. Focusing on general capacity, in addition to innovation-specific capacity, will increase the likelihood that practitioners will be able to effectively select, adapt, and implement a wide range of violence prevention programs, practices, and policies that resonate with their communities.

Think Bigger

The vision for VPTAC was to develop a TTA system that was comprehensive, multi-component, and flexible. This required that the VPPTB staff approach the design and evaluation of VPTAC from a system-level perspective. This requires an expansion beyond evaluating a single TTA event or engagement to designing a comprehensive strategy to evaluate and continuously improve the system. Developing a TTA system logic model helps to ensure all partners have a shared understanding of desired outcomes and illuminates how all of the partners and TTA activities are working together to achieve the outcomes. In contrast to developing logic models for specific TTA activities or providers, the process of creating a system wide TTA structure enables a strategic view of combined TTA efforts. Similarly, establishing clear priorities within the strategic TTA plan ensures that all partners are working toward the same goals. Having a system level logic model and specific annual TTA objectives provides key information for developing the evaluation plan and measuring the impact of the system as a whole.

It is important to have comprehensively aligned evaluation data from multiple sources. Obtaining this type of data is critical for funders, operators of technical assistance centers, providers, recipients, and researchers, allowing them to draw conclusions regarding the efficacy of TTA delivery. Included in VPTAC’s standard evaluation tools are open-ended questions that ask about the implementation activities recipients are planning to conduct as a result of skills that they gain through TTA. Follow-up questions that ask whether planned activities were carried out helps the VPPTB staff maintain quality control and improve future assistance. Likewise, shared data sources enable alignment of results from multiple TTA sources across a comprehensive TTA system which provides a full picture of how the combination of multiple TTA modalities (e.g., tools, trainings, individual TTA) contribute to overall recipient success. Conducting a standardized evaluation that looks across program recipients, across TTA providers, and across TTA modalities enables the VPPTB staff to show improvement in technical knowledge and gather examples of enhanced implementation, and, crucially, facilitate proactive TTA planning.

Supplementary Material

Supplemental table 1

Acknowledgments

We would like to thank the following TTA providers for their continuous hard work and dedication to VPTAC: American Institutes for Research (AIR), National Resource Center on Domestic Violence (NRCDV), National Sexual Violence Resource Center (NSVRC), and PreventConnect (ValorUS).

Funding

The author(s) received no financial support for the research, authorship, and/or publication of this article.

Footnotes

Declaration of Conflicting Interests

The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Disclaimer

The findings and conclusions in this report are those of the authors and do not necessarily represent the official position of the Centers for Disease Control and Prevention.

Supplemental Material

Supplemental material for this article is available online.

References

  1. Barranco L, Freire K, & Payne GH (2022). Moving evidence to action: A strategy to support the implementation of comprehensive violence prevention efforts. Health Promotion Practice, 23(5), 824–833. 10.1177/15248399211028156 [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Brownson RC, Fielding JE, & Green LW (2018). Building capacity for evidence-based public health: Reconciling the pulls of practice and the push of research. Annual Review of Public Health, 39, 27–53. 10.1146/annurev-publhealth-040617-014746 [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Centers for Disease Control and Prevention. (2022a). The division of violence prevention’s strategic vision. https://www.cdc.gov/violenceprevention/about/strategicvision.html
  4. Centers for Disease Control and Prevention. (2022b). What is health equity. https://www.cdc.gov/healthequity/whatis/index.html
  5. DeCorby-Watson K, Mensah G, Bergeron K, Abdi S, Rempel B, & Manson H (2018). Effectiveness of capacity building interventions relevant to public health practice: A systematic review. BMC Public Health, 18(1), 684. 10.1186/s12889-018-5591-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Moreland-Russell S, Adsul P, Nasir S, Fernandez ME, Walker TJ, Brandt HM, Vanderpool RC, Pilar M, Cuccaro P, Norton WE, Vinson CA, Chambers DA, & Brownson RC (2018). Evaluating centralized technical assistance as an implementation strategy to improve cancer prevention and control. Cancer Causes & Control, 29(12), 1221–1230. 10.1007/s10552-018-1108-y [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. National Center for Education Evaluation and Regional Assistance. (2011). National evaluation of the comprehensive technical assistance centers: Final report. Executive summary (NCEE 2011–4032). U.S. Department of Education. Institute of Education Sciences. https://ies.ed.gov/ncee/pubs/20114031/pdf/20114032.pdf [Google Scholar]
  8. Olson JR, Coldiron JS, Parigoris RM, Zabel MD, Matarese M, & Bruns EJ (2020). Developing an evidence-based technical assistance model: A process evaluation of the national training and technical assistance center for child, youth, and family mental health. Journal Behavioral Health Services Research, 47(3), 312–330. 10.1007/s11414-020-09686-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Scott VC, Jillani Z, Malpert A, Kolodny-Goetz J, & Wandersman A (2022). A scoping review of the evaluation and effectiveness of technical assistance. Implementation Science Communications, 3(1), 70. 10.1186/s43058-022-00314-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  10. Smith BJ, Tang KC, & Nutbeam D (2006). WHO health promotion glossary: New terms. Health Promotion International, 21(4), 340–345. 10.1093/heapro/dal033 [DOI] [PubMed] [Google Scholar]
  11. Viglione C, Stadnick NA, Birenbaum B, Fan O, Cakici JA, Aarons GA, Brookman-Frazee L, & Rabin BA (2023). A systematic review of dissemination and implementation science capacity building programs around the globe. Implementation Science Communication, 4(1), 34. 10.1186/s43058-023-00405-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Wandersman A, Duffy J, Flaspohler P, Noonan R, Lubell K, Stillman L, Blachman M, Dunville R, & Saul J (2008). Bridging the gap between prevention research and practice: The interactive systems framework for dissemination and implementation. American Journal of Community Psychology, 41(3–4), 171–181. 10.1007/s10464-008-9174-z [DOI] [PubMed] [Google Scholar]
  13. Wilkins N, Tsao B, Hertz M, Davis R, & Klevens J (2014). Connecting the dots: An overview of the links among multiple forms of violence. National Center for Injury Prevention and Control. Centers for Disease Control and Prevention. https://www.cdc.gov/violenceprevention/pdf/connecting_the_dots-a.pdf
  14. Zakocs R, Hill JA, Brown P, Wheaton J, & Freire KE (2015). The data-to-action framework: A rapid program improvement process. Health Education Behavior, 42(4), 471–479. 10.1177/1090198115595010 [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplemental table 1

RESOURCES