Skip to main content
Health Education Research logoLink to Health Education Research
. 2016 Mar 3;31(2):283–294. doi: 10.1093/her/cyw012

A qualitative analysis of the concepts of fidelity and adaptation in the implementation of an evidence-based HIV prevention intervention

Jill Owczarzak 1,*, Michelle Broaddus 2, Steven Pinkerton 2
PMCID: PMC5007579  PMID: 26944867

Abstract

Continued debate about the relative value of fidelity versus adaptation, and lack of clarity about the meaning of fidelity, raise concerns about how frontline service providers resolve similar issues in their daily practice. We use SISTA (‘Sisters Informing Sisters on Topics about acquired immune deficiency syndrome’), an evidence-based human immunodeficiency virus (HIV) prevention intervention for African American women, to understand how facilitators and program directors interpret and enact implementation fidelity with the need for adaptation in real-world program delivery. We conducted 22 in-depth, semi-structured interviews with service providers from four agencies implementing SISTA. Facilitators valued their skills as group leaders and ability to emotionally engage participants as more critical to program effectiveness than delivering the intervention with strict fidelity. Consequently, they saw program manuals as guides rather than static texts that should never be changed and, moreover, viewed the prescriptive nature of manuals as undermining their efforts to fully engage with participants. Our findings suggest that greater consideration should be given to understanding the role of facilitators in program effectiveness over and above the question of whether they implement the program with fidelity. Moreover, training curricula should provide facilitators with transferable skills through general facilitator training rather than only program-specific or manual-specific training.

Background

Fidelity and adaptation are central concerns of implementation research within public health promotion. Since the 1970s, researchers have debated whether and the extent to which providers in real-world settings should adapt programs to fit local contexts, as opposed to implementing them as originally designed, with strict fidelity [1–6]. Although some researchers argue that fidelity to the original intervention improves effectiveness, whereas significant modifications or deletions can diminish effectiveness [7, 8], research on ‘real-world’ program implementation (i.e. with clinicians and other professionals who deliver programs rather than researchers) clearly indicates that providers rarely implement programs with complete fidelity. In the best cases, programs are implemented with around 60% fidelity [9]. Adaptation often is necessary to improve stakeholder buy-in, increase the program’s relevance for local populations, facilitate delivery of the intervention to the target population, and implement the program in resource-constrained environments [10, 11].

The relative value of adaptation versus fidelity in implementation is further complicated by the lack of a uniform definition of fidelity. Various definitions of fidelity can include the extent to which the implemented program reflects theoretical methods, strategies and determinants; completeness, or the extent to which all intervention components are delivered; or reach, or the extent to which the program reaches the intended target population and quality of delivery [7, 12]. Fidelity can also refer to the extent to which a program is delivered according to general ‘core elements’ (discussed below) or whether specific intervention activities are delivered as described in a program manual. Furthermore, the distinction between ‘allowable’ and ‘unacceptable’ program modifications in the implementation process is not well understood and is further confused by competing terms, including adaptation, modification, and tailoring [13]. In addition, the term ‘adaptation’ is also broadly defined, as in ‘any addition, subtraction or modification to the original program model, quality of delivery or participant responsiveness’ [13, 14].

The lack of consistent terminology and definitions related to fidelity and adaptation, as well as continued disagreement about the relative benefits of fidelity versus adaptation within the research community, raises concerns about how frontline service providers resolve similar issues in their daily practice. Providers may be taught that programs should be implemented with fidelity [12, 15] but may also want to modify programs based on their clients’ needs [16], their own expertise [17] or the resources they have available [18, 19]. Providers’ views about the relationship between fidelity and adaptation, their decision-making processes about balancing fidelity and adaptation and whether they think programs are working optimally have implications for provider commitment to sustained program implementation and ultimately, to the overall effectiveness of evidence-based programs in community settings.

Evidence-based human immunodeficiency virus prevention

Issues surrounding the balance between adaptation and fidelity are particularly salient in the implementation of human immunodeficiency virus (HIV) prevention interventions. The majority of the HIV prevention programs supported and funded by entities such as the Centers for Disease Control and Prevention (CDC) and state health departments are prepackaged, evidence-based programs. They follow relatively rigid delivery formats and session structures. These programs are disseminated to providers through the Diffusion of Effective Behavioral Interventions (DEBI) program. Through the DEBI program, HIV prevention service providers are taught how to implement a particular intervention during intensive, multi-day trainings. Implementation fidelity evaluations of the evidence-based programs included in the DEBI program have primarily focused on delivery of core elements (program components integral to the intervention that are thought to be responsible for its effectiveness and that must be retained in order for HIV risk reduction to occur), rather than specific activities included in the program (i.e. measures based on implementation fidelity checklists). These evaluations illustrate that service providers rarely implement these programs with complete fidelity. Agencies reinvent interventions by incorporating them into other programs [10]; fail to implement the program with fidelity to all core elements [16] and edit content, change the number of session, and substitute, add or delete activities [13].

That providers continue to adapt evidence-based interventions, despite research and training that emphasizes the importance of fidelity, raises question about the underlying reasons why providers make these adaptations. Notably, Palinkas et al. [17] found that clinicians who implemented an evidence-based program were more likely to continue the program if they could exercise creativity in applying the material and integrating it with their own theoretical orientation and previous training. However, as Bearman et al. [20] note, much of the literature on evidence-based practice uptake and implementation focuses on providers’ attitudes rather than experiences. Less understood is also how providers understand and describe their role in delivering programs within the context of a specifically ‘pre-packaged’ manualized curriculum, and how this affects program delivery and adaptation. To understand providers’ views on fidelity, adaptation and the role of the facilitator in program implementation, we examined real-world implementation of SISTA (‘Sisters Informing Sisters on Topics about Acquired Immune Deficiency Syndrome’), a small group, five-session HIV prevention education and skills-building program for heterosexually active African American women [21]. We focused on SISTA because it is one of the CDC’s most popular evidence-based interventions: Through 2008, over 400 community-based organizations and 900 individuals had received SISTA training in the United States.

As Table I illustrates, SISTA is a highly interactive program that focuses on building racial and gender pride, relies on role-playing to teach communication skills and encourages collaboration and discussion among group members. Unlike the implementation of evidence-based programs explored in clinical settings [17, 20, 22, 23], SISTA facilitators are often specifically hired to deliver this program and therefore have little autonomy whether to adopt it. Therefore, this article focuses on experiences and perspectives on implementation fidelity and adaptation, rather than uptake and buy-in. This article explores how facilitators interpret the sometimes contradictory messages they receive regarding implementation fidelity and how they balance fidelity and adaptation in practice. We seek to understand the extent to which providers value fidelity and their motivations for changing particular elements of a program.

Table I.

Core elements and key characteristics of the SISTA program

Core elements
    Convene small-group sessions to discuss the session objectives; model skills development; role-play women’s skills acquisition; and address the challenges and joys of being an African American woman.
    Use skilled African American female facilitators to implement SISTA group sessions.
    Usually culturally and gender appropriate materials to acknowledge pride and enhance self-worth with regard to being an African American woman (e.g., use poetry by African American women).
    Teach women to communicate both verbally and non-verbally to show they care for their partner and need to protect themselves (e.g., negotiation skills, assertive communication skills).
    Instruct women on how to use condoms effectively and consistently (e.g., negotiation skills, assertive communication skills).
    Discuss culture and gender-related barriers and facilitators to using condoms (e.g., provide information on African American women’s risk of HIV infection).
    Emphasize the importance of a partner’s involvement in safer sex (e.g., enhance partner norms supportive of condom use).
Key characteristics
    SISTA can be adapted for different populations of African American women.
    SISTA must be implemented with passion.
    SISTA should be publicized as a program that was developed by African American women for African American women.
    SISTA should include HIV prevention discussions that address relationships, dating, and sexual health within the context of the African American women’s experiences.

From: Sisters Informing Sisters about Topics on AIDS (SISTA) Implementation Manual. 2008. Centers for Disease Control and Prevention.

Methods

Data collection

From 2011 to 2013, in-depth, semi-structured interviews were conducted with 22 service providers from four agencies located in four cities in three geographically distinct regions of the United States (Northeast, Midwest and South). The four agencies were recruited to participate in a larger study evaluating the effectiveness of the SISTA program as implemented by frontline service providers. Three of the participating agencies were comprehensive AIDS service organizations that offered treatment, care, support and prevention services to broad populations but that mainly served ethnic and racial minority communities. The fourth organization conducted HIV prevention programs, including SISTA, on a limited scale and also conducted HIV counseling and testing. Most of its work focused on capacity-building both domestically and internationally. The HIV prevention service providers who participated in our study received their SISTA training through official CDC-sponsored trainings promoted through its evidence-based interventions website (www.effectiveinterventions.org), in-person training conducted by expert SISTA facilitators and sponsored by state health departments or local academic institutions, or by experienced SISTA facilitators within the agency.

Agency directors provided names and contact information for staff at their organizations who were involved in SISTA programs for African American women, as well as prevention directors and other individuals involved in HIV prevention program planning and administration. Using a convenience sampling strategy, these individuals were contacted directly via phone or email and invited to participate in the interview component of the study. Directors and staff were assured that participation in the interview was voluntary and confidential and written informed consent was obtained before the interview began. Of the 22 people contacted, no individuals declined to participate in the interview. Approval for the study was obtained from the Institutional Review Board at the Medical College of Wisconsin. All interviews were conducted by the first and second authors and digitally recorded and lasted between one and two and half hours. Interviews were conducted in a private room at the person’s place of employment; one interview was conducted over the phone. The interviews were semi-structured, based on an interview guide with a set of topics to be covered during the interview depending on the individual’s position in the agency (i.e., director or facilitator) but that allowed flexibility to probe participant responses and explore topics in greater depth [24]. Interviews with both directors and facilitators asked participants about their current position and history with the organization, the populations they serve, general HIV and SISTA-specific training experiences and successes and challenges implementing SISTA. Interviews also addressed changes that were made to the SISTA curriculum, including additions and deletions; overall thoughts of the importance of fidelity versus adaptation and experience with ‘packaged’ curricula in general.

Analysis

All interview recordings were transcribed verbatim and entered into a computer-based text file. Transcripts were then transferred to the qualitative data analysis software program MAXQDA [25] to be coded and sorted. MAXQDA and similar data analysis software programs are used for systematizing, organizing and analyzing qualitative data. Such software aids the researcher in sorting material into groups, defining variables and assigning codes to text segments. It also contains tools for making comparisons across various pieces of qualitative data (interview transcripts, text segments). Qualitative data analysis software does not itself suggest interpretations, but allows the researcher to draw conclusions through the systematic coding and comparison of the text. For this project, transcripts were analyzed by the first and second authors for emergent themes using principles of grounded theory analysis [26]. Transcripts were initially examined to identify primary coding categories, as well as the range of themes present within each category. Identified coding categories and themes were organized into a formal code book through collaboration and discussion by the first and second authors. Transcripts were then formally content coded. New themes that did not appear to fit into the original code book were discussed and modifications were made when appropriate. When suggested by associations, overlap or diversions in the data, thematic categories were refined, merged or subdivided.

Transcripts were first coded by agency location, gender, ethnicity and participant’s title or role in the agency (e.g. director, paid staff and volunteer). Then, the documents were coded through an open coding process. In open coding, transcripts are read through and codes freely generated to capture the content of the interview. The transcripts were coded with the text codes that emerged and that reflected key analytical topics including personal history of involvement in human immunodeficiency virus (HIV)-related issues, HIV training and education experience, target populations and agency services, experiences with SISTA, program modifications, implementation support and technical assistance and views on adaptation and fidelity. We examined codes in isolation and in relation to other codes (axial coding), e.g. ‘Fidelity’ in isolation and in tandem with other concepts such as ‘Limitations’ and ‘Key Elements’, to explore the limitations of fidelity described below. Due to the small sample size, the analysis focused on the interviews as a complete set. Additional analysis was conducted to determine whether time at agency or experience facilitating SISTA affected how participants discussed fidelity; none were found. Some differences did emerge between agencies and by role (director and facilitator). These differences are noted below where appropriate. The interview quotes below were selected because they were representative of the range of responses within a given theme.

Results

Participant characteristics

Of the 22 participants, four directors (one from each agency) and 18 frontline staff in non-director positions who facilitated or helped with SISTA programs completed the interview (Table II). Facilitators in the study sample were not evenly distributed across agencies due to differences in agency size and scope. Agency 1, e.g. was a state-wide organization that conducted SISTA in several cities and therefore had a larger cadre of SISTA facilitators. In contrast, Agency 2 worked in only one city and had only one full-time facilitator on staff and a contractual, as-needed relationship with a second facilitator. The distribution of interview excerpts below reflects this aspect of the study sample. Of the 22 interview participants, 18 were female and four were male; three of the four agency directors were men; 17 of the 18 intervention facilitators were women; 14 identified as African American, five as Hispanic and three as White. Over half (13) had completed college in a range of disciplines, including psychology, business, education and social work and related fields. The median time interviewees had worked at the agencies was 2.5 years but the range of time varied significantly. One person in a director level position, e.g. had been with his organization for 15 years; five interviewees had been with their agencies for less than 6 months. Two participants worked at their respective agencies for several years, took breaks, and then returned to work at the agency. The 2.5-year median tenure reflects the high rate of turnover at these agencies: over half of original interview participants in non-director level positions were no longer working at their agencies 2 years after the interviews were conducted.

Table II.

Participant characteristics

Agency 1 Agency 2 Agency 3 Agency 4
Coverage area Statewide Regional Regional Single city
Programs and services HIV prevention, case management and support services for people living with HIV HIV prevention, capacity-building Comprehensive AIDS service organizationa HIV prevention, case management and support services for people living with HIV
Interview participants N = 9 N = 4 N = 5 N = 4
    Gender
        Female N = 8 N = 4 N = 3 N = 3
        Male N = 1 N = 0 N = 2 N = 1
    Education level
        High school only N = 4 N = 0 N = 1 N = 0
        Some college N = 2 N = 0 N = 1 N = 1
        Completed college N = 2 N = 2 N = 3 N = 3
        Graduate training N = 1 N = 2 N = 0 N = 0
    Race and ethnicity
        African American N = 5 N = 4 N = 3 N = 3
        Hispanic N = 5 N = 0 N = 0 N = 0
        White N = 0 N = 0 N = 2 N = 1
    Time at agency
        Range 3 months to 9.5 years 6 months to 8 years 1 year to 13 years 2 to 15 years
        Median 3 years 1.25 years 3 years 4.5 years

aIn addition to HIV prevention programs, comprehensive AIDS service organizations provide medical (primary care, dental care, reproductive health) and social services (food pantry, housing support, legal aid, insurance assistance) to their clients living with HIV.

Two primary themes emerged during analysis: First, participants gave often ambivalent opinions on the limitations of strict fidelity to the SISTA manual, which includes the importance of the facilitators’ own lived experiences to communicate effectively with the SISTA recipients, the need to adhere to the theoretical basis of the intervention, and addressing client needs that are sometimes outside the scope of the SISTA manual. Second, participants described how the SISTA facilitators themselves can potentially affect SISTA participants, over and above the content of the manual itself. We elaborate on these themes below.

Fidelity, tweaking’ and ‘tailoring

The SISTA manual clearly distinguishes the program’s core elements and key characteristics (Table I). It also offers a highly regimented guide to implementing the intervention, including how much time should be devoted to each activity, what should be prepared in advance of the session, and written dialogue to facilitate discussion within and transition between activities. The facilitators we interviewed almost universally reported that they were told during training that they must implement SISTA with fidelity to core elements, and that any changes to core elements would render the intervention ‘not SISTA’. However, facilitators expressed some confusion over what fidelity meant in the context of a highly-manualized and scripted intervention. For example, one agency staff person criticized her colleague for reading the SISTA manual verbatim when she delivered the program:

[W]e, as women, especially black women, we know how to communicate… That [reading the manual] is not how we do, and particularly when you are talking about concepts like infidelity, and finding your voice, and being assertive versus aggressive… This is real life stuff. We have all experienced it. So I asked [my colleague] about that, and what she said was they went to be trained in Memphis. They were told you cannot deviate from this training. You can not deviate, and so I’m sitting here thinking, ok we cannot deviate from the training. Because if you deviate, that means that it’s not this training, so what does that mean, when you cannot deviate? Does that mean really you have to read this word? Does it mean it can’t deviate from the concept? Or you can deviate from the delivery? (Agency 4, Facilitator 1)

In contrast, one agency director who had been involved in translating SISTA from the research context and manualizing it for dissemination to providers strongly emphasized the need to implement the program with fidelity to both core elements and specific intervention activities. She viewed the program holistically and argued that each element was essential to its success:

So I think that one of the challenges that we have had to experience with SISTA is training people how to implement the intervention the way it is without adapting it… Each of the points are specifically written for the session that you are facilitating and speaks to the activities and the purpose of the session versus just, “Oh I don’t like [the poem] ‘Still I Rise.’” Well, the purpose of that poem is to instill empowerment so you can’t change that for a song because it doesn’t match with the session. So I think that’s been one of the biggest challenges is to help people understand why they want to implement with fidelity so you can get the same outcomes as the research did versus “Oh, our women need substance abuse information too so we are just going to add that.” Well, you just can’t add that because it doesn’t match with the theories associated with the intervention. (Agency 2, Director)

Concerns about balancing fidelity with adaptation often arose in the context of how to respond to individual clients’ needs or group dynamics. While the SISTA program is highly structured, group discussions following many of the activities prompt participants to disclose sensitive and personal information on topics such as sexual relationships, abuse, substance use and mental health. Facilitators not only wanted to allow women to talk about these issues but also recognized that they often faced time constraints if they wanted to cover all the material in the manual.

In addition, SISTA participants often bring up traumatic personal experiences, including rape and child sex abuse. However, the rigid structure of the intervention sessions limited facilitators’ ability to deal with these issues within the context of SISTA:

I think that a lot of things can come up, especially when you are working with the type of people that we are working with. Well, you don’t want to add counseling … But, [if you had] just a little bit more space and time to maybe kind of process what is going on. Because that is what ends up happening anyways, is that the women have so much stuff to talk about. And so you are having to cut them off, to keep it going. (Agency 2, Facilitator 1)

This facilitator expressed the common experience that the demands of following the curriculum limited their ability to address participants’ needs. Similarly, other facilitators suggested that the intervention’s effectiveness could be improved if it were more flexible:

Sometimes I do wish that I could throw extra in. I think the pre-packaged is fine because it has meaning behind it. It has a purpose. But sometimes I do feel that if maybe some[thing] could be switched or adapted. And we don’t want to do an adaptation, well of course, because of the core elements of the intervention. But just I think as time goes on maybe we could do some adaptations, because I would see … more success. (Agency 1, Facilitator 5)

The realities of service delivery prompted facilitators to question what fidelity means in practice and the overall value of fidelity in light of meeting clients’ emergent needs. Concerns with implementation fidelity diminished as a concern with serving the needs of diverse clients grew.

Being ‘real’ and not a ‘robot’: the role of the facilitator in intervention effectiveness

The fidelity versus adaptation debate points to the content and messages of the intervention as the foundation of program effectiveness. In contrast, SISTA facilitators generally agreed that they are central to making this program work, corroborating research documenting the effect that provider characteristics can have on successful uptake and delivery of interventions [23, 27–30]. One facilitator who very closely followed the manual, including verbatim reading of suggested transitions, questions, and summarizing statements, also valued the role of a skilled facilitator in delivering the intervention:

Of course you want to do the program the way that the program is meant to be done. And you want to be able to be yourself in there too. I mean, I think the way that the program is written is very good. And, like I said, very easy to follow and everything, and even now I mean, after doing it a bunch of times, I still very much use scripts and everything. So I think that your own personality is important. You want to not come in there and be a robot with the participants, or anything, but you want to be sincere. You want to be authentic, genuine, because I’m thinking that people can read through if you are not, you know? (Agency 2, Facilitator 1)

As mentioned previously, during SISTA sessions participants often want to discuss deeply personal and sensitive information. Facilitators often viewed the SISTA program as a first step toward helping participants in a more comprehensive way. They believed their passion, skill and commitment were essential to motivate participants to access other potentially helpful services:

You know, when I first got on this I didn’t realize the importance of it. I thought you just provide information and give them skills building to keep it moving. But now that I’m really deep into it, I realize that [it’s] not just providing the education, but along with compassion, along with having the knowledge, and the skills, and the referrals. You know, really being an instrument … I’m a mentor to someone else. Even though you are a counselor, but you have become a mentor, you have become an encourager, you become so much more than just a prevention plan. And putting yourself at risk you have become that person that they begin to look to and ask questions on life skills. (Agency 1, Facilitator 4)

From her perspective, SISTA facilitators do more than convey information contained in the SISTA program manual. They view themselves as compassionate and caring professionals with the capacity to help program participants in multiple ways beyond the limits of what is addressed in the SISTA HIV prevention intervention itself.

The desire to modify the intervention also stems from facilitators’ belief that they learn as they implement programs and should be able to use their accumulated experience, expertise and judgment to modify the intervention to reflect what they believe will work with their participants based on this knowledge:

So this is going to be egotistical. I think if you have someone who gets the concepts of education, and can read objectives and goals and gets what you need to get out of that session, fidelity is less important. If I can read the gist, I know what we are trying to get through here. Fidelity is less important, but then again, going back to there aren’t very many educators at the CEO levels, who are savvy enough in their skill set to be able to change a word here and there. So I think fidelity is important particularly at the beginning, until you are comfortable with the material. Then you can choose, pick and choose. (Agency 4, Prevention Director)

In this view, the manual is a foundation to build upon as facilitators learn from working with women and implementing the intervention. SISTA facilitators use their experience delivering the intervention and understanding how participants respond to activities, topics, and materials to determine what needs to be changed. They suggest that as they become more skilled delivering the intervention and more familiar with participants, they develop a sense of decisional autonomy to change the intervention to reflect what they have learned.

These insights into the important role the facilitator plays in creating intervention effectiveness led one supervisor to advocate for both better facilitator training and recruitment of passionate and engaged facilitators:

For me, I think it’s 90% on the facilitator. And the reason why is because someone can get up there and stand, and say information all day long. But the way in which the women learn, the way in which the facilitators are able to engage with their participants, really speaks to how the women receive the information. And it also speaks to their willingness to change behavior. If someone is just standing there and giving you some information, it’s just going in one ear and out the other. But when a facilitator is able to really speak to the heart of their population, and really engage the women with the information that they’re sharing with them, I think that’s when the light starts to go on. (Agency 2, Director)

Discussion

One of the most significant themes to emerge from interviews with SISTA facilitators is that they saw both the program and themselves more broadly than simply delivering material from a manual, often due to the interactive nature of the intervention. First, they saw the narrowly focused program as creating a safe space for participants to discuss potentially traumatic life events and the relatively rigid manual as a tool for getting women to talk about life circumstances that potentially contribute to HIV risk. That is, although the SISTA intervention was developed specifically to reflect the needs of African American women, the intervention’s narrow focus on HIV prevention and communication within sexual relationships at times could not accommodate more general discussions about other issues in participants’ lives. Moreover, they viewed the program itself as a means through which women could be linked to other services, such as counseling. Similarly, facilitators saw their role more broadly than delivering the intervention with fidelity. In the words of one facilitator, they are ‘instruments’, counselors’, ‘encouragers’ and ‘mentors’ who can put resources together and provide support for clients beyond the five sessions of the SISTA program. Importantly, serving in this capacity requires passion and patience, in addition to the ability to convey information (i.e. implement SISTA with fidelity). Viewing themselves as skilled professionals who can draw on their experience to modify a program makes sense given many facilitators’ personal experience either working with or being from the target population of this intervention, and the numerous HIV, health, facilitator and other professional trainings most must complete as part of their work. However, the extremely prescriptive and scripted nature of the manual itself may undermine facilitators’ efforts to fully engage with participants as they focus attention on keeping time and completing each activity, rather than rapport-building and addressing participants’ emergent needs. Moreover, they viewed adaptation as a way to make a program ‘more powerful’. Therefore, facilitators did not view the manual as a static text that should never be changed.

While research on fidelity and adaptation of HIV prevention programs included in the DEBI program often focuses on core elements and key components, the specificity of manuals and the training on manualized activities sets up expectations that detailed adherence to manualized procedures is required for fidelity. Numerous frameworks have been proposed to guide organizations in the adoption, adaptation and implementation of evidence-based interventions, including ADAPT-ITT [31], intervention mapping [32] and RE-AIM [33]. While these frameworks help agencies select interventions that are appropriate for their target populations and reflective of the resources and expertise available within the agency, they do not address facilitators’ views on adaptation and fidelity revealed through our interviews, specifically facilitator skills and the desire to continually modify the intervention to reflect the target population and what they have learned through continued implementation. In general, facilitators do not oppose packaged curricula with core elements and are not opposed to adapting them for new populations [34]. However, they are opposed to rigidity and devaluing of their expertise, skills and knowledge. Taking a prepackaged evidence-based intervention and adapting it to create a new program for the target population does not adequately address their concerns if the new program is equally rigid. The facilitators we interviewed wanted greater flexibility to meet emergent needs of program participants and freedom to change the intervention to reflect their own experiences and new implementation contexts.

Some researchers have called for the development of ‘flexible adaptive programs’ that provide options within or among components that are ‘equally influential on the outcome’ that providers can select, rather than a standard set of activities that all providers must implement in the same way [2]. Such a system would address the confusion that often exists among providers about what constitute ‘acceptable’ or ‘unacceptable’ changes to evidence-based programs. Van Daele et al. [35] have similarly proposed ‘empowerment implementation’ to resolve the adaptation versus fidelity debate, based on the argument that both are indeed necessary for successful program delivery and desired public health. In this model, communities are provided with the concepts, tools and skills to identify the core components of an intervention, adapt the intervention to their context and evaluate and sustain implementation quality.

Numerous studies indicate that provider characteristics play an important role in program uptake and sustained delivery. In a randomized clinical effectiveness trial of an evidence-based program for depression, anxiety and conduct problems among children, Chorpita et al. [36] proposed a modular program design that broke complex activities into simpler parts that could function independently. Their research showed that therapists tended to be very satisfied using the modular program because it fit their clinical practice and allowed them to select components for specific clients [17]. Modular evidence-based programs in HIV prevention are uncommon, with the exception of CLEAR (Choosing Life: Empowerment! Action! Results!) [37]. Future research should explore the possibility of more flexible program design within HIV prevention in particular.

This study’s limitations include its focus on one intervention, small sample size and homogeneity in participant characteristics. This study focused on a single small group, behavior change intervention developed for a specific population. While this potentially limits the transferability of this study’s conclusions, numerous other researchers have documented incomplete fidelity to similar interventions within HIV prevention programming [10, 13, 38]. This study’s findings suggest some potential reasons why fidelity is limited and why providers may be dissatisfied with the DEBI program [39]. The small sample size limits our ability to explore factors that may affect how facilitators view implementation fidelity, including length of time at the agency and level of experience (e.g. number of years) facilitating the SISTA intervention. Similarly, we did not conduct analysis based on training source (e.g. CDC trained versus agency trained). Although all trainings used a standardized manual, it is possible that trainers communicated different messages about fidelity and adaptation during the actual training. Finally, our data do not allow us to determine how implementation fidelity or adaption affects the outcomes of this intervention.

Our findings suggest that greater consideration needs to be given to the perspective and role of facilitators in the implementation fidelity versus adaptation debate. Greater attention should be paid to providing facilitators with transferrable skills through general facilitator training rather than only intervention- or manual-specific training. Better assessment tools are needed to evaluate the role of the facilitator and how their skills contribute to program effectiveness. Moreover, better adaptation guidelines need to be developed that account for the fact that facilitators learn through continued implementation and real-time problem-solving about aspects of the program that do and do not work with their target populations. As they implement the program, facilitators become experts themselves. The development of expertise through implementation is not accounted for in current implementation fidelity guidelines.

Our findings about facilitators’ desire to have greater flexibility in delivering interventions are supported by other process and outcome evaluations of the SISTA program. In a 2006 evaluation of a SISTA pilot project, Prather et al. [40] reported that ‘most’ of the 16 enrolled agencies implemented all of the core elements, but many expanded the intervention beyond the original target population to include adolescents, native Africans, and women living with HIV. Fuller et al. [38] reported similar modifications to the SISTA program when it was piloted for large-scale dissemination. More recently, Sapiano et al. [41] found that agencies funded to implement SISTA adapted the program in a variety of ways, including adjusting the intervention schedule, adding or eliminating activities and substituting materials that they thought were more culturally relevant. These findings, along with process evaluations from other DEBI interventions [10, 16], underscore the need to reconsider the relationship between fidelity, adaptation and facilitator skill.

While evidence-based programs are often developed with significant attention to ensuring they are culturally tailored and relevant for the target population, less attention is paid to how they will be received by those who will ultimately implement them. In future research, greater consideration of the role and experiences of facilitators should be incorporated into the intervention design. For example, evaluations could include a measure of facilitator skills through both direct observation of facilitator–participant interactions and participant feedback. Finally, more rigorous research and program evaluation studies are needed to determine which aspects of programs truly are responsible for their effectiveness. Without a clear understanding of what is truly necessary for a program to produce its desired effects, core elements will continue to be vague and inconsistent, further undermining frontline providers’ ability to make informed decisions about how to best adapt a packaged and specifically manualized curriculum to meet their needs and retain its effectiveness.

Funding

National Institutes of Mental Health (grant numbers R01-MH089828 and P30-MH52776).

Conflict of interest statement

None declared.

References

  • 1.Berman P, McLaughlin MW. Implementation of educational outcomes. Educ Forum 1976; 40: 347–70. [Google Scholar]
  • 2.Bopp M, Saunders RP, Lattimore D. The tug-of-war: fidelity versus adaptation throughout the health promotion program life cycle. J Prim Prev 2013; 34: 193–207. [DOI] [PubMed] [Google Scholar]
  • 3.Boruch RR, Gomez H. Sensitivity, bias, and theory in impact evaluation. Prof Psychol 1977; 8: 411–33. [Google Scholar]
  • 4.Calsyn R, Tornatzky LG, Dittmar S. Incomplete adoption on an innovation: the case of goal attainment scaling. Evaluation 1977; 4: 128–30. [Google Scholar]
  • 5.Dusenbury L, Brannigan R, Hansen WB. et al. Quality of implementation: developing measures crucial to understanding the diffusion of preventive interventions. Health Educ Res 2005; 20: 308–13. [DOI] [PubMed] [Google Scholar]
  • 6.Hansen WB. Introduction to the special issue on adaptation and fidelity. Health Educ 2013; 113: 260–3. [Google Scholar]
  • 7.Dane AV, Schneider BH. Program integrity in primary and early secondary prevention: are implementation effects out of control? Clin Psychol Rev 1998; 18: 23–45. [DOI] [PubMed] [Google Scholar]
  • 8.McKleroy VS, Galbraith JS, Cummings B. et al. Adapting evidence-based behavioral interventions for new settings and target populations. AIDS Educ Prev 2006; 18: 59–73. [DOI] [PubMed] [Google Scholar]
  • 9.Dusenbury L, Brannigan R, Falco M. et al. A review of research on fidelity of implementation: implications for drug abuse prevention in school settings. Health Educ Res 2003; 18: 237–56. [DOI] [PubMed] [Google Scholar]
  • 10.Harshbarger C, Simmons G, Coelho H. et al. An empirical assessment of implementation, adaptation, and tailoring: the evaluation of CDC’s national diffusion of VOICES/VOCES. AIDS Educ Prev 2006; 18: 184–97. [DOI] [PubMed] [Google Scholar]
  • 11.Rohrbach LA, Grana R, Sussman S. et al. Type II translation: transporting prevention interventions from research to real-world settings. Eval Heal Prof 2006; 29: 302–33. [DOI] [PubMed] [Google Scholar]
  • 12.Klimes-Dougan B, August GJ, Lee CYS. et al. Practitioner and site characteristics that relate to fidelity of implementation: the early risers prevention program in a going-to-scale intervention trial. Prof Psychol Res Pract 2009; 40: 467–75. [Google Scholar]
  • 13.Cunningham SD, Card JJ. Realities of replication: implementation of evidence-based interventions for HIV prevention in real-world settings. Implement Sci 2014; 9: 5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Moore J, Bumbarger B, Cooper BR. Examining adaptations of evidence-based programs in natural contexts. J Prim Prev 2013; 34: 147–61. [DOI] [PubMed] [Google Scholar]
  • 15.Lee CYS, August GJ, Realmuto GM. et al. Fidelity at a distance: assessing implementation fidelity of the early risers prevention program in a going-to-scale intervention trial. Prev Sci 2008; 9: 215–29. [DOI] [PubMed] [Google Scholar]
  • 16.Galbraith JS, Stanton B, Boekeloo B. et al. Exploring implementation and fidelity of evidence-based behavioral interventions for HIV preventions: lessons learned from the focus on kids diffusion case study. Heal Educ Behav 2008; 36: 532–49. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Palinkas LA, Schoenwald SK, Hoagwood K. et al. ; Research Network on Youth Mental Helath. An ethnographic study of implementation of evidence-based treatments in child mental health: first steps. Psychiatr Serv 2008; 59: 738–46. [DOI] [PubMed] [Google Scholar]
  • 18.Lemmens LC, Kerkkamp HE, van Klei WA. et al. Implementation of outpatient preoperative evaluation clinics: facilitating and limiting factors. BJA Br J Anaesth 2008; 100: 645. [DOI] [PubMed] [Google Scholar]
  • 19.Payne AA, Eckert R. The relative importance of provider, program, school, and community predictors of the implementation quality of school-based prevention programs. Prev Sci 2010; 11: 126–41. [DOI] [PubMed] [Google Scholar]
  • 20.Bearman SK, Weisz JR, Chorpita BF. et al. More practice, less preach? the role of supervision processes and therapist characteristics in EBP implementation. Adm Policy Ment Health 2013; 40: 518–29. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.DiClemente RJ, Wingood GM. A randomized controlled social skills trial: an HIV sexual risk-reduction intervention among young adult African-American women. JAMA 1995; 274: 1271–6. [PubMed] [Google Scholar]
  • 22.Aarons GA, Fettes DL, Flores LE. et al. Evidence-based practice implementation and staff emotional exhaustion in children’s services. Behav Res Ther 2009; 47: 954–60. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Kramer TL, Burns BJ. Implementing cognitive behavioral therapy in the real world: a case study of two mental health centers. Implement Sci 2008; 3: 14. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Bernard HR. Research Methods in Anthropology: Qualitative and Quantitative Approaches. 5th ed. Lanham, MD: AltaMira, 2011. [Google Scholar]
  • 25.VERBI GmbH. MAXQDA: The Art of Data Analysis. Berlin: VERBI GmbH, 2011. [Google Scholar]
  • 26.Strauss A, Corbin JM. Basics of Qualitative Research: Grounded Theory and Procedures and Techniques. Newbury Park: Sage, 1990. [Google Scholar]
  • 27.Aarons GA, Sawitzky AC. Organizational culture and climate and mental health provider attitudes toward evidence-based practice. Psychol Serv 2006; 3: 61–72. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Aarons GA. Mental Health Provider Attitudes toward Adoption of Evidence-Based Practice: the Evidence-Based Practice Attitude Scale (EBPAS). Ment Health Serv Res 2004; 6: 61–74. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Ball S, Bachrach K, DeCarlo J. et al. Characteristics, beliefs, and practices of community clinicians trained to provide manual-guided therapy for substance abusers. J Subst Abuse Treat 2002; 23: 309–18. [DOI] [PubMed] [Google Scholar]
  • 30.Bellg AJ, Borrelli B, Resnick B. et al. Enhancing treatment fidelity in health behavior change studies: best practices and recommendations from the NIH Behavior Change Consortium. Health Psychol 2004; 23: 443–51. [DOI] [PubMed] [Google Scholar]
  • 31.Wingood GM, DiClemente RJ. The ADAPT-ITT Model: a novel method of adapting evidence-based HIV interventions. J Acquir Immune Defic Syndr 2008; 47: S40–6. [DOI] [PubMed] [Google Scholar]
  • 32.Bartholomew LK, Parcel GS, Kok G. Intervention mapping: a process for developing theory and evidence-based health education programs. Heal Educ Behav 1998; 25: 545–63. [DOI] [PubMed] [Google Scholar]
  • 33.Glasgow R, Vogt T, Boles S. Evaluating the public health impact of health promotion interventions: The RE-AIM Framework. Am J Public Health 1999; 89: 1322–7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Owczarzak JT, Dickson-Gomez J. Providers’ perceptions of and receptivity toward evidence-based HIV prevention interventions. AIDS Educ Prev 2011; 23: 105–17. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Van Daele T, Van Audenhove C, Hermans D. et al. Empowerment implementation: enhancing fidelity and adaptation in a psycho-educational intervention. Health Promot Int 2014; 29(2): 212–22. [DOI] [PubMed] [Google Scholar]
  • 36.Chorpita BF, Daleiden EL, Weisz JR. Modularity in the design and application of therapeutic interventions. Appl Prev Psychol 2005; 11: 141–56. [Google Scholar]
  • 37.Lightfoot M, Rotheram-Borus MJ, Tevendale H. An HIV-preventive intervention for youth living with HIV. Behav Modif 2007; 31: 345–63. [DOI] [PubMed] [Google Scholar]
  • 38.Fuller TR, Brown M, King W. et al. The SISTA pilot project: understanding the training and technical assistance needs of community-based organizations implementing HIV prevention interventions for African American women–implications for a capacity building strategy. Women Health 2007; 46: 167–86. [DOI] [PubMed] [Google Scholar]
  • 39.Dworkin SL, Pinto RM, Hunter J. et al. Keeping the spirit of community partnerships alive in the scale up of HIV/AIDS prevention: critical reflections on the roll out of DEBI (Diffusion of Effective Behavioral Interventions),”. Am J Community Psychol 2008; 42: 51–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40.Prather C, Fuller TR, King W, et al. Diffusing an HIV prevention intervention for African American women: integrating afrocentric components into the SISTA diffusion strategy. AIDS Educ Prev 2006; 18(SupplA): 149–60. [DOI] [PubMed] [Google Scholar]
  • 41.Sapiano TN, Moore A, Kalayil EJ. et al. Evaluation of an HIV prevention intervention designed for African American Women: results from the SISTA community-based organization behavioral outcomes project. AIDS Behav 2013; 17: 1052–67. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Health Education Research are provided here courtesy of Oxford University Press

RESOURCES