ABSTRACT
Effective implementation strategies are needed to improve the adoption of evidence-based psychotherapy in primary care settings. This study provides pilot data on the test of an implementation strategy conducted as part of a multisite randomized controlled trial examining a brief cognitive-behavioral therapy versus usual care for medically ill patients in primary care, using a hybrid (type II) effectiveness/implementation design. The implementation strategy was multifaceted and included (1) modular-based online clinician training, (2) treatment fidelity auditing with expert feedback, and (3) internal and external facilitation to provide ongoing consultation and support of practice. Outcomes included descriptive and qualitative data on the feasibility and acceptability of the implementation strategy, as well as initial indicators of clinician adoption and treatment fidelity. Results suggest that a comprehensive implementation strategy to improve clinician adoption of a brief cognitive-behavioral therapy in primary care is feasible and effective for reaching high levels of adoption and fidelity.
KEYWORDS: Primary care, Hybrid effectiveness-implementation designs, Cognitive behavioral therapy, Veterans, Anxiety, Depression
BACKGROUND
Recent national healthcare reform (i.e., The Patient Protection and Affordable Care Act of 2010) encourages expansion of patient-centered medical homes, providing unparalleled opportunities for integrated behavioral health care [1]. The Veteran’s Health Administration (VA) has been a leader in this effort [2], with Primary Care-Mental Health Integration (PC-MHI) services mandated as part of its mental health treatment standards [3]. However, behavioral health providers in both VA and non-VA primary care settings are struggling to integrate evidence-based psychotherapies into their current practice patterns [4]. More work is needed to improve adoption and dissemination of evidence-based psychotherapies in the primary care arena.
With mental health treatment increasingly delivered in primary care, it is important to consider factors that affect adoption of evidence-based psychotherapies (EBPs). Traditional evidence-based approaches, such as cognitive-behavioral therapies (CBT) for depression and anxiety [5, 6], are unlikely to be adopted in primary care [4] because of treatment length (12–16 sessions) and session duration (45–50 min). Additionally, traditional evidence-based approaches are often comprehensive, including detailed assessment, case conceptualization, and treatment focused on broad mental health outcomes. To adapt these EBPs for the primary care setting, clinicians often use only selected components of traditional EBPs [4] over a briefer course of treatment (e.g., three to seven sessions) [2, 4].
To address the emerging primary care mental health practice needs, abbreviated psychotherapies such as brief CBT (bCBT) and problem-solving therapy for depression and anxiety have been developed. Three recent systematic reviews and meta-analyses found a moderate level of support for brief psychotherapies [7–9]. Emerging support also exists for bCBT targeting depression, anxiety, and physical health quality of life in medically ill primary care patients [10].
Although these brief psychotherapies for primary care settings demonstrate a moderate level of efficacy [7–9], their adoption into clinical practice has been slow [4]. This is unsurprising, given the challenges of encouraging providers and systems to adopt other evidence-based psychotherapies [11]. Clinician knowledge, motivation, and perceived consistency of the intervention practices with professional scope of practice are important adoption characteristics [12] but are likely insufficient for adoption in frontline practice. For example, the literature has shown that, even for clinicians who self-evaluate as competent in delivering a particular treatment (i.e., cognitive-behavioral therapy), additional support is often necessary to ensure fidelity of treatment delivery [13]. Therefore, effective implementation strategies that use generalizable methods, improve knowledge and skill, and seek to facilitate and increase adoption of practice changes [14] are likely necessary to prevent the unfortunate and common experience of empirically supported treatments failing to translate from clinical trials to clinical practice [15].
Implementation strategies involving multiple components targeting patients, providers, and system-level factors tailored uniquely to an individual site are more effective than a singular, generic strategy at increasing adoption, intervention fidelity, and sustainability [16–18]. Common components of effective implementation strategies include early and continual engagement of leadership, engagement and support of clinicians targeted for change (e.g., psychologists, social workers physician assistants), performance monitoring with feedback, system changes, administrative and technical support, and a conceptual model to guide the implementation process [19]. Additionally, the same clinical interventions that promote change in psychotherapy practice can be used as implementation strategies to promote change. For example, positive reinforcement, problem solving, motivational interviewing, “nonspecific factors” that promote collaborative relationships, and other strategies can be used in clinician facilitation [19–21]. Facilitation or, specifically, the use of strategies uniquely tailored and applied at the right time in response to the needs of a site to promote practice change, has demonstrated significant importance in prior implementation efforts [14, 19, 22]. Facilitators may be persons either external or internal to the implementation site; in either case, they perform similar tasks (e.g., problem-solving support) [14] and are especially effective in combination for multisite implementation efforts [14, 23, 24]. External facilitation alone has been identified as a particularly potent implementation strategy in complex healthcare settings. External facilitators are persons affiliated with the research study team (or some other outside entity) who monitor and respond to the unique implementation needs of the site as they occur, collaboratively working with site personnel by providing real-time coaching and problem-solving support [13, 14, 19]. However, the implementation literature remains in its infancy, and more research needs to empirically develop and guide use of such strategies.
This report provides data from a pilot study examining the feasibility and acceptability of a multifaceted implementation strategy embedded within an ongoing, large, multisite, randomized bCBT trial for medically ill patients with symptoms of depression and/or anxiety in an integrated primary care setting. Preliminary treatment fidelity and adoption outcomes are provided as additional measures of the feasibility of this implementation strategy.
METHODS
This pilot study was conducted as part of a larger, ongoing randomized controlled trial employing a hybrid (type II) effectiveness/implementation design to compare bCBT and usual care for medically ill patients in primary care with significant anxiety and/or depression at two large VA hospitals (Houston VA Medical Center [HOU] and Oklahoma City VA Medical Center [OK]). This study was approved and monitored for compliance with ethical research practices by the Internal Review Boards at both study sites. Outcomes include patient factors, including depression, anxiety, and physical disease quality of life, as well as implementation outcomes related to treatment engagement, adherence, and fidelity [25]. In brief, the study is recruiting 320 patients with functionally impairing chronic obstructive pulmonary disease (COPD; Medical Research Council dyspnea scale [26], cut off score of ≥3) and/or chronic heart failure (CHF; New York Heart Association functional classification [27], cut off score of ≥2), who report clinically significant symptoms of depression (Patient Health Questionnaire-9 [28], cut off score of ≥10) and/or anxiety (Beck Anxiety Inventory [29], cut off score of ≥16). Participants are excluded only for patient (e.g., cognitive disorders) or clinical factors (e.g., concurrent psychotherapy), rendering use of a bCBT intervention or treatment in a primary care setting inappropriate. Eligible participants are randomly assigned to either (1) a usual care control condition and receive feedback about their elevations in depression and/or anxiety symptoms and relevant educational materials or (2) a bCBT condition, namely, Adjusting to Chronic Conditions with Education, Support, and Skills (ACCESS), delivered by PC-MHI providers trained in bCBT. Study staff directly solicited clinician participation at both study sites through attendance at PC-MHI staff meetings and through individual contacts and meetings with PC-MHI leadership and staff. Prior to participation, clinicians signed a written informed consent form that explicitly stated that information about clinicians’ performance would not be shared with supervisors or other clinic staff. Readers interested in additional details about patient-recruitment procedures for this ongoing trial are referred to Cully and colleagues [25].
The ACCESS Intervention
ACCESS is a modular bCBT delivered over six 30- to 45-min sessions and two 10- to 15-min telephone-booster sessions (see Cully et al. [30]). A clinician manual and companion patient workbook are used to structure and facilitate treatment. Additionally, patients receive disease-specific (COPD or CHF) educational materials. Following the first session, patients can complete subsequent sessions by telephone. Treatment begins with two core sessions, intended to introduce bCBT, allow patient–provider relationship building and tailor treatment to each patient. Three elective self-management modules are delivered during sessions 3–5, including (A) exercise, nutrition, and managing a chronic health care condition; (B) using thoughts to improve wellness; (C) increasing pleasant events; and (D) learning how to relax. The final treatment session reviews and consolidates treatment gains.
The implementation strategy
The implementation strategy was to increase the acceptability, adoption, and fidelity of ACCESS, as used by PC-MHI clinicians. Its development was informed by the Promoting Action on Research Implementation (PARIHS) in Health Services framework [31, 32]. The PARIHS framework describes successful implementation as a function of three facets: the type and nature of evidence, the contextual qualities of the environment, and the way in which implementation is facilitated. These PARIHS facets provided the foundation to acquire information from patients, clinicians, and clinic leaders to inform the development of the implementation strategy (for additional detail see Cully et al. [25]). Additionally, a Study Advisory Council comprising multiple VA PC-MHI program stakeholders (i.e., a team of VA researchers, clinicians, clinical managers, and regional and national mental health leaders) provided early and ongoing feedback about the implementation strategy, which (see Table 1) comprised three main components, online clinician training, treatment-fidelity auditing and feedback, and internal and external facilitation to provide ongoing consultation and support of practice.
Table 1.
Implementation interventions | Description |
---|---|
ACCESS (bCBT) online clinician training—www.vaprojectaccess.org | Comprehensive multimedia bCBT training program |
Optional print-based foundational materials | |
Organization that is module-based to allow increased flexibility of clinician self-assessed needs | |
Real-time feedback about comprehension of skills/techniques | |
Audit and feedback | Treatment fidelity audit and feedback of session audio recordings from bCBT expert |
Measures using standard rating scale (i.e., Adherence and Competency Evaluation (ACE) rating forms [28]) | |
Provision for clinicians’ first study participants and regularly thereafter | |
Increased audit and feedback reserved for ratings falling below adequate performance, as determined by ACE rating forms | |
Facilitation | External facilitators (i.e., members of the project staff): Regularly engaged study clinicians and clinic leadership through formal (e.g., regular group meetings) and informal (e.g., clinician’s emailing study staff about concerns) methods of communication |
Internal facilitators (i.e., the PC-MHI directors): Addressed site specific clinician and system concerns collaboratively with external facilitators in an effort to promote adoption |
bCBT brief cognitive-behavioral therapy, PC-MHI Primary Care-Mental Health Integration
Online CBT training with assessment and feedback
An online ACCESS training program was made available to clinicians before and during the trial (www.vaprojectaccess.org) [33]. Soon after the current trial began, we decided to increase the flexibility of online training requirements to allow each clinician to engage in all or only targeted areas of training at her/his discretion, in light of varying degrees of experience in providing bCBT to medically ill primary care patients. Online training included a modular multimedia platform of narrated PowerPoint™ slideshows, expert modeling through audio case examples, and exit quizzes to evaluate and provide real-time feedback about comprehension of skills/techniques. Each ACCESS treatment session (i.e., session 1, session 2, the three elective self-management modules, and session 6) was available as an individual module. In total, the online training was expected to take approximately 8 h to complete. Consequently, up to eight continuing education credits were available to clinicians for time spent engaging in the online training. Optional print-based “Concept Review” materials provided foundational knowledge for less-experienced clinicians. These documents reviewed COPD, CHF, the foundations of CBT, psychotherapy and the importance of the therapeutic relationship, and ways to use a therapy manual.
Audit and feedback of treatment adherence and competence
To measure and improve clinician treatment fidelity, or the degree to which the intervention was delivered as intended, a bCBT expert clinician (external to the PC-MHI clinics), audited all clinicians’ first ACCESS patient session audio recordings. The Adherence and Competency Evaluation (ACE) rating forms (see example in Fig. 1) provided the formal rating system [34] used to evaluate session audio recordings. Written and/or verbal feedback was typically provided by the bCBT expert following the second treatment session and then again after the final active treatment session. Audit and feedback occurred regularly, with two to four randomly extracted session recordings reviewed in 4- to 6-month intervals. If ACE adherence and/or skillfulness scores fell below acceptable standards (defined as a “four” on one or both of the eight-point Likert scales for overall adherence and skillfulness; see Fig. 1), the frequency of audit and feedback increased until that clinician’s ACE ratings improved to a rating of five or better for both primary indices. To mimic real-world conditions, no clinicians were removed from the trial to allow a richer understanding of real-world practice behaviors and response to feedback.
Internal and external facilitation to provide ongoing consultation and support of practice
A blended model of internal facilitators (PC-MHI directors or clerical staff not affiliated with the study providing guidance to other PC-MHI clinicians) and external facilitators (study clinical staff members interacting with PC-MHI clinicians) was used to improve ACCESS adoption and treatment fidelity. External facilitation, the primary facilitation method, began with structured introductory materials for all clinicians, including a packet containing treatment materials (e.g., a patient workbook), other procedural documents, and access to the training website. External facilitation efforts were both formal and informal. While external facilitation varied slightly between the two intervention sites, the facilitator attempted to be responsive to each site’s unique needs. Common methods of externally facilitating across sites included regular individual or group meetings with clinicians (i.e., group facilitation was made available approximately every other week at HOU and every 2–3 months at OKC, and more informal support (i.e., one-on-one phone or in-person support) was initiated by the study team or by the clinician and available as-needed to provide individual clinician support. At HOU, consultation on individual cases was generally provided through informal support, whereas group facilitation meetings were often used to address practice barriers and/or logistical concerns as they arose. In addition to these issues, consultation on individual cases was also often provided during group facilitation meetings at OKC. At both sites, group meetings were structured in a way such that clinicians were provided with general study updates and clinical practice recommendations as identified by the audit and feedback expert reviewer, with most of the time spent allowing clinicians to share challenges and successes with one another. Facilitators regularly monitored the activity of ACCESS study patients (including factors like wait times, treatment attendance, time between sessions, the focus of each treatment session per chart review, and number of study participants in each clinician’s panel), and intervened where possible to streamline and improve clinical processes. Clinicians also received feedback from external facilitators about study progress (e.g., number of study participants engaged in treatment, average number of study patients assigned to each clinician) individually and as a group to promote a collaborative relationship.
Facilitation is central to the PARIHS framework [31] and, put simply, describes efforts to make things easier for others (p. 152) [35]. Facilitation efforts were primarily external. However, internal facilitators, namely, the PC-MHI clinic directors at both sites, were engaged early on and routinely thereafter by external facilitators to collaboratively address clinician, organizational, and procedural concerns. While not directly monitored or evaluated in this pilot trial, external facilitators sought to use behaviors consistent with motivational interviewing (e.g., collaboration, acceptance, compassion, and evocation [21]) to assist PC-MHI staff in moving toward using bCBT in their daily practice.
Measurement of preliminary implementation outcomes
Measures of the adoption, fidelity, and feasibility/acceptability of ACCESS were collected and used to evaluate implementation outcomes. Adoption was operationally defined as the uptake of bCBT by sites and clinicians [36], with associated variables, including the percent of clinicians who engaged in training, completed training, and provided the protocol treatment. Descriptive data were collected with a pretraining questionnaire regarding the clinician characteristics and training background (see Table 2), while qualitative (i.e., clinician open-ended response feedback on a posttraining questionnaire) and quantitative (i.e., ACE treatment fidelity ratings for each clinician’s first study patient and clinician responses to Likert-scale questions on a posttraining questionnaire) data were used to evaluate implementation. The posttraining questionnaire assessed clinicians’ views on the feasibility and acceptability of the implementation strategy, using questions evaluating clinician engagement and satisfaction with the three components of the strategy (see Table 1) after approximately 4 months of study enrollment. Finally, clinician use of the multiple implementation components provides additional data about their benefit and feasibility.
Table 2.
Entire sample (N = 9) | ||
---|---|---|
Type of mental health provider | Psychologist | 4 (44.4 %) |
Postdoctoral psychology fellow | 3 (33.3 %) | |
Predoctoral psychology intern | 1 (11.1 %) | |
Physician assistant | 1 (11.1 %) | |
Time as a mental health provider | <1 year | 4 (44.4 %) |
1–3 years | 2 (22.2 %) | |
4–5 years | 3 (33.3 %) | |
Time affiliated with PC-MHI | <1 year | 4 (44.4 %) |
1–2 years | 5 (55.6 %) | |
PC-MHI hours per week | <20 h | 4 (44.4 %) |
30–39 h | 1 (11.1 %) | |
40 h | 4 (44.4 %) | |
Advanced training in CBT | Yes | 7 (77.8 %) |
Self-assessed in CT or CBT | Yes | 8 (88.9 %) |
Self-assessed adequate training and competence in BT or BAT | Yes | 4 (44.4 %) |
Length of time spent conducting CBT in a staff position | <1 year or never | 5 (55.6 %) |
1–3 years | 1 (22.2 %) | |
4–5 years | 3 (33.3 %) |
CBT cognitive-behavioral therapy, CT cognitive therapy, BT behavioral therapy, BAT behavioral activation therapy
RESULTS
Preliminary adoption outcomes
Both HOU and OKC VA primary care clinics agreed to participate in the study, with enrollment of clinicians beginning in March and September of 2011, respectively. Descriptive data from the pretraining survey for the nine clinicians who agreed to participate without delay (N = 9) are reported in Table 2. Across sites, five of the seven PC-MHI staff clinicians who typically provide psychotherapy agreed to participate and accept one or two ACCESS patients monthly. The two clinicians who declined initial participation cited current workload concerns, but indicated a willingness to participate in the future. Additionally, three postdoctoral psychology fellows and one predoctoral psychology intern consented to participate. Finally, the study also targeted PC-MHI clinicians who had a scope of practice that included psychotherapy but who were not currently providing such care. Consequently, two additional PC-MHI clinicians who did not routinely provide psychotherapy as part of their PC-MHI clinic were approached, with one additional clinician deciding to participate.
Preliminary fidelity outcomes
Each clinician had an average of four audio-recorded sessions from his/her initial patient (range = 3–6) reviewed and evaluated. On average, the bCBT expert provided feedback one to three times (= 1.5) for each clinician’s initial ACCESS patient. While two clinicians each had one session not scored for reasons unrelated to treatment fidelity (e.g., a partially inaudible recording), all other sessions were rated at or better than the minimally acceptable standard for treatment adherence and skillfulness (see Table 3). On average, clinicians were evaluated as having “good” adherence (range = “moderate”–“very good/excellent”) and skillfulness (range = “moderate”–“very good/excellent”) ratings.
Table 3.
Session 1 | Session 2 | Elective A | Elective B | Elective C | Elective D | Session 6 | Across sessions | |
---|---|---|---|---|---|---|---|---|
Chronic disease and stress (n = 9) | Personal impact and increasing control (n = 9) | Taking control of physical health (n = 5) | Improving thoughts (n = 3) | Increasing activity (n = 3) | Learning to relax (n = 3) | Review (n = 3) | ||
Adherence | 6.67 (0.50) | 7.00 (0.82) | 5.80 (1.64) | 6.75 (0.50) | 6.67 (1.15) | 7.33 (0.58) | 6.67 (1.53) | 6.68 (0.98) |
Skill | 6.22 (0.67) | 6.14 (0.90) | 6.20 (0.84) | 6.25 (0.5) | 6.00 (1.0) | 6.33 (0.58) | 6.33 (2.08) | 6.21 (0.84) |
ACE Adherence and Skill ratings are scored on a 0–8 scale, with anchors on 0 (very poor), 2 (poor), 4 (moderate), 6 (good), and 8 (very good/excellent); each ACE rating provides average scores with standard deviations in parentheses
Feasibility and acceptability
Online CBT training with assessment and feedback
About half (n = 4) of the clinicians completed all training modules, and all (n = 9) completed at least one. Most reported spending between 4 and 6 h (n = 4) completing online training; however, two spent more than 10 h. Although most completed the training at work (n = 5) and felt that its flexibility allowed little disruption of their overall job responsibilities (n = 5), most critiqued the online training as too long or too detailed (n = 6). Nevertheless, except for one who felt neutral, clinicians (n = 8) thought the online training was important for their training and skill development. Although most (n = 8) indicated that they would recommend this training for clinician coworkers and/or trainees, one stated that, because it was too detailed and lengthy, she would not recommend the program “as is.”
Other critiques included frustration with technology difficulties (i.e., lengthy time for a module to load on work computers) and advanced CBT clinicians’ lower self-assessed need for training. Reported benefits of online training included its flexibility as to when and where it could be completed, adaptability to clinician needs, and tailoring of content toward bCBT for patients with COPD and/or CHF. In addition, clinicians rated “user friendliness” of the training as fair (n = 1), good (n = 6), or very good (n = 2) on a five-point Likert scale anchored by poor and excellent.
Clinicians also provided feedback about the impact of provider and patient print-based materials. All felt the manual was important for their knowledge and skill development. Similarly, most viewed the patient workbook as important (n = 7), although one did not, and another was indifferent about its impact. While three found the patient educational materials on COPD and CHF helpful for their knowledge and skill development, one did not, and a majority were indifferent about their impact (n = 5).
Ongoing consultation and support of practice
External facilitation largely focused on enhancing practice and utilization of bCBT through discussions about practice barriers and facilitators. On the basis of feedback from the clinician engagement and satisfaction survey, external facilitation also provided encouragement and appreciation of participating clinicians. Clinicians at both sites endorsed feeling supported in their efforts to apply ACCESS.
Both individual and group facilitation efforts were provided. In HOU, all but one clinician were able to attend at least one group facilitation meeting with study staff and other ACCESS clinicians, but scheduling conflicts were a barrier to attendance for three. HOU clinicians attended an average of 2.8 group meetings each (range = 1–5 sessions) in the 4 months following completion of online training. In OKC, all three clinicians attended one or two group facilitation meetings in the 4 months following completion of online training. Informal external facilitation was also provided individually (in person, by phone, or by email) with clinicians at both sites as needed. On the post-training questionnaire, all clinicians viewed having the ability to meet with ACCESS investigators and other ACCESS clinicians as an important part of their training and professional development, and most (n = 6) felt that the frequency of facilitation meetings was appropriate. Additional free-text comments provided by a couple of clinicians suggest that they experienced external facilitation as responsive, accommodating, and encouraging. One appreciated that study staff “…were exceedingly polite, cheerful, and appreciative of the work clinicians put in.”
Audit and feedback of treatment adherence and competence
Most clinicians (n = 7) viewed receiving session feedback from a CBT expert as an important part of ACCESS training and professional development, although two felt neutral about it. Although most indicated that audit and feedback were positive and valuable, some advanced CBT clinicians (n = 2) viewed it as less beneficial, given their existing competence in CBT. Overall, nearly all (n = 8) viewed the expert feedback training component as “good” or better; one rated it as “fair.”
DISCUSSION
PC-MHI clinicians often struggle to incorporate evidence-based interventions into their clinical practice [4]. The pilot testing of our multicomponent implementation strategy suggests a benefit of providing clinicians with support above and beyond initial training to improve their ability to incorporate these interventions into their clinical practice. The implementation strategy pilot tested here proved acceptable and feasible and led to high levels of intervention fidelity.
Implementation successes
Our current findings further support the need of a multilevel adaptive implementation strategy capable of being responsive to individual clinics and clinicians. For example, differences in scheduling procedures and number of patient encounters resulted in a need to tailor scheduling procedures to each clinic. In OKC, clinicians had no difficulty contacting and scheduling newly assigned ACCESS patients and usually did so on a session-by-session basis. In contrast, external facilitators worked with the PC-MHI director and clerical staff to enact a coordinated scheduling effort at the HOU VA. Due to a high demand for services, clinician schedules were often congested, making weekly treatment sessions difficult. The study team worked with the PC-MHI director to address this problem and identified a mechanism for scheduling patients for all ACCESS treatment appointments upon entry into the bCBT intervention. ACCESS external facilitators collaborated to work with the PC-MHI director or a scheduling clerk to preschedule all ACCESS patients’ treatment sessions before the first session (also referred to as multibooking [37]). It is important to note that the PC-MHI director was highly invested in our study, and this was essential for using this solution for this scheduling problem.
Consistent with the recommendations of Stetler and colleagues [14], external facilitation was critical to our implementation strategy. Qualitative feedback from clinicians suggests that external facilitation led to development of a partnership between clinical and study staff that fostered improved delivery and clinician investment in implementation of this bCBT in primary care. In addition, our early engagement of stakeholders forged a collaborative working relationship with leadership and clinicians at each site, provided a solid foundation that allowed us to better anticipate and respond to implementation challenges, and was essential to our current success.
Possible areas for refinement
Clinician feedback regarding satisfaction suggests the online training and frequency of audit and feedback may need to be more flexible for advanced CBT clinicians. A screening process might help determine what online skill modules would best serve each clinician’s unique training needs, in that experienced clinicians may require less exposure to foundational CBT training components. Additionally, the audit-and-feedback component would likely benefit from more flexibility and less oversight of expert clinicians rather than a one-size-fits-all approach toward treatment fidelity. Moreover, because this approach to measuring treatment fidelity requires a substantial investment of time (approximately 3.5 h per clinician for the current trial) and resources, ongoing and comprehensive feedback may prove difficult for longer term sustainability. Future research and implementation efforts should explore alternate fidelity procedures, possibly the inclusion of time-limited and/or tailored approaches based on clinician expertise. Without a process to ensure minimum treatment fidelity, long-term sustainability of treatment implementation is questionable.
The ACCESS program was constructed with the ultimate goal of improving adoption and use of bCBT in primary care that would be feasible for wider dissemination [16]. As was seen within the trial, implementation strategies are likely to work best when they are consistent with existing clinical practices. Referral and scheduling adaptations made as part of the current trial may be unique to the clinical settings of the current trial and may also prove difficult to implement more broadly. Lastly, adoption of the ACCESS treatment among behavioral health providers within integrated primary care teams would likely benefit from process mechanisms promoting efforts to improve relational coordination and reciprocal learning among all members of patient-aligned care teams during implementation [38].
Limitations and future directions
While these results support the feasibility and acceptability of this piloted implementation strategy, future trials are necessary to compare implementation strategies and to conduct similar implementation approaches across larger and more diverse clinician populations and settings. A number of study limitations are important to note, including a small clinician sample size, lack of blinded ratings for the audit and feedback, the absence of a control/comparison group, and the potential for limited generalizability to other sites with different primary care mental health clinic practices and procedures. It is also important to note that almost all of the clinicians in our sample self-evaluated themselves as competent in CBT prior to training. Given the importance of implementation strategies to meet the individual needs of the intended stakeholder group, our study may include a set of highly trained clinicians, thus limiting the generalizability of findings to settings with clinicians without prior CBT training. While prior research suggests confirmation of clinician self-evaluated competency is necessary [12, 13], these initial competencies may have improved the “uptake” of the implementation strategy. However, it is notable that the ACCESS intervention was a challenging practice enhancement for almost all clinicians given its focus on medically ill veterans, the need to use a manualized treatment, and the focused and brief nature of the weekly individual psychotherapy sessions, which were infrequently being applied in the primary care setting. These important facets of the intervention meant that even those clinicians with prior CBT experience were challenged to leverage that knowledge to successfully implement the ACCESS intervention. For CBT novice clinicians, this meant additional work up front to grasp both the fundamentals of CBT in addition to these implementation challenges.
Future studies in other primary care clinics with other types of clinicians may provide important data to identify the active ingredients or important facets associated with the implementation approach. Dismantling trials could also answer many questions raised during the current trial, for example: (1) What facets of the implementation strategy were most potent and or necessary?, (2) do different types of PC-MHI clinicians benefit more than others from training and facilitation?, and (3) what is the optimal “dose” of these components for each type of provider? Additionally, to determine the generalizability of these findings, this implementation strategy would need to be tested at a non-VA primary care clinic setting. Finally, future trials should test strategies aimed at maintaining clinician use and treatment fidelity following successful implementation. To this end, future research is needed to examine how practices such as audit and feedback could be embedded into routine clinical practice. For example, should fidelity ratings be assigned to supervisors and included with staff-performance appraisals, or are these practices best left to an external expert who is not connected with a clinician’s performance appraisal? Such distinctions are important and may impact how clinicians view audit and feedback as a “job metric” or as a professional development opportunity.
In summary, this pilot investigation provides preliminary evidence for the success of a multifaceted implementation strategy on improving bCBT use in the primary care setting. Based on feedback from participants, clinician engagement was significantly related to perceiving participation as an opportunity for professional development and enhancement. Furthermore, the consistency of the clinical intervention with the scope of professional practice for the participating clinicians was likely a significant factor in overcoming barriers and facilitating use of the treatment in the primary care setting. Further trials are needed to determine how best to refine and disseminate implementation strategies to increase the use of evidence-based psychotherapies.
Acknowledgments
The authors would like to thank Ms. Sonora Hudson and Ms. Emma Welch for their thoughtful review and editing of this manuscript.
Funding
This material is based upon work supported by the Department of Veterans Affairs (HSR&D grant IIR 09-088). It was also partly supported by the Department of Veterans Affairs, Veterans Health Administration, Office of Research and Development, the Center for Innovations in Quality, Effectiveness and Safety (CIN 13-413), and the South Central Mental Illness, Research, Education, and Clinical Center. The views expressed in this article are those of the authors and do not necessarily reflect the position or policy of the Department of Veterans Affairs, the US government, or Baylor College of Medicine. None of these bodies played a role in study design; in the collection, analysis and interpretation of data; in the writing of the report; or in the decision to submit the article for publication.
Footnotes
Trial registration: NCT01149772 at http://www.clinicaltrials.gov/ct2/show/NCT01149772
Implications
Practice: Comprehensive implementation strategies hold the potential to improve clinician comfort and skill using standardized brief evidence-based psychotherapies delivered within the primary care setting.
Policy: Resources should be allocated to support multicomponent implementation strategies, including facilitative efforts to provide ongoing consultation and support of practice, to improve the adoption and fidelity of evidence-based psychotherapy practices in primary care settings.
Research: Research efforts should continue to test multicomponent implementation strategies to advance best practices for improving the utilization of evidence-based mental health treatments in primary care settings.
References
- 1.Nash JM, McKay KM, Vogel ME, Masters KS. Functional roles and foundational characteristics of psychologists in integrated primary care. J Clin Psychol Med Settings. 2012;19:93–104. doi: 10.1007/s10880-011-9290-z. [DOI] [PubMed] [Google Scholar]
- 2.Wray LO, Szymanski BR, Kearney LK, McCarthy JF. Implementation of primary care-mental health integration services in the Veterans Health Administration: program activity and association with engagement in specialty mental health services. J Clin Psychol Med Settings. 2012;19:105–116. doi: 10.1007/s10880-011-9285-9. [DOI] [PubMed] [Google Scholar]
- 3.Veterans Health Administration. Uniform Mental Health Services in VA Medical Centers and Clinics (VHA Handbook 1160.01). Washington: Department of Veterans Affairs; 2008.
- 4.Funderburk JS, Sugarman DE, Labbe AK, Rodrigues A, Maisto SA, Nelson B. Behavioral health interventions being implemented in a VA primary care system. J Clin Psychol Med Settings. 2011;18:22–29. doi: 10.1007/s10880-011-9230-y. [DOI] [PubMed] [Google Scholar]
- 5.National Institute for Health and Clinical Excellence (NICE). Depression. The Treatment and Management of Depression in Adults (updated edition) [CG90]. London: National Institute for Health and Clinical Excellence; 2009.
- 6.Stewart RE, Chambless DL. Cognitive behavioral therapy for adult anxiety disorders in clinical practice: a meta-analysis of effectiveness studies. J Consult Clin Psych. 2009;77(4):595–606. doi: 10.1037/a0016032. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Cuijpers P, van Straten A, van Schaik A, Andersson G. Psychological treatment of depression in primary care: a meta-analysis. B J Gen Pract. 2009;59:e51–e60. doi: 10.3399/bjgp09X395139. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Cape J, Whittington C, Buszewicz M, Wallace P, Underwood L. Brief psychological therapies for anxiety and depression in primary care: Meta-analysis and meta-regression. BMC Medicine. 2010;8:38. doi: 10.1186/1741-7015-8-38. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Nieuwsma JA, Trivedi RB, McDuffie J, Kronish I, Benjamin D, Williams JW., Jr . Brief Psychotherapy for Depression in Primary Care: A Systematic Review of the Evidence (VA-ESP Project #09-010) Washington: Department of Veterans Affairs, Veterans Health Administration; 2011. [PubMed] [Google Scholar]
- 10.Cully JA, Stanley MA, Deswal A, Hanania NA, Phillips LL, Kunik ME. Cognitive-behavioral therapy for chronic cardiopulmonary conditions: Preliminary outcomes from an open trial. Prim Care Companion J Clin Psychiatry 2010;12(4). Available at http://www.hsrd.research.va.gov/publications/esp/brief-psychotherapy-2011-REPORT.pdf. Published January, 2011. Accessed November 1, 2012. [DOI] [PMC free article] [PubMed]
- 11.McHugh RK, Barlow DH. The dissemination and implementation of evidence-based psychological treatments: a review of current efforts. Am Psychol. 2010;65:73–84. doi: 10.1037/a0018121. [DOI] [PubMed] [Google Scholar]
- 12.Cook JM, Schnurr PP, Biyanova T, Coyne JC. Apples don’t fall far from the tree: influences on psychotherapists’ adoption and sustained use of new therapies. Psychiatr Serv. 2009;60:671–676. doi: 10.1176/appi.ps.60.5.671. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Kauth MR, Sullivan G, Blevins D, et al. Employing external facilitation to implement cognitive behavioral therapy in VA clinics: A pilot study. Implement Sci. 2010;5:75. doi: 10.1186/1748-5908-5-75. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Stetler CB, Legro MW, Rycroft-Malone J, et al. Role of “external facilitation” in implementation of research findings: a qualitative evaluation of facilitation experiences in the Veterans Health Administration. Implement Sci. 2006;1:23. doi: 10.1186/1748-5908-1-23. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Glasgow RE, Lichtenstein E, Marcus AC. Why don’t we see more translation of health promotion research to practice? Rethinking the efficacy-to-effectiveness transition. Am J Public Health. 2003;93(8):1261–1267. doi: 10.2105/AJPH.93.8.1261. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovation in service organizations: systematic review and recommendations. Milbank Q. 2004;82(4):581–629. doi: 10.1111/j.0887-378X.2004.00325.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Grol R, Wensing M, Eccles M, editors. Improving Patient Care: The Implementation of Change in Clinical Practice. Oxford: Elsevier; 2005. [Google Scholar]
- 18.Stetler CB. Stetler Model. In: Rycroft-Malone J, Bucknall T, editors. Evidence-Based Practice Series. Models and Frameworks for Implementing Evidence-Based Practice: Linking Evidence to Action. Oxford: Wiley-Blackwell; 2010. pp. 51–82. [Google Scholar]
- 19.Kauth MR, Sullivan G, Cully J, Blevins D. Facilitating practice changes in mental health clinics: a guide for implementation development in health care systems. Psychol Serv. 2011;8(1):6–47. doi: 10.1037/a0022250. [DOI] [Google Scholar]
- 20.Meichenbaum D, Turk DC. Facilitating Treatment Adherence: A Practitioner’s Guidebook. New York: Plenum; 1987. [Google Scholar]
- 21.Miller WR, Rollnick S. Motivational Interviewing: Preparing People for Change. 3. New York: Guilford; 2012. [Google Scholar]
- 22.Harvey G, Loftus-Hills A, Rycroft-Malone, et al. Getting evidence into practice: the role and function of facilitation. J Adv Nurs. 2002;37(6):577–588. doi: 10.1046/j.1365-2648.2002.02126.x. [DOI] [PubMed] [Google Scholar]
- 23.Kirchner J, Edlund C, Henderson K, Daily L, Parker LE, Fortney JC. Using a multilevel approach to implement a Primary Care Mental Health (PCMH) program. Fam Syst Health. 2010;28:161–174. doi: 10.1037/a0020250. [DOI] [PubMed] [Google Scholar]
- 24.Sullivan G, Blevins D, Kauth MR. Translating clinical training into practice in complex mental health systems: toward opening the “black box” of implementation. Implement Sci. 2008;33:1–7. doi: 10.1186/1748-5908-3-33. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Cully JA, Armento MEA, Mott J, et al. Brief cognitive behavioral therapy in primary care: a hybrid type 2 patient randomized effectiveness-implementation design. Implement Sci. 2012;7(1):64. doi: 10.1186/1748-5908-7-64. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Fletcher CM, Elmes PC, Fairburn MB, Wood CH. The significance of respiratory symptoms and the diagnosis of chronic bronchitis in a working population. BMJ. 1959;2(5147):257–266. doi: 10.1136/bmj.2.5147.257. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Association NYH. Diseases of the Heart and Blood Vessels: Nomenclature and Criteria for Diagnosis. 6. Boston: Little, Brown and Co.; 1964. [Google Scholar]
- 28.Kroenke K, Spitzer RL, Williams JB. The PHQ-9: validity of a brief depression severity measure. J Gen Intern Med. 2001;16:606–613. doi: 10.1046/j.1525-1497.2001.016009606.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Beck A, Steer R. Beck Anxiety Inventory Manual. Harcourt Brace and Company: San Antonio; 1990. [Google Scholar]
- 30.Cully JA, Paukert A, Falco J, Stanley MA. Cognitive-behavioral therapy: innovations for cardiopulmonary patients with depression and anxiety. Cogn Behav Pract. 2009;16:394–407. doi: 10.1016/j.cbpra.2009.04.004. [DOI] [Google Scholar]
- 31.Kitson A, Rycroft-Maline J, Harvey G, McCormack B, Seers K, Titchen A. Evaluating the successful implementation of evidence into practice using the PARIHS framework: theoretical and practical challenges. Implement Sci. 2008;3(1):1. doi: 10.1186/1748-5908-3-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Stetler CB, Damschroder LJ, Helfrich CE, Hagedorn HJ. A guide for applying a revised version of the PARIHS framework for implementation. Implement Sci. 2011;6:99. doi: 10.1186/1748-5908-6-99. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Cully JA, Curry AD, Ryan SR, et al. Development of a computer-aided training program for brief cognitive-behavioral therapy in primary care. Acad Psychiatry. 2011; in press. [DOI] [PubMed]
- 34.Cully JA, Mignogna J, Stanley MA, Malik A, Zeno D, Willcockson I. Development and pilot testing of a standardized training program for a patient-mentoring intervention to increase adherence to outpatient HIV care. AIDS Patient Care and STDs. 2012;26:165–172. doi: 10.1089/apc.2011.0248. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 35.Kitson A, Harvey G, McCormack B. Enabling the implementation of evidence based practice: a conceptual framework. Qual Health Care. 1998;7:149–158. doi: 10.1136/qshc.7.3.149. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Bleza B, Toobert DJ, Gasgow RE. RE-AIM for Program Planning: Overview and Applications. Washington: Center for Healthy Aging; 2007. [Google Scholar]
- 37.Veterans Health Administration. Uniform Mental Health Services in VA Medical Centers and Clinics: Local Implementation of Evidence-based Psychotherapies for Mental and Behavioral Health Conditions (VHA Handbook 1160.05). Washington: Department of Veterans Affairs; 2008.
- 38.Noël PH, Lanham HJ, Palmer RF, Leykum LK, Parchman ML. The importance of relational coordination and reciprocal learning for chronic illness care within primary care teams. Health Care Manage Rev. 2013;38(1):20–28. doi: 10.1097/HMR.0b013e3182497262. [DOI] [PMC free article] [PubMed] [Google Scholar]