Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2022 Jan 1.
Published in final edited form as: Clin Superv. 2021 Jan 6;40(1):112–133. doi: 10.1080/07325223.2020.1870023

Strategies that Promote Therapist Engagement in Active and Experiential Learning: Micro-Level Sequential Analysis

EB Caron a, Teresa A Lind b,c, Mary Dozier d
PMCID: PMC8262085  NIHMSID: NIHMS1660072  PMID: 34248258

Abstract

Therapists’ active learning increases treatment fidelity, but research is needed on supervisory strategies to engage therapists in active learning. This study used sequential analysis to examine consultant behaviors associated with increased and decreased probability of eliciting therapists’ active learning. The study included 162 consultation sessions from 27 community therapists implementing Attachment and Biobehavioral Catch-up. Consultants’ client discussion, information provision, and modeling were associated with reduced likelihood of active learning. Consultants’ questions, engagement in active learning strategies, use of video, and silence were associated with greater likelihood of therapist active learning. These findings inform supervisors’ attempts to encourage active learning.

Keywords: Supervision, consultation, active learning, active ingredients, implementation, sequential analysis


About one in every five Americans suffers with mental illness (Institute of Medicine, 2015). Evidence-based treatments (EBTs) are seen as a solution to this mental health crisis (Institute of Medicine, 2015), because meta-analytic work has found that EBTs are more effective than treatment as usual (e.g., Watts et al., 2015; Weisz et al., 2017). However, it is challenging to bring EBTs from the research settings in which they are developed to community settings (Institute of Medicine, 2015). When EBTs are implemented in community settings, therapist fidelity to the EBT is often lower than in lab-based trials, which is thought to lead to reduced benefits for clients (Hulleman & Cordray, 2009; Weisz et al., 2013). The field of implementation science has developed to address these challenges and improve the process of integrating EBTs into community practice (Proctor et al., 2009).

Early implementation efforts focused on training therapists; however, more recent research suggests that although training workshops increase EBT knowledge and attitudes, there is limited evidence that they change in-session therapist behavior or fidelity to EBTs (Edmunds, Beidas et al., 2013; Herschell et al., 2010). Supervision (internal to a therapist’s agency) or consultation (externally provided by EBT trainers), beginning after training and as therapists start implementing EBTs, is critical in helping therapists achieve practice changes (Edmunds, Beidas et al., 2013). In a number of studies, researchers have found that training plus supervision/consultation, as compared with training alone, increases therapists’ fidelity to various therapeutic interventions (e.g., Henggeler et al., 2008; Schwalbe et al., 2014; Webster-Stratton et al., 2014). However, supervisory interventions are not equally effective in changing therapists’ practice. For example, specific delivery methods (Funderburk et al., 2015) and greater adherence to certain aspects of supervision (Schoenwald et al., 2009) have been linked to differences in therapist and client outcomes. These findings have led to calls to identify the “active ingredients” of supervision, the elements linked to therapists’ growth (Bailin et al., 2018). In order for clinical supervisors to be able to enhance their own practice, we need research that identifies what supervisory practices are linked to therapist outcomes, like improved EBT fidelity, as well as improved client outcomes (Nadeem et al., 2013).

Supervisory microskills are the “moment-to-moment activities” that supervisors use to support supervisee learning and therapeutic competence (James et al., 2008, p. 29). Recent research in supervisory microskills has focused on active learning, an interactive learning process that involves action and reflection (Beidas & Kendall, 2010). Initial findings regarding active learning emerged in research on therapist training, with training workshops that included methods like modeling, role play, and feedback on practice, found to be more effective than passive, didactic training workshops (Beidas & Kendall, 2010). Several studies have examined these practices within the context of supervision or consultation. Bearman et al. (2013) found that supervision with modeling and role play was associated with greater therapist practice implementation than supervision that involved discussion without active learning techniques. Edmunds, Kendall et al. (2013) also found that role play was associated with therapist competence gains, though only among therapists rated as more engaged in group consultation. In addition, others have found supervision or consultation that includes performance feedback, in which the supervisor or consultant codes fidelity based on session recording review and provides feedback, is more effective in promoting therapist fidelity or client outcomes than supervision or consultation as usual (Martino et al., 2016; Miller et al., 2004; Weck et al., 2017).

Supervisory strategies for encouraging active learning are theorized to lead to in-session therapist behavior change by engaging therapists in Kolb’s (1984) experiential learning cycle (Milne et al., 2008; Zorga, 1997). The experiential learning cycle includes four processes: experience, reflection, conceptualization, and experimenting (Kolb, 1984). Therapists’ experiencing occurs outside of supervision, in therapy sessions. In supervision, supervisors can encourage therapists’ reflection on and abstract conceptualization of the experience (Zorga, 1997). Role play is a way for therapists to experiment with new or revised therapeutic techniques during supervision. In line with Kolb’s model, supervisees’ reflection is widely considered critical in supporting therapists’ growth (e.g., Bennett-Levy & Thwaites, 2007; Johnston & Milne, 2012) and has been found to promote therapist skill development (McGillivray et al., 2015). One form of reflection is fidelity self-coding, in which therapists watch video recordings of their session and code their own fidelity (Garcia et al., 2017). Though self-coding or self-monitoring is utilized rarely in supervision (Milne et al., 2008), preliminary evidence suggests it may complement supervisors’ fidelity coding feedback to promote in-session behavior change (Caron & Dozier, 2019; Isenhart et al., 2014).

Review of session recordings is likely to enhance supervisors’ feedback even when it is not structured by fidelity coding and, for this reason, is recommended by clinical supervision guidelines in the fields of clinical and counseling psychology (Association for Counselor Education and Supervision, 2011; American Psychological Association, 2015). Therapists’ reports of session content may be positively self-biased (American Psychological Association, 2015), but supervisors’ use of session recordings can promote their accurate assessment of therapist performance (Caron et al., 2020). Thus, use of session recordings is another strategy that may facilitate active learning.

Further research suggests that combining these active learning methods in supervision and consultation protocols is effective. For example, supervision with modeling, role play, and performance feedback based on video review (Bearman et al., 2017) and consultation with video review, performance feedback, and self-reflection (Eiraldi et al., 2018) have been shown to improve therapists’ adherence and competence more than supervision/consultation as usual.

However, despite the evidence for these active ingredients of supervision, they are not often used in supervision as usual. For example, in observational studies of supervision in community mental health settings, Accurso et al. (2011) and Bailin et al. (2018) found that session recordings were used in 2–13% of supervision sessions, and skill or therapy checklists were used in 5–11% of sessions. Although Bailin et al. (2018) found that supervisor modeling occurred in the majority of supervision sessions (70%), corrective feedback to the supervisee was provided in only 7% of sessions, and role play occurred in only one session. Since methods to encourage supervisee active and experiential learning tend to be underutilized, research is needed to understand how supervisors and consultants can best encourage therapists’ active learning.

The current study used microanalytic coding of consultation sessions to examine specific consultant activities that resulted in increased and decreased likelihood of therapists’ subsequent active learning behaviors, at the moment-by-moment level (i.e., within the 3 seconds following a consultant verbalization or activity). Data came from an observational study of fidelity-focused consultation in Attachment and Biobehavioral Catch-up (Caron, 2017), which emphasized fidelity coding-based performance feedback and therapist self-coding of fidelity. Although consultants in this study were following a unique manualized protocol, different from the techniques commonly used in supervision as usual, to our knowledge this is the first study to examine moment-by-moment links between consultant and therapist behaviors in a relatively large sample. Our hope is that this study can provide initial signals about how supervisors and consultants can encourage – and avoid discouraging – therapists’ active learning.

Method

Participants

Therapists

Therapist participants included 27 clinicians from 18 agencies in 6 US states who were participating in a year of developer-led training and consultation in the Attachment and Biobehavioral Catch-up (ABC) intervention. Three additional therapists consented to the current study but were not included in analyses because of insufficient data related to maternity leave (n = 1), leaving the agency (n = 1), and lack of recorded consultation sessions (n = 1). Full demographic data were not available for one therapist because she started but did not complete the questionnaires.

All therapists but one (96%) were female. Most therapists (n =18, 67%) had Master’s degrees, 6 (22%) had Bachelor’s degrees, 1 (4%) had a Ph.D., and 1 (4%) had completed high school or less. Most therapists were White (n = 16, 59%), 6 (22%) were Black, 1 (4%) was Asian American, and 3 (11%) were more than one race. Two therapists (7%) were Hispanic. On average, therapists were 34.0 years old (SD = 7.5, range: 23–48). They reported having worked an average of 4.8 years (SD = 5.7) in their current jobs and 9.1 years (SD = 6.7) in the field.

Consultants

Twenty-one fidelity consultants participated in the current study. The majority (n = 14, 67%) were undergraduate students who had been trained to reliability in the ABC fidelity coding system, and who then received additional training and supervision on fidelity-focused consultation, which is described below. All consultants but one (95%) were female. Most consultants were White/non-Hispanic (n = 13, 62%), 6 (29%) were White/Hispanic, and 1 (5%) was Black. Average reported age was 22.1 (SD = 1.7, range: 20 – 26). The seven consultants who were not undergraduate students (33%) worked as part- or full-time staff. Of the staff members, six had completed their Bachelor’s degree and one had a Ph.D.

Procedure

Therapists were trained in a 2- to 3-day in-person training workshop between 2012 and 2016. Therapists were provided with an ABC manual and access to a training website that included videos of expert therapists’ ABC sessions. After training, they began implementing ABC with families and received weekly group clinical consultation, led by a Ph.D.-level consultant, as well as weekly individual fidelity-focused consultation. Both types of consultation were conducted remotely using videoconferencing software. Fidelity-focused consultation was intended to occur weekly for one year, and on average, therapists received 33.8 consultation sessions (SD = 7.3, range: 23 – 50) over 10.9 months (SD = 1.8, range: 7.0 – 16.2). Consultation sessions were video-recorded for training and feedback to consultants; these videos were archived and used in the current study. The study protocol was approved by the University of Delaware Institutional Review Board, and therapists and consultants provided informed consent for use of their archived materials and for completing questionnaires, including a questionnaire on demographics. Therapists were compensated with a $20 gift card for their questionnaire completion.

Changes in therapist-consultant pairings occurred when consultants left the lab temporarily (e.g., summer break) or permanently (e.g., leaving job, graduating). Excluding brief periods of substitute consulting (i.e., 4 sessions or less), most (n = 15, 56%) therapists received consultation from a single consultant during their training period. However, 9 (33%) had two consultants and 3 (11%) had three consultants during their ABC training. In the videos in the current sample, 35 unique therapist-consultant pairings were represented.

Attachment and Biobehavioral Catch-up

Attachment and Biobehavioral Catch-up is an evidence-supported, 10-session preventative intervention for infants who have experienced early adversity, including abuse and neglect. ABC focuses on increasing the parental behaviors of following the lead (i.e., contingently responding to children’s behaviors in play and conversation), nurturance (i.e., sensitively responding to children’s distress), and delight (i.e., sharing one’s enjoyment in spending time with the child). ABC also focuses on decreasing parental frightening behavior (i.e., verbally or physically overwhelming or threatening behavior). An efficacy trial that compared children whose birth parents received ABC with children whose parents received a 10-session control intervention focused on developmental education found that ABC enhanced infants’ secure attachment (Bernard et al., 2012) and regulation of the stress hormone cortisol (Bernard, Dozier, et al., 2015). Follow-up evaluations of these groups have found long-term effects of ABC on children’s stress hormones (Bernard, Hostinar, et al., 2015), attachment (Zajac et al., 2020), and executive functioning (Lind et al., 2017, 2019) up to 8 years after the intervention. ABC has been found to enhance parenting in both randomized controlled trials (e.g., Bernard, Simons, et al., 2015; Yarger et al., 2019) and in community implementation settings (e.g., Caron et al., 2016; Perrone et al., 2020). Further, changes in parenting have mediated intervention effects on children’s outcomes, including receptive vocabulary and behavioral compliance (Lind et al., 2019; Raby et al., 2019).

Therapists deliver ABC in families’ homes, using a parent coaching model called “in-the-moment commenting,” in which therapists provide feedback to parents about their intervention-targeted behaviors as they occur naturally during sessions. For example, a therapist might say, “When he said, ‘Dada,’ and you said, ‘Yeah, dada,’ right back, that was an example of following his lead.” The ABC fidelity measure is focused on assessing therapists’ frequency and quality of in-the-moment comments, which have been found to predict parent behavior change (Caron et al., 2018). The ABC fidelity measure also assesses targeted parent behaviors, as each relevant parent behavior is considered an opportunity for the therapist to make an in-the-moment comment.

Fidelity-Focused Consultation

Fidelity-focused consultation is centered around consultants’ and therapists’ use of the ABC fidelity measure. Prior to each consultation session, the consultant assigns fidelity coding from one of the therapist’s recent session videos, and the consultant and therapist independently code fidelity from the video. During consultation, the consultant uses the ABC fidelity coding to provide fidelity feedback to the therapist. In addition, the consultant provides feedback on the therapist’s fidelity self-coding, to help promote the therapist’s understanding of and ability to self-evaluate ABC fidelity. In addition to providing performance feedback, consultants are also trained to use other active learning techniques in consultation, including modeling, role-play, and “live coding,” a process in which therapists practice fidelity coding during the consultation session using a worksheet or video. Consultation sessions are scheduled for 30 minutes, and in the current sample of videos, meetings lasted an average of 26.8 minutes (SD = 7.5). Fidelity-focused consultation procedures were manualized and supervised by the first author in weekly group meetings. Further description of fidelity-focused consultation can be found in Caron and Dozier (2019).

Data Collection

Consultation sessions were recorded, and six sessions per therapist were selected for second-by-second coding. One session was selected from each of the first six months of consultation, with selection based on spacing coded consultation sessions across regular intervals (as close to 30 days between sessions as possible). However, if consultation sessions from the target month could not be coded due to recording errors, video file format, or lack of ABC session review during consultation (i.e., consultation occurred but therapist did not have any new ABC sessions to review since the previous consultation session), a session from the previous or following month was used to replace the missing video. All consultation sessions were coded by the first author, and 32 videos (20% of the sample) were double-coded by a trained undergraduate student for the purposes of establishing interrater reliability.

Measures

Demographics

Demographics were collected from therapists and consultants at the conclusion of ABC training. Demographics included gender, race, ethnicity, age, education level, and work experience.

Consultation Process

Therapist and consultant behaviors were coded second-by-second from video recordings of the consultation sessions using Noldus Observer XT 11 software. In this coding program, as videos are played, markers can be dropped to indicate the onset and end of relevant behaviors. Multiple behaviors can be coded simultaneously (e.g., if the therapist and consultant are talking at the same time). The coding manual was developed by the first author through consultation of the literature on supervision (e.g., Edmunds, Kendall et al., 2013; Milne et al., 2008) and review of recorded consultation sessions from the sample. After initial testing and discussion of discrepancies by the first author and an undergraduate coder, the coding system was revised to its final format. Behavior codes are organized hierarchically and different modifiers can be added to denote various characteristics within behaviors. See following sections for details regarding specific codes. The full coding manual is provided as an appendix in Caron (2017). Interrater reliability is evaluated at the second-by-second level by determining the presence or absence of matching codes at each individual second; these second-by-second determinations are aggregated and summarized by kappa coefficients. Noldus Observer XT calculates kappas at the individual video level, and in the current sample, kappas for the 32 double-coded videos averaged .68 (range: .59 - .85), reflecting moderate agreement (McHugh, 2012).

Consultant Behaviors.
Feedback.

Consultant feedback was coded when consultants provided therapists with a neutral, positive, or negative evaluation of their ABC fidelity or ABC fidelity coding. For example, statements coded as Fidelity Feedback would include, “Your in-the-moment comments were 100% on-target in this clip,” and “You did a great job with fidelity this week.” Statements coded as Coding Feedback would include, “There were a few differences in our fidelity coding sheets,” and “You coded a lot more parent behaviors this week.”

Questions/Prompts.

Consultant questions were coded when consultants asked a question of the therapist, such as, “How do you feel the session went, in terms of fidelity?,” “How was coding your fidelity this week?” and “Does that make sense?” Consultant prompts were coded when consultants made statements intended to elicit a therapist response, such as, “When you emailed, you said that you felt more confident about coding your fidelity this week,” and “Tell me more about that.” Questions/prompts were grouped together and categorized into four topics, specifically, Fidelity Questions, Coding Questions, Client Questions, and Other Questions.

Information/Advice.

Consultant information/advice was coded when consultants provided information about ABC, fidelity, or fidelity coding, such as, “One of the ABC certification criteria is making one in-the-moment comment per minute, on average.” This category was also coded when consultants provided advice or recommendations to the therapists, such as, “One thing you can do to keep commenting frequently with this family, even without trying to address negative behaviors, is to focus on commenting on delight.” Information/Advice was further categorized as either Fidelity Information, Coding Information, and Other Information, depending on the topic.

Client Discussion.

Consultant client discussion was coded when consultants engaged in discussion about therapists’ ABC cases, including case conceptualization and client progress. For example, statements coded as client discussion included, “Even though the mom is really intrusive with the baby, she does delight a lot,” and “Mom seems to be starting to understand following the lead now.”

Modeling.

Modeling was coded when consultants gave examples of language that could be used in session, typically, in-the-moment comments that therapists could make, such as “You could say, ‘Oh, I’m sorry, I’m distracting you when your baby needs you. Go ahead and give her that nurturance.’”

Live Coding.

Live fidelity coding was recorded when consultants explained fidelity coding or talked through thoughts on coding, based on a video example or worksheet viewed during consultation rather than fidelity coding completed prior to consultation.

Supporting.

Supporting was coded when consultants made non-specific encouraging statements or expressed understanding of a therapist’s experiences, such as, “Everyone feels overwhelmed in their first session!” or “Hang in there, you’re doing great.”

Use of Video.

When video was played during consultation, it was coded separately from consultant and therapist behaviors. Consultants typically used therapists’ own ABC session videos, but could also use ABC training videos to create live fidelity coding and role play activities for therapists.

Therapist Behaviors
Role Play.

Role play was coded when the therapist rehearsed, practiced, or gave an example of language that she could use in the future. This included times when therapists were reflecting on prior ABC sessions but stating what they could say differently in the future, such as, “I think I should have been more specific, like, ‘Dad, when he started to cry, you picked him up and held him.’”

Live Coding.

Live fidelity coding was recorded when the therapist made online statements about fidelity coding in response to a video or worksheet example, such as, “That looks like nurturance.”

Self-Feedback.

Therapist self-feedback was coded when the therapist expressed self-directed feedback about their performance. Fidelity Self-Feedback was coded when therapists made evaluative statements about their fidelity, such as, “I missed so many opportunities to make comments in this clip!” Coding Self-Feedback was coded when therapists made evaluative statements about their fidelity coding, such as, “Looking at your [the consultant’s] fidelity coding, I realized I coded a lot of random things I didn’t have to code.”

Reflection.

Reflection was coded when therapists made statements about their experience conducting ABC that were not evaluative enough to be coded as self-feedback, including expression of thoughts and emotions related to experiences with fidelity coding and implementing ABC, as well as report of general in-session observations and actions, such as, “I was really confused about that when I was coding,” “I keep getting stuck thinking about what I want to say,” and “I actually feel more nervous about session 2 than I did about session 1.”

Analyses

To examine sequential relationships between consultant behaviors and therapist active learning behaviors, the second-by-second coding conducted in Noldus Observer XT 11 was exported to ObsTxtSds 2.0 (Bakeman & Quera, 2008), which converted the data to code compatible with Generalized Sequential Querier 5.1 (Bakeman & Quera, 1995; 2011). Generalized Sequential Querier 5.1 was used to analyze the likelihood that various consultant behaviors would be followed by therapist active learning behaviors. Specifically, Generalized Sequential Querier 5.1 analyzed the odds ratios for therapist active learning to occur within 3 seconds of the end of each consultant behavior category.

Results

Table 1 presents descriptive information about consultant and therapist behaviors, including the number of occurrences of each behavior in the 162 coded consultation sessions, the percentage of consultation sessions that included at least one instance of the behavior, and the average duration the behavior was coded each time it occurred. The most common consultant behaviors included fidelity feedback, fidelity coding feedback, fidelity information, fidelity coding information, and support. The most common therapist behavior was reflection. With regard to active learning behaviors, consultant performance feedback about fidelity and fidelity coding occurred in 99% of sessions, modeling occurred in 83% of sessions, live coding occurred in 21% of sessions, and use of video occurred in 33% of sessions. In terms of therapist active and experiential learning, reflection occurred in 100% of sessions, fidelity self-feedback occurred in 61% of sessions, fidelity coding self-feedback occurred in 46% of sessions, role play occurred in 46% of sessions, and live coding occurred in 18% of sessions.

Table 1.

Descriptive Statistics for Consultant and Therapist Behaviors

Number of Occurrences (N) in the 162 Coded Sessions Percent of Sessions with Behavior Coded Average Duration of Each Occurrence (Range) – in Seconds
Consultant Behaviors
 Live Coding 174 21% 10.8 (1 – 47)
 Modeling 826 83% 3.9 (1 – 19)
 Coding Feedback 1404 99% 16.8 (1 – 137)
 Fidelity Feedback 1718 99% 14.4 (1 – 107)
 Client Discussion 500 77% 12.9 (1 – 59)
 Information – Coding 1838 98% 16.5 (1 – 127)
 Information – Fidelity 1216 96% 21.6 (1 – 250)
 Information – Other 414 74% 15.0 (1 – 83)
 Question – Client 242 61% 4.5 (1 – 22)
 Question – Coding 721 90% 4.2 (1 – 40)
 Question – Fidelity 436 75% 5.4 (1 – 26)
 Question – Other 458 92% 3.8 (1 – 24)
 Supporting 1348 99% 4.0 (1 – 86)
 Use of Video 342 33% 13.6 (1 – 110)
Therapist Behaviors
 Live Coding 69 18% 7.1 (1 – 30)
 Role Play 232 46% 5.0 (1 – 18)
 Coding Self-Feedback 136 46% 10.2 (1 – 46)
 Fidelity Self-Feedback 177 61% 11.5 (1 – 45)
 Reflection 1940 100% 14.9 (1 – 108)

Next we examined the contingent relationships between consultant behaviors and subsequent therapist behaviors. As shown in Table 2, all but 1 of the 90 odds ratios (ORs) were significant at the p ≤ .01 level, indicating that nearly all therapist behaviors were either significantly more likely (ORs > 1.0) or less likely than chance (ORs < 1.0) to follow various consultant behaviors. Because most results were significant, we highlight broad patterns and consider the size of the odds ratios in interpreting results. For example, although therapists’ fidelity self-feedback was somewhat more likely than chance to follow consultant support (OR = 1.13), therapists’ fidelity self-feedback was even more likely to follow consultant silence (i.e., no code; OR = 1.35), as reflected by non-overlapping 95% confidence intervals. Overall, there were 20 odds ratios larger than 1.0, and 69 odds ratios smaller than 1.0, reflecting that, in general, consultants’ vocalizations were less likely than chance to lead to therapist active learning. This broad result likely reflects that consultants spent more overall time talking than therapists did.

Table 2.

Odds Ratios Reflecting Likelihood for Therapist Active Learning Behaviors to Follow Consultant Behaviors

Consultant Behavior Therapist Behavior
Live Coding Role Play Coding Self-Feedback Fidelity Self-Feedback Reflection
Live Coding 32.78 (31.54, 34.07) 0.00 (n/a) 0.00 (n/a) 0.00 (n/a) 0.41 (0.39, 0.42)
Modeling 0.00 (n/a) 0.18 (0.16, 0.21) 0.00 (n/a) 0.05 (0.04, 0.06) 0.07 (0.07, 0.07)
Coding Feedback 0.74 (0.69, 0.81) 0.00 (n/a) 1.92 (1.87, 1.98) 0.32 (0.30, 0.34) 0.73 (0.72, 0.73)
Fidelity Feedback 0.00 (n/a) 0.09 (0.08, 0.10) 0.25 (0.23, 0.27) 1.31 (1.27, 1.34) 0.51 (0.50, 0.52)
Client Discussion 0.00 (n/a) 0.00 (n/a) 0.00 (n/a) 0.84 (0.79, 0.89) 0.37 (0.36, 0.38)
Information – Coding 0.09 (0.08, 0.11) 0.04 (0.03, 0.05) 1.21 (1.18, 1.26) 0.20 (0.19, 0.21) 0.76 (0.75, 0.77)
Information – Fidelity 0.00 (n/a) 0.73 (0.69, 0.78) 0.16 (0.15, 0.18) 0.95 (0.91, 0.98) 0.73 (0.72, 0.74)
Information – Other 0.00 (n/a) 0.00 (n/a) 0.00 (n/a) 0.41 (0.37, 0.45) 0.63 (0.61, 0.64)
Question – Client 0.00 (n/a) 0.00 (n/a) 0.00 (n/a) 0.00 (n/a) 0.71 (0.69, 0.73)
Question – Coding 5.39 (5.15, 5.64) 0.00 (n/a) 2.31 (2.22, 2.41) 0.93 (0.88, 0.98) 1.87 (1.85, 1.89)
Question – Fidelity 3.07 (2.85, 3.31) 5.31 (5.11, 5.52) 0.00 (n/a) 2.98 (2.87, 3.09) 1.82 (1.79, 1.84)
Question – Other 0.00 (n/a) 0.00 (n/a) 0.97 (0.90, 1.05)* 1.43 (1.36, 1.50) 1.43 (1.41, 1.45)
Supporting 0.82 (0.76, 0.89) 0.52 (0.49, 0.55) 0.82 (0.78, 0.86) 1.13 (1.09, 1.17) 0.94 (0.93, 0.95)
Use of Video 34.61 (33.63, 35.62) 0.00 (n/a) 1.12 (1.03, 1.21) 0.38 (0.34, 0.42) 0.37 (0.36, 0.38)
No Code (Silence) 0.45 (0.44, 0.46) 3.18 (3.09, 3.27) 1.32 (1.29, 1.34) 1.35 (1.33, 1.37) 1.37 (1.36, 1.38)

Note. 95% Confidence Intervals are in parentheses. Odds ratios that represent a sequence that never occurred are 0.00, and do not have an associated confidence interval.

*

All p-values, with the exception of the starred odds ratio, are significant at the p ≤ .01 level.

Several consultant behaviors were more likely than chance to lead to parallel therapist behaviors. Specifically, consultant feedback about fidelity was likely to be followed by therapist self-feedback about fidelity (OR = 1.31), consultant feedback about fidelity coding was likely to be followed by therapist self-feedback about fidelity coding (OR = 1.92), and consultant live fidelity coding was very likely to be followed by therapist live fidelity coding (OR = 32.78). In contrast, consultant modeling was less likely than chance to be followed by therapist role play (OR = 0.18) or other therapist active learning behaviors (OR’s 0.00 to 0.07). In the current sample, consultant use of video was associated with high rates of subsequent therapist live fidelity coding (OR = 34.61) as well as increased likelihood of therapist self-feedback about fidelity coding (OR = 1.12) but not other active learning behaviors (OR’s 0.00 to 0.38).

Consultants’ questions and prompts were an effective strategy for eliciting therapist active learning. Consultants’ questions about fidelity coding were more often than chance followed by therapist live fidelity coding (OR = 5.39), self-feedback about fidelity coding (OR = 2.31), and reflection (OR = 1.87). Consultants’ questions about therapists’ fidelity were more often than chance followed by therapist role play (OR = 5.31), self-feedback about fidelity (OR = 2.98), live fidelity coding (OR = 3.07), and reflection (OR = 1.82). Questions that did not fit in any other categories, often because they were general/global (e.g., “What do you think went well this week?”), also tended to elicit therapist reflection (OR = 1.43) and self-feedback about fidelity (OR = 1.43). Finally, consultant silence (represented by “no code”) was associated with increased odds of subsequent therapist active learning for four of the five behaviors (OR’s 1.32 to 3.18), likely reflecting that, in addition to asking questions, sitting with silence or using a “pregnant pause” can also be a strategy to encourage therapist engagement in active learning.

Several consultant behaviors were associated with reduced likelihood of therapist active learning. Specifically, consultant questions about clients (OR’s 0.00 to 0.71) and discussion of clients (OR’s 0.00 to 0.84) were less likely than chance to be followed by therapist active learning behaviors. In addition, with two exceptions of small magnitude (OR = 1.13 and 1.21), consultant provision of information and support were generally associated with reduced likelihood of therapist active learning (i.e., OR’s 0.00 to 0.95 in 18 out of 20 OR’s).

Discussion

In this study, we explored consultant behaviors that were associated with increased or decreased likelihood of eliciting therapist active learning behaviors. Several consultant behaviors, including providing information, discussion and questions about clients, and supportive statements, were generally associated with reduced likelihood of subsequent therapist active learning. These results suggest that consultants and supervisors must encourage active learning intentionally through their verbal behaviors.

In this vein, several consultant behaviors selectively increased likelihood of subsequent parallel behaviors in therapists, but did not increase likelihood of other active learning behaviors. Specifically, consultant feedback increased likelihood of therapist self-feedback, and consultant live coding increased likelihood of subsequent therapist live coding. These findings are somewhat similar to work by Milne et al. (2003), who suggested in a qualitative case study that supervisor change strategies in supervision (e.g., agenda setting, modeling, role play) led the therapist to demonstrate increased use of those same change strategies in their cognitive behavioral therapy sessions. Thus, it appears that consultants and supervisors may be able to engage therapists in active learning by modeling or setting the tone of the conversation by using parallel strategies themselves.

However, consultant modeling did not encourage therapists’ own active learning processes in the current study, an important exception to the pattern of consultants’ behaviors encouraging parallel therapist active learning. In fact, therapist active learning behaviors were less likely than chance to follow consultant modeling. This result aligns with Bailin et al.’s (2018) finding that, in routine supervision for youth mental health care, modeling was used frequently (coded in 70% of recorded sessions), but role play was very infrequent (coded in 2% of sessions). Further, modeling tends to be used frequently early in consultation, when therapists are learning to implement a new intervention, and then to be phased out over time (Becker et al., 2013). In the current study, some of the lack of association between modeling and role play may be due to consultants using modeling early in the consultation relationship, and gradually transitioning to greater use of role play over time. Overall, the findings suggest that although modeling and role play may superficially seem like similar strategies, modeling does not encourage role play, and supervisors need to use other strategies to encourage and prompt therapists to engage in role play in supervision.

Questions and prompts were one of the most effective strategies for eliciting therapist active learning, accounting for nearly half of the odds ratios greater than 1.0 in the current study (i.e., 9 of 20). This finding is consistent with James et al.’s (2008) assertion that questions “move the learning forward” in supervision (p. 34). However, the findings also reflect the importance of supervisor specificity with questions. For example, consultants’ questions about clients were associated with reduced likelihood of therapist active learning, and consultants’ questions about fidelity coding were associated with increased likelihood of therapist fidelity coding self-feedback but not fidelity self-feedback, whereas questions about fidelity had the opposite pattern with types of self-feedback. Thus, although therapists do sometimes engage in active learning after more open-ended questions (“Other” questions in the current study), the current results suggest that the most effective way to encourage therapist active learning is with targeted questions. Supervisors should be trained to ask specific, targeted questions to encourage therapist active learning. For example, supervisors can ask therapists, “What would you say if…?” to encourage role-play. Overall, the current analyses broadly combined a diverse range of supervisor questions, in terms of topic, specificity, and complexity, and more fine-grained work on the most effective types of questions in supervision is needed (James et al., 2008).

Silence was also an effective strategy for encouraging therapist active learning, and was more likely than chance to be followed by therapist role play, self-feedback, and reflection. Silence appeared particularly effective in encouraging therapist role play, as the likelihood of therapist role play following consultant silence (OR = 3.18) was higher than the likelihood of any other active learning behaviors following silence. This may reflect the fact that role play can be particularly challenging for therapists, and they may need time to think before they begin speaking. After asking a question that prompts role play (the most effective strategy for eliciting role play, according to odds ratio sizes), we encourage supervisors to recognize the helpfulness of silence. Therapists tend to have mixed opinions about the comfort of role play in supervision (Beidas et al., 2013), which may lead to brief delays in responding. The current results encourage supervisors not to jump in to rescue therapists from perceived discomfort or lack of knowledge by answering the question or modeling the therapeutic technique themselves, and to use silence to strategically facilitate role play.

Use of session recordings in supervision is viewed as important to promote supervisors’ accurate feedback to supervisees (American Psychological Association, 2015; Hurlburt et al., 2010). In terms of encouraging supervisees’ engagement in active learning, in the current study, use of video was associated with increased likelihood of subsequent therapist live coding and self-feedback about fidelity coding, two behaviors that are rather unique to the model of fidelity-focused consultation in ABC. Thus, it is difficult to generalize these results to supervision more broadly, and the role of video in encouraging therapists’ engagement in active learning remains an important question.

A primary limitation of the current study is that the consultation sessions sampled came from a specific model of consultation that emphasized therapists’ fidelity and included therapist fidelity self-coding. Given that use of fidelity coding or skill checklists is rare in supervision as usual (Accurso et al., 2011; Bailin et al., 2018), the content of consultation in the current study – particularly with regard to fidelity feedback, fidelity coding feedback, self-feedback about fidelity and fidelity coding, and live fidelity coding during consultation – was very different from most community-based supervision. Further, consultants were primarily undergraduate students who were experts in ABC fidelity coding but who did not have clinical experience implementing ABC or other interventions, and who received a significant amount of training and ongoing supervision. Thus, it is unknown whether the current results are likely to generalize to other models of consultation/supervision, or to supervision as usual, with more typical personnel in the consultant/supervisor role. On the other hand, if supervision as usual sessions were coded and analyzed as in this study, the statistical infrequency of certain active learning behaviors, like role play, might make it difficult to identify any supervisor behaviors that made therapist active learning more likely to occur (Bailin et al., 2018).

The sample size, while somewhat small, was larger than other studies that have examined contingent relationships between supervisor and supervisee behaviors (e.g., N = 1 in Milne et al., 2003 and James et al., 2008). In addition, the larger session-level sample size and the fact that each session was coded from beginning to end created a total of over 4300 minutes of micro-level coding, which meant that the study was well-powered for sequential analysis; indeed, 89 of the 90 tests resulted in odds ratios that were significantly different from 0. The study’s largest strength is its novel, micro-level observational and sequential analysis statistical approach. Replication and extension of this work with supervision sessions from usual practice settings, as well as with sessions of other consultation models, is an important next step.

Conclusions

A growing literature points to the importance of ongoing supervision or consultation in the implementation of evidence-based practices (Lau et al., 2020). Despite this, little has been done to investigate which specific supervisor/consultant and therapist behaviors may promote therapist learning of evidence-based practices. The current study used micro-level coding and sequential analysis of consultation sessions to examine the consultant behaviors that were linked to subsequent therapist active learning behaviors. Active learning behaviors, including role play, reflection, and self-feedback, have been shown to improve implementation of new evidence-based practices (e.g., Bearman et al., 2013). Results showed that targeted questions and parallel active learning strategies (e.g., giving feedback to the therapist) increased likelihood of therapist active learning. In addition, consultants’ use of silence and video were strategies that increased likelihood of subsequent therapist active learning. Although replication and extension of the current study are needed, the results provide clear suggestions for how supervisors can try to increase therapists’ active learning in supervision. These strategies may help to improve supervision and consultation practice, which have been shown to be critical in implementing evidence-based practices in community settings.

Practical Implications and Recommendations for Supervisors.

Supervisees’ engagement in active and experiential learning leads to in-session behavior change, including improved fidelity to evidence-based treatments. To encourage supervisees’ active and experiential learning, supervisors should consider integrating the following strategies into their supervisory practice:

  1. Limit the amount of time spent on lengthy client discussion, didactic provision of information about therapeutic strategies, modeling of therapeutic strategies, or support for the supervisee.

  2. Use session recordings and model the active learning processes you would like the supervisee to engage in. For example, give performance feedback to the supervisee to encourage supervisees’ self-feedback; when available, try using session checklists or other fidelity measures to facilitate this feedback process.

  3. Ask targeted questions to encourage specific active learning processes (e.g., “What do you think went well and what do you think could have gone better, in terms of your goals for the session?” to encourage self-feedback; “What would you say to the client if…?” to encourage role-play).

  4. Follow questions with silence to allow supervisees time to think, particularly when asking supervisees to engage in complex or challenging active learning processes like role play.

Acknowledgments

This work was supported by the National Institutes of Health under Grants R01 MH052135, R01 MH074374, and R01 MH084135, and a Dissertation Award from the University of Delaware Department of Psychological and Brain Sciences. We gratefully acknowledge the parent coaches and consultants who participated in this study. We also would like to thank Victoria Kager for her assistance with coding consultation session videos.

Biographies

Author Note

EB Caron, Ph.D., is an Assistant Professor of Psychological Science at Fitchburg State University in Massachusetts. She received her B.A. from Stanford University and her Ph.D. in Clinical Science from University of Delaware. While at the University of Delaware, Dr. Caron developed and validated a measure of fidelity, as well as fidelity-focused consultation procedures, for the Attachment and Biobehavioral Catch-up (ABC) intervention, a home-based preventive parenting intervention for high-risk infants and toddlers. Dr. Caron completed her internship at Terry Children’s Center in Delaware and her postdoctoral training at UConn Health. She was a 2019 NIMH Child Intervention, Prevention, and Services (CHIPS) fellow. Her research focuses on implementation of evidence-based practices for children.

Teresa Lind, Ph.D., is an Assistant Professor of Child and Family Development at San Diego State University. She received her A.B. from Harvard College and her Ph.D. in Clinical Science from University of Delaware. She completed her predoctoral psychology internship at the University of Arkansas for Medical Sciences and her postdoctoral training at the University of California, San Diego and the Child and Adolescent Services Research Center (CASRC) in San Diego, CA. She was a 2018 NIMH Child Intervention, Prevention, and Services (CHIPS) fellow. Her research focuses on the effects of early adversity on the development of emotional, physiological, and behavioral regulation capabilities in young children. In addition, she is interested in improving mental health services for these at-risk populations through the implementation of evidence-based practices (EBPs).

Mary Dozier is Professor of Psychological and Brain Sciences at the University of Delaware. She obtained her Ph.D. from Duke University in 1983. She was named the Amy E. DuPont Chair in Child Development in 2007, and in 2016 was named the Francis Alison Professor, the university’s highest faculty honor. Over the last 25 years, she has studied the development of young children in foster care and young children living with neglecting birth parents, examining challenges in attachment and regulatory capabilities. Along with her graduate students and research team, she developed an intervention, Attachment and Biobehavioral Catch-up, that targets specific issues that have been identified as problematic for young children who have experienced adversity. This intervention has been shown to enhance children’s ability to form secure attachments, and to regulate physiology and behavior normatively, among other things.

Footnotes

Preliminary descriptive data from the current study were published as a dissertation (Caron, 2017), but have not previously been published in a peer-reviewed journal. Sequential analyses are novel and have never before been published.

Declaration of Interest Statement

The authors declare that they have no conflicts of interest.

References

  1. Accurso EC, Taylor RM, & Garland AF (2011). Evidence-based practices addressed in community-based children’s mental health clinical supervision. Training and Education in Professional Psychology, 5(2), 88–96. [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. American Psychological Association. (2015). Guidelines for clinical supervision in health service psychology. American Psychologist, 70(1), 33–46. [DOI] [PubMed] [Google Scholar]
  3. Association for Counselor Education and Supervision Taskforce on Best Practices in Clinical Supervision. (2011). Best practices in clinical supervision. Association for Counselor Education and Supervision. https://acesonline.net/wp-content/uploads/2018/11/ACES-Best-Practices-in-Clinical-Supervision-2011.pdf [Google Scholar]
  4. Bailin A, Bearman SK, & Sale R (2018). Clinical supervision of mental health professionals serving youth: Format and microskills. Administration and Policy in Mental Health and Mental Health Services Research, 45(5), 800–812. [DOI] [PubMed] [Google Scholar]
  5. Bakeman R, & Quera V (1995). Analyzing interaction: Sequential analysis with SDIS & GSEQ. Cambridge University Press. [Google Scholar]
  6. Bakeman R, & Quera V (2008). ActSds and OdfSds: Programs for converting INTERACT and The Observer data files into SDIS timed-event sequential data files. Behavior Research Methods, 40(3), 869–872. [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. Bakeman R, & Quera V (2011). Sequential analysis and observational methods for the behavioral sciences. Cambridge University Press. [Google Scholar]
  8. Bearman SK, Schneiderman RL, & Zoloth E (2017). Building an evidence base for effective supervision practices: An analogue experiment of supervision to increase EBT fidelity. Administration and Policy in Mental Health and Mental Health Services Research, 44(2), 293–307. [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Bearman SK, Weisz JR, Chorpita BF, Hoagwood K, Ward A, Ugueto AM, Bernstein A & The Research Network on Youth Mental Health. (2013). More practice, less preach? The role of supervision processes and therapist characteristics in EBP implementation. Administration and Policy in Mental Health and Mental Health Services Research, 40(6), 518–529. [DOI] [PMC free article] [PubMed] [Google Scholar]
  10. Becker KD, Bradshaw CP, Domitrovich C, & Ialongo NS (2013). Coaching teachers to improve implementation of the good behavior game. Administration and Policy in Mental Health and Mental Health Services Research, 40(6), 482–493. [DOI] [PubMed] [Google Scholar]
  11. Beidas RS, Edmunds JM, Cannuscio CC, Gallagher M, Downey MM, & Kendall PC (2013). Therapists perspectives on the effective elements of consultation following training. Administration and Policy in Mental Health and Mental Health Services Research, 40(6), 507–517. [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Beidas RS, & Kendall PC (2010). Training therapists in evidence‐based practice: A critical review of studies from a systems‐contextual perspective. Clinical Psychology: Science and Practice, 17(1), 1–30. [DOI] [PMC free article] [PubMed] [Google Scholar]
  13. Bennett-Levy J, & Thwaites R (2007). Self and self-reflection in the therapeutic relationship. In Gilbert P & Leahy RL (Eds.), The therapeutic relationship in the cognitive behavioral psychotherapies (pp. 255–281). Routledge. [Google Scholar]
  14. Bernard K, Dozier M, Bick J, & Gordon MK (2015). Intervening to enhance cortisol regulation among children at risk for neglect: results of a randomized clinical trial. Development and Psychopathology, 27(3), 829–841. [DOI] [PMC free article] [PubMed] [Google Scholar]
  15. Bernard K, Dozier M, Bick J, Lewis‐Morrarty E, Lindhiem O, & Carlson E (2012). Enhancing attachment organization among maltreated children: Results of a randomized clinical trial. Child Development, 83(2), 623–636. [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. Bernard K, Hostinar CE, & Dozier M (2015). Intervention effects on diurnal cortisol rhythms of child protective services–referred infants in early childhood: Preschool follow-up results of a randomized clinical trial. JAMA Pediatrics, 169(2), 112–119. [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Bernard K, Simons R, & Dozier M (2015). Effects of an attachment‐based intervention on child protective services–referred mothers’ event‐related potentials to children’s emotions. Child Development, 86(6), 1673–1684. [DOI] [PMC free article] [PubMed] [Google Scholar]
  18. Caron E (2017). Effects and processes of fidelity-focused consultation (Doctoral dissertation, University of Delaware). UDSpace Digital Archive. https://udspace.udel.edu/bitstream/handle/19716/23096/Caron_udel_0060D_13001.pdf?sequence=1 [Google Scholar]
  19. Caron E, Bernard K, & Dozier M (2018). In vivo feedback predicts parent behavior change in the Attachment and Biobehavioral Catch-up intervention. Journal of Clinical Child & Adolescent Psychology, 47(sup1), S35–S46. [DOI] [PMC free article] [PubMed] [Google Scholar]
  20. Caron E, & Dozier M (2019). Effects of fidelity-focused consultation on therapists’ implementation: An exploratory multiple baseline design. Administration and Policy in Mental Health and Mental Health Services Research, 46, 445–457. [DOI] [PMC free article] [PubMed] [Google Scholar]
  21. Caron E, Muggeo MA, Souer HR, Pella JE, & Ginsburg GS (2020). Concordance between therapist, supervisor and observer ratings of therapeutic competence in CBT and treatment as usual: Does therapist competence or supervisor session observation improve agreement? Behavioural and Cognitive Psychotherapy, 48(3), 350–363. [DOI] [PubMed] [Google Scholar]
  22. Caron E, Weston-Lee P, Haggerty D, & Dozier M (2016). Community implementation outcomes of Attachment and Biobehavioral Catch-up. Child Abuse & Neglect, 53, 128–137. [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Edmunds JM, Beidas RS, & Kendall PC (2013). Dissemination and implementation of evidence–based practices: Training and consultation as implementation strategies. Clinical Psychology: Science and Practice, 20(2), 152–165. [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Edmunds JM, Kendall PC, Ringle VA, Read KL, Brodman DM, Pimentel SS, & Beidas RS (2013). An examination of behavioral rehearsal during consultation as a predictor of training outcomes. Administration and Policy in Mental Health and Mental Health Services Research, 40(6), 456–466. [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Eiraldi R, Mautone JA, Khanna MS, Power TJ, Orapallo A, Cacia J, Schwartz BS, McCurdy B, Keiffer J, Paidipati C, Kanine R, Abraham M, Tulio S, Swift L, Bressler SN, Cabello B, & Jawad AF (2018). Group CBT for externalizing disorders in urban schools: Effect of training strategy on treatment fidelity and child outcomes. Behavior Therapy, 49(4), 538–550. [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. Funderburk B, Chaffin M, Bard E, Shanley J, Bard D, & Berliner L (2015). Comparing client outcomes for two evidence-based treatment consultation strategies. Journal of Clinical Child & Adolescent Psychology, 44(5), 730–741. [DOI] [PubMed] [Google Scholar]
  27. Garcia I, James RW, Bischof P, & Baroffio A (2017). Self-observation and peer feedback as a faculty development approach for problem-based learning tutors: A program evaluation. Teaching and Learning in Medicine, 29(3), 313–325. [DOI] [PubMed] [Google Scholar]
  28. Henggeler SW, Sheidow AJ, Cunningham PB, Donohue BC, & Ford JD (2008). Promoting the implementation of an evidence-based intervention for adolescent marijuana abuse in community settings: Testing the use of intensive quality assurance. Journal of Clinical Child & Adolescent Psychology, 37(3), 682–689. [DOI] [PubMed] [Google Scholar]
  29. Herschell AD, Kolko DJ, Baumann BL, & Davis AC (2010). The role of therapist training in the implementation of psychosocial treatments: A review and critique with recommendations. Clinical Psychology Review, 30(4), 448–466. [DOI] [PMC free article] [PubMed] [Google Scholar]
  30. Hulleman CS, & Cordray DS (2009). Moving from the lab to the field: The role of fidelity and achieved relative intervention strength. Journal of Research on Educational Effectiveness, 2(1), 88–110. [Google Scholar]
  31. Hurlburt MS, Garland AF, Nguyen K, & Brookman-Frazee L (2010). Child and family therapy process: Concordance of therapist and observational perspectives. Administration and Policy in Mental Health and Mental Health Services Research, 37(3), 230–244. [DOI] [PMC free article] [PubMed] [Google Scholar]
  32. Institute of Medicine (2015). Psychosocial interventions for mental and substance use disorders. National Academies Press. [PubMed] [Google Scholar]
  33. Isenhart C, Dieperink E, Thuras P, Fuller B, Stull L, Koets N, & Lenox R (2014). Training and maintaining motivational interviewing skills in a clinical trial. Journal of Substance Use, 19(1–2), 164–170. [Google Scholar]
  34. James IA, Milne D, & Morse R (2008). Microskills of clinical supervision: Scaffolding skills. Journal of Cognitive Psychotherapy, 22(1), 29–36. [Google Scholar]
  35. Johnston LH, & Milne DL (2012). How do supervisee’s learn during supervision? A grounded theory study of the perceived developmental process. The Cognitive Behaviour Therapist, 5(1), 1–23. [Google Scholar]
  36. Kolb DA (1984). Experiential learning: Experience as the source of learning and development. Prentice-Hall. [Google Scholar]
  37. Lau AS, Lind T, Crawley M, Rodriguez A, Smith A, & Brookman-Frazee L (2020). When do therapists stop using evidence-based practices? Findings from a mixed method study on system-driven implementation of multiple EBPs for children. Administration and Policy in Mental Health and Mental Health Services Research, 47(2), 323–337. [DOI] [PMC free article] [PubMed] [Google Scholar]
  38. Lind T, Bernard K, Yarger HA, & Dozier M (2019). Promoting compliance in children referred to child protective services: a randomized clinical trial. Child Development, 91(2), 563–576. [DOI] [PMC free article] [PubMed] [Google Scholar]
  39. Lind T, Raby KL, Caron E, Roben CK, & Dozier M (2017). Enhancing executive functioning among toddlers in foster care with an attachment-based intervention. Development and Psychopathology, 29(2), 575–586. [DOI] [PMC free article] [PubMed] [Google Scholar]
  40. Martino S, Paris M Jr, Añez L, Nich C, Canning-Ball M, Hunkele K, Olmstead TA & Carroll KM (2016). The effectiveness and cost of clinical supervision for motivational interviewing: A randomized controlled trial. Journal of Substance Abuse Treatment, 68, 11–23. [DOI] [PMC free article] [PubMed] [Google Scholar]
  41. McGillivray J, Gurtman C, Boganin C, & Sheen J (2015). Self‐practice and self‐reflection in training of psychological interventions and therapist skills development: A qualitative meta‐synthesis review. Australian Psychologist, 50(6), 434–444. [Google Scholar]
  42. McHugh ML (2012). Interrater reliability: the kappa statistic. Biochemia Medica, 22(3), 276–282. [PMC free article] [PubMed] [Google Scholar]
  43. Miller WR, Yahne CE, Moyers TB, Martinez J, & Pirritano M (2004). A randomized trial of methods to help therapists learn motivational interviewing. Journal of Consulting and Clinical Psychology, 72(6), 1050–1062. [DOI] [PubMed] [Google Scholar]
  44. Milne DL, Aylott H, Fitzpatrick H, & Ellis MV (2008). How does clinical supervision work? Using a Best Evidence Synthesis approach to construct a basic model of supervision. The Clinical Supervisor, 27(2), 170–190. [Google Scholar]
  45. Milne DL, Pilkington J, Gracie J, & James I (2003). Transferring skills from supervision to therapy: A qualitative and quantitative N = 1 analysis. Behavioural and Cognitive Psychotherapy, 31(2), 193–202. [Google Scholar]
  46. Nadeem E, Gleacher A, & Beidas RS (2013). Consultation as an implementation strategy for evidence-based practices across multiple contexts: Unpacking the black box. Administration and Policy in Mental Health and Mental Health Services Research, 40(6), 439–450. [DOI] [PMC free article] [PubMed] [Google Scholar]
  47. Perrone L, Imrisek SD, Dash A, Rodriguez M, Monticciolo E, & Bernard K (2020). Changing parental depression and sensitivity: Randomized clinical trial of ABC’s effectiveness in the community. Development and Psychopathology. Advanced online publication. [DOI] [PubMed]
  48. Proctor EK, Landsverk J, Aarons G, Chambers D, Glisson C, & Mittman B (2009). Implementation research in mental health services: an emerging science with conceptual, methodological, and training challenges. Administration and Policy in Mental Health and Mental Health Services Research, 36, 24–34. [DOI] [PMC free article] [PubMed] [Google Scholar]
  49. Raby KL, Freedman E, Yarger HA, Lind T, & Dozier M (2019). Enhancing the language development of toddlers in foster care by promoting foster parents’ sensitivity: Results from a randomized controlled trial. Developmental Science, 22(2), e12753. [DOI] [PMC free article] [PubMed] [Google Scholar]
  50. Schoenwald SK, Sheidow AJ, & Chapman JE (2009). Clinical supervision in treatment transport: Effects on adherence and outcomes. Journal of Consulting and Clinical Psychology, 77(3), 410–421. [DOI] [PMC free article] [PubMed] [Google Scholar]
  51. Schwalbe CS, Oh HY, & Zweben A (2014). Sustaining motivational interviewing: A meta‐analysis of training studies. Addiction, 109(8), 1287–1294. [DOI] [PubMed] [Google Scholar]
  52. Watts SE, Turnell A, Kladnitski N, Newby JM, & Andrews G (2015). Treatment-as-usual (TAU) is anything but usual: A meta-analysis of CBT versus TAU for anxiety and depression. Journal of Affective Disorders, 175, 152–167. [DOI] [PubMed] [Google Scholar]
  53. Webster-Stratton CH, Reid MJ, & Marsenich L (2014). Improving therapist fidelity during implementation of evidence-based practices: Incredible years program. Psychiatric Services, 65(6), 789–795. [DOI] [PubMed] [Google Scholar]
  54. Weck F, Kaufmann YM, & Höfling V (2017). Competence feedback improves CBT competence in trainee therapists: A randomized controlled pilot study. Psychotherapy Research, 27(4), 501–509. [DOI] [PubMed] [Google Scholar]
  55. Weisz JR, Kuppens S, Eckshtain D, Ugueto AM, Hawley KM, & Jensen-Doss A (2013). Performance of evidence-based youth psychotherapies compared with usual clinical care: A multilevel meta-analysis. JAMA Psychiatry, 70(7), 750–761. [DOI] [PMC free article] [PubMed] [Google Scholar]
  56. Weisz JR, Kuppens S, Ng MY, Eckshtain D, Ugueto AM, Vaughn-Coaxum R, Jensen-Doss A, Hawley KM, Krumholz Marchette LS, Chu BC, Weersing VR, & Fordwood SR (2017). What five decades of research tells us about the effects of youth psychological therapy: A multilevel meta-analysis and implications for science and practice. American Psychologist, 72(2), 79–117. [DOI] [PubMed] [Google Scholar]
  57. Yarger HA, Bernard K, Caron E, Wallin A, & Dozier M (2019). Enhancing parenting quality for young children adopted internationally: Results of a randomized controlled trial. Journal of Clinical Child & Adolescent Psychology, 49(3), 378–390. [DOI] [PMC free article] [PubMed] [Google Scholar]
  58. Zajac L, Raby KL, & Dozier M (2020). Sustained effects on attachment security in middle childhood: Results from a randomized clinical trial of the Attachment and Biobehavioral Catch‐up (ABC) intervention. Journal of Child Psychology and Psychiatry, 61(4), 417–424. [DOI] [PMC free article] [PubMed] [Google Scholar]
  59. Zorga S (1997). Supervision process seen as a process of experiential learning. The Clinical Supervisor, 16(1), 145–161. [Google Scholar]

RESOURCES