Abstract
Using youth mentors to deliver evidence‐based psychosocial services has been proposed to increase the reach of treatments, in part given the affordability and ubiquity of mentors in youth settings. Further, tests of mentor‐delivered motivational interviewing (MI) have shown increases in youth mentees' academic performance and wellbeing. Yet, traditional methods of training mentors to use MI can be costly and time‐consuming. Previous work has suggested the value of asynchronous, brief, just‐in‐time training (JITT) to help offset these challenges; however, MI JITT for mentors has not yet been formally evaluated. As such, here, we report on a preliminary study of MI JITT videos for youth mentors. Mentors in the program were randomly assigned to training‐as‐usual or training‐as‐usual plus JITT. MI attitudes, knowledge, and skills were measured via self‐report pre‐ and post‐intervention. Results indicate that assignment to the JITT video condition was associated with significantly improved reflection skills. Effect size analyses also suggest moderate improvements in understanding MI mechanisms and theory, and in other MI skills (e.g., asking open‐ended questions). Mentors found the JITT videos acceptable and usable and reported understanding their content. The article concludes with a discussion of considerations for future research and implementation.
Keywords: evidence‐based practices, middle school, mentoring, school‐based mentoring, motivational interviewing, training
Highlights
Task‐shifting redistributes professional duties to individuals with less training
Combining task‐shifting with just‐in‐time training (JITT) may advance child services in communities
Youth mentors in instrumental programs may particularly benefit from JITT for helping skills
Preliminary data suggest mentors' helping skills improve following random assignment to JITTs
Additional research should explore ways to leverage mentors in school systems and through training
INTRODUCTION
Schools in the United States are increasingly responsible for detecting and treating psychosocial concerns that interfere with children and adolescents' (herein, youths') functioning. Yet, at the same time, these types of psychosocial concerns are growing in both incidence and prevalence (Centers for Disease Control and Prevention, 2021), and school systems are restricted by finite resources and workforce shortages (García & Weiss, 2019; Guerra et al., 2019). Research suggests that the need for youth mental health services will continue to grow, and while students with severe or acute needs will likely be triaged to appropriate services, current models of school‐based prevention and intervention efforts may not have the capacity to serve youth with subclinical needs (Maggin et al., 2016). Expanding the youth mental health workforce—such as through incorporating paraprofessionals in service provision models—has been suggested as one way to combat these limitations, in part due to the success of using paraprofessionals in community settings (Hart et al., 2024; McQuillin et al., 2019).
Expanding the workforces may include task‐shifting, which the World Health Organization (World Health Organization, 2008) defines as “a process whereby specific tasks are moved, where appropriate, to health workers with shorter training and fewer qualifications” (p. 7). Through expanding specialized workforces, task‐shifting is associated with increases in access to care (Schneeberger & Mathai, 2015). Task‐shifting is also thought to enhance the role of community in service delivery (i.e., through collaboration and participation; Riemer et al., 2020), and to improve how efficiently resources and services are distributed (World Health Organization, 2008; Zachariah et al., 2009). Though developed and studied in traditional health contexts, task‐shifting has broadened to other fields, such as mental health and education. Indeed, task‐shifting is almost ubiquitous in schools, where it has become increasingly common for teacher aides or assistants (i.e., individuals without a teachers' certification or license) to provide academic remediation or behavioral support to select students (Webster & De Boer, 2019).
Task‐shifting to youth mentors
Youth mentoring is another way youth services can extend beyond professional providers (Christensen et al., 2020; Hart et al., 2023b) and benefit children in community or school settings (Wheeler et al., 2010). Moreover, youth mentoring is aligned with community psychology values as it emphasizes meaningful engagement from both mentees and mentors toward goals of collective (i.e., community) wellbeing (Riemer et al., 2020). Though youth mentors often lack professional credentials in mental health and/or youth development, they are well‐positioned to positively affect the youth with whom they work (Cavell et al., 2021; Garringer et al., 2017). Large‐scale studies show that participation in youth mentoring programs is associated with decreased behavioral, peer, and emotional problems (e.g., anxiety, depression; DeWit et al., 2016; Herrera et al., 2023). Moreover, mentored youth report higher coping skills when faced with stressors (DeWit et al., 2016). These benefits extend to youth of diverse backgrounds and identities (Nabors et al., 2022), and have led to significant federal investment in youth mentoring programs (Fernandes‐Alcantara, 2019).
The focus and mission of youth mentoring programs include academic enabling skills, general wellness, and positive development (Garringer et al., 2017), and therefore likely contribute to mentees' school success. Further, mentors are often welcomed in community and school settings (Bruce & Bridgeland, 2014), suggesting that formally integrating mentors into school service provision models would be an acceptable approach and require only modest infrastructure enhancements (Hart et al., 2024). School‐based mentoring has, in fact, been the fastest‐growing type of youth mentoring in recent decades (Garringer et al., 2017; Herrera et al., 2011; Wheeler et al., 2010). In one study of the Student Check‐Up, undergraduate volunteers delivered a semi‐structured single‐session intervention based on motivational interviewing (MI) to middle school students (Strait et al., 2017; Strait, 2018). Following the Student Check‐Up, students reported increased academic effort and self‐efficacy (Strait et al., 2017).
However, the extent and nature of youth mentoring benefits seem largely determined by the goals and structure of the program (Cavell et al., 2018). Historically, mentoring was believed to benefit youth mentees by providing them with a relationship that serves as a developmental asset, and the mentee‐mentor relationship may be considered the mechanism of change (Rhodes, 2005). This type of mentoring is called developmental, informal, or relational (Cavell et al., 2021). While proponents of this model may worry that prescribing mentee‐mentor activities may damage their valuable relationship (Li & Julian, 2012), recent work by Christensen et al. (2020) shows that formal, instrumental, or targeted programs—or those that require mentors to use specific practices and skills designed to address mentees' presenting concerns—produce effect sizes that are twice as large as those produced by nonspecific, relational practices. These findings suggest that training mentors to facilitate instrumental programs (i.e., to deliver curriculum‐ and evidence‐based programming that meets mentees' needs) is worthwhile. Additional literature on instrumental mentoring (e.g., Lyons et al., 2019; Schenk et al., 2021) has echoed the value of training mentors to address areas of need among mentees they aim to serve.
The promise of pairing motivational interviewing with youth mentoring
The theory of MI is a counseling approach that prompts behavior change when individuals feel ambivalent (i.e., have contradictory or mixed feelings towards changing their behavior; Miller & Rollnick, 2002) and has shown promise when combined with a youth mentoring approach. Clinicians who use MI encourage and support clients to generate their own plans and reasons for change (i.e., “change talk”), and then strengthen clients' resolve to enact these changes. Clinicians are taught to foster a collaborative relationship with clients through open‐ended questions, affirmations, reflections, and summaries—referred to as “OARS” in the MI literature. Considerable research has sought to identify best practices in training professional providers to use MI in various settings (e.g., Madson et al., 2009). Frey et al. (2021) emphasize that learning to employ MI requires an intentional integration of didactic training, applied practice, and ongoing support.
MI service delivery has been combined with youth mentoring and studied in schools beyond the aforementioned Student Check‐Up intervention, and has shown positive associations with academic achievement, behavior, engagement, and mental health (Rollnick et al., 2016). For example, Simon and Ward (2014) developed and evaluated an MI training administered to 17 part‐time paraprofessional academic advisors whose role was to provide academic enrichment and support services to urban youth. Findings from their study show that learning MI resulted in increased MI knowledge and use in both applied and simulated sessions. Similarly, Snape and Atkinson (2015) trained paraprofessional school staff to use MI with disaffected pupils and found that staff rated the training program highly, indicating they endorsed the benefits of using MI. However, these studies are preliminary and have some methodological limitations (i.e., no control groups or randomization, small sample sizes; Hart et al., 2023a).
Another project that has capitalized on the joint promise of MI and youth mentoring is the Academic Mentoring Program for Education and Development (AMPED; McQuillin et al., 2013). AMPED mentors—typically undergraduate students—receive initial training in MI, and then guide middle school mentees through goal setting and skill‐building sessions using a program manual (McQuillin et al., 2015; Strait et al., 2020). AMPED sessions occur for 45 min weekly for approximately one semester. Each meeting has a prescribed curriculum, largely drawn from Cognitive Behavioral Therapy (Beck, 2011) and/or the Homework, Organization, Planning, and Skills program (Langberg et al., 2012). AMPED mentees show higher academic achievement, improved academic skills (e.g., organization), and increased life satisfaction relative to control groups (McQuillin et al., 2015; Strait et al., 2020).
However, common methods for training AMPED mentors also incur limitations and stray from best practices in MI and youth mentoring training. Often, initial, prerequisite trainings occur in‐person and span up to 3 h (Strait et al., 2020). This design arguably emphasizes training to competency but overlooks the importance of ongoing practice, review, and training in MI skill maintenance (Frey et al., 2021). Moreover, research in youth mentoring has found that ongoing support and training for mentors (a) better predicts program outcomes than initial training and (b) can double the effect sizes of mentee outcomes (DuBois et al., 2002).
Just‐in‐time trainings (JITTs)
Just‐in‐time training (JITT) is described as “only the training necessary, when it is necessary, to produce competent service provision” (McQuillin et al., 2019, p. 3‐4) and can address some of the limitations inherent to standard training models. In other words, JITT may serve as “booster sessions” within some MI training programs (Schwalbe et al., 2014). In providing a proof‐of‐concept supporting the combination of JITT and task‐shifting, McQuillin et al. (2019) discuss the expansion of school mental health services to students who typically do not qualify for assistance from professional providers. Specifically, employing JITT in the form of asynchronous videos may help to maintain mentors' MI skills and offset training challenges characteristic of most youth mentoring programs. This approach is aligned with Ebbinghaus's formative work in memory retention and the “forgetting curve” (Murre & Dros, 2015) as it provides strategically spaced repetition of content. In one study, medical paraprofessionals who completed JITT before service delivery implemented newly‐obtained skills with more confidence and fidelity than paraprofessionals in a control condition (Kent, 2010). However, JITT has not yet been formally evaluated where MI services are being task‐shifted to youth mentors in a school setting.
Rationale for the current study
The purpose of the present study is to provide a preliminary examination of asynchronous, brief, JITT videos designed specifically to reinforce MI content to school‐based youth mentors. The first aim of this study was to determine the efficacy of using MI JITTs with youth mentors in AMPED. We hypothesized that mentors who receive MI JITTs will demonstrate improved MI knowledge and skills relative to mentors who do not receive MI JITTs. The second aim of this study was to determine the usability of MI JITTs with youth mentors in AMPED, to help identify potential barriers and facilitators to using MI JITTs as part of MI training programs.
METHOD
Participants
All youth mentors were undergraduate students at a large, predominantly white, public research university in the southeastern United States. Researchers recruited mentors during the Fall 2021 semester by visiting undergraduate psychology classes and contacting student organizations. Undergraduates who were interested in being youth mentors submitted an online application that included a background check that the partnering school district required of on‐campus volunteers. Mentors were screened via a follow‐up phone interview, which was standardized and facilitated by a trained research assistant. Screening questions included prospective mentors' interest in and reasons for being a mentor, as well as potential barriers to mentoring/participation (e.g., transportation to an off‐campus site, scheduling restrictions). Regarding demographics, approximately 55% of participants were psychology majors, and 86% of participants were women. Participants (i.e., mentors) were allowed to withdraw from the study at any time without negative consequences.
Study setting (Background)
Data collection occurred at a public middle school in the southeastern United States over the Spring 2022 semester. School‐wide demographic data indicate that approximately 56% of students are African American/Black, and 30% of students are white; 100% of students are considered economically disadvantaged. The middle school determined an internal student referral/selection process for the program and sought to identify students with subclinical academic or behavioral concerns. Middle school students' identifying information was not collected for this study, but basic demographic data (e.g., age, gender, grade level, race) and a brief reason for students' referral to the program were obtained and shared with program coordinators to enable better participation (e.g., matching with a preferred mentor; scheduling meetings during a preferred time). Students referred for the mentoring program were considered representative of the broader student body. All study procedures were approved by an Institutional Review Board before study activities commencing. The mentoring program, AMPED, has been active for over 10 years and requires parents to provide written permission for their children to be assigned a mentor.
Assessment and study procedures
Table 1 provides a brief, visual overview of procedural differences between the control and treatment groups in this study. These procedures are provided in narrative form and more detail below.
Table 1.
Delineation of MI training and support provided to control (i.e., training‐as‐usual) and treatment (i.e., training‐as‐usual, plus JITT) groups in the present study.
| MI training and support provided | Control group (i.e., training‐as‐usual) | Treatment group (i.e., training‐as‐usual plus JITT) |
|---|---|---|
| Initial, workshop‐style training | X | X |
| Access to program manual, including MI materials, to guide program administration | X | X |
| Weekly emails to provide relevant updates and reminders regarding programming | X | X |
| Links to JITT videos for two sessions embedded in respective weekly emails | X | |
| On‐site supervisors check mentors' understanding of session content before each meeting with mentee | X | X |
| On‐site supervisors confirm the completion of JITT videos before meeting with mentee | X |
Note: Mentors in control and treatment groups were intermixed during the initial, workshop‐style MI training.
Before being assigned a mentee, mentors were randomized into the control or treatment groups. All mentors provided informed consent and submitted pre‐training measures of MI aptitude as part of the onboarding process. The MI inventory was shared with participants through email with an online survey link via Qualtrics. The same email invited mentors to required pre‐match trainings, which consisted of AMPED trainings‐as‐usual (McQuillin & McDaniel, 2021; Strait et al., 2020). As such, all mentors regardless of JITT condition assignment, were required to attend two pre‐match trainings. Mentors were not separated into control and treatment groups for these trainings (i.e., mentors in both groups attended trainings together), and the order in which mentors attended these trainings was not prescribed. One training provided a workshop‐style “Introduction to MI,” while the other provided an overview of the AMPED protocol and requirements.
Completion of the MI pre‐test was confirmed or enforced as mentors arrived for these trainings (i.e., mentors could not sit for initial training until their completed pre‐test was received by the research team). All mentors who were randomized completed the pre‐test. In some instances, this led to mentors who were randomized before they completed their pre‐test. This decision was made by the research team to safeguard against a situation in which administrative delays from the mentoring program (e.g., randomizing mentors to the control or treatment group) delayed mentee‐mentor pairing and meeting. Upon completion of both trainings, mentors received a copy of the AMPED manual and curriculum from research staff via email to reinforce content from both trainings and to guide subsequent program delivery.
Mentee‐mentor meetings; treatment
Mentee‐mentor pairs were formed on a rolling basis (i.e., as middle school students were referred to AMPED) during the Spring 2022 semester with attention to shared interests as indicated by mentors' applications and middle school staffs' referrals. All mentors were notified upon being matched with a mentee via email from the primary researcher and program coordinator. Emails sent to mentors in the treatment group had embedded links to JITT videos that reinforced the tenets of MI and provided suggestions for using MI as a mentor. Emails sent to mentors in the control group omitted additional video trainings but were otherwise equivalent to those for the treatment group. All mentors were also reminded to follow the program manual and curriculum in facilitating meetings with their mentees.
For the duration of their participation in the semester‐long mentoring program, all mentors received weekly emails from the program coordinator with a combination of administrative updates (e.g., an adjusted school schedule due to standardized testing) and reminders about facilitating AMPED. Moreover, study supervisors (individuals with advanced MI and/or program experience) were stationed on‐site to check that all mentors had adequately planned and prepared for all sessions. Supervisors were aware of mentors' assignment to the control or treatment group and asked those in the treatment group fidelity questions specific to the assigned JITTs. Pre‐meeting check‐ins with supervisors also provided opportunities for mentors to ask questions or clarify session content as needed.
Only two sessions of the eight‐session manualized program were assigned corresponding MI JITTs, as researchers view this as a preliminary study of the efficacy and usability of MI JITTs. These sessions were strategically selected to occur early in the program to safeguard against attrition threats and to align with opportunities to employ MI tenets (e.g., collaborating with mentees and using OARS) within the AMPED sessions. Embedding links to MI JITT videos to weekly emails sent to mentors in the treatment group sought to ensure that the “just in time” aspect of the training was enforced (i.e., the study would detect time‐limited effects of training). Email distribution also allowed for staggering the JITT content to coincide with specific activities outlined in the program manual (i.e., mentors did not receive access to all JITTs at once).
JITT treatment 1
Mentors in the JITT condition watched two videos before their first session with mentees, both of which were included in their match assignment email. One JITT focused on identifying and preventing the “righting reflex” (i.e., practices that are inconsistent with MI), which had been emphasized in initial training, and the second JITT provided a demonstration and overview of a value card sort activity that was included in the AMPED manual. The first session for all mentee‐mentor pairs (i.e., those in both conditions) consisted of completing a value card sort activity (Miller et al., 2001) and filling out a mentee‐mentor contract. The contract prompted pairs to identify mentees' strengths (i.e., “What are you good at? How do you know you are good at these things?”) and general goals.
JITT treatment 2
The third session of AMPED also employed MI JITTs for the treatment group. Like for the first session, MI JITT video links were sent to mentors in their weekly email. Skills emphasized in the MI JITT video links included recognizing and responding to mentees' change talk, guiding mentees through a behavior self‐assessment, and facilitating a goal‐setting discussion. The third session for all mentee‐mentor pairs began with a brief, mentor‐led lesson on goal setting, then introduced a structured goal‐setting worksheet. This sheet prompted pairs to: propose a weekly goal for the mentee; identify environmental factors or personal characteristics that support the mentee reaching their goal; generate a plan to help the mentee reach the remainder of their goal; identify potential obstacles to reaching the goal, and brainstorm corresponding solutions; and, indicate a way to measure whether the plan is working.
Following the conclusion of the third session, which was the second and final session with corresponding MI JITTs (versus following the conclusion of the full mentoring program), mentors in both groups completed posttest measures of MI JITT efficacy.
Measures of motivational interviewing
Two measures of MI were used in this study. The Helpful Responses Questionnaire (HRQ) and the Motivational Interviewing Knowledge and Attitudes Test (MIKAT) were adapted as relevant to reflect a youth mentoring context. Eight additional multiple‐choice questions were created to measure mentors' understanding and use of OARS. The use of measures that focus on attitudes, knowledge, and skills, as those used in this study, is aligned with how the American Psychological Association defines competencies as well as with previous MI training literature (Madson et al., 2009; Schwalbe et al., 2014).
The helpful responses questionnaire (HRQ)
The HRQ (Miller et al., 1991) is a tool for measuring therapeutic empathy. The HRQ requires respondents to provide a one‐ or two‐sentence open‐ended response to a prompt/vignette, and participants were instructed to “write the next thing that you would say if you wanted to help.” The measure includes guidelines for evaluating and quantifying participants' responses on a scale from 1 to 5, with higher scores indicating more appropriate and richer (i.e., more skilled) reflections.
A rating of 1 is the lowest possible score, which is assigned when the response does not contain a reflection and contains a verbal behavior that is inconsistent with MI (i.e., something that counteracts or disrupts collaboration). A rating of 2 is neutral; it may be assigned when a response contains no reflection nor a comment inconsistent with MI, or when a response contains both a reflection and a comment that is inconsistent with MI. A rating of 3 indicates a basic reflection (e.g., restating). A rating of 4 represents a more advanced reflection, adding inferred or paraphrased meaning that seems believable. A rating of 5 indicates the highest level of reflection, meeting the criteria for a rating of 4 and including a reflection of feeling that fits the original statement. When multiple responses are made, the highest level of reflection is scored unless one or more components of the response is inconsistent with MI (in which case, the rating is 2).
HRQ scores were assigned by a team of three researchers, including the first author/program coordinator and two trained, undergraduate research assistants. In instances where codes were discrepant or divergent, coders met to discuss their reasoning and come to an agreed‐upon code to be used in analyses. Participants' scores of 1–5 on each of the six questions were averaged to provide their overall score on the HRQ.
The motivational interviewing knowledge and attitudes test (MIKAT)
The MIKAT (Leffingwell, 2006) is an integrated measure of respondents' beliefs about problem behaviors and corresponding treatments (i.e., attitudes and knowledge). All questions are responded to in a True or False format. In the current study, the MIKAT was condensed (i.e., cut from 14 questions to eight questions) and customized to reflect school‐based youth mentoring, in contrast with its original development for substance use treatment. For example, where the original MIKAT states “External pressure and consequences are the only way to make substance abusers change,” the present protocol used “External pressure and consequences are often the only way to make youth change.” Because mentors' responses to the MIKAT were “correct” or “incorrect,” scores on this measure ranged from 0/8 to 8/8.
OARS questions
An additional eight multiple‐choice questions were developed to assess respondents' abilities and tendencies to use OARS, and to avoid using the righting reflex. Each question allowed respondents to select from four possible answers, two of which were questions and two of which were statements. Each question had at least one “correct” answer as indicated by its potential to respond to mentees in a collaborative and validating manner. For example, one prompt stated, “Your mentee greets you seeming very upset, telling you that they were given lunch detention for the next 2 weeks. You:,” and then offers the following four response choices: “Say, ‘It's only been 2 weeks, it will be over faster than you think;’” “Say, ‘You think this is an unfair punishment;’” “Ask, 'What did you think was going to happen when you were acting out?;’” and, “Ask, ‘Do you want me to call the school for you?,’” in which the second option was scored as the correct answer. Because mentors' responses to the OARS Questions were “correct” or “incorrect,” scores on this measure ranged from 0/8 to 8/8.
Measures of JITT usability
The usage rating profile (URP) for interventions revised (IR)
The URP‐IR (Chafouleas et al., 2011) is designed to assess barriers and facilitators to intervention implementation. It assesses and conceptualizes “usability” through subscales of Acceptability, Understanding, Home–school Collaboration, Feasibility, System Climate, and System Support. For the present study, questions were tailored to assess the usability of MI JITTs in youth mentoring programs. The URP‐IR was administered to the treatment group only, as part of their posttest battery. A total of five items from the URP‐IR were omitted due to their incongruence with the study (e.g., one item queried about record‐keeping associated with the intervention, which was not relevant to the use of JITTs by mentors in the treatment group). Similarly, the “Home–school Collaboration” subscale, which consists of three questions, was omitted because communication and involvement with youths' caregivers were not considered in the current study. The URP‐IR has respondents indicate answers on a six‐point Likert‐type scale ranging from 1‐ Strongly Disagree to 6‐ Strongly Agree. These scores are then grouped by subscale (Briesch et al., 2013; Chafouleas et al., 2011) and descriptive statistics are used to provide overall subscale ratings. In general, higher ratings indicate positive experiences using the intervention; however, some items are reverse‐coded (Briesch et al., 2013). Use of the URP‐IR in this study reflects its use in similar studies (e.g., training mentors to implement AMPED with middle school students with elevated disruptive behavior; McQuillin & McDaniel, 2021).
Regarding the subscales that were administered, acceptability refers to how well an intervention was received by the population intended to deliver or receive it; considering the acceptability of behavioral and social interventions is paramount in their implementation (Ayala & Elder, 2011). In the present study, acceptability refers to mentors' feelings towards the MI JITT videos, including their subjective evaluations of the videos' utility and mentors' likelihood of employing the videos as tools outside of the program, which enforced their use. Understanding refers to how well those who received the intervention grasped its contents. Questions within the feasibility cluster sought to gauge how easy or possible it would be for mentors to implement the MI skills and strategies modeled and/or taught during training, in real‐life mentoring practice with their mentees. Questions within the system climate cluster sought to query whether mentors felt their use of MI JITTs was accepted and supported by the broader systems in which they were working. System support refers to the additional resources (e.g., professional development) mentors reported they felt they would need to use MI JITTs. As such, higher ratings on system support items indicate need for higher levels of support implementing MI.
Statistical analyses
All data were analyzed using the R statistical computing program version 4.2.1. Data were checked to ensure they met the statistical assumptions required for further inferential tests. An a priori level of significance (p‐value) of 0.05 was set for all analyses. Mentors were randomly assigned (0 = control, 1 = treatment) to watch JITT videos, the effects of which were tested in a regression model where pre‐test scores were modeled as covariates.
RESULTS
Participants
Completed pre‐test measures were obtained from 63 individuals, and completed posttest measures were obtained from 38 mentors. Throughout the study, mentors were majority female psychology majors. Participant attrition can be attributed to a variety of factors. Some mentors failed to complete both required trainings, while others removed themselves from the program due to complications in being paired with a mentee (i.e., the academic calendars of the partnering middle school and the university differed, and mentors were ready to participate in the program earlier and faster than the school completed mentees' referrals). These delays similarly caused difficulties completing the AMPED curriculum once pairs were formed, and posttest data from these mentors were not gathered as the JITT intervention had not been administered to them in full. Lastly, a small subset of mentors failed to submit their post‐program measures, despite up to three reminders. These patterns in mentor attrition resulted in this study being underpowered to detect effects that were not large (d > 0.9; Raudenbush & Liu, 2000). Provided that this was a pilot randomized trial, emphasis should be placed on the estimated effect sizes. Participant/mentor enrollment at each stage of the study is depicted in the CONSORT Diagram below: Figure 1.
Figure 1.

Mentor/participant CONSORT diagram.
Findings of MI JITT efficacy
Results of analyses are presented by measure in Table 2.
Table 2.
Outcomes.
| Outcome | Coefficient | SE | Cohen's d | T | p |
|---|---|---|---|---|---|
| Parameter (intercept, treatment) | |||||
| HRQ | |||||
| Intercept | 1.66 | 0.34 | — | — | — |
| Treatment | 0.33 | 0.15 | 0.62 | 2.14 | 0.04* |
| MIKAT | |||||
| Intercept | 0.51 | 0.11 | ‐ | ‐ | ‐ |
| Treatment | 0.09 | 0.05 | 0.58 | 1.868 | 0.07 |
| OARS Questions | |||||
| Intercept | 0.49 | 0.09 | ‐ | ‐ | ‐ |
| Treatment | 0.08 | 0.04 | 0.59 | 2.019 | 0.05 |
p < 0.05.
The HRQ
The pre‐test average score was 1.76 (SD = 0.43), which represents that responses tended to contain MI‐inconsistent behaviors and lack a reflection. Scores ranged from 1.17 to 3.17. The posttest average score was 2.73 (SD = 0.53), which represents the presence of some basic reflections. Scores ranged from 1.50 to 3.83. When mentors' pre‐ and post‐scores on the HRQ were examined via regression analysis, assignment to the treatment group was statistically significant, B = 0.33(0.15), t = 2.14, p = .04, d = 0.62. In other words, assignment to the treatment group was associated with a 0.33 increase in HRQ scores at posttest controlling for pre‐test covariates. The effect size for this estimate is considered moderate in strength. These findings were consistent with our hypothesis that assignment to watch the JITT videos immediately before mentoring sessions was associated with improved MI skills.
The MIKAT
The average score on the pretest was 0.68 (SD = 0.19), which represents between five and six questions correct. The average score on the posttest was 0.77 (SD = 0.16), which represents over six questions correct. When mentors' pre‐ and post‐scores on the MIKAT were examined via regression analysis, assignment to the treatment group was not statistically significant, B = 0.09(0.05), t = 1.87, p = .07, d = 0.58. However, this parameter was of moderate effect size and in the hypothesized direction.
OARS questions
The pre‐test mean was 0.71 (SD = 0.16), which represents between five and six questions correct. Responses ranged from three out of eight questions correct to all eight questions correct. The posttest mean was 0.82 (SD = 0.13) which represents between six and seven questions correct. When mentors' pre‐ and post‐scores on OARS Questions were examined via regression analysis, assignment to the treatment group was not statistically significant, B = 0.08(0.04), t = 2.02, p = .05, d = 0.59. However, the parameter again was of moderate effect size and in the hypothesized direction.
Findings of MI JITT usability
The URP‐IR (Chafouleas et al., 2011) was completed and returned by 20 participants who received MI JITTs (i.e., only mentors in the treatment group provided feedback regarding intervention usability).
Acceptability
Overall, users rated the JITT protocol as acceptable, with a mean rating of 5.45/6 (SD = 0.73). Mentors' ratings of the MI JITT videos' acceptability were very positive, with almost uniform “agreement” and modal responses of “strong agreement” across statements that favored the use of MI JITT's (e.g., “This video series provides information i.e. useful to my work as a mentor”).
Understanding
In this study, the understanding cluster measured the degree to which mentors believed they comprehended ways to use MI with their mentees as a function of the training they received. Overall, users rated the JITT protocol as understandable, with a mean rating of 5.3/6 (SD = 0.62). As such, mentors reported that JITT videos were effective in teaching them how to use MI skills to engage and interact with their mentees.
Feasibility
Overall, users rated the MI JITT protocol as feasible, with a mean rating of 4.9/6 (SD = 1.24). Most mentors agreed that the total time required to implement MI per the JITT videos was manageable, and that material resources needed for using MI per the JITT videos were reasonable.
System climate
Overall, JITT users rated this domain positively, with a mean rating of 5.35/6 (SD = 0.88). Mentors indicated high alignment between their use of MI JITTs and their work with youth; for example, all mentors agreed that their administrator or organization would be supportive of their use of MI JITTs and that their use of MI JITTs would be consistent with the mission of their organization.
System support
These questions convey information about the potential sustainability of teaching mentors to use MI via JITTs. Responses represent mixed beliefs about how much additional support mentors would require implementing MI, reflected by a mean rating of 2.89/6 (SD = 1.52).
DISCUSSION
The purpose of this study was to conduct a preliminary evaluation of asynchronous, brief JITT videos developed to reinforce and review the teaching of MI skills to school‐based youth mentors in the AMPED program. Overall, findings from this study suggest that MI JITTs were efficacious. Mentors who received MI JITTs significantly outperformed mentors who did not receive MI JITTs per scores on the HRQ, which measured mentors' empathic responding (i.e., the depth and quality of reflections in open‐ended responses to vignettes). Findings were not significant for assignment to the JITT condition as measured by the MIKAT or OARS Questions, which queried participants' understanding of the underlying mechanisms and beliefs of MI and assessed mentors' effective use of basic MI skills, respectively. However, Cohen's d effect sizes per all three measures were of moderate size and in the hypothesized direction (d = 0.62 on the HRQ, d = 0.58 on the MIKAT, d = 0.59 on OARS Questions), supporting that MI JITT for mentors is an efficacious way to maintain a variety of MI skills in instrumental mentoring programs. The second main finding of this study supported the usability of MI JITTs. These findings occurred despite some issues with attrition and power. As shown in the CONSORT Diagram, the study went from an initial sample of 78 participants to 18 in the control arm and 20 in the treatment arm, due to incomplete screening, scheduling/transportation issues, and not completing the program. In turn, this attrition reduced the study's power.
It is possible that mentors improved specifically in their reflection‐making in this study because reflections were emphasized, modeled, and reviewed throughout the JITT video sequence. In other words, reflection‐making may have improved as a function of repetition across the JITT program, whereas information about the underlying tenets of MI, for example (such as was queried by the MIKAT), was not reviewed to the same extent. It is characteristic of MI trainings to emphasize the importance of reflections throughout all stages of training (Madson et al., 2009), and the finding of improved reflections specifically is consistent with previous literature of training paraprofessionals to use MI (e.g., Newman‐Casey et al., 2018). There may also be a partially measurement‐based explanation to mentors' performance on the HRQ in this study, as it was the only qualitative, open‐ended measure used. This format perhaps allowed for more nuanced assessments of MI skills than the other measures, which were multiple‐choice and true‐false. This finding may highlight opportunities for future research to develop or use measures that are strong fits with paraprofessionals in school settings.
Regarding the usability of MI JITT for mentors, participants in the treatment group indicated they had enthusiastic and positive attitudes about using MI as shown in the JITTs they received, and that the asynchronous training videos provided useful information to their work as mentors. They also endorsed that using MI was a good way to help their mentees make positive behavior changes. These items collectively signal high to very high usability of MI JITTs for mentors (Briesch et al., 2013). These findings are consistent with previous studies of MI trainings, as training recipients often self‐endorse improved skills and a subjective favorable response to training (Madson et al., 2009).
Implications of findings and future directions
Present findings replicate and expand on previous findings that training mentors to use MI is a promising approach to expanding the school mental health workforce (e.g., Strait et al., 2020). This approach is aligned with community psychology tenets as mentors often come from youths' local communities, and work to promote mental health and social wellbeing through community engagement (Garringer et al., 2017; Riemer et al., 2020). Current findings can be translated to school mental health initiatives in efforts to increase access to evidence‐based services for students. For example, students with elevated but subclinical needs could benefit from lower‐intensity psychosocial support delivered by youth mentors, while students with clinical needs could be best served by professionals who traditionally provide school‐based services (e.g., school counselors, school psychologists). JITT models may help paraprofessional providers receive ongoing, high‐quality instruction to maintain competent service provision.
It will be helpful for future research to further integrate what is known about best practices in MI training (e.g., Schwalbe et al., 2014) and paraprofessional workforces. For example, research may examine the utility of combining JITT with formal coaching or supervision, so that preparing for sessions and strategizing about the most effective way to use MI with youth is overseen by a professional. This setup would also allow for more individualized coaching and interactive, live practice with feedback, which is recognized as an important step to becoming proficient in MI (Frey et al., 2021). Literature in community psychology emphasizes the importance of including systems‐level factors in these initiatives, such as the availability of training and supervision resources, and institutional readiness to support mentors using MI JITTs (McQuillin et al., 2019). Formally integrating JITTs with a supervision option into future trials may be a step toward maximizing training efficacy and usability. Similarly, future work may consider and evaluate JITTs at other and/or additional timepoints throughout an intervention, as the present study paired only two sessions near the beginning of the program with JITT videos. This pursuit may allow future research to examine aspects of effective JITTs such as timing and sequencing and variations in these and other parameters (e.g., video quality/satisfaction) on provider retention.
Additional future directions might examine the value of a more targeted recruitment and/or selection process for mentors. Conducting role‐plays with potential providers (e.g., during the interview process, as seen in Yahne et al., 2014) may demonstrate how screening and selection can optimize training protocols. This study could influence the effectiveness of task‐shifting MI on a larger scale, as requiring paraprofessional providers to demonstrate a baseline level of competency and/or understanding of program rules (such as done by Strait et al., 2020) may be a worthwhile investment in training.
Another valuable future direction in this area would be to study the effects of receiving MI JITTs on mentors' applied practice (i.e., effectiveness), versus only studying changes in their MI attitudes, knowledge, and skills in a contrived or controlled environment (i.e., efficacy). In other words, mentors' application of MI skills may differ from their responses to the survey measures used herein, and understanding the degree and nature of these possible differences would be critical to leveraging MI JITTs. Audio recording and coding mentee‐mentor sessions, for example, could provide information about mentors' verbal behaviors and consistency with MI in sessions with youth. Granted, these future directions are most relevant to instrumental mentoring contexts, as developmental/relational youth mentoring models may suffer unintended consequences linked to this type of over‐prescription.
Limitations
One limitation of the study is generalizability. Our participants were demographically homogeneous; that is, undergraduate psychology majors who were mostly female. Further, this study occurred in one geographical location and where key stakeholders (e.g., research staff and school administrators) had established rapport, thereby taking for granted some initial necessary steps in piloting new school mental health initiatives.
The role of research staff may also be considered a limitation in the current study, as mentors were instructed and reminded to watch JITTs to ensure fidelity. Understanding mentors' autonomous, “real‐world” use of JITTs will be helpful to maximizing the potential value of MI JITTs in the future. Another methodological limitation of the current study relates to the pre‐ and post‐measures used in assessing the efficacy of JITTs, as all measures were adapted or customized to some degree.
Participant attrition resulting in a small posttest sample size was also a notable limitation of the current study, as the data set was underpowered to detect the marginal benefits of receiving JITT videos. Current effect sizes may better capture the promise of MI JITTs than inferential statistics. Nonetheless, the information garnered from effect sizes will be helpful for future, larger trials.
Conducting this study in the wake of the COVID‐19 pandemic likely contributed to this limitation, as school safety protocols arguably compromised aspects of the execution of the broader youth mentoring program. For example, school staff were slow to recruit/refer mentees to the program, which led to substantial delays and hurdles in data collection in the current study. Other school safety protocols, such as requiring all mentee‐mentor meetings to occur outside, further delayed pairings and program execution (e.g., delaying meetings due to weather concerns). These difficulties do not reflect the efficacy and usability of youth mentors using MI JITTs but are important to note as contextual variables in interpreting findings of the current study. Previous studies of AMPED show little to no mentor attrition (e.g., McQuillin & McDaniel, 2021), further suggesting that issues affecting the current study may be attributed to environmental factors more than to issues with programming and MI JITTs. Similarly, the composition of our sample remained relatively constant throughout the present study, suggesting that attrition was not a result of bias. Still, attrition should be noted as a major limitation.
CONCLUSION
Expanding youth psychosocial service delivery to paraprofessionals such as youth mentors may allow more youth to access evidence‐based support, such as programs grounded in MI. The present study shows that training undergraduate youth mentors with MI JITTs may be a promising way to expand the reach of services, as participants showed improvements in MI knowledge and skills, and indicated that the JITT program had high usability. However, the present study also reveals many research questions to explore, including the best way for JITTs to alter, complement, or enhance existing training practices.
ETHICS STATEMENT
Authors have complied with APA ethical principles in our treatment of individuals participating in the research, program, or policy described in the manuscript. The research received approval by our organizational unit responsible for the protection of human subjects, the University of South Carolina's Institutional Review Board.
ACKNOWLEDGMENTS
This project was funded by grant from Genentech in partnership with MENTOR: The National Mentoring Partnership. The authors also would like to thank Mike Garringer from MENTOR for his guidance and support.
Hart, M. J. , McQuillin, S. D. , Iachini, A. , Cooper, D. K. , & Weist, M. D. (2025). The efficacy and usability of motivational interviewing just‐in‐time trainings for youth mentors. American Journal of Community Psychology, 76, 121–132. 10.1002/ajcp.12804
REFERENCES
- Ayala, G. X. , & Elder, J. P. (2011). Qualitative methods to ensure acceptability of behavioral and social interventions to the target population. Journal of Public Health Dentistry, 71 Suppl 1(1), 69–79. 10.1111/j.1752-7325.2011.00241.x [DOI] [PMC free article] [PubMed] [Google Scholar]
- Beck, J. S. (2011). Cognitive behavior therapy: Basics and beyond (2nd ed.). Guilford Press. [Google Scholar]
- Briesch, A. M. , Chafouleas, S. M. , Neugebauer, S. R. , & Riley‐Tillman, T. C. (2013). Assessing influences on intervention implementation: Revision of the usage rating profile‐intervention. Journal of School Psychology, 51, 81–96. 10.1016/j.jsp.2012.08.006 [DOI] [PubMed] [Google Scholar]
- Bruce, M. , & Bridgeland, J. (2014). The mentoring effect: Young people's perspectives on the outcomes and availability of mentoring, Civic Enterprises with Hart Research Associates for MENTOR. The National Mentoring Partnership. [Google Scholar]
- Cavell, T. A. , Gregus, S. J. , Craig, J. T. , Pastrana, F. A. , & Hernandez Rodriguez, J. (2018). Program‐specific practices and outcomes for high school mentors and their mentees. Children and Youth Services Review, 89, 309–318. 10.1016/j.childyouth.2018.04.045 [DOI] [Google Scholar]
- Cavell, T. A. , Spencer, R. , & McQuillin, S. D. (2021). Back to the future: Mentoring as means and end in promoting child mental health. Journal of Clinical Child & Adolescent Psychology, 50(2), 281–299. 10.1080/15374416.2021.1875327 [DOI] [PubMed] [Google Scholar]
- Centers for Disease Control and Prevention . (2021). Data and Statistics on Children's Mental Health. Author. [Google Scholar]
- Chafouleas, S. M. , Briesch, A. M. , Neugebauer, S. R. , & Riley‐Tillman, T. C. (2011). Usage Rating Profile—Intervention (Revised). University of Connecticut. [DOI] [PubMed] [Google Scholar]
- Christensen, K. M. , Hagler, M. A. , Stams, G. J. , Raposa, E. B. , Burton, S. , & Rhodes, J. E. (2020). Non‐specific versus targeted approaches to youth mentoring: A follow‐up meta‐analysis. Journal of Youth and Adolescence, 49, 959–972. 10.1007/s10964-020-01233-x [DOI] [PubMed] [Google Scholar]
- DeWit, D. J. , DuBois, D. , Erdem, G. , Larose, S. , & Lipman, E. L. (2016). The role of program‐supported mentoring relationships in promoting youth mental health, behavioral and developmental outcomes. Prevention Science, 17(5), 646–657. 10.1007/s11121-016-0663-2 [DOI] [PubMed] [Google Scholar]
- DuBois, D. L. , Holloway, B. E. , Valentine, J. C. , & Cooper, H. (2002). Effectiveness of mentoring programs for youth: A meta‐analytic review. American Journal of Community Psychology, 30(2), 157–197. 10.1023/A:1014628810714 [DOI] [PubMed] [Google Scholar]
- Fernandes‐Alcantara, A. (2019). Vulnerable youth: Federalfederal mentoring programs and issues. Congressional Research Service.
- Frey, A. J. , Lee, J. , Small, J. W. , Sibley, M. , Owens, J. S. , Skidmore, B. , Johnson, L. , Bradshaw, C. P. , & Moyers, T. B. (2021). Mechanisms of motivational interviewing: A conceptual framework to guide practice and research. Prevention Science, 22, 689–700. 10.1007/s11121-020-01139-x [DOI] [PubMed] [Google Scholar]
- García, E. , & Weiss, E. (2019). The teacher shortage is real, large and growing, and worse than we thought. Economic Policy Institute.
- Garringer, M. , McQuillin, S. , & McDaniel, H. (2017). Examining Youth Mentoring Servicesyouth mentoring services across America: Findings from the 2016 National Mentoring Program Surveynational mentoring program survey. MENTOR: National Mentoring Partnership.
- Guerra, L. A. , Rajan, S. , & Roberts, K. J. (2019). The implementation of mental health policies and practices in schools: An examination of school and state factors. Journal of School Health, 89(4), 328–338. 10.1111/josh.12738 [DOI] [PubMed] [Google Scholar]
- Hart, M. J. , Flitner, A. M. , Kornbluh, M. E. , Thompson, D. C. , Davis, A. L. , Lanza‐Gregory, J. , McQuillin, S. D. , Gonzalez, J. E. , & Strait, G. G. (2024). Combining MTSS and community‐based mentoring programs. School Psychology Review, 53(2), 185–199. 10.1080/2372966x.2021.1922937 [DOI] [Google Scholar]
- Hart, M. J. , McQuillin, S. D. , Iachini, A. , Weist, M. , Hills, K. , & Cooper, D. (2023a). Expanding school‐based motivational interviewing through delivery by paraprofessional providers: A preliminary scoping review. School Mental Health, 25, 1–19. 10.1007/s12310-023-09580-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hart, M. J. , Sung, J. Y. , McQuillin, S. D. , & Schleider, J. L. (2023b). Expanding the reach of psychosocial services for youth: Untapped potential of mentors and single session interventions. Journal of Community Psychology, 51(3), 1255–1272. 10.1002/jcop.22927 [DOI] [PubMed] [Google Scholar]
- Herrera, C. , DuBois, D. L. , Heubach, J. , & Grossman, J. B. (2023). Effects of the big brothers big sisters of America Community‐Based mentoring program on social‐emotional, behavioral, and academic outcomes of participating youth: A randomized controlled trial. Children and Youth Services Review, 144, 106742. 10.1016/j.childyouth.2022.106742 [DOI] [Google Scholar]
- Herrera, C. , Grossman, J. B. , Kauh, T. J. , & McMaken, J. (2011). Mentoring in schools: An impact study of big brothers big sisters school‐based mentoring. Child Development, 82(1), 346–361. 10.1111/j.1467-8624.2010.01559.x [DOI] [PubMed] [Google Scholar]
- Kent, D. J. (2010). Effects of a just‐in‐time educational intervention placed on wound dressing packages: A multicenter randomized controlled trial. Journal of Wound, Ostomy & Continence Nursing, 37(6), 609–614. 10.1097/WON.0b013e3181f1826b [DOI] [PubMed] [Google Scholar]
- Langberg, J. M. , Epstein, J. N. , Becker, S. P. , Girio‐Herrera, E. , & Vaughn, A. J. (2012). Evaluation of the homework, organization, and planning skills (HOPS) intervention for middle school students with attention deficit hyperactivity disorder as implemented by school mental health providers. School Psychology Review, 41(3), 342–364. 10.1080/02796015.2012.12087514 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Leffingwell, T. R. (2006). Motivational interviewing knowledge and attitudes test (MIKAT) for evaluation of training outcomes. MINUET, 13, 10–11. [Google Scholar]
- Li, J. , & Julian, M. M. (2012). Developmental relationships as the active ingredient: A unifying working hypothesis of “what works” across intervention settings. American Journal of Orthopsychiatry, 82(2), 157–166. [DOI] [PubMed] [Google Scholar]
- Lyons, M. D. , McQuillin, S. D. , & Henderson, L. J. (2019). Finding the sweet spot: Investigating the effects of relationship closeness and instrumental activities in school‐based mentoring. American Journal of Community Psychology, 63(1–2), 88–98. 10.1002/ajcp.12283 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Madson, M. B. , Loignon, A. C. , & Lane, C. (2009). Training in motivational interviewing: A systematic review. Journal of Substance Abuse Treatment, 36(1), 101–109. 10.1016/j.jsat.2008.05.005 [DOI] [PubMed] [Google Scholar]
- Maggin, D. M. , Wehby, J. H. , Farmer, T. W. , & Brooks, D. S. (2016). Intensive interventions for students with emotional and behavioral disorders: Issues, theory, and future directions. Journal of Emotional and Behavioral Disorders, 24(3), 127–137. 10.1177/1063426616661498 [DOI] [Google Scholar]
- McQuillin, S. , McLelland, B. , & Smith, B. (2013). University of Houston Student Mentoring Manual Version 3. Available from www. http://faculty.coe.uh.edu/smcquillin/UHSM.pdf
- McQuillin, S. , Strait, G. , Smith, B. , & Ingram, A. (2015). Brief instrumental school‐based mentoring for first and second year middle school students: A randomized evaluation. Journal of Community Psychology, 43(7), 885–899. 10.1002/jcop.21719 [DOI] [Google Scholar]
- McQuillin, S. D. , Lyons, M. D. , Becker, K. D. , Hart, M. J. , & Cohen, K. (2019). Strengthening and expanding child services in low resource communities: The role of task‐shifting and just‐in‐time training. American Journal of Community Psychology, 63(3–4), 355–365. 10.1002/ajcp.12314 [DOI] [PubMed] [Google Scholar]
- McQuillin, S. D. , & McDaniel, H. L. (2021). Pilot randomized trial of brief school‐based mentoring for middle school students with elevated disruptive behavior. Annals of the New York Academy of Sciences, 1483(1), 127–141. 10.1111/nyas.14334 [DOI] [PubMed] [Google Scholar]
- Miller, W. R. , C'de Baca, J. , Matthews, D. B. , & Wilbourne, P. L. (2001). Personal values card sort. University of New Mexico. [Google Scholar]
- Miller, W. R. , Hedrick, K. E. , & Orlofsky, D. R. (1991). The helpful responses questionnaire: A procedure for measuring therapeutic empathy. Journal of Clinical Psychology, 47, 444–448. [DOI] [PubMed] [Google Scholar]
- Miller, W. R. , & Rollnick, S. (2002). Motivational Interviewing: Preparing people for change (2nd ed.). The Guilford Press. [Google Scholar]
- Murre, J. M. J. , & Dros, J. (2015). Replication and analysis of ebbinghaus' forgetting curve. PLoS One, 10(7), e0120644. 10.1371/journal.pone.0120644 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Nabors, L. A. , Stanton‐Chapman, T. L. , & Toledano‐Toledano, F. (2022). A university and community‐based partnership: After‐school mentoring activities to support positive mental health for children who are refugees. International Journal of Environmental Research and Public Health, 19(10), 6328. 10.3390/ijerph19106328 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Newman‐Casey, P. A. , Killeen, O. , Miller, S. , MacKenzie, C. , Niziol, L. M. , Resnicow, K. , Creswell, J. W. , Cook, P. , & Heisler, M. (2018). A glaucoma‐specific brief motivational interviewing training program for ophthalmology para‐professionals: Assessment of feasibility and initial patient impact. Health Communication, 35(2), 233–241. 10.1080/10410236.2018.1557357 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Raudenbush, S. W. , & Liu, X. (2000). Statistical power and optimal design for multisite randomized trials. Psychological Methods, 5(2), 199–213. 10.1037/1082-989X.5.2.199 [DOI] [PubMed] [Google Scholar]
- Rhodes, J. E. (2005). A theoretical model of youth mentoring. In DuBois D. L., & Karcher M. J. (Eds.), Handbook of youth mentoring (pp. 30–43). Sage Press. [Google Scholar]
- Riemer, M. , Reich, S. M. , Evans, S. D. , Nelson, G. , & Prilleltensky, I. Eds. (2020). Community psychology: In pursuit of liberation and wellbeing. Bloomsbury Publishing. [Google Scholar]
- Rollnick, S. , Kaplan, S. G. , & Rutschman, R. (2016). Motivational Interviewing in Schools: Conversations to Improve Behavior and Learning. The Guilford Press. [Google Scholar]
- Schenk, L. , Sentse, M. , Lenkens, M. , Nagelhout, G. E. , Engbersen, G. , & Severiens, S. (2021). Instrumental mentoring for young adults: A multi‐method study. Journal of Adolescent Research, 36(4), 398–424. 10.1177/0743558420979123 [DOI] [Google Scholar]
- Schneeberger, C. , & Mathai, M. (2015). Emergency obstetric care: Making the impossible possible through task shifting. International Journal of Gynaecology and Obstetrics: The Official Organ of the International Federation of Gynaecology and Obstetrics, 131 Suppl 1(S1), 6–9. 10.1016/j.ijgo.2015.02.004 [DOI] [PubMed] [Google Scholar]
- Schwalbe, C. S. , Oh, H. Y. , & Zweben, A. (2014). Sustaining motivational interviewing: A meta‐analysis of training studies. Addiction, 109(8), 1287–1294. 10.1111/add.12558 [DOI] [PubMed] [Google Scholar]
- Simon, P. , & Ward, N. L. (2014). An evaluation of training for lay providers in the use of motivational interviewing to promote academic achievement among urban youth. Advances in School Mental Health Promotion, 7(4), 255–270. 10.1080/1754730X.2014.949062 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Snape, L. , & Atkinson, C. (2015). Exploring and challenging pupil disaffection: An evaluation of a motivational interviewing‐based intervention delivered by paraprofessionals. Pastoral Care in Education, 33(2), 69–82. 10.1080/02643944.2015.1022207 [DOI] [Google Scholar]
- Strait, G. (2018). The Student Checkup: A School‐Based Motivational Interview for Students Manual Version 2.0.
- Strait, G. G. , Lee, E. R. , McQuillin, S. , Terry, J. , Cebada, M. , & Strait, J. E. (2017). The student check‐up: Effects of paraprofessional‐delivered motivational interviewing on academic outcomes. Advances in School Mental Health Promotion, 10(4), 250–264. 10.1080/1754730X.2017.1333915 [DOI] [Google Scholar]
- Strait, G. G. , Turner, J. , Stinson, D. , Harrison, S. , Bagheri, R. , Perez, T. , Smith, B. H. , Gonzalez, J. , Anderson, J. R. , Simpson, J. , & D. McQuillin, S. (2020). Paraprofessionals use of group school‐based instrumental mentoring: Examining process and preliminary outcomes. Psychology in the Schools, 57(9), 1492–1505. 10.1002/pits.22417 [DOI] [Google Scholar]
- Webster, R. , & De Boer, A. (2019). Teaching assistants: Their role in the inclusion, education and achievement of pupils with special educational needs. European Journal of Special Needs Education, 34(3), 404–407. 10.1080/08856257.2019.1615746 [DOI] [Google Scholar]
- Wheeler, M. E. , Keller, T. E. , & DuBois, D. L. (2010). Review of Three Recent Randomized Trials of School‐Based Mentoring: Making Sense of Mixed Findings, Social Policy Report (24). Society for Research in Child Development. Number 3. [Google Scholar]
- World Health Organization . (2008). Task‐shifting: Global recommendations and guidelines. Author. [Google Scholar]
- Yahne, C. , Jackson, S. , & Tollestrup, K. (2014). Training teen mothers as motivational interviewers: A feasibility study. Motivational Interviewing: Training, Research, Implementation, Practice, 1(3), 25–30. 10.5195/mitrip.2014.40 [DOI] [Google Scholar]
- Zachariah, R. , Ford, N. , Philips, M. , S.Lynch, S. , Massaquoi, M. , Janssens, V. , & Harries, A. D. (2009). Task shifting in HIV/AIDS: Opportunities, challenges and proposed actions for sub‐saharan Africa. Transactions of the Royal Society of Tropical Medicine and Hygiene, 103(6), 549–558. 10.1016/j.trstmh.2008.09.019 [DOI] [PubMed] [Google Scholar]
