Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2021 Apr 1.
Published in final edited form as: J Subst Abuse Treat. 2020 Jan 20;111:67–72. doi: 10.1016/j.jsat.2020.01.007

Adolescent SBIRT Implementation: Generalist vs. Specialist Models of Service Delivery in Primary Care

Shannon Gwin Mitchell a, Jan Gryczynski a, Robert P Schwartz a, Arethusa S Kirk b, Kristi Dusek a, Marla Oros c, Colleen Hosler d, Kevin E O’Grady e, Barry S Brown f
PMCID: PMC7039979  NIHMSID: NIHMS1554613  PMID: 32087839

Abstract

Background:

Drug, alcohol, and tobacco use among adolescents pose significant short- and long-term health consequences and are associated with more severe use as adults. Screening, brief intervention, and referral to treatment in primary care settings has the potential to deliver preventive interventions to a diverse range of adolescents, but optimal implementation of these services needs to be determined. The purpose of this study was to compare implementation of two different SBIRT service delivery models in primary care settings.

Methods:

This cluster-randomized trial assigned 7 primary care clinics of a federally qualified health center to implement brief interventions (BI) using a Generalist model (4 sites), in which BIs were delivered by the primary care provider (PCP), or a Specialist model (3 sites), in which BIs were delivered by a behavioral health counselor (BHC) for adolescent patients ages 12–17 years. Implementation was tracked through the clinic’s electronic health record, spanning 9,639 clinic visits over 20 months. Multilevel logistic regression modeling was used to compare Generalist and Specialist strategies on penetration of BI for patients scoring ≥2 on the CRAFFT substance use screen, delivered by the PCP in the Generalist sites, and via warm hand-off to a BHC in the Specialist sites.

Results:

Approximately 62% of adolescent patient visits were screened with the CRAFFT (with <4% screening positive with a CRAFFT score ≥2). The Generalist Condition had significantly higher self-reported penetration of BI delivery than the Specialist Condition (38% vs. 8%; Adjusted Odds Ratio= 6.53; p=.005).

Discussion:

Despite having co-located behavioral health services at all sites, a Specialist approach to providing BI was less effectively implemented than a Generalist approach in this FQHC. BI delivered by PCPs rather than by hand-off to a BHC may ensure greater penetration of these services in primary care settings. Both implementation models provided a framework for identifying and intervening with adolescent primary care patients whose substance use might have otherwise gone undetected.

Keywords: SBIRT, adolescent, implementation, primary care

1. Introduction

In recognition of the unique role pediatricians can play in meeting the challenge of adolescent substance use, the American Academy of Pediatrics has recommended its members become familiar with the screening, brief intervention and referral to treatment (SBIRT) model and its potential to be integrated into medical practice (Levy & Williams, 2016). Because most adolescents see their primary care provider at least annually, incorporating SBIRT into practice represents a logical and potentially powerful strategy for delivering prevention messages and early interventions to a broad range of adolescents in a developmentally appropriate manner (Newacheck, Brindis, Cart, Marchi, & Irwin, 1999; Ozechowski, Becker, & Hogue, 2016; Windle et al., 2008).

The science of implementation focuses on factors influencing the adoption of evidence-based health care into routine practice and focuses in factors such as penetration of services rather than patient-level outcomes. There is a need to develop and test models to successfully implement SBIRT for adolescents in primary cares settings (Sterling et al., 2015). Within extant models, the specific roles and responsibilities of staff members must be delineated when SBIRT is being introduced into the workflow. It has been asserted that having brief interventions (BIs) delivered by primary care providers builds upon the trusting, established relationship between youth and their primary care providers (Werner, Joffe, & Graham, 1999). However, given the busy nature of practice and the push towards integrated models of health care delivery, critical questions remain regarding how best to integrate adolescent SBIRT into standard practice (Ader et al., 2015; Mitchell et al., 2013).

Sterling and colleagues (Sterling et al., 2015) conducted the first study to compare different models of adolescent SBIRT implementation. In their cluster randomized trial, conducted in a large pediatric practice at Kaiser Permanente Northern California (KPNC), providers were randomly assigned to one of three Conditions: (1) pediatrician-delivered SBIRT for substance use and/or mental health; (2) SBIRT delivered by behavioral health staff integrated into clinic practice; or, (3) usual care. Both intervention Conditions were found to deliver more BIs than usual care and, overall, the behavioral health staff demonstrated better penetration, delivering more interventions than in the pediatrician-delivered model. However, among the BIs delivered, patients in the pediatrician-delivered Condition were more likely to receive a BI containing substance use content and less likely to receive mental health content compared to the behavioral health staff-delivered Condition. These findings may not generalize to pediatric practices in which behavioral health staff are not already integrated within the medical team and general clinical practice operations.

The present study is a cluster randomized implementation trial that sought to build upon existing evidence by comparing the Generalist SBIRT model (BIs delivered by adolescents’ primary care providers; PCP) versus the Specialist approach (BIs delivered by onsite behavioral health counselors; BHC) (Bower & Gilbody, 2005; Pincus, 1980, 1987). It was conducted within 7 clinic sites of an urban Federally Qualified Health Center (FQHC) in Baltimore, MD that had offered co-located behavioral health services (i.e., delivering medical and behavioral health services at the same facility with opportunities for consultation between medical and behavioral health providers; (Collins, Hewson, Munger, & Wade, 2010) rather than integrated care (i.e., medical and behavioral health providers working as a team and using a single treatment plan) at all clinics (Mitchell et al., 2016). Intervention data were collected continuously from May 1, 2013 through December 31, 2014. We hypothesized that the Generalist Condition compared to the Specialist Condition would have higher rates of delivering BIs and referral to treatment for patients with an indicated need of these services. The hypothesis was premised on the belief it would be logistically simpler for the PCPs to discuss substance use with the patients rather than effecting a referral to an onsite BHC. As a secondary focus we examined implementation of SBIRT services that followed a uniform protocol across all sites, including the administration of screening for all adolescent patients and delivery of brief advice for lower-risk patients. Finally, we also consider challenges and opportunities inherent in different models of adolescent SBIRT delivery and how they may impact awareness of and interventions for behavioral health issues in primary care settings, as well as how our findings contribute to the emerging evidence for providing adolescent SBIRT services.

2. Methods

2.1. Study Sites

The participating organization was a large, urban FQHC, which provided adolescent medicine to approximately 3,600 patients at its 7 sites throughout Baltimore City. The FQHC served a predominantly African American patient population. All sites had pre-existing co-located behavioral health services with dedicated primary care and behavioral health patient caseloads. Separate medical and behavioral health records and separate appointment systems were used.

2.2. Randomization

The 7 sites were randomly assigned to implement either the Generalist or Specialist SBIRT model with their 12- to 17-year old patients. A stratified randomization approach was used at the clinic level, with clinics grouped into pairs by size of their adolescent patient populations. The single largest clinic was grouped against two smaller sites. Each matched grouping was then randomized, with one clinic in each pairing assigned to the Specialist Condition and the other assigned to the Generalist Condition, resulting in 3 Specialist sites (including the largest clinic) and 4 Generalist sites (Mitchell et al., 2016). Patients were not considered human subjects, as each site implemented a wholesale change to their standard clinic practice. Data collection for this aspect of the study was limited to clinical services delivery as documented in the electronic health record. The electronic health records were reviewed by the research team, de-identified, and aggregated for each site by month. The study was approved by the Friends Research Institute’s Institutional Review Board.

2.3. Intervention Implementation

The protocol for adolescent SBIRT delivery in the Generalist and Specialist Conditions was implemented using The Mosaic Group’s implementation approach, which uses a combination of strategies targeting multiple system levels within each clinic, including: integrated team development of service delivery protocols, electronic health record (EHR) modifications, regular performance feedback and supervision, and manual development and training modifications (Mitchell et al., 2016). The specific intervention implementation for this study was developed collaboratively with key primary care and behavioral health personnel working in conjunction with study leads and implementation experts to tailor the intervention (Generalist or Specialist) within each clinic’s flow and layout. The EHR was modified to include screening results as well as a provider checklist indicating what services were provided in response to the screening results (e.g., counseled to continue abstinence, delivery of brief advice, referred to BHC). All primary care staff at each clinic received a one-hour training, orienting them to the project, the screening process, the appropriate responses to screenings (depending on Condition), and the proper documentation of activities in the EHR. PCPs and BHCs received an additional one-hour training on delivering Brief Interventions using principles of motivational interviewing. Data regarding adolescent SBIRT services were extracted from the EHR on a bimonthly basis and written feedback was given to all participating PCPs, focusing specifically on their adherence to the implementation model over the past 60 days. The one-page feedback summary letters were signed by the Medical Director, who served as the project’s Organizational Champion. In addition, EHR data was analyzed at the clinic-level and used to provide targeted feedback at quarterly booster trainings, where technical assistance was delivered by the implementation specialists for staff at each clinic. Quarterly boosters involved all pediatric staff and BHCs (at Specialist sites).

2.3.1. Screening

Personnel at all clinic sites, irrespective of their assigned Condition, were trained to screen for drug and alcohol use in a uniform manner. Medical Assistants (MAs) administered the CRAFFT (Knight, Sherritt, Harris, Gates, & Chang, 2003; Knight, Sherritt, Shrier, Harris, & Chang, 2002) as part of the patient triage/intake procedures at every visit attended by a 12–17-year-old patient. The MA then entered the patient’s responses into the EHR, scored the CRAFFT, indicated whether or not the patient’s parent was present during the screening, and opened the Provider Intervention tab in the EHR for the PCP to complete the remainder of the documentation. If the parent was present during the initial screening, the PCP was trained to re-screen the patient with the parent out of the exam room and update documentation in the EHR.

2.3.2. Anticipatory Guidance and Brief Advice

PCPs at all sites were trained to review the CRAFFT results with patients during their regular patient encounter after asking the parent, if present, to leave the exam room. For patients who denied past year drug or alcohol use (CRAFFT score of 0), the PCP offered positive feedback and anticipatory guidance. For patients scoring just below the positive threshold on the CRAFFT (score of 1), PCPs were instructed to deliver 1–2 minutes of brief advice. Brief advice included encouragement to make specific behavior changes (e.g., reduce use or reduce risk behaviors), but did not involve a change plan or other elements present in BIs.

2.3.3. Brief Intervention and Referral to Treatment

For patients who scored ≥2 on the CRAFFT, the PCPs were trained to respond based upon their assigned Condition.

2.4. Generalist Condition.

At the four sites randomly assigned to deliver the Generalist approach, the PCP discussed the risky behaviors endorsed by the adolescent on the CRAFFT. The PCPs were trained to conduct BIs of about five to ten-minute duration using motivational interviewing techniques focused on reducing or discontinuing their substance use. They developed a plan with the patient to reach their specified goal. Patients who scored a 2 or more on the CRAFFT were also asked about risky sexual behaviors by the PCP, and such issues were included in the BI discussion, when appropriate.

Upon concluding the session, the PCPs, at their discretion based upon the severity of the patient’s substance use and the patient’s willingness to engage in further intervention, either made a follow-up appointment for further discussion with the patient, planned to continue the conversation at the patient’s next regularly scheduled visit, or made a specialty substance use service referral.

2.5. Specialist Condition.

At the 3 sites randomly assigned to deliver the Specialist approach, patients who scored 2 or more on the CRAFFT received brief advice from the PCP and were encouraged to accept a “warm handoff” (i.e., an immediate referral) to the BHC. The provider alerted the MA or nurse to notify the BHC who, if available, saw the patient either in the exam room or took the patient to their office (located in the same building) to conduct the BI. BIs conducted by BHCs contained the same content as those provided by PCPs, but additional time was sometimes required to establish rapport and gather additional background information, and thus lasted approximately 15 minutes. The BHC then developed a plan with the patient for reduction of use, as appropriate, encouraged the patient to accept a referral for a substance use treatment evaluation, and/or scheduled a follow-up visit to assess the patient’s progress. If the BHC was not available to conduct the BI, the nurse or MA notified the PCP and scheduled an appointment for the patient to meet with the BHC within one week.

2.6. De-identified Electronic Health Record (EHR) Data

De-identified patient-level data were drawn from the EHR that included: screening results, the provider’s response to the screening, whether a BHC referral was made, whether the BHC conducted a BI, and whether a referral to treatment was made.

2.7. Outcomes Assessed

Consistent with the Proctor implementation model (Proctor et al., 2009; Proctor et al., 2011), study outcomes were determined by the penetration of SBIRT services, that is, the extent of integration of SBIRT into the service setting. We examined penetration of screening, brief advice, brief intervention, and referral to treatment, defined as the proportion of patient visits that received each of these four services among those patients who should have received such service(s), according to the implementation protocol. The definitions for the components of SBIRT documented in the EHR were as follows: (1) screening (CRAFFT administration among all adolescent patient visits); (2) brief advice (delivery of brief advice by the PCP among allvisits in which patients reported past year alcohol and/or drug use but had CRAFFT =1); (3) BI (delivery of a BI by the PCP for Generalist sites or by the BHC for Specialist sites among all patient visits with a CRAFFT score ≥2); and (4) referral to treatment by the PCP for Generalist sites or by the BHC for Specialist sites, when determined clinically appropriate.

We expected increased penetration of screening and brief advice over time, but no differences by Condition, because all sites used the same protocol for these services. We hypothesized that Generalist sites would have higher penetration of BI and Referral to Treatment compared to Specialist sites, due to their ability to more seamlessly integrate these interventions into the medical visit without additional steps or personnel.

2.8. Statistical Analyses

Inferential analyses were conducted using mixed effects logistic regression, distilling information abstracted from the clinical EHR on services needed and services delivered in each month. The implementation period was 20 months; however, in the sixth month the clinics transitioned to a new EHR system that allowed for better clinical documentation of BIs by BHCs. Thus, data for the primary outcome of interest (appropriate receipt of BI) were restricted to the 14-month period after adoption of the new EHR system. Data were characterized as binomial successes-per-trial for receipt of BI if so indicated by the protocol. At Generalist sites, cases were classified as successes if the PCP documented delivery of a BI or referral to treatment. At Specialist sites, cases were classified as successes if the BHC documented delivering a BI or referral to treatment after PCP-handoff. We also used this approach to model (for the full 20-month implementation period) receipt of (a) screening and (b) brief advice. The mixed effects logistic regression models included independent variables for Condition (Generalist vs. Specialist; the effect of interest) and a categorical variable for the Implementation Period, grouped into 6-month increments, except for the final 2-month period, to gauge the possible impact of coarse temporal trends. The shorter length of the final period category is not problematic because Implementation Period is interpreted as a categorical variable and penetration variables were defined as binomial successes-per-trial. Each model also included a random effect for clinic to account for clustering of service delivery outcomes within site.

3. Results

3.1. Patient and Provider Characteristics

Adolescent patient characteristics were similar across Generalist and Specialist Conditions with a mean (SD) age of 14.4 (1.7) and 14.2 (1.7) at Generalist and Specialist sites, respectively. The sex distribution was likewise similar across Generalist and Specialist sites (56.5% and 54.6% female, respectively).

Provider characteristics were also highly similar across Generalist and Specialist sites. Among the 27 providers participating in the study (12 at Generalist sites and 15 at Specialist sites), providers were 70.4% female with a mean (SD) age of 41.8 (7.8) years. Providers were 58.3% African American, 76% were MDs, 36% held pediatric certifications and 64% were family medicine physicians.

3.2. Penetration of SBIRT Services

Table 1 shows the cumulative number of adolescent patient visits for which key SBIRT services were needed, and the number in which such services were delivered. As expected, Generalist and Specialist sites had similar rates of screening and brief advice because the implementation protocol was the same for these aspects of the model in both study Conditions. Overall, screening occurred at 62% of adolescent patient visits. Of those visits at which screening took place, 5.9% of visits had a report of substance use but a CRAFFT score of 1, indicating need for brief advice only, while 3.9% (165/4,279, post-EHR transition) had a CRAFFT score ≥2, indicating need for brief intervention. Overall, brief advice was delivered in 29% of adolescent patient visits when it was indicated, while brief intervention was delivered in 22% of adolescent patient visits when it was indicated (38% in Generalist sites, 8% in Specialist sites). Among adolescents for whom a BI was indicated, approximately one quarter of those seen at Specialist sites declined to see the BHC (24.4%), while only 3.8% of those seen in Generalist sites declined a BI from the PCP. Referral to treatment at an outside agency was rare, with such referrals provided in only 4 instances (1 at a Generalist site and 3 at Specialists sites).

Table 1.

Screening, brief advice, and brief intervention penetration over the implementation period.

Generalist Sites Specialist Sites Combined
Screening
[delivered by Medical Assistants at each visit]
Total adolescent patient visits, ages 12–17 4233 5406 9639
Screening delivered 2504 3467 5971
% of adolescent patient visits that received screening 59% 64% 62%
Brief Advice (BA)
[always PCP-delivered]
BA indicated (substance use with CRAFFT = 1) 191 161 352
BA delivered 54 49 103
Patient declined BA 3 9 12
% of visits that received BA (if needed) 28% 30% 29%
Brief Intervention (BI)
[by PCP at Generalist sites or BHC at Specialist Sites]
BI indicated (CRAFFT ≥ 2) 79 86 165
BI delivered 30 7 37
Patient declined BI 3 21 24
% of visits that received BI (if needed) 38% 8% 22%
Referral to Treatment
Referral to substance use treatment provided 1 3 4

Note: Data were available for a 20-month implementation period for screening and brief advice, and a 14-month period for Brief Intervention (data on BHC-delivered BI was not available prior to transition to a new EHR in Month 6). All numbers and percentages refer to visits, not unique patients.

Table 2 shows key contrasts from the mixed effects logistic regression models examining penetration of screening, brief advice, and brief intervention. Significant period effects were found only for penetration of screening, which increased from the first 6-month period to subsequent periods. As expected, there were no significant differences by Condition in penetration of screening (p=.52) or brief advice (p=.77). Generalist sites were more likely than Specialist sites to deliver brief intervention per the implementation protocol (Adjusted Odds Ratio= 6.53; 95% Confidence Interval= 1.79, 23.90; p=.005).

Table 2.

Results for Mixed-effects Logistic Regression Analysis of penetration of screening, brief advice, and brief intervention.

Test Statistic χ2(df) p Parameter Estimate
Screening (N=9639 visits)
Condition
Generalist (vs. Specialist) χ2(1): 0.41 .52 AOR= 0.77 (95% CI= .35 – 1.69)
Implementation Period χ2(3): 80.19 < .001
Period 1 (months 1–6) ref
Period 2 (months 7–12) < .001 AOR= 1.36 (95% CI= 1.22 – 1.52)
Period 3 (months 13–18) < .001 AOR= 1.60 (95% CI= 1.44 – 1.78)
Period 4 (months 19–20) < .001 AOR= 1.40 (95% CI= 1.20 – 1.64)
Site intraclass correlation 0.07 (95% CI= 0.03 – 0.19)
Brief Advice (N= 352 visits)
Condition
Generalist (vs. Specialist) χ2(1): 0.08 .77 AOR= 1.19 (95% CI= 0.37 – 3.85)
Implementation Period χ2(3): 1.93 .59
Period 1 (months 1–6) ref
Period 2 (months 7–12) .34 AOR= 0.73 (95% CI= 0.39 – 1.39)
Period 3 (months 13–18) .18 AOR= 0.65 (95% CI= 0.35 – 1.22)
Period 4 (months 19–20) .42 AOR= 0.72 (95% CI= 0.32 – 1.60)
Site intraclass correlation 0.12 (95% CI= 0.03 – 0.39)
Brief Intervention (N= 165 visits)
Condition
Generalist Condition (vs. Specialist) χ2(1): 8.05 .005 AOR= 6.53 (95% CI= 1.79 – 23.90)
Implementation Period χ2(2): 0.09 .96
Period 1 (months 1–6) (omitted1)
Period 2 (months 7–12) ref
Period 3 (months 13–18) .80 AOR= 1.13 (95% CI= 0.45 – 2.81)
Period 4 (months 19–20) 1.0 AOR= 1.00 (95% CI= 0.30 – 3.33)
Site intraclass correlation .08 (95% CI= 0.006 – 0.55)

Notes: df= degrees of freedom; AOR= Adjusted Odds Ratio; CI= Confidence Interval; ref= Reference category.

1

Data were available for a 20-month implementation period for penetration of screening and brief advice, and a 14-month implementation period for brief intervention because data on BHC-delivered BI was not available prior to transition to a new electronic health record in Month 6.

Secondary analyses revealed that, of the 86 patients scoring a 2 or higher on the CRAFFT screening in the Specialist Condition, PCPs only indicated a hand-off to the BHC in 22 cases, of which 7 were recorded as successfully completed by the BHC. In 21 cases the youth declined a BI, in 8 cases the PCP delivered a BI, and in 35 cases no response was recorded in the EHR.

4. Discussion

The current study examined two approaches to delivering adolescent SBIRT for substance use within an urban FQHC with co-located mental health services. Implementation of brief intervention was less effectively achieved when the usual primary care team’s workflow was modified to coordinate with a BHC to deliver BIs. This finding is in contrast to those of Sterling and team (Sterling et al., 2015), who also included behavioral health screening as a trigger for SBIRT delivery, and found higher BI penetration rates by BHCs than PCPs, but with greater focus on behavioral health screens than substance use screens. Secondary findings demonstrated that providers in the pediatrician-only arm were more likely to deliver BIs for substance use (though not mental health) than providers in either the behavioral health or usual care Conditions, similar to the penetration patterns found in our study. An additional point upon which our findings are in agreement with those of Sterling’s involves the relatively low response rates to positive screens using either adolescent SBIRT service delivery approach in primary care settings, a finding also noted when SBIRT services have been implemented in adult primary care (Mertens et al., 2015). A recent interrupted time-series analysis (O’Donnell et al., 2019) found that financial incentives had a minimal positive impact on increasing alcohol screening and BI delivery rates in primary care in England when they were initiated but that the withdrawal of the incentives several years later resulted in rapid service reductions. Clearly more research is needed regarding mechanisms for further improving service delivery.

Our findings reflected two points at which the hand-off process broke down in the Specialist condition: first, when PCPs failed to hand off eligible patients, which occurred in the majority of cases, and second, when the BHCs were not available to accept the warm handoff. Some of these difficulties may reflect the functioning of the co-located medical and behavioral systems at the clinics. It is possible that the implementation of a Specialist model may function both differently and more effectively in an integrated system.

The low hand-off rate to BHCs for adolescents who screened positive in the Specialist Condition may not just reflect the PCPs willingness to hand-off, or the BHCs availability, but the perceived willingness of adolescents to continue a conversation regarding their drug or alcohol use with a new, possibly unfamiliar provider. This “opt-out” aspect of the Specialist Condition may not have been present in the Generalist Condition, where the BI would have flowed naturally from the screening results discussion between the provider and the patient. It is also possible that adolescents felt that the information they exchanged with their PCP was sufficient for addressing their substance use and that no further intervention from the BHC was necessary, or that they were concerned about alerting their parents to a problem if their visit was uncharacteristically long in order to meet with the BHC. Patient feedback regarding the BI content and satisfaction with the intervention was intended to be collected at the end of each visit but few patients elected to provide this voluntary feedback. Clinics with better integrated medical and behavioral health services may be able to eliminate some of these barriers, particularly if the BHC is also the staff member reviewing the screening results with the adolescents in the course of the normal clinic visit.

More broadly, our findings highlight both the promise and the challenges inherent in efforts to implement SBIRT widely across multiple clinics in an FQHC. The implementation effort was substantial, and included the use of external experts, a committed Medical Director who championed the project, buy-in from organization- and site-level leadership, dissemination of project-branded materials, regular trainings with clinic staff at all levels, written performance feedback for providers, and access to free technical assistance. While these efforts resulted in significant adoption of evidence-based screening practices overall, the positive screening rates were surprisingly low when compared with similar populations in other studies (Knight et al., 2003; Levy & Williams, 2016; Windle et al., 2008) and the rates of service delivery in response to positive screens were disappointing, from an organizational perspective, in both Conditions.

Our screening protocol emphasized screening at every visit, in recognition that substance use behaviors often change rapidly during adolescence (Monico et al., 2019). Although uptake of substance use screening increased markedly during the first 6 months of the project and stabilized thereafter, screenings were not conducted or left undocumented in a sizable minorityof adolescent patient visits. Reasons for not screening may include foregoing re-screening if the patient had been seen recently, or de-prioritization of screening based on the day’s patient flowor the medical urgency of the visit (e.g., well check-ups vs. acute complaints). The rate of positive screens (<4% with a CRAFFT score ≥2) was also relatively low compared to a substance use screening study conducted in the same organization carried out under “research confidential” Conditions (Kelly et al., 2014; Gryczynski et al., 2019). Likewise, penetration of brief advice and BI was low; only a minority of patients who qualified for these services received them in either Condition.

Ultimately, however, one must consider that hundreds of adolescent patients who received even an abbreviated brief intervention in response to their CRAFFT screening would not have received any tailored response in prior years. Our study, along with the growing evidence from Sterling (2019; Sterling et al., 2015; Sterling, Kline-Simon, Weisner, Jones, & Satre, 2018), Harris & Knight (D’Souza-Li & Harris, 2016; Harris, Louis-Jacques, & Knight, 2014; Knight et al., 2018), D’Amico (D’Amico et al., 2018), and others (Mello et al., 2018; Ozechowski et al., 2016; Stanhope, Manuel, Jessell, & Halliday, 2018) point to opportunities for utilizing a range of adolescent SBIRT service delivery approaches (e.g., primary care providers, behavioral health providers, computer interventions) and delivering SBIRT in diverse settings serving youth (e.g., pediatric practices, school based health centers, pediatric trauma centers, community mental health organizations) to create opportunities for positively impacting associated health outcomes (e.g., depression and anxiety, risk behaviors, substance use and psychiatric service utilization). The long-term benefits of successfully implementing these services for adolescents have not yet been realized.

4.1. Limitations

A limitation of this study is the relatively small number of clinic sites. Randomization occurred at the clinic level, as we determined that implementing both Generalist and Specialist models in the same site would have been impractical and likely lead to contamination. We assumed limited variability between clinics, as all sites operated under the same parent FQHC and served a similar population. However, there was more variability than anticipated, thereby reducing power to detect differences between the implementation models. Nevertheless, even with this unforeseen challenge, this study was able to document a significant advantage for the Generalist model in ensuring delivery of brief intervention. While information regarding the receipt of a BI was documented when it occurred, it did not include details regarding the nature of the intervention, so we cannot speak to the issue of whether a BI in response to a CRAFFT screen elicited a discussion of substance use only, or delved into the broader behavioral health issues that may have been underlying and driving that use. The study also did not collect patient outcome data to determine whether interventions provided by PCPs or BHCs were more likely to produce changes in drug, alcohol, or tobacco consumption patterns and the limited EHR data did not provide information regarding the nature of the BI, so we cannot speak to the issue of whether the content of BIs differed based on the type of provider delivering the intervention. Attempts to collect patient satisfaction information following BI delivery was also unsuccessful. This study took place at a single FQHC, which could impact generalizability to other clinics and systems that may be structured differently with respect to their level of medical and behavioral health integration. Finally, during the course of this study considerable changes occurred throughout the organization, including adoption of a new EHR as well as the re-structuring of leadership and clinical teams, likely muting the impact of the implementation.

4.2. Conclusion

Our findings, taken together with those of other investigators, make clear the significant role to be played by health care providers, and pediatricians particularly, in identifying and providing brief interventions to adolescents at risk for substance use during the course of primary care visits. Having established that medical practitioners can be trained to deliver SBIRT services to this critical population around substance use issues, there now remains the essential task of refining and evaluating behavior change strategies capable of being integrated into typical clinical practice and of better tailoring the implementation model to the specific characteristics of the practice (e.g., staffing, layout, technology). Given the low intervention rates even when positive screenings were identified, more research is needed to improve adolescent SBIRT service delivery in primary care settings as well as improve and extend the delivery of SBIRT in other settings serving youth.

Highlights.

  • Implementation of brief intervention was less effectively achieved when the usual primary care team’s workflow was modified to coordinate with a behavioral health counselor to deliver brief interventions.

  • Screening occurred at 62% of adolescent patient visits using a universal screening approach but positive screens were surprisingly low in this population.

Acknowledgement

We thank the staff of Total Health Care for their collaboration on this implementation We also thank Drs. Tisha Wiley and Lori Ducharme for their guidance.

Funding Source: All phases of this study were supported by a National Institute on Drug Abuse Grant 1R01DA034258 (Mitchell, Gryczynski, O’Grady, & Schwartz). NIDA had no role in the design and conduct of the study; data acquisition, management, analysis, and interpretation of the data; and preparation, review, or approval of the manuscript.

Footnotes

Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

Financial Disclosures: The authors have no financial relationships to disclose relevant to this article.

Conflicts of Interest: The authors have no conflicts of interest relevant to this article to disclose.

Clinical Trials Registration: Clinicaltrials.gov/

References

  1. Ader J, Stille CJ, Keller D, Miller BF, Barr MS, & Perrin JM (2015). The Medical Home and Integrated Behavioral Health: Advancing the Policy Agenda. Pediatrics. doi: 10.1542/peds.2014-3941 [DOI] [PubMed] [Google Scholar]
  2. Bower P, & Gilbody S (2005). Managing common mental health disorders in primary care: conceptual models and evidence base. BMJ, 330(7495), 839–842. doi: 10.1136/bmj.330.7495.839 [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Collins C, Hewson DL, Munger R, & Wade T (2010). Evolving models of behavioral health integration in primary care. Retrieved from New York, NY: [Google Scholar]
  4. D’Souza-Li L, & Harris SK (2016). The future of screening, brief intervention and referral to treatment in adolescent primary care: Research directions and dissemination challenges. Current Opinion in Pediatrics, 28(4), 434–440. [DOI] [PubMed] [Google Scholar]
  5. D’Amico EJ, Parast L, Shadel WG, Meredith LS, Seelam R, & Stein BD (2018). Brief motivational interviewing intervention to reduce alcohol and marijuana use for at-risk adolescents in primary care. Journal of Consulting and Clinical Psychology, 86(9), 775–786. [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Gryczynski J, Mitchell SG, Schwartz RP, Kelly SM, Dusek K, Monico L, … Hosler C (2019). Disclosure of adolescent substance use in in primary care: Comparison of routine clinical screening and anonymous research interviews. Journal of Adolescent Health, 64(4), 541–543. [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. Harris SK, Louis-Jacques J, & Knight JR (2014). Screening and brief intervention for alcohol and other abuse. Adolescent Medicine: State of the Art Reviews, 25(1), 126–156. [PubMed] [Google Scholar]
  8. Kelly SM, Gryczynski J, Mitchell SG, Kirk A, O’Grady KE, & Schwartz RP (2014). Validity of brief screening instrument for adolescent tobacco, alcohol, and drug use. Pediatrics, 133(5), 819–826. doi: 10.1542/peds.2013-2346 [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Knight JR, Csemy L, Sherritt L, Starostova O, Van Hook S, Bacic J, … Harris SK (2018). Screening and brief advice to reduce adolescents’ risk of riding with substance-using drivers. Journal of Studies on Alcohol and Drugs, 79(4), 611–616. [DOI] [PMC free article] [PubMed] [Google Scholar]
  10. Knight JR, Sherritt L, Harris SK, Gates EC, & Chang G (2003). Validity of brief alcohol screening tests among adolescents: a comparison of the AUDIT, POSIT, CAGE, and CRAFFT. Alcohol Clin Exp Res, 27(1), 67–73. doi: 10.1097/01.ALC.0000046598.59317.3A [DOI] [PubMed] [Google Scholar]
  11. Knight JR, Sherritt L, Shrier LA, Harris SK, & Chang G (2002). Validity of the CRAFFT substance abuse screening test among adolescent clinic patients. Arch Pediatr Adolesc Med, 156(6), 607–614. Retrieved from http://www.ncbi.nlm.nih.gov/pubmed/12038895 [DOI] [PubMed] [Google Scholar]
  12. Levy SJL, & Williams JF (2016). Substance Use Screening, Brief Intervention, and Referral to Treatment. Pediatrics, 138(1). doi: 10.1542/peds.2016-1211 [DOI] [PubMed] [Google Scholar]
  13. Mello MJ, Becker SJ, Bromberg J, Baird J, Zonfrillo MR, & Spirito A (2018). Implementing Alcohol Misuse SBIRT in a National Cohort of Pediatric Trauma Centers—a type III hybrid effectiveness-implementation trial. Implementation Implementation Science, 13(35). [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. Mertens JR, Chi FW, Weisner CM, Satre DD, Ross TB, Allen S, … Sterling SA (2015). Physician versus non-physician delivery of alcohol screening, brief intervention and referral to treatment in adult primary care: the ADVISe cluster randomized controlled implementation trial. Addiction Science & Clinical Practice, 10(1), 26. doi: 10.1186/s13722-015-0047-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
  15. Mitchell SG, Gryczynski J, O’Grady KE, & Schwartz RP (2013). SBIRT for adolescent drug and alcohol use: current status and future directions. J Subst Abuse Treat, 44(5), 463–472. doi: 10.1016/j.jsat.2012.11.005 [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. Mitchell SG, Schwartz RP, Kirk AS, Dusek K, Oros M, Hosler C, … Brown BS (2016). SBIRT Implementation for Adolescents in Urban Federally Qualified Health Centers. J Subst Abuse Treat, 60, 81–90. doi: 10.1016/j.jsat.2015.06.011 [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Monico LB, Mitchell SG, Dusek K, Gryczynski J, Schwartz RP, Oros M, … Brown BS (2019). A comparison of screening practices for adolescents in primary care after implementation of screening, brief intervention, and referral to treatment. Journal of Adolescent Health, 65(1), 46–50. [DOI] [PMC free article] [PubMed] [Google Scholar]
  18. Newacheck PW, Brindis CD, Cart CU, Marchi K, & Irwin CE (1999). Adolescent health insurance coverage: recent changes and access to care. Pediatrics, 104(2 Pt 1), 195–202. Retrieved from http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=PubMed&dopt=Citation&list_uids=10428994 [DOI] [PubMed] [Google Scholar]
  19. O’Donnell A, Angus C, Hanratty B, Hamilton FL, Petersen I, & Kaner E (2019). Impact of the introduction and withdrawal of financial incentives on the delivery of alcohol screening and brief advice in English primary health care: an interrupted time-series analysis. Addiction. [DOI] [PubMed] [Google Scholar]
  20. Ozechowski TJ, Becker SJ, & Hogue A (2016). SBIRT-A: Adapting SBIRT to Maximize Developmental Fit for Adolescents in Primary Care. J Subst Abuse Treat, 62, 28–37. doi: 10.1016/j.jsat.2015.10.006 [DOI] [PMC free article] [PubMed] [Google Scholar]
  21. Pincus HA (1980). Linking general health and mental health systems of care: conceptual models of implementation. Am J Psychiatry, 137(3), 315–320. doi: 10.1176/ajp.137.3.315 [DOI] [PubMed] [Google Scholar]
  22. Pincus HA (1987). Patient-oriented models for linking primary care and mental health care. Gen Hosp Psychiatry, 9(2), 95–101. Retrieved from http://www.ncbi.nlm.nih.gov/pubmed/3569892 [DOI] [PubMed] [Google Scholar]
  23. Proctor E, Landsverk J, Aarons G, Chambers D, Glisson C, & Mittman B (2009). Implementation research in mental health services: an emerging science with conceptual, methodological, and training challenges. Adm Policy Ment Health, 36(1), 24–34. doi: 10.1007/s10488-008-0197-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, … Hensley M (2011). Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health, 38(2), 65–76. doi: 10.1007/s10488-010-0319-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Stanhope V, Manuel JI, Jessell L, & Halliday TM (2018). Implementing SBIRT for adolescents within community mental health organizations: A mixed methods study. Journal of Substance Abuse Treatment 90, 38–46. [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. Sterling S, Kline-Simon AH, Jones A, Hartman L, Saba K, Weisner C, & Parthasarathy S (2019). Health care use over 3 years after adolescent SBIRT. Pediatrics, 143(5). [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. Sterling S, Kline-Simon AH, Satre DD, Jones A, Mertens J, Wong A, & Weisner C (2015). Implementation of Screening, Brief Intervention, and Referral to Treatment for Adolescents in Pediatric Primary Care: A Cluster Randomized Trial. JAMA Pediatr, 169(11), e153145. doi: 10.1001/jamapediatrics.2015.3145 [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Sterling S, Kline-Simon AH, Weisner C, Jones A, & Satre DD (2018). Pediatrician and behavioral clinician-delivered screening, brief intervention and referral to treatment: Substance use and depression outcomes. Journal of Adolescent Health, 62(4), 390–396. [DOI] [PMC free article] [PubMed] [Google Scholar]
  29. Werner MJ, Joffe A, & Graham AV (1999). Screening, early identification, and office-based intervention with children and youth living in substance-abusing families. Pediatrics, 103(5 Pt 2), 1099–1112. Retrieved from http://www.ncbi.nlm.nih.gov/pubmed/10224197 [PubMed] [Google Scholar]
  30. Windle M, Spear LP, Fuligni AJ, Angold A, Brown JD, Pine D, … Dahl RE (2008). Transitions into underage and problem drinking: developmental processes and mechanisms between 10 and 15 years of age. Pediatrics, 121 Suppl 4, S273–289. doi: 10.1542/peds.2007-2243C [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES