Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2016 Jan 31.
Published in final edited form as: Cogn Behav Pract. 2014 Apr 24;22(1):74–86. doi: 10.1016/j.cbpra.2014.03.007

Monitoring Client Progress and Feedback in School-Based Mental Health

Cameo Borntrager 1, Aaron R Lyon 2
PMCID: PMC4524776  NIHMSID: NIHMS688017  PMID: 26257508

Abstract

Research in children's mental health has suggested that emotional and behavioral problems in are inextricably tied to academic difficulties. However, evidence-based programs implemented in school-based mental health tend to focus primarily on treatment practices, with less explicit emphasis on components of evidence-based assessment (EBA), such as progress monitoring and feedback. The current paper describes two studies that incorporated standardized assessment and progress monitoring/feedback into school-based mental health programs. Barriers to implementation are identified, recommendations for clinicians implementing EBA in the school setting are provided, and examples of mental health and academic indicators are discussed.


Emotional and behavioral problems represent significant barriers to student academic success (Adelman & Taylor, 2000; Shriver & Kramer, 1997). Unfortunately, the majority of youth experiencing mental health problems do not receive indicated interventions (Merkingas et al., 2010). Given that children and adolescents spend more time in school than any other setting outside of the home (Hofferth & Sandberg, 2001), providing mental health care in the education sector has the potential to enhance the likelihood that students will receive services (Lyon, Ludwig, Vander Stoep, Gudmundsen, & McCauley, 2013). This stands in contrast to other service sectors, such as community mental health settings, where access is largely parent-mediated and a variety of barriers to care have been identified, particularly for youth from historically underserved ethnic and economic minority groups (Cauce et al., 2002; Yeh, McCabe, Hough, & et al., 2003). Indeed, of the youth who receive mental health services, 70-80% receive them in the school context (Farmer et al., 2003), and research has documented that youth from ethnic and cultural minority backgrounds are just as likely to access school services as their Caucasian counterparts (Kataoka, Stein, Nadeem, & Wong, 2007; Lyon et al., 2013c). Beyond their accessibility, school-based mental health (SBMH) programs allow for early screening, assessment, and intervention, as well as more opportunities for direct behavioral observation than traditional clinic settings (Owens, & Murphy, 2004). It is for many of these reasons that the national emphasis on SBMH has continued to grow (Mental Health in Schools Act, 2013; Protect our Children and our Communities by Reducing Gun Violence, 2013).

Nevertheless, academic goals and mental health services lack a common language or unified system for tracking and communicating meaningful student progress across teachers, administrators, and service providers, resulting in inadequate alignment between the two (Center for Mental Health in Schools, 2011). Indeed, recent research has suggested that mental health—school integration may be enhanced through the implementation of data-driven processes in which outcomes relevant to emotional, behavioral, and academic functioning are routinely monitored (Lyon, Borntrager, Nakamura, & Higa-McMillan, 2013; Prodente, Sander, & Weist, 2002). The recent, growing emphasis on evidence-based assessment (EBA) tools and processes in mental health provides an opportunity to maximize or improve mental health/school integration.

EBA can be defined as “assessment methods and processes that are based on empirical evidence in terms of both their reliability and validity as well as their clinical usefulness for prescribed populations and purposes” (Mash & Hunsley, 2005, p.364). Indeed, there is increasing evidence to suggest that components of EBA – such as monitoring and feedback – may represent stand-alone and worthwhile quality improvement targets for youth mental health services (Bickman et al., 2011). Nevertheless, in schools, EBA-relevant data remain underutilized, in part because the infrastructure for supporting their collection and use are underdeveloped (Lyon et al., 2013a Weist, & Paternite, 2006). In particular, when combined with practice monitoring – recording interventions in tandem with progress indicators – progress monitoring and feedback provide an opportunity for evaluating real-time response to intervention and making as-needed adjustments. Unfortunately, few approaches to accomplishing these goals in SBMH have been articulated.

EBA Principles and Evidence

Notably, the definition of EBA provided above includes both methods and processes for care. When referencing methods, EBA includes (a) standardized assessment tools, which have empirical support for their reliability, validity, and clinical utility (Jensen-Doss & Hawley, 2010), and (b) idiographic assessment approaches, defined as quantitative variables that have been individually selected or tailored to maximize their relevance for a particular individual (Haynes et al., 2009). Idiographic targets may include approaches to goal-based outcome assessment, including Goal Attainment Scaling (Cytrynbaum, Ginath, Birdwell, & Brandt, 1979; Michalak & Holtforth, 2007) and, more recently, “top problems” assessments (Weisz et al., 2011). In contrast, EBA processes may include (a) initial assessment for the purposes of problem identification/diagnosis, and treatment planning, (b) progress monitoring (a.k.a., routine outcomes monitoring; Carlier et al., 2010) over the course of intervention, and/or (c) feedback to clinicians or clients about the results of initial or ongoing assessments (e.g., reporting on progress that has been achieved). Feedback to clinicians is a central component of measurement-based care, while client feedback supports alignment and shared decision making with service recipients. Figure 1 provides an overview of and organizing structure for the method and process components of EBA. Although progress monitoring and feedback are included as discrete processes, it should be noted that monitoring without feedback is unlikely to lead to service quality improvements (Lambert, et al., 2003). The primary focus of the current paper will be on describing school-based EBA processes, particularly approaches to progress monitoring and feedback over the course of an intervention in SMH, but will also address key EBA methods for use in schools in the context of monitoring. Although the constructs discussed have broad applicability across populations, they are specifically relevant to mental health service delivery in the education sector.

Figure 1. Overview of Evidence-Based Assessment Methods and Processes.

Figure 1

Progress monitoring is typically conceptualized as influencing client outcomes through feedback and its impact on clinician behavior. Feedback Intervention Theory (FIT; Kluger, & DeNisi, 1996) posits that behavior is regulated by comparisons of feedback to hierarchically-organized goals. Feedback loops to clinicians have the effect of refocusing attention on new or different goals and levels of the goal hierarchy, thereby producing cognitive dissonance and behavior change among professionals (Riemer, Rosof-Williams, & Bickman, 2005). As clinicians receive information about client symptoms or functioning (e.g., high distress) that is inconsistent with their goal states (i.e., recovery from a mental health problem), FIT suggests that their dissonance will motivate them to change their behavior in some way to better facilitate client improvement (e.g., applying a new or different intervention technique or engaging in additional information gathering). Use of repeated standardized assessment tools to track mental health outcomes and provide feedback to providers has been associated with youth and adult client improvements and reductions in premature service discontinuation (e.g., Bickman et al., 2011; Lambert et al., 2003; 2011), and may enhance communication between therapists and clients (Carlier et al., 2012). Nevertheless, despite these benefits, less is known about idiographic progress indicators and their influence on clinician behavior. In addition, research has consistently found that community-based clinicians are relatively unlikely to use EBA tools, and even less likely to engage in EBA processes such as incorporating them into their treatment decisions (Garland, et al., 2003; Hatfield & Ogles, 2004; Palmiter, 2004).

EBA in School-Based Mental Health

Within a SBMH framework, EBA is an important element of effective service delivery, the principles and characteristics of which are consistent with leading models of educational interventions. For instance, EBA – and in particular progress monitoring – is highly compatible with the increasingly-popular Response to Intervention (RtI; Bradley, Danielson, & Doolittle, 2007) frameworks in schools. RtI is a model for best practice in the education field, which incorporates data collection and evidence-based interventions in a step-wise fashion. Specifically, data related to student academic success (e.g., reading test scores on brief measures of reading fluency) are used explicitly to drive decision making about student progress and determine whether there is a need to adapt, maintain, increase, or discontinue elements of an educational intervention (Hawken, et al., 2008).

In light of the growing emphasis on RtI within education, progress monitoring and feedback in SBMH have the potential to demonstrate a high level of contextual appropriateness – a key variable in the uptake and sustained use of new practices (Proctor et al., 2009). Indeed, this is one reason why EBA has been identified as a particularly malleable quality improvement target for school-based service delivery (Lyon, Charlesworth-Attie, Vander Stoep, & McCauley, 2011). Many SBMH providers also endorse regularly collecting a variety of academically-relevant information sources to measure the effectiveness of their practice, including teacher and student self-report, observation, and school data (e.g., attendance, disciplinary reports; Kelly & Lueck, 2011). Progress monitoring data in schools may therefore require a broader conceptualization than in other service delivery settings if data are to be meaningful to both clinical progress and academic success. Emerging frameworks suggest that these data should include idiographic indicators, such as school (e.g., attendance) and academic (e.g., homework completion) outcomes, alongside more traditional measures of mental health symptoms (Lyon et al., 2013a), and should be integrated in user-friendly formats to be used in feedback and clinical decision-making.

Recently, Lyon and colleagues (2013a) articulated how academic and school data can be emphasized to create more contextually-appropriate services in the education sector. Drawing from Daleiden and Chorpita's (2005) evidence-based service system model, they differentiated four separate evidence bases – encompassing different facets of EBA – which can inform interventions and serve as sources of information for use in clinical care (each is described below). The utility of EBA to develop a feedback loop surrounding treatment decisions should be just as applicable to SBMH as the community-based settings in which it is more commonly discussed.

The first evidence base, general services research evidence, includes information systematically mined from the existing empirical literature through research articles and treatment protocols. Inherently, this evidence base includes EBA tools and processes because many evidence-based treatment protocols also include routine, standardized outcome evaluation; at least for the purpose of establishing an intervention's efficacy. Although the services research evidence base is relatively well developed, it is not always accessible or easily integrated into practice, thus underscoring the utility of training in a finite number of standardized assessment instruments. The case history evidence base includes information drawn from individualized, case-specific data, such as clinical interactions with clients and historical information relative to treatment success and progress. The case history evidence base can be utilized to inform idiographic progress monitoring measures based on a youth's unique presentation. The local aggregate evidence base (also referred to as “practice based” evidence by Daleiden & Chorpita, 2005) uses case-specific data (i.e., case history evidence), aggregated across cases into larger meaningful units (e.g., therapists, provider agencies, or regions) for program evaluation and administration purposes. This practice-based evidence can be used to make individualized treatment decisions using assessment and progress monitoring benchmarks for a particular client's local aggregate reference group (e.g., Higa-McMillan et al., 2011). Finally, causal mechanism evidence refers to a more general and comprehensive understanding of etiological and treatment processes, including tacit knowledge and collective wisdom contained within the intervention team or drawn from theoretical models of therapeutic change. Among the four evidence bases, causal mechanism evidence is arguably the least standardized and is highly dependent upon provider factors such as theoretical orientation. According to Daleiden and Chorpita (2005), due to their individual limitations, all of the evidence bases should be integrated to inform treatment planning and clinical decision-making, including decisions relevant to EBA.

Aims of the current paper

Given the underutilization of EBA processes and tools in SBMH settings, the aims of the current paper are to (a) provide an overview of two projects implementing progress monitoring and feedback in schools within the context of modular psychotherapy (described below); (b) describe the principles of progress monitoring that informed those projects, relevant data about the EBA processes, and provide recommendations for monitoring and feedback in schools; and (c) describe barriers that were encountered and strategies for how they were overcome. The overarching goal is to provide examples of real applications of progress monitoring within a school context, as well as to provide ‘how to’ lessons for clinicians to make use of assessment-based feedback, minimize barriers to EBA, and maximize opportunities for positive client outcomes.

Overview of Projects

Behavioral Education Systems Training (B.E.S.T.)

The overarching purpose of the B.E.S.T. project was to develop and provide a continuum of emotional and behavioral supports and interventions for children by building a unified network of mental health and school professionals trained to utilize evidence-based practices (EBPs). Given the emphasis on EBPs, EBA tools and processes of EBA were introduced throughout training and consultation. In addition, at the initiation of the project, schools within the participating district were at varying stages of implementation of the national Positive Behavioral Interventions and Supports initiative (PBIS; www.pbis.org), a facet of the Montana Behavioral Initiative (MBI) that combines PBIS and RtI models. MBI emphasizes the collection and use of assessment data in schools to inform behavior plans, Individualized Education Plans, and early intervention strategies.

Although a number of services were developed and provided in the B.E.S.T. project, the focus of the current description is on the implementation of training and ongoing consultation in EBA, particularly progress monitoring and feedback, for SBMH clinicians trained in a modular psychotherapy model and the clinical dashboard tool. Modular psychotherapy emphasizes ‘common elements’ of existing evidence-based treatments. Specifically, this approach is rooted in the perspective that most evidence-based treatment protocols can be subdivided into meaningful components, which can then be implemented independently or in complement to bring about a specific treatment outcome (Chorpita, Daleiden, & Weisz, 2005). This type of intervention was recently compared to usual care and “standard-arranged” manualized treatments in a multi-site randomized controlled trial for youth with anxiety, depression, and/or conduct problems (MATCH-ADC; Weisz, Chorpita, Palinkas, Schoenwald, Miranda, Bearman, et al., 2011). The modular arrangement of EBPs outperformed both usual care and standard manualized treatments in a mixture of school and community mental health settings.

Because clinical decisions guiding modular psychotherapy are informed by EBA data, Chorpita and colleagues (2008) created an electronic tool for tracking client progress and provider treatment practices called the ‘clinical dashboard.’ The clinical dashboard provides a platform for collecting real-time data on provider treatment practices and client progress to map the relationship between the two, provide feedback to clinicians, and inform clinical decision-making. Further, the clinical dashboard presents a snapshot of most relevant treatment information in a meaningful, user-friendly format (e.g., graphical, chronological presentation of data; Chorpita, Bernstein, Daleiden, & The Research Network on Youth Mental Health, 2008).

In the state of Montana, the majority of SBMH clinicians work in Comprehensive School and Community Treatment (CSCT) teams, which consist of both a therapist (typically a Masters-level social worker or licensed professional counselor) and a behavioral specialist (individual with agency-provided training in behavior management, and often a Bachelor's-level education background in psychology, social work, or related field). For the current project, CSCT teams across 4 schools participated—three elementary schools and one middle school. Over the course of two years, 19 CSCT clinicians and 3 supervisors were trained in modular psychotherapy, associated EBA tools and processes, and the clinical dashboard tool with which they used to collect data on their subsequent cases.

In B.E.S.T., modular psychotherapy trainings consisted of 5 days (40 hours) of didactic and experiential coverage of modular EBPs for youth with a variety of mental health difficulties, as well as emphasis on and behavioral rehearsal with EBA and the clinical dashboard tool. Indeed, CSCT teams were trained in administering and scoring relevant standardized measures for progress monitoring. Training also involved identifying and role-playing the collection of both mental health and academic idiographic indicators keyed to target problem areas. Exercises regarding the use of progress monitoring feedback data to make practice and intervention decisions were also introduced. Due to the availability of funding, training instances were rolled out slowly over the course of two years. Five-day trainings occurred in August 2011, February 2012, and August 2012. Training groups were chosen based on openings in schedules. Introduction to EBA and progress monitoring, described above, as well as the clinical dashboard tracking tool, was provided during each of the 5 days in the modular psychotherapy trainings, as well as continually throughout the consultation period that followed the training events. In order to maximize efficiency in consultation as well as for trainees to benefit from the learning experiences of their colleagues, each new group of CSCT teams joined the ongoing consultation group in their respective school after being trained. Thus, following the August 2012 modular psychotherapy training, consultation groups ranged in size from ten to six clinicians and consultation meetings were held approximately every 2 weeks, with fewer meetings held during the summer months. During the consultation meetings, CSCT teams reviewed dashboards for their cases, and a number of other process-oriented topics were covered (i.e., adapting practice based on diversity issues, selecting and arranging treatment modules, selecting appropriate assessment measures). Cases were presented for a variety of reasons, but often they were nominated for the agenda based on poor progress, deterioration, or to discuss crisis management.

Quantitative data were aggregated across the available clinical dashboards from the two year project period (dashboards were shared with the first author throughout the project). ‘Social skills’ and ‘problem solving’ were the most frequently endorsed practice elements. Disruptive behavior was reported as the most common, primary focus of treatment (33% of cases; n = 83, two youth did not have problem area data reported), as well as was the most commonly reported interference/secondary problem area (54% of cases; n = 54, thirty-one youth did not have interference problems reported).

School-Based Health Center (SBHC) Mental Health Excellence Project (Excellence)

A separate modular psychotherapy pilot was initiated in the context of an existing partnership between academic researchers, the public school district, the local department of public health, and a variety of community health service organizations in an urban public school district in the Pacific Northwest. University-based consultants had been providing training and support to school-based health center (SBHC) therapists for seven years at the time of the pilot. Although this existing relationship may have facilitated participation or predisposed some clinicians to the concepts presented, previous trainings had not focused explicitly on assessment. Furthermore, findings from the original study indicated that the participants did not differ notably from national norming samples on two established measures of EBP attitudes and awareness at baseline (the Evidence-Based Practice Attitudes Scale and the Knowledge of Evidence-Based Services Questionnaire; Lyon et al., 2011).

To fit within the existing consultation structure and the constraints of the school mental health context (e.g., limited time for training; Lyon et al., 2013b), components of a modular psychotherapy were adapted for implementation. Adaptations included the selection of depression and anxiety modules only, based on previous research about the most commonly-treated conditions in SBHCs (Walker, Kerns, Lyon, Bruns, & Cosgrove, 2010) and pre-implementation data collection. The narrower diagnostic focus limited the number of relevant practice modules and enhanced feasibility.

Modules were introduced gradually in an effort to maximize the fit with the pre-existing consultation structure, rather than a single introductory 5-day (i.e., 40 hour) training. Initial training occurred over three separate half-day sessions at sites accessible to SBHC providers. Clinical dashboards, principles of EBA and progress monitoring, and a subset of modules were introduced in the in the first session. In the second session, additional modules were introduced and providers were coached as they interacted with the dashboards. Following the second session, therapists were asked to begin tracking five clients at a time with primary presenting problems of anxiety or depression. Similar to the B.E.S.T. project, using the dashboards, therapists monitored their use of psychotherapy modules as well as scores on standardized outcome measures and idiographic measures of student functioning/progress. Consultation occurred biweekly over the course of the academic year and included case review, training in additional practice modules, and discussion of progress monitoring indicators. Consultants reviewed dashboards for all active cases prior to each consultation meeting. Cases were selected for discussion for a variety of reasons, but primarily because of problematic client outcomes, as evidenced by progress monitoring data (i.e., deterioration, elevated scores).

Over one academic year, 7 participating clinicians (nearly all of whom held Masters degrees) were trained across six schools. Seventy-five percent of students tracked had a primary presenting problem of depression with the remainder presenting with anxiety or mixed anxiety and depression. Therapists’ dashboard-based reports of module use indicated that the most commonly administered modules included self-monitoring, cognitive restructuring for depression, psychoeducation for depression, problem solving, and skill building (see Lyon et al., 2011 for a full description of adaptations and findings).

Principles and Recommendations for Progress Monitoring and Feedback in Schools

In the context of the projects described, principles of progress monitoring and feedback were applied throughout, beginning with the training objectives and following through ongoing consultation and treatment termination. Based on both quantitative and qualitative data collected throughout the course of both the B.E.S.T. and Excellence projects, barriers to progress monitoring, ‘lessons learned,’ and recommendations for overcoming barriers to EBA were identified.

Principle 1: Select targets that are meaningful to the client

As a result of the B.E.S.T. project, standardized assessment measures were routinely introduced to cases being assessed for eligibility for CSCT. Specifically, the Strengths and Difficulties Questionnaire (SDQ; Goodman, 1997) was identified as the most practical quantitative instrument for use in schools because it has multiple formats (e.g., parent, teacher, and self-report), can be administered to a wide range of ages (4-17 years old), is relatively short, and it is in the public domain. As described above, the Excellence project had a more narrow diagnostic focus. Primary problem areas of depression and/or anxiety were identified using clinicians’ routine intake procedures (which may or may not have involved initial standardized screening measures), but were then confirmed with standardized tools, such as the Short Mood and Feelings Questionnaire (S-MFQ; Angold et al., 1995). Importantly, in both projects standardized assessment measures were utilized to either identify or confirm the problem areas defined by youth and/or their caregivers as most meaningful. Also, in both projects, use of standardized measures generated information at the level of the local aggregate and case history evidence bases; measures could be aggregated to provide group data on programs or agencies and were also utilized for individual youth progress monitoring.

Once the primary presenting target areas were identified, clinicians were encouraged to begin developing their treatment plans, typically utilizing the general services research evidence base, facilitated by project consultants, and/or any case-specific evidence that may be informative. Undoubtedly, clinicians also implicitly or explicitly accessed the causal mechanism evidence base when making treatment planning decisions, dependent upon their graduate training experiences and theoretical orientations.

At the point of initial treatment planning, clinicians were encouraged to identify with their clients relevant treatment goals and measurable indicators for tracking. The indicators included both standardized tools and idiographic monitoring targets. For example, Excellence clinicians used the S-MFQ most frequently, administering it in 77% of all sessions (N = 377). This number generally corresponded to the percentage of students who had a primary presenting problem of depression. Related to anxiety, clinicians in the Excellence project were less likely to use standardized measures, using them in only about 5% of sessions (17% of those with a primary problem of anxiety or mixed depression and anxiety). Specifically, clinicians reported using the Leahy Anxiety Checklist (Leahy & Holland, 2000), the Revised Children's Anxiety and Depressive Scale (RCADS; Chorpita, et al., 2000), and the Self-Report for Childhood Anxiety Related Emotional Disorders (SCARED; Birmaher et al., 1997; 1999); although, each of these tools was reported to be used at low frequency (each was used in less than 3% of all sessions for youth with depression and anxiety). Whereas the use of depression measures was consistent with depression rates in the student sample, use of anxiety measures was lower than client presentation alone would predict. This may have occurred because the frequency of depression presentations provided ample opportunities for providers to become quickly comfortable with the use of depression measures and that such comfort increased the likelihood of their subsequent use. In addition to standardized measures, idiographic monitoring targets such as self-reported level of suicidality in each session (rated on a 1-10 scale with higher numbers indicating greater thoughts and urges) was tracked in approximately 8% of all 487 sessions tracked. Similarly, the number of times a student thought about suicide since the prior session was recorded in 4% of all sessions.

In the B.E.S.T. project, the SDQ (described previously) was required to be administered quarterly as part of the introduction of standardized measures into the participating agencies. In addition, clinicians were encouraged to administer the RCADS, as well as the SDQ, for those cases in which anxiety was considered a focus problem area though the RCADS was not a required measure by the participating agencies. In the 19 cases where anxiety was the identified primary problem area, the RCADS was administered in 14 of those cases (74%). In 100% of cases, at least one idiographic indicator was measured which was typically keyed to the primary and/or secondary presenting problem areas (e.g., frequency counts of behaviors such as tantrums, curse words, positive peer interactions, etc.).

Principle 2: Monitor more than just symptoms

Functional outcomes are infrequently reported in clinical trials and, when they are, they are less likely to demonstrate improvements in response to intervention (Becker, Chorpita, & Daleiden, 2011). These findings underscore the importance of developing case history and local aggregate evidence related to functional indicators as such information is likely to extend beyond the data available in the general services evidence base. As described previously, providing mental health services in a school context introduces a number of opportunities for combining EBA relevant to mental health outcomes as well as educational outcomes. Educational outcomes can describe both school data, including attendance rates, frequency of tardies, and disciplinary events, as well as academically-oriented targets such as grade point average, credits earned, or the results of curriculum-based or standardized measures (Lyon et al., 2013a). Research has found that few studies incorporate both mental health and educational outcomes; although, for those that do, some positive impacts can be found (Becker, Brandt, Stephan, & Chorpita, in press; Hoagwood, et al., 2007; Farahmand, et al., 2011).

Given these complexities, consultants from both projects worked with school-based clinicians to identify client-specific functional indicators as a component of progress monitoring. B.E.S.T. providers were also explicitly trained in functional behavior assessments (FBA; Crone & Horner, 2003). From 2012 to 2013, three FBA trainings were provided for CSCT teams and attendance varied across them (average of 30 clinicians per training). In that project, FBA was used in combination with the intake assessment measures and interviews to identify relevant progress monitoring indicators and their functions (e.g., running out of the classroom as a means to escape completing math worksheets). By identifying the function of a behavior, providers could better select a behavior's positive opposite and track its increase/improvement, which was also in line with the MBI.

School-based clinicians in the B.E.S.T. and Excellence projects were also coached to track more than mental health symptoms, and to incorporate academic variables whenever possible. For instance, clinicians were encouraged to prioritize idiographic indicators that were most likely to show improvement. In the Excellence project, individualized monitoring targets were identified to help guide relevant constructs for progress monitoring and feedback (client's top problems). In the B.E.S.T. project, identifying targets such as these were also encouraged, particularly from a self-monitoring and observable behavior standpoint; though, 56% of those youth for whom disruptive behavior was a primary problem (n = 15) also had an educationally-relevant target tracked, such ‘number of minutes spent in mainstream classroom,’ ‘percent of time in class per day,’ or frequency of ‘office discipline referrals’ (ODRs). In Excellence, which was conducted in middle schools and high schools, educationally-relevant monitoring targets were collected though were somewhat less common. Targets included the frequency of contact with a student's teacher or academic counselor. In both projects, the progress of educational indicators was generally consistent with symptom indicators (e.g., if symptom indicators were improving, so were educational indicators); however, they provided a richer picture of the severity of youth target problem areas as well as the degree of progress.

Principle 3: Provide feedback to the client

In both projects, clinicians were both the recipients and providers of feedback related to student progress. Following the identification of problem areas and development of treatment plans, clinicians were encouraged to identify progress monitoring indicators, in collaboration with clients, and to communicate this information to clients whenever possible (facilitated by the clinical dashboard, described below). The development of a self-monitoring system, and/or progress monitoring targets identified by others (e.g., teachers, caregivers, etc.), may take time to refine, though it allows for more dialogue with clients and with adult caregivers. Indeed, for youth where ‘self-monitoring’ was endorsed as a delivered practice element (n = 15 in B.E.S.T. and n = 50 in Excellence), an average of 3.9 (B.E.S.T.) and 4.8 sessions (Excellence) were reported with this emphasis. Further, throughout the ongoing consultation meetings, clinicians reported a number of strategies for providing feedback to their clients regarding targets and progress. For example, some clinicians created handmade, idiographic self-monitoring scales with their clients, which they could reference at each session (e.g., colored faces to represent different emotions or severity of certain emotions; wall thermometers, etc.). In addition, a number of clinicians in B.E.S.T. reported having clients enter in their own data points into the clinical dashboard, which operated not only as a feedback system (i.e., clients could view their progress lines increasing, decreasing, or staying the same), but also an engagement strategy (i.e., engaging on the computer as an investment into their own treatment progress and goal setting). Ultimately, feedback to clients both informs and is informed by the case history evidence base. For example, clinicians were coached to provide feedback to youth who had identified ‘attention problems’ through more creative or interactive means. Strategies such as these that provide direct client feedback may increase the likelihood that progress data are utilized, which is especially important given that some have suggested potential iatrogenic effects of administering measures or collecting idiographic data and not utilizing them in clinical decision-making about treatment (Wolpert, in press).

Principle 4: Provide visual/graphical feedback

Throughout the course of both projects, feedback was provided to clients, caregivers, and other informants. Whenever possible, clinicians were encouraged to provide feedback to clients visually via the clinical dashboard tool (Chorpita, et al., 2008); although, no data were collected related to the frequency with which this occurred. In B.E.S.T., EBA data were also explicitly aggregated annually and presented visually (aggregated clinical dashboard) to provide feedback information to individual agencies as well as the school district, thus generating a local aggregate evidence base. Not only can the clinical dashboard function as an engagement strategy relative to progress monitoring, as discussed previously, it also facilitates a feedback-intervention loop in a manner aligned with the RtI model. Specifically, a clinician may input practice element information, derived from clinical interactions with individual youth, and daily or weekly progress relevant to that practice is displayed. Over time, the dashboard displays clinical and academic progress, which denotes a finite number of actions: continue moving forward with treatment plan until goals are met, change practices, maintain current practice, or review practices. For example, for a client with a focus area of anxiety, a school-based clinician can track the number of times the client raises his/her hand to speak in class as relaxation techniques are introduced. This information could be collected weekly from the client's teacher via a simple tracking form that involves making a tick mark each time the child speaks in class. Such information can be presented in meetings with the client each week and a ‘benchmark’ line introduced to help with goal setting. Meeting benchmarks could be displayed visually on the graph and/or be correlated with tangible rewards. Importantly, the version of the clinical dashboard used in both studies could incorporate up to 5 progress measures and therefore the slope of each line may be positive or negative (and lines can cross) depending upon what is being tracked. Figure 2 shows a de-identified clinical dashboard from the B.E.S.T. project.

Figure 2. Sample Clinical Dashboard Presentation Pane.

Figure 2

In addition to its ability to facilitate consultation, the clinical dashboard can be especially useful in situations where school-based clinicians work in teams. Specifically, the clinical dashboard file could be stored on shared networks such that both members of the B.E.S.T. CSCT team could access the file at different times of a day or week. Indeed, clinical dashboard files were also shared within IEP meetings or other school treatment team meetings, including those involving multidisciplinary emphasis (e.g., meetings with psychiatrists).

Additional recommendations

Given the fast pace of a school environment, frequent, brief assessments are often better than more extensive assessments, conducted infrequently. Additionally, the limited time for SBMH intervention does not lend itself to lengthy assessment measures. Beyond caseload size and service provision pressures on clinicians (Lyon et al., in press-b), extensive assessment measures may also be time and labor intensive for students and caregivers. Indeed, the likelihood of a busy caregiver or teacher completing a 100+ item questionnaire is likely reduced at the busiest times of the school year. Teachers are often asked to complete measures for multiple youth in their classrooms, which can be burdensome. In addition, in both projects, school-based clinicians were placed within their respective schools for the entirety of the school day. This may provide opportunities for real-time data collection throughout the day, which could be maximized by applying different data collection intervals to different outcome targets. However, with an average of 9.8 clients per caseload in B.E.S.T. (recall supervisors saw an average of 2 cases) and 39.3 in the SBHCs where Excellence occurred, those data collection opportunities must be brief. An example of a brief, frequent progress indicator from the B.E.S.T. project included tracking ‘points/levels earned,’ which were based on the presence of positive behaviors that were both keyed to the MBI behavioral expectations in the school (e.g., safe behaviors, respectful behaviors, etc.), as well as individualized to a youth's difficulties, and could be tallied per teacher, per class.

Barriers and Lessons Learned

Several barriers and lessons learned were identified throughout the course of these two projects. First, time was identified as a significant barrier. Interestingly, although inputting data into the clinical dashboard itself can take a matter of seconds, there are a number of processes surrounding the use of the clinical dashboard that were apparent throughout implementation and at times functioned as barriers. For example, trainee comfort with the use of technology varied and may have impacted the uptake of the clinical dashboard tool, such that those individuals who had less facility with computers and Microsoft Excel tended to have greater trouble keeping their clinical dashboards up to date (e.g., more frequent out-of-date clinical dashboards presented at consultation meetings). Further, a follow-up interview with providers who both participated and chose not to participate in Excellence revealed that time was the top concern noted (Lyon et al., 2013b). Within the B.E.S.T. project, time constraints became apparent relevant to the sheer number of interactions with youth throughout the school day. One of the CSCT teams had 162 contacts with a youth within a semester, which was inherently related to the school culture that viewed CSCT teams as primarily crisis management (a barrier that was being addressed via the implementation of MBI). Also related to time constraints, consultation meetings frequently focused on methods for efficient data collection given the number of billable units clinicians acquired on their cases throughout the day (15 minute increments per federal billing guidelines), and the collection of data therein. Within the B.E.S.T. project, data collection often had to be adapted to take advantage of existing data (e.g., ODRs or ‘points’ per classroom that was collected via the implementation of MBI systems) in order to address time inefficiencies.

In the Excellence project, although there was a different billing structure, time was still reportedly a concern in particular because of Excellence clinicians’ large caseloads. Often, streamlining progress indicators meant modifying them to be less accurate and real-time. For instance, for certain clients, daily or even weekly teacher ratings were difficult to collect (in terms of teacher compliance and/or clinician compliance with tracking frequency counts) and therefore were modified to be an ‘average’ count of a behavior or the ‘highest’ instance of a behavior within a week. In Excellence, student self-reported indicators were used much more commonly for this reason as well.

Finally, time was also a barrier relevant to billing requirements, including the amount and redundancy of state- and federally-mandated documentation. Although consultation often focused on efficiency relevant to EBA (e.g., completing dashboards while completing billing notes; utilizing time within sessions to update the dashboards and communicate with clients regarding their progress and treatment planning), CSCT teams in B.E.S.T. frequently took work home or worked over 40 hours per week to complete all of their requirements. In Excellence, some providers viewed completion of dashboards and associated EBA measures as additional paperwork (Lyon et al., 2013b). One recommendation for future research to address these barriers would be to include new infrastructure to support the use of data tracking methods, particularly one where clinical dashboards could be integrated with required billing paperwork (e.g., integrated into the electronic medical record information system), as well as allowing for modified billing structures and infrastructure to administer and score assessment measures. This could help to support more easy access to both data entry as well as quick access for fulfilling billing requirements. In a similar vein, modifying billing requirements to allow for the additional case management issues pertinent to the school setting, such as IEP meetings, Student Intervention Team meetings, etc. would directly address time inefficiencies.

In addition to time, the process of incorporating information from the four evidence bases to make treatment decisions cannot be immediately mastered. Across both projects, few clinicians were comfortable using standardized assessment measures or engaging in progress monitoring and feedback at the beginning of the initiatives, at least in part because data collection methods were not required prior to the initiation of either project. Indeed, even with clinicians who were comfortable with EBA tools and processes and the technology required for tracking were, at times, examining the progress indicators in hindsight. Indeed, flowing through a sequence of clinical decisions in a step-wise fashion, such as those presented in the ‘roadmap’ by Chorpita, Bernstein, Daleiden, and the Research Network on Youth Mental Health (2008) which are informed by the four evidence bases described above, is a learning process for clinicians and is also inherently reliant upon identifying relevant, accurate, practical progress indicators. Thus, this iterative process requires scaffolding, of which the consultant meetings frequently consisted. The difficulty with relying on progress data to make subsequent clinical decisions was further compromised, at times, by the pressures of the school context in which the time spent completing FBAs, collecting measures, or dialing in idiographic measurements was often time spent out of classroom instruction, out of control, and/or exhibiting inappropriate behaviors. If billing requirements are adjusted to provide more expansive coverage of EBA methods, it is likely that clinicians will be able and wiling to allot more time to this learning process.

Finally, although data collection methods are becoming more of a common practice in schools with the proliferation of RtI and PBIS models, data collection and use processes require additional work. For instance, within project B.E.S.T., data on ODRs, attendance, curriculum-based measurement scores, among other academic indicators, were routinely collected for all students as part of the MBI initiative. How these data were utilized and by whom varied substantially. In some cases, only the school principal, school psychologist, and/or school counselor examined the aggregate data (and the frequency of these instances also varied). Whether or not the data collected by school staff were communicated back to the staff, in palatable format, also varied substantially, which was evident via the project consultant (Borntrager) sitting in on MBI team meetings at the participating schools. Without the regular, understandable communication of these data, it is likely that school staff will continue to report them up to a point (compliance); although, research has suggested that fulfilling compliance regulations is not enough incentive to continue collecting data and/or to utilize it in decision-making (Kelly, 2011). Clinicians would benefit from the development of a consultation protocol that is specifically focused on the interpretation and communication of EBA data, which could be integrated into their professional development trainings and supervision. Further, if clinicians are allowed to bill for staff and consultation meetings within the school setting, it is likely that SBMH clinicians would be able to take on more of a leadership role to organize routine, data-based meetings on individual youth as well as to aggregate and interpret data for the whole school staff. Although these changes were beginning to take place within the B.E.S.T. project (e.g., in one participating school, the CSCT teams were allotted case presentation time to cover data collection methods in the weekly school staff meeting), adapting billing requirements will likely be the largest sustainability factor for future EBA and practice implementation. Another strategy for addressing the sustainability of EBA practices could include additional training and professional development relative to educators’ knowledge and attitudes toward EBA – particularly if school staff ‘see the value’ of data collection and are able to utilize the outcomes in their own teaching practices.

Current and Future Directions

There are a number of recommendations for current and future directions that can be made based on the lessons learned within the B.E.S.T. and Excellence projects. These recommendations are summarized in Table 1 and also relevant to the EBA literature (e.g., Lyon, et al., 2013a). For example, given the difficulties encountered relevant to clinician comfort and knowledge of EBA and its’ uses, SBMH agencies would benefit from specific, ongoing professional development in the tools and processes of EBA. In particular, explicit training in the incorporation of academic indicators into regular evidence-based practice and assessment monitoring systems would be beneficial. Simply providing training in the structure of EBA within the school context is unlikely to elicit sustainable adhere to these practices, however. Thus, future research should focus on the development of a protocol for consultation and supervision that is specific to SMH and emphasizes the decision-making processes involved in EBA, as well as the implementation of these processes. Given the low-resource environments that schools represent, providing structured guidance in EBA for SBMH staff via specific consultation may be a more efficient method for improving accountability, and potentially student outcomes, than intensive training in extensive EBP programs (Evans & Weist, 2004).

Table 1. Characteristics of the B.E.S.T. and Excellence Projects.

B.E.S.T. Excellence
Focus problem area Any problem area Depression and Anxiety only
Most frequently reported problem Disruptive behavior, 33% of cases Depression, 75% of cases
Duration of project roll-out 2 years 1 year
N of schools involved 4 6
N of clinicians trained 22 7
N of clients treated 85 66
Average N of clients treated per provider 9.8 39.3
Average number of sessions per client 24.2 Not available
Total number of sessions Not available 487

Another issue evidenced through the ‘lessons learned’ in both projects is that infrastructure to support the implementation of EBA tools and processes is needed. Regardless of whether new infrastructure is being developed or existing systems repurposed, meaningful use of infrastructure for tracking educational data can be facilitated if districts and individual school systems prioritize professional development for a wide range of teachers and paraprofessionals on the tenets of principles such as data tracking, knowledge of confidentiality, behavior management strategies, and use of available data tracking systems. This approach will help to avoid the ‘single user’ phenomenon whereby data are filtered to one individual who is familiar with the data tracking technology but not to other school staff who lack knowledge about the system. This phenomenon may increase the likelihood that practitioners do not feel ‘ownership’ over the data nor utilize it to make practice decisions. Anecdotal reports from projects in which stakeholders at multiple levels have been brought together to review data (Higa-McMillan et al., 2011) suggest their value in increasing engagement in data collection and use. Inextricably, professional engagement with outcome monitoring software may also be affected by existing documentation and billing practices, given that requirements often consume valuable time that could be devoted to implementing strategies for data-driven decision-making and that tracking systems may be viewed as redundant.

Table 2. Recommendations for EBA in Schools.

Strategy Objective
Training and professional development for SBMH staff in the tools and processes of EBA
  • Training in the administration and interpretation of standardized EBA tools

  • Training and ongoing professional development in the processes associated with EBA, such as identifying management idiographic targets

  • Training and ongoing professional development in the identification and incorporation of academic indicators and interventions into mental health practice

Consultation protocol development
  • Development of a SBMH-specific protocol for consultation and supervision in EBA tools and processes

  • The protocol should include explicit emphasis on the decision-making processes involved in EBA

Develop infrastructure to support the implementation of EBA tools and processes
  • Repurpose existing infrastructure to support the use of EBA (e.g., shared network drives, modifying spreadsheets for individual school purposes

  • Develop new data collection and management systems that are accessible to all staff and remain HIPAA compliant

  • Infrastructure should also include administrative support for data collection, storage, and communication procedures

  • Regular staff meetings in which aggregate and individual data are communicated to those individuals who assist with data collection

Training and professional development for educators and wide range of paraprofessional staff on EBA principles
  • Training and professional development for educators and paraprofessionals should include emphasis on the tenets of EBA such as data tracking, knowledge of confidentiality, behavior management strategies, and use of available data tracking systems

Draw from parallel models of integrated/collaborative care for adults
  • Utilize adult models that facilitate the management of chronic mental health conditions (e.g., depression) in primary care settings (c.f., Thota et al., 2012).

Acknowledgments

This publication was made possible, in part, by funding from the Montana Mental Health Settlement Trust grant entitled “Comprehensive Training Network for Children's Mental Health Services” awarded to the first author and also by grant number K08 MH095939, awarded to the second author from the National Institute of Mental Health.

Dr. Lyon is also an investigator with the Implementation Research Institute (IRI), at the George Warren Brown School of Social Work, Washington University in St. Louis; through an award from the National Institute of Mental Health (R25 MH080916) and the Department of Veterans Affairs, Health Services Research & Development Service, Quality Enhancement Research Initiative (QUERI).

Contributor Information

Cameo Borntrager, University of Montana.

Aaron R. Lyon, University of Washington.

References

  1. Adelman HS, Taylor L. Promoting mental health in schools in the midst of school reform. Journal of School Health. 2000;70:171–178. doi: 10.1111/j.1746-1561.2000.tb06467.x. [DOI] [PubMed] [Google Scholar]
  2. Becker K, Chorpita BF, Daleiden E. Improvement in symptoms versus functioning: How do our best treatments measure up? Administration and Policy in Mental Health and Mental Health Services Research. 2011;38:440–458. doi: 10.1007/s10488-010-0332-x. [DOI] [PubMed] [Google Scholar]
  3. Becker KD, Brandt NE, Stephan SH, Chorpita BF. A review of educational outcomes in the children's mental health treatment literature. Advances in School Mental Health Promotion in press. [Google Scholar]
  4. Bickman L, Douglas S, Breda C, de Andrade AR, Riemer M. Effects of routine feedback to clinicians on mental health outcomes of youths: Results of a randomized trial. Psychiatric Services. 2011;62:1423–1429. doi: 10.1176/appi.ps.002052011. [DOI] [PubMed] [Google Scholar]
  5. Birmaher B, Khetarpal S, Brent D, Cully M, Balach L, Kaufman J, McKenzie S. The Screen for Child Anxiety Related Emotional Disorders (SCARED): Scale construction and psychometric characteristics. Journal of the American Academy of Child & Adolescent Psychiatry. 1997;36:545–553. doi: 10.1097/00004583-199704000-00018. [DOI] [PubMed] [Google Scholar]
  6. Birmaher B, Brent D, Chiappetta L, Bridge J, Monga S, Baugher M. Psychometric properties of the Screen for Child Anxiety Related Emotional Disorders (SCARED): A replication study. Journal of the American Academy of Child & Adolescent Psychiatry. 1999;38:1230–1236. doi: 10.1097/00004583-199910000-00011. [DOI] [PubMed] [Google Scholar]
  7. Bradley R, Danielson L, Doolittle J. Responsiveness to Intervention: 1997 to 2007. Teaching Exceptional Children. 2007;39:8–12. [Google Scholar]
  8. Carlier I, Meuldjk D, Van Vllet I, Van Fenema E, Van der Wee N, Zitman FG. Routine outcome monitoring and feedback on physical or mental health status: Evidence and theory. Journal of Evaluation in Clinical Practice. 2012;18:104–110. doi: 10.1111/j.1365-2753.2010.01543.x. [DOI] [PubMed] [Google Scholar]
  9. Cauce AM, Domench-Rodriguez M, Paradise M, Cochran BN, Shea JM, Srebnik D, Baydar N. Cultural and contextual influences in mental health help seeking: A focus on ethnic minority youth. Journal of Consulting and Clinical Psychology. 2002;70:44–55. doi: 10.1037//0022-006x.70.1.44. [DOI] [PubMed] [Google Scholar]
  10. Center for Mental Health in Schools. Moving beyond the three tier intervention pyramid toward a comprehensive framework for student and learning supports. Los Angeles, CA: Center for Mental Health in Schools; Feb, 2011. [Google Scholar]
  11. Chorpita BF, Daleiden E, Weisz J. Identifying and selecting the common elements of evidence-based intervention: A Distillation and Matching Model. Mental Health Services Research. 2005;7:5–20. doi: 10.1007/s11020-005-1962-6. [DOI] [PubMed] [Google Scholar]
  12. Chorpita BF, Bernstein A, Daleiden E, The Research Network on Children's Mental Health Driving with roadmaps and dashboards: Using information resources to structure the decision models in service organizations. Administration and Policy in Mental Health and Mental Health Services Research. 2008;35:114–123. doi: 10.1007/s10488-007-0151-x. [DOI] [PubMed] [Google Scholar]
  13. Chorpita BF, Yim L, Moffitt L, Umemoto Francis S. Assessment of symptoms of DSM-IV anxiety and depression in children: A Revised Child Anxiety and Depression Scale. Behaviour Research and Therapy. 2000;38:835–855. doi: 10.1016/s0005-7967(99)00130-8. [DOI] [PubMed] [Google Scholar]
  14. Crone D, Horner R. Building Positive Behavior Support Systems in Schools: Functional Behavioral Assessment. New York, NY: Guilford Press; 2003. [Google Scholar]
  15. Cytrynbaum S, Ginath Y, Birdwell J, Brandt L. Goal attainment scaling a critical review. Evaluation Review. 1979;3:5–40. [Google Scholar]
  16. Daleiden E, Chorpita BF. From data to wisdom: Quality improvement strategies supporting large-scale implementation of evidence-based services. Child and Adolescent Psychiatric Clinics of North America. 2005;14:329–349. doi: 10.1016/j.chc.2004.11.002. [DOI] [PubMed] [Google Scholar]
  17. Evans S, Weist M. Commentary: Implementing empirically supported treatments in the schools: What are we asking? Clinical Child and Family Psychology Review. 2004;7:263–267. doi: 10.1007/s10567-004-6090-0. [DOI] [PubMed] [Google Scholar]
  18. Farahmand F, Grant K, Polo A, Duffy S, DuBois D. School-based mental health and behavioral programs for low-income, urban youth: A systematic and meta-analytic review. Clinical Psychology-Science and Practice. 2011;18:372–390. [Google Scholar]
  19. Farmer EM, Burns BJ, Phillips SD, Angold A, Costello EJ. Pathways into and through mental health services for children and adolescents. Psychiatric Services. 2003;54:60–66. doi: 10.1176/appi.ps.54.1.60. [DOI] [PubMed] [Google Scholar]
  20. Franken A. Mental Health in Schools Act. §195 2013 [Google Scholar]
  21. Garland A, Kruse M, Aarons G. Clinicians and outcome measurement: What’s the use? The Journal of Behavioral Health Services & Research. 2003;30:393–405. doi: 10.1007/BF02287427. [DOI] [PubMed] [Google Scholar]
  22. Goodman R. The Strengths and Difficulties Questionnaire: A Research Note. Journal of Child Psychology and Psychiatry. 1997;38:581–586. doi: 10.1111/j.1469-7610.1997.tb01545.x. [DOI] [PubMed] [Google Scholar]
  23. Hatfield D, Ogles B. The use of outcome measures by psychologists in clinical practice. Professional Psychology: Research and Practice. 2004;35:485–491. [Google Scholar]
  24. Hawken LS, Vincent CG, Schumann J. Response to intervention for social behavior. Journal of Emotional and Behavioral Disorders. 2008;16:213–225. [Google Scholar]
  25. Haynes SN, Mumma GH, Pinson C. Idiographic assessment: Conceptual and psychometric foundations of individualized behavioral assessment. Clinical Psychology Review. 2009;29:179–191. doi: 10.1016/j.cpr.2008.12.003. [DOI] [PubMed] [Google Scholar]
  26. Higa-McMillan C, Powell CK, Daleiden E, Mueller C. Purusing an evidence-based culture through contextualized feedback: Aligning youth outcomes and practices. Professional Psychology and Practice. 2011;42:137–144. [Google Scholar]
  27. Hoagwood K, Olin, Kerker BD, Kratochwill TR, Crowe M, Saka N. Empirically based school interventions Targeted at academic and mental health functioning. Journal of Emotional and Behavioral Disorders. 2007;15:66–92. [Google Scholar]
  28. Hofferth SL, Sandberg JF. How American children spend their time. Journal of Marriage and Family. 2001;63:295–308. [Google Scholar]
  29. Jensen-Doss A, Hawley K. Understanding barriers to evidence-based assessment: Clinician attitudes toward standardized assesment tools. Journal of Clinical Child and Adolescent Psychology. 2010;39:885–896. doi: 10.1080/15374416.2010.517169. [DOI] [PMC free article] [PubMed] [Google Scholar]
  30. Kataoka S, Stein B, Nadeem E, Wong M. Who gets care? Mental health service use following a school-based suicide prevention program. Journal of the American Academy of Child & Adolescent Psychiatry. 2007;46:1341–1348. doi: 10.1097/chi.0b013e31813761fd. [DOI] [PubMed] [Google Scholar]
  31. Kelly M, Lueck C. Adopting a data-driven public health framework in schools: Results from a multi-disciplinary survey on school-based mental health practice. Advances in School Mental Health Promotion. 2011;4:5–12. [Google Scholar]
  32. Kluger A, DeNisi A. The effects of feedback interventions on performance: A historical review, a metanalysis and a preliminary feedback intervention theory. Psychological Bulletin. 1996;119:254–284. [Google Scholar]
  33. Lambert M. Lambert M. Prevention of Treatment Failure: The use of Measuring, Monitoring, and Feedback in Clinical Practice. Washington, DC: American Psychological Association; 2010. Using outcome data to improve the effects of psychotherapy: Some illustrations. [Google Scholar]
  34. Lambert MJ, Whipple JL, Hawkins EJ, Vermeersch DA, Nielsen SL, Smart DW. Is it time for clinicians to routinely track patient outcome? A meta-analysis. Clinical Psychology: Science and Practice. 2003;10:288–301. [Google Scholar]
  35. Leahy R, Holland S, McGinn L. Treatment Plans and Interventions for Depression and Anxiety Disorders. 2nd. New York, NY: Guilford Press; 2012. [Google Scholar]
  36. Lyon AR, Borntrager C, Nakamura B, Higa-McMillan C. From distal to proximal: Routine educational data monitoring in school-based mental health. Advances in School Mental Health Promotion. 2013a;6:263–279. doi: 10.1080/1754730X.2013.832008. [DOI] [PMC free article] [PubMed] [Google Scholar]
  37. Lyon AR, Bruns EJ, Weathers E, Canavas N, Ludwig K, Vander Stoep A, Cheney D, McCauley E. Taking EBPs to school: Developing and testing a framework for applying common elements of evidence based practice to school mental health. Advances in School Mental Health Promotion in press-a. [Google Scholar]
  38. Lyon AR, Charlesworth-Attie S, Vander Stoep A, McCauley E. Modular psychotherapy for youth with internalizing problems: Implementation with therapists in school-based health centers. School Psychology Review. 2011;40:569–581. [Google Scholar]
  39. Lyon AR, Ludwig K, Romano E, Koltracht J, Vander Stoep A, McCauley E. Using modular psychotherapy in school mental health: Provider perspectives on intervention-setting fit. Journal of Clinical Child & Adolescent Psychology. doi: 10.1080/15374416.2013.843460. in press-b. [DOI] [PMC free article] [PubMed] [Google Scholar]
  40. Lyon AR, Ludwig K, Romano E, Leonard S, Vander Stoep A, McCauley E. “If it's worth my time, I will make the time”: School-based providers' decision-making about participating in an evidence-based psychotherapy consultation program. Administration and Policy in Mental Health and Mental Health Services Research. 2013b;40:467–481. doi: 10.1007/s10488-013-0494-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  41. Lyon AR, Ludwig K, Vander Stoep A, Gudmundsen G, McCauley E. Patterns and predictors of mental healthcare utilization in schools and other service sectors among adolescents at risk for depression. School Mental Health. 2013c;5:155–165. doi: 10.1007/s12310-012-9097-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  42. Mash E, Hunsley J. Evidence-based assessment of child and adolescent disorders: Issues and challenges. Journal of Clinical and Adolescent Psychology. 2005;34:362–379. doi: 10.1207/s15374424jccp3403_1. [DOI] [PubMed] [Google Scholar]
  43. Michalak J, Holtforth MG. Where do we go from here? The goal perspective in psychotherapy. Clinical Psychology: Science and Practice. 2006;13:346–365. [Google Scholar]
  44. Owens JS, Murphy CE. Effectiveness research in the context of school-based mental health. Clinical Child and Family Psychology Review. 2004;7:195–209. doi: 10.1007/s10567-004-6085-x. [DOI] [PubMed] [Google Scholar]
  45. Palmiter D. A survey of the assessment practices of child and adolescent clinicians. American Journal of Orthopsychiatry. 2004;74:122–128. doi: 10.1037/0002-9432.74.2.122. [DOI] [PubMed] [Google Scholar]
  46. Prodente C, Sander M, Weist M. Furthering support for expanded school mental health programs. Children's Services: Social Policy, Research, and Practice. 2002;5:173–188. [Google Scholar]
  47. Proctor E, Landsverk J, Aarons G, Chambers D, Glisson C, Mittman B. Implementation research in mental health services: An emerging science with conceptual, methodological, and training challenges. Administration and Policy in Mental Health and Mental Health Services Research. 2009;36:24–34. doi: 10.1007/s10488-008-0197-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  48. Protect our Children and our Communities by Reducing Gun Violence. 2013 Retrieved on July 1, 2013 at http://www.whitehouse.gov/sites/default/files/docs/wh_now_is_the_time_full.pdf.
  49. Riemer M, Rosof-Williams J, Bickman L. Theories related to changing clinician practice. Child and Adolescent Psychiatric Clinics of North America. 2005;14:241. doi: 10.1016/j.chc.2004.05.002. [DOI] [PubMed] [Google Scholar]
  50. Shriver M, Kramer J. Application of the generalized matching law for description of student behavior in the classroom. Journal of Behavioral Education. 1997;7:131–149. [Google Scholar]
  51. Southam-Gerow M, Chorpita BF, Miller L, Gleacher A. Are children with anxiety disorders privately referred to a university clinic like those referred from the public mental health system? Administration and Policy in Mental Health and Mental Health Services Research. 2008;35:168–180. doi: 10.1007/s10488-007-0154-7. [DOI] [PubMed] [Google Scholar]
  52. Southam-Gerow M, Weisz J, Kendall P. Youth with anxiety disorders in research and service clinics: Examining client differences and similarities. Journal of Clinical Child and Adolescent Psychology. 2003;32:375–385. doi: 10.1207/S15374424JCCP3203_06. [DOI] [PubMed] [Google Scholar]
  53. Thota A, Sipe TA, Byard GJ, Zometa CS, Hahn RA, McKnight-Eily LR, Williams SP. Collaborative care to improve the management of depressive disorders: A community guide systematic review and meta-analysis. American Journal of Preventative Medicine. 2012;42:525–538. doi: 10.1016/j.amepre.2012.01.019. [DOI] [PubMed] [Google Scholar]
  54. van der Kolk B. Developmental trauma disorder. Psychiatric Annals. 2005;35:401–408. [Google Scholar]
  55. Walker SC, Kerns S, Lyon AR, Bruns EJ, Cosgrove T. Impact of school-based health center use on academic outcomes. Journal of Adolescent Health. 2010;46:251–257. doi: 10.1016/j.jadohealth.2009.07.002. [DOI] [PubMed] [Google Scholar]
  56. Weist M, Paternite C. Building an interconnected policy-training-practice-research agenda to advance school mental health. Education & Treatment of Children. 2006;29:173–196. [Google Scholar]
  57. Weisz JR, Chorpita BF, Frye A, Ng MY, Lau N, Bearman SK, Hoagwood KE. Youth Top Problems: using idiographic, consumer-guided assessment to identify treatment needs and to track change during psychotherapy. Journal of Consulting and Clinical Psychology. 2011;79:369–380. doi: 10.1037/a0023307. [DOI] [PubMed] [Google Scholar]
  58. Weisz J, Chorpita BF, Palinkas L, Schoenwald S, Miranda J, Bearman SK, The Research Network on Youth Mental Health Testing standard and modular designs for psychotherapy treating depression, anxiety, and conduct problems in youth. Archives of General Psychiatry. 2011:E1–E9. doi: 10.1001/archgenpsychiatry.2011.147. [DOI] [PubMed] [Google Scholar]
  59. Wolpert M. Uses and abuses of patient reported outcome measures (PROMs): Potential iatrogenic impact of PROMs implementation and How It Can Be Mitigated. Administration and Policy in Mental Health and Mental Health Services Research. doi: 10.1007/s10488-013-0509-1. in press. [DOI] [PMC free article] [PubMed] [Google Scholar]
  60. Yeh M, McCabe K, Hough RL, Dupuis D, Hazen A. Racial/Ethnic differences in parental endorsement of barriers to mental health services for youth. Mental Health Services Research. 2003;5:65–77. doi: 10.1023/a:1023286210205. [DOI] [PubMed] [Google Scholar]

RESOURCES