Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2014 Oct 1.
Published in final edited form as: Adv Sch Ment Health Promot. 2013 Sep 17;6(4):10.1080/1754730X.2013.832008. doi: 10.1080/1754730X.2013.832008

From distal to proximal: Routine educational data monitoring in school-based mental health

Aaron R Lyon 1, Cameo Borntrager 2, Brad Nakamura 3, Charmaine Higa-McMillan 4
PMCID: PMC3866920  NIHMSID: NIHMS514069  PMID: 24363781

Abstract

Research and practice in school-based mental health (SBMH) typically includes educational variables only as distal outcomes, resulting from improvements in mental health symptoms rather than directly from mental health intervention. Although sometimes appropriate, this approach also has the potential to inhibit the integration of mental health and schools. The current paper applies an existing model of data-driven decision making (Daleiden & Chorpita, 2005) to detail how SBMH can better integrate routine monitoring of school and academic outcomes into four evidence bases: general services research evidence, case histories, local aggregate, and causal mechanisms. The importance of developing new consultation protocols specific to data-driven decision making in SBMH as well as supportive infrastructure (e.g., measurement feedback systems) to support the collection and use of educational data is also described.

Keywords: data-driven decision making, school and academic data, school-based mental health, progress monitoring, evidence-based practice

Mental Health and School Success

Education sector services have long been identified as a key component of the youth mental health service delivery system (Burns et al., 1995; Farmer, Burns, Phillips, Angold, & Costello, 2003; Zahner, Pawelkiewicz, DeFrancesco, & Adnopoz, 1992). In recognition of its potential for impact, governmental reports and federal policies frequently include the goals of improving and expanding school mental health programs (e.g., President’s New Freedom Commission, 2003; Mental Health in Schools Act of 2013). Relative to other mental health service contexts, school-based mental health (SBMH) carries additional pressures to link the timely delivery of care to academic and school-related outcomes (Franklin, Kim, & Tripodi, 2009; Prodente, Sander, & Weist, 2002; Teich, Robinson, & Weist, 2007). This is often appropriate, given the considerable body of research documenting the co-occurrence of mental health problems and academic difficulties, as well as their combined, detrimental impact on long-term youth functioning (Lawrence et al., 2005; Roeser, Eccles, & Freedman-Doan, 1999). Indeed, these relationships are often referenced as justification for locating mental health services in schools and, in some circumstances, devoting educational resources to SBMH program development (e.g., IDEA, 2004). Nevertheless, despite the close relationship between psychological functioning and school success – and the increasing availability of SBMH programs nationwide – the mental health and educational systems remain inadequately integrated and, as such, unable to optimally support positive youth development (e.g., Atkins, Hoagwood, Kutash, & Seidman, 2010; Kutash, Duchnowski, & Lynn, 2006).

Educational Outcomes in SBMH Research and Practice

A particularly glaring missed opportunity for mental health and school integration comes from the low representation of educationally-relevant outcomes in SBMH research and practice. The term educational outcomes is sometimes used in this area of research and can include both school data, such as tardies, attendance rates, and disciplinary events, as well as academic variables, such as grades, credits earned, and the results of curriculum-based and standardized testing. Unfortunately, most models of SBMH typically include educational outcomes only at the distal level, if at all. In many of these models, such outcomes result from the cascading effects of mental health improvements, rather than directly from a mental health intervention (e.g., Stormshak, Connell, & Dishion, 2009). This traditional sequencing of intervention outcomes is often appropriate within the SBMH context, and not all behavioral health interventions are equipped to address educational outcomes directly. However, such an orientation may result in missed opportunities to create behavioral and mental health programs that can have more of an immediate impact on student functioning.

The majority of SBMH studies have failed to include educational outcomes, and among those that have, most reveal mixed findings regarding program impact. In an extensive review of SBMH interventions, Hoagwood and colleagues (2007) were able to identify only 24 studies that met their inclusion criteria for methodological rigor and addressed both mental health and educational outcomes. Among the 24, only 15 yielded positive effects for both types of variables. Similarly, Farahmand et al. (2011) conducted a meta-analysis of SBMH programs for low-income, urban youth and found a comparable, but small, mean effect size (0.24) among the studies that included educational outcomes as primary (i.e., related to program targets; n = 4) or secondary (n = 9) outcomes. Interestingly, this effect size was similar to the impact of programs on internalizing problems (0.24), but was considerably higher than their virtually nonexistent effect on externalizing problems (0.02). Although reviews such as these indicate that SBMH interventions may have the potential to affect academic outcomes to an extent that is comparable to their effect on measures of mental health functioning, the small number of studies currently limits their generalizability.

Despite some evidence for a meaningful impact, the frequency with which educational outcomes are omitted in research has the unfortunate consequence of perpetuating the view that SBMH falls outside of the core mission of schools; a perspective that can inhibit the extent to which the two are aligned to promote healthy student development (Hoagwood et al., 2007). Atkins and colleagues (2010) have suggested that education and mental health integration will be substantially enhanced when the goals of mental health service delivery include effective schooling and academic success. Focusing on school and academic data within the context of behavioral health interventions may represent an important point of integration between education and mental health. Nevertheless, this objective has remained elusive and new approaches to the incorporation of educational outcomes in typical SBMH service delivery may be needed.

Data-Driven Decision Making in Clinical Care

Recently, there has been an explosion of research and theory detailing the importance of – and strategies for – using data to guide clinical decision making in youth mental health services (e.g., Bickman, 2008; Bickman, Kelley, Breda, de Andrade, & Riemer, 2011; Chorpita, Bernstein, Daleiden, & Research Network on Youth Mental Health, 2008; Higa-McMillan, Kihman, Powell, Daleiden, & Mueller, 2011). Much of this work has centered on the design and implementation of decision-making models and support systems that can be used to monitor client outcomes and, in many cases, track the practices used by service providers during the course of treatment. Evidence from both the adult and youth psychotherapy literatures has supported the value of collecting data for the purposes of treatment progress monitoring and improving client engagement and outcomes (Bickman et al., 2011; Lambert et al., 2003; Shimokawa, Lambert, & Smart, 2010). In light of these findings, systematic monitoring of therapy progress to guide clinical decision making is being increasingly recognized as a key component of evidence-based psychotherapy (Halford et al, 2012); and in some cases, an evidence-based practice in and of itself (Substance Abuse and Mental Health Services Administration, 2012).

Despite the promise of this approach, the structured collection and use of progress monitoring data appears to occur infrequently in routine clinical practice settings (Hatfield & Ogles, 2004; Palmiter, 2004). This is unfortunate, given findings that therapists struggle with identifying client deterioration, a significant predictor of premature treatment dropout and diminished intervention benefit (Hatfield, McCullough, Frantz, & Krieger, 2010). Furthermore, work by Garland and colleagues (2003) has found that even when presented with scored assessment profiles, the incorporation of outcome data into ongoing treatment planning is unlikely to occur without additional supports. Similarly, Young and colleagues (2007) documented that there is often degradation in the quality and specificity of information contained in service documents when moving from referral to assessment and to treatment planning, making it difficult for even the highest quality assessment to adequately influence intervention. Although little research has focused specifically on schools, existing evidence suggests that usual care practice in SBMH is similarly unlikely to include the structured use of idiographic (i.e., individualized, but not norm-referenced) or standardized assessment data to drive clinical practice (Kelly & Lueck, 2011; Lyon, Charlesworth-Attie, Vander Stoep, & McCauley, 2011; Weist, 1998).

Instrumental feedback – that is, feedback that provides the clinician with guidance about what could be contributing to client decline and strategies for changing trajectory – may be even more useful to clinicians than information on client progress alone. For instance, a series of findings from the adult literature have demonstrated that measuring therapeutic alliance, client commitment to change, or other key process variables in a way that informs treatment decisions can increase the magnitude of the effects of outcome monitoring (Harmon et al., 2007; Slade, Lambert, Harmon, Smart & Bailey, 2008; Whipple et al., 2003). Within the youth mental health literature, evidence supports models in which both intervention outcomes and the delivery of specific treatment practices are monitored over time. Indeed, Weisz and colleagues (2012) recently documented the effectiveness of such an intervention approach, in which common elements of evidence-based treatment protocols were distilled from the empirical literature and then implemented in concert with assessment and outcome monitoring to inform decisions about treatment planning and adjustment.

Models such as the ones described above are generally intended to facilitate the use of proximal evidence in practice by integrating routine data collection into clinical decision-making processes and attending to multiple sources of clinically-useful information. In one such model, Daleiden and Chorpita (2005) differentiated four separate evidence bases relevant to clinical care. The first, general services research evidence, includes information systematically mined from the existing empirical literature through research articles and treatment protocols. Relative to some of the other evidence bases, this information source is relatively well developed, but not always accessible or easily integrated into practice due to time, training, and access constraints. Case history evidence is drawn from individualized, case-specific data derived from clinical interactions with clients. It may be organized or presented using a computer-based tracking system or “dashboard” to facilitate data tracking over time for individual clients (e.g., Chorpita et al., 2008). Local aggregate evidence (also referred to as “practice based” evidence by Daleiden & Chorpita, 2005) uses the case-specific data (i.e., case history evidence) described above, but aggregates this information across cases into larger meaningful units (e.g., therapists’, provider agencies' or region’s entire caseload) for program evaluation and administration purposes (e.g., Higa-McMillan et al., 2011). Finally, causal mechanism evidence refers to a more general and comprehensive understanding of etiological and treatment processes, including tacit knowledge and collective wisdom contained within the intervention team or drawn from theoretical models of therapeutic change. Among the four evidence bases, causal mechanism evidence is arguably the least standardized.

According to Daleiden and Chorpita (2005), due to their individual limitations, all of the evidence bases should be integrated to inform treatment planning and clinical decision-making. Furthermore, different information sources may be given higher or lower priority, based on the stage of treatment. For instance, therapists or supervisors may prioritize the general services research or local aggregate evidence for choosing an intervention when first starting treatment with a youth or family, until sufficient information about a particular case’s treatment response (i.e., case history evidence) can be obtained. Alternatively, if a supervisor is brought in for consultation for an existing or longer-standing treatment case, she may give more priority to the case history evidence over more distal forms of data like the broader treatment outcome literature. In this latter example, if a youth’s case history evidence base demonstrated a positive treatment response, the therapist would be encouraged to continue with his/her therapeutic approach, regardless of whether or not s/he was utilizing an evidence-based practice formally identified through the general services evidence base (Daleiden & Chorpita, 2005). On the other hand, if the case history evidence suggested the youth was deteriorating and review of the practice data suggested that the therapist had not tried practices that the general services, local aggregate, and/or causal mechanisms research would suggest for clients with similar problems, the therapist would be encouraged to implement a practice with research support. Leveraging these evidence bases at varying points during treatment provides opportunities for maximizing data-driven decision making, especially within the larger context of public mental health or school systems where it may be impractical to develop the capacity to provide all youth with an established, “brand-name” evidence-based treatment (Chorpita, Bernstein, & Daleiden, 2011).

Although no comparable model of evidence-based decision-making has been articulated for SBMH specifically, increasing attention is being paid to the use of data in education sector service delivery (Carey & Dimmitt, 2008; Dimmitt, Carey, & Hatch, 2007; Kelly & Lueck, 2011). Nevertheless, difficulties remain surrounding what outcomes are most appropriate to track and how monitoring should be conducted in a given context. Although standardized assessment tools that measure mental health symptoms are an important core component of evidence-based service provision, some authors (e.g., Garland et al., 2013) have pointed out that other types of data may be preferable or equally important in some situations for monitoring treatment progress. Similarly, in their discussion of data-driven decision making, Daleiden and Chorpita (2005) suggested that alternative intervention targets may be particularly common and relevant in certain settings, such as academic targets in the education sector. In support of this point, we describe the relevance of educational data as key outcomes in SBMH services in the following sections.

Data-Driven Decision Making within the SBMH Context

The organizational and individual treatment factors inherent in school settings (e.g., school culture, clinical intervention timelines, daily access to students, opportunities for teaming around a student to support layers of services, billing practices, etc.; Stephan, Davis, Burke, & Weist, 2006) can vary substantially from those in traditional mental health clinics. Nevertheless, the logic and importance of using real-time progress tracking to steer a wide variety of treatment decisions should remain equally applicable. In the education sector, perceptions of the utility of progress monitoring are likely to be enhanced by the collection and use of contextually-relevant outcome data, such as indicators of school success. In pursuit of this goal, the utility of a range of educational outcomes in SBMH service delivery is described below within the evidence bases framework discussed by Daleiden and Chorpita (2005). Table 1 further details the applicability (including advantages and limitations) of different components of the framework to the school context. Although the four evidence bases are first discussed in isolation, it is important to note that their integration is essential to effective clinical decision making.

Table 1.

Overview of Daleiden & Chorpita’s (2005) four evidence bases applied to the promotion of educational outcomes in school-based mental health.

Definition Advantages Limitations When to prioritize?
General Services Research evidence Information mined from the Existing Empirical literature. Draws from generalizable, high-quality knowledge produced through systematic investigations. Research linking mental health interventions to school or academic outcomes is very limited. Early in treatment planning or when progress is suboptimal for an ongoing client (provided information relevant to the specific mental health and educational outcomes of interest is available).
Case history evidence Case-specific data derived from clinical Interactions with clients. Provides the most immediately relevant information about individual client progress in response to intervention.

Highly consistent with RtI approaches in schools.
Must be developed over time for each individual.

Not available for initial intervention planning (unless treatment is being reinitiated).

Some educational outcome data may be difficult to obtain.
Should be prioritized for all cases following intervention initiation to guide decisions about maintaining or altering the selected intervention approach.
Local Aggregate evidence Case history Evidence aggregated into larger units. Generates local knowledge that is likely to be highly applicable to the service providers, recipients, and stakeholders in a given context.

Can inform larger policy decisions, resource allocation (e.g., new trainings), or the establishment of client improvement benchmarks.
At the organization/agency level, requires significant infrastructure/resources to collect, integrate, manage, and interpret/use.

At the clinician level, requires that a provider has been in practice and collecting data long enough to establish a caseload aggregate.
Across all phases of intervention (e.g., early in treatment planning to identify effective practices for a similar population; during intervention to determine if client progress is “on track;” toward the end of treatment to examine whether a client has met termination benchmarks).
Causal Mechanism evidence General Understanding of etiological and treatment processes. Draws from sources of knowledge/theory that may not have been codified in the empirical literature, including those specific to the educational/school context.

Can inform interventions even when no data are available.
Least standardized of all the evidence bases, especially related to mental health and educational outcomes.

There is little guidance about its systematic application, which may introduce unwanted bias.
Across all phases of the intervention process, but may be emphasized when case history evidence is lacking or to guide the search and application of the general services research evidence base.

General Services Research Evidence

According to Daleiden and Chorpita (2005), the services evidence base is responsible for the frequently-discussed “evidence-based services model” of treatment delivery (APA Task Force, 1995), which focuses primarily on the use of interventions with demonstrated efficacy and effectiveness. In the general youth mental health services literature, this evidence base has been extensively developed over more than 40 years of study and over 600 clinical trials (Chorpita et al., 2011). Although multiple studies have engendered confidence in the use of this evidence base by documenting the superior outcomes achieved through the use of evidence-based practices, relative to usual care (e.g., Weisz et al., 2006), much has been written about the limitations of relying solely on the delivery of evidence-based practices to improve client outcomes (Chorpita et al., 2008; Kelley, Bickman, & Norwood, 2010).

Unfortunately, research about the effectiveness of SBMH on educational outcomes is limited, especially relative to the general youth mental health services literature. Notwithstanding promising findings for a small number of intervention programs (e.g., DuPaul, Kern, Gormley, & Volpe, 2011; Kern et al., 2007; Kataoka et al., 2011), review articles have documented that few studies simultaneously evaluate educational and mental health outcomes (Farahmand et al., 2011; Hoagwood et al., 2007). Chorpita and colleagues (PracticeWise, 2013) have developed a comprehensive system for harnessing much of the information included in the services research literature, but a comparable effort has not yet been completed in SBMH with respect to school and academic outcomes.

Providers may still incorporate components of the services evidence base into treatment planning by accessing comprehensive reviews or meta-analyses that have evaluated the impact of programs on educational outcomes. For example, a provider encountering a middle school student who presents with both disruptive behavior problems (aggression and fights at school) and excessive tardies may begin by consulting the review published by Hoagwood and colleagues (2007) in order to determine what types of interventions have empirical support for those problems. This review suggests that a group-based Social Moral Reasoning Development Program (SMRDP; Arbuthnot, 1992) – which targets youth with behavior problems and includes listening and communication skills, discussion of social dilemmas, perspective taking, and social role plays as intervention components – has been found to reduce tardies, improve academic achievement, and reduce disciplinary referrals. Recognizing that it may not be feasible to introduce an entire group protocol to address the problems of a single student, the provider may therefore choose to incorporate SMRDP intervention components into individualized intervention planning.

Case History Strategies

Youth are frequently referred to SBMH because they are experiencing academic problems. In many school settings, some impact on learning must be observed and assessed in order to receive a referral for services (Bradshaw, Buckley, & Ialongo, 2008). Although these data are regularly used for progress monitoring in the context of academic-only interventions, such as those included in individual education programs (IEPs), they are generally left out of case history evidence and clinical decision-making in SBMH. In situations where academics are a primary, or even secondary, reason for an individual’s treatment initiation, routine collection and monitoring of these outcomes is essential to inform the services provided.

Little attention has been paid to how school and academic outcomes can be used to inform data-driven decisions about clinical treatment progress on an individual basis. As described by Chorpita and colleagues (2008), building a case-specific evidence base is integral for youth, particularly those who are in need of more individualized, responsive interventions. This requires that the treatment targets are accurately identified and that progress indicators (both behavioral and educational) and treatment components/practices are a good fit for an individual student. For example, an individualized progress indicator for a socially anxious youth may be school attendance. However, the specificity of school attendance is dependent on the student’s presentation. The “number of school days attended per week” may be able to capture progress, given that many socially anxious youth struggle with maintaining regular school attendance. For a youth with severe social anxiety who rarely attends a full day of school, the metric by which school attendance is measured may need to be adjusted to demonstrate smaller increments of response, such as the “percentage of the school day attended.” In this example, as interventions are delivered (e.g., a process of gradual exposure to the school setting), small successes in habituation could be observed, celebrated, and/or modified based on the youth’s progress.

Local Aggregate Evidence

Use of educational data at the local aggregate level provides opportunities to demonstrate the value of services delivered as well as to engage in quality improvement initiatives. In SBMH, the ability to communicate the value of programs to key stakeholders is widely considered to be important due to the frequent pressures to justify resource allocation (e.g., funding, space, student and teacher time) to “non-educational” programs (Teich et al., 2007). In this vein, documenting that SBMH services can produce positive educational outcomes has been identified as one strategy to help combat perceptions that SBMH services are disconnected from the primary mission of schools (Prodente, Sander, & Weist, 2002). Depending on which stakeholder is being targeted (e.g., a principal, program director, or superintendent), educational data may be aggregated at different levels of a particular system (e.g., a school, SBMH agency, or school district). For these reasons, routine data collection and interpretation in the context of school-based services is a priority for science and practice agendas intent on forging effective, bidirectional collaborations between researchers, front-line service providers, and other key constituents (Kratochwill et al., 2011).

Quality improvement is also facilitated as the core unit of analysis moves beyond the individual client and data are aggregated at the levels of clinicians, supervisors, or entire service systems. For the purposes of program evaluation and improvement in their youth public mental health system, Hawaii’s Child and Adolescent Mental Health Division (CAMHD) has utilized aggregated data, such as client outcomes on the Child and Adolescent Functional Assessment Scale (CAFAS; Hodges, 1998), to drive policy and practice decisions for nearly a decade (Daleiden, Chorpita, Donkervoet, Arensdorf, & Brogan, 2006) within a continuous quality improvement approach. Efforts have included the implementation of provider feedback reports and data “parties” (Higa-McMillan et al., 2011) at which outcome data aggregated at the program level within a provider agency (e.g., all youth receiving Therapeutic Foster Care at Agency A; all youth receiving Intensive In-Home Therapy at Agency A; all youth receiving Therapeutic Foster Care at Agency B; etc.) are reviewed. At these meetings, providers and CAMHD staff collaboratively reflect on the data and engage in action planning such as targeted trainings for their staff when data suggest limited client improvement and/or limited use of evidence-based practices. Within SBMH, such collaborative meetings could involve simultaneous program-level review of both mental health and educational outcomes in relation to the stated goals of each program. Although programs will likely vary in the extent to which either outcome category is considered primary, in cases where school and academic outcomes are explicit program objectives, the introduction of additional provider training (e.g., in applying a problem solving framework to low school engagement) or referrals (e.g., tutoring/homework help) may be indicated.

In situations where sufficient case history evidence has been collected to form a meaningful local aggregate evidence base, SBMH providers interested in addressing educational outcomes may rely more heavily on that evidence than on the limited general services research base for guidance related to initial treatment planning. For instance, it may be determined that a 16-year-old, Latina high school student who is demonstrating inconsistent homework completion and moderate symptoms of depression, should receive a set of interventions that have been found to be locally effective for youth with similar backgrounds and presentations (e.g., behavioral activation and problem solving). Simultaneously, practices for which there is local evidence of their ineffectiveness can be de-prioritized. Each of the student’s problems may then be monitored weekly, using standardized and ideographic tools (e.g., a standardized depression measure and teacher-reported homework completion). These outcomes can be compared to local benchmarks to determine incremental progress (i.e., how much change, on average, do similar youth demonstrate after two, four, and six weeks of intervention) and guide decision-making regarding additional therapeutic strategies to employ while simultaneously creating a case history evidence base for the individual student.

Causal Mechanism Strategies

Considering the underdeveloped intersection of the general services evidence base for SBMH and its impact on school/academic outcomes, the importance of overarching intervention theories and frameworks is likely increased. As described previously, this type of causal mechanism evidence draws from existing theory to inform clinical decisions. Although traditional theories of psychotherapy and human behavior change remain relevant to SBMH (e.g., Ajzen, 1991), focusing on educational outcomes while providing services in schools may also necessitate the incorporation of additional intervention models drawn from that context. For instance, routine monitoring of school and academic outcomes is highly compatible with the increasingly popular Response to Intervention (RtI) framework used in schools. Indeed, an important part of RtI includes explicit focus on data collection as well as the use of these data in problem solving and decision-making about student progress and the need to adapt or maintain interventions (Bradley, Danielson, & Doolittle, 2007).

Although the types of information that can be categorized under causal mechanism evidence include well-known theories of learning, they may also include theoretical mechanisms as understood by real-world SBMH practitioners, but not yet articulated in the literature. Within the field of education, Whitehead (2009) has described living educational theories as those that are constructed by individual educators based on their own experiences and perspectives. Given the paucity of established school mental health theory, it may be that the “living theories” of student behavior change held by practitioners will be particularly important, especially initially, to understanding and promoting the link between SBMH services and educational outcomes. Indeed, practitioner construction of “living theories” is well in line with the development of contextualized evidence that emerges from close consideration of the local aggregate and case-specific history evidence bases.

Practitioner incorporation of the causal mechanism evidence base into SBMH service delivery may also include the flexible application of traditional mental health theory for more educational purposes or the utilization of educational theories in mental health practice. Consider a therapist treating a male student diagnosed with post-traumatic stress disorder (PTSD) and who exhibits crying spells when presented with unexpected information, which negatively impacts his ability to function in the classroom and interferes with learning. His therapist may incorporate knowledge from the causal mechanism evidence base by using trauma theory to inform the psychoeducation she provides to the student’s teacher regarding his reduced tolerance for change or surprises. Furthermore, the therapist may learn that his teacher was originally trained in the theory of direct instruction (Becker & Carnine, 1981); which is based on the notion that students learn best when teachers use explicit lesson plans, provide opportunities to perform skills, and measure progress incrementally. Drawing on the commonalities between this perspective and her own cognitive-behavioral therapy approach, the therapist works with the student and teacher to develop a plan in which the teacher provides five minute warning prior to requesting that the student performs skills taught via direct instruction.

Integrating the Four Evidence-Bases in SBMH

Examination of existing intervention strategies currently in use within the school context supports Daleiden and Chorpita’s (2005) constructs and logic. For instance, the evidence integration approach is consistent with school-wide positive behavioral interventions and supports model (SWPBIS; Sugai & Horner, 2006), which emphasizes the measurement of mental health and educational outcomes and use of data in decision making (case history evidence and local aggregate evidence), application of practices with research support (general services research evidence), and systems to support implementation. Consistent with RtI, SWPBIS distinguishes among three tiers of intervention, based on the severity of a youth’s presentation and the intensity of services received (Tier I [lowest intensity] through Tier III [highest intensity]). Within this frame, Daleiden and Chorpita’s model is applicable through its focus on identifying and tracking ‘at-risk” (i.e., Tier II) youth in addition to monitoring those already identified as in need of intensive, individualized services (Tier III).

Another data-based program, Check and Connect (Anderson, Christenson, Sinclair, & Lehr, 2004) also tracks key school and academic data (e.g., attendance, grades, suspensions) to identify at-risk youth and apply targeted, individualized behavioral interventions to decrease the risk of school failure. School engagement is continuously monitored to inform intervention strategies. Similarly, the Check In/Check out program (CICO; Crone, Horner, & Hawken, 2004) is a preventive strategy for youth identified as needing more targeted interventions to reduce problem behaviors in school. In CICO, progress monitoring is based on specific behavioral goals, and youth earn points based on the behavioral feedback they receive from adults throughout the day. Each youth’s progress monitoring plan is combined with targeted, evidence-based interventions, such as social skills groups, behavioral contracting, reading groups, etc. (general services research evidence) and youth have the opportunity to receive positive adult feedback throughout the day based on their progress and at minimum during the ‘check in’ in the morning and ‘check out’ in the afternoon (case history evidence). CICO data are typically analyzed for all youth involved in the program and can be compared across multiple time-points and in relation to school-wide data such as attendance, office discipline referrals, etc. (local aggregate evidence). Despite promising examples such as those above, educational data remain underutilized largely because clear monitoring and consultation protocols regarding the collection and use of data to inform practice decisions are lacking and the infrastructure for supporting the collection and use of student data is underdeveloped (Weist & Paternite, 2006).

Monitoring and Consultation Protocols for Educational Data

An integrated clinical decision-making process – including initial assessment; selection of appropriate interventions; identifying valid, reliable, and sensitive indices of progress; collecting and tracking data; and utilizing those data to inform planning, practices, and adaptations – requires significant training and support. Even within the more general mental health literature, few well-defined protocols exist to guide practitioners through the process of identifying specific indicators of treatment progress, maximizing the ease and consistency of data collection, and utilizing the data to make clinical decisions. Furthermore, there are a number of contextual issues, specific to school-based settings, which may impede or facilitate progress monitoring. Stephan and colleagues (2006) described this as a “dance” that involves balancing clinical needs, requirements, and evidence-based practice implementation with the contextual and organizational factors unique to school settings; a process that requires substantial and consistent supervision and support. Based on data indicating that professional behavior change rarely occurs without extended consultation and coaching (Beidas & Kendall, 2010; Fixsen et al., 2005; Herschell et al., 2010; Lyon, Stirman, Kerns, & Bruns, 2011), uptake of routine school and academic data monitoring is unlikely to occur without carefully structured, ongoing support for school-based providers, as well as other school staff who may be integral to collecting and maintaining important indicators of progress (e.g., teachers tracking student in-class engagement, front office staff responsible for maintaining databases of school absences).

In their review of the literature on youth mental health, Garland and colleagues (2013) identified a number of areas in need of improvement when it comes to translating knowledge into action. Specifically, they determined that there is “much room at the individual provider and client/family levels for training on the utility and value of outcome monitoring” (p. 16). In doing so, they highlighted recent research by Bickman and colleagues (2011), which found that clinical outcomes for youth were better when clinicians had weekly access to assessment feedback on youth symptoms and functioning and had received training in the integration of assessment-based feedback into practice. Unfortunately, few of the existing models for supporting outcome monitoring in practice were designed for use specifically by SBMH practitioners and no guidance is available to inform this process as it relates to educational data. Indeed, data-monitoring models that have been tested in schools have focused largely on treatment integrity within the context of specific treatment packages (e.g., Brown & Rahn-Blakeslee, 2009), limiting their applicability to the diverse range of youth that typically make up SBMH practitioner caseloads or the school-related problems with which they frequently present. There is great opportunity, therefore, for the field of SBMH to make significant advancements in methods of supporting providers to utilize educational outcomes during intervention in a manner consistent with the four evidence bases. In doing so, SBMH is also poised to become a leader within the larger mental health services field by establishing new frameworks to guide practitioners in the incorporation of key functional indicators into treatment across a range of service settings.

Infrastructure for Data Collection and Use

One important barrier to making explicit use of educational data in SBMH is an underdeveloped infrastructure to support the collection, organization, and use of relevant information. School-based practitioners, teachers, and other school staff have competing demands throughout the school day, which may reduce their ability and willingness to participate in quality improvement activities such as student mental health data tracking (Lyon et al., in press). In addition, although school personnel may be invested in collecting data to fulfill compliance regulations, data collection strategies are unlikely to be sustained without a clear awareness of their impact on quality of care (Kelly, 2011). Further, research suggests that large systems providing health and/or mental healthcare, conceivably including schools, sometimes adopt infrastructure (e.g., IT systems) without a priori assessment of task-fit and readiness for change (Zheng et al., 2013). Simple computer systems or other electronic infrastructure can streamline data collection processes and provide meaningful, real-time feedback to reinforce their use by key individuals, but without manageable data collection and management infrastructure to support tracking over time, valuable information is likely to be lost. Infrastructure, as it relates to the use of educational data in mental health interventions, may refer to (a) to the construction or introduction of new infrastructure explicitly designed for that purpose or (b) repurposing existing non-clinical infrastructure (e.g., school district data systems) to support clinical objectives.

With respect to construction or introduction of new infrastructure, measurement feedback systems (MFS; Bickman, 2008) are an increasingly popular type of computerized support in which feedback is delivered to mental health providers about client progress to assist in clinical decision making. In addition to outcomes, many MFS also provide the ability to track regular measurement of treatment processes (e.g., practices used, therapeutic alliance). Bickman, Kelley, and Athay (2012) recently presented a specific measurement feedback system, the Contextualized Feedback Systems, which allows for collection of a variety of progress measures and presentation of the data for immediate consumption by practitioners. Similarly, Higa-McMillan, and colleagues (2011) described a different contextualized measurement feedback system where the emphasis is on examining practices and outcomes in aggregate form (via visual displays developed electronically), while utilizing these data to inform successes, training needs, and goal setting for agencies. Beyond their direct impact on client outcomes, systems such as these are intended to facilitate staff organization, accountability, and communication. Unfortunately, no such system has been developed for specific use in school settings and no existing systems include the explicit ability to incorporate educational indicators or specific intervention practices intended to improve school or academic functioning. Given that the number of MFS available for use in the delivery of mental health services has increased rapidly over the past 10 years, the “ground-up” development of novel systems for SBMH is likely to be less important or cost-effective than the selection and adaptation of existing systems for use in the education sector.

Repurposing existing school-based infrastructure may be another efficient and cost-effective method of supporting progress and practice monitoring within SBMH. One particularly appealing facilitator is that existing systems are already designed for use in the context of the school to track a variety of important educational outcomes. For instance, district data systems are often intended to provide parents (and older students) with information about academic performance (e.g., homework completion, grades) and school-specific behavioral functioning (e.g., attendance, discipline) in order to bridge the school and family microsystems. The design of these systems to be communication tools also supports a fundamental principle of progress and practice monitoring in SBMH; specifically, the importance of explicitly discussing progress monitoring information with clients. In an example of existing infrastructure, the SchoolWise Information System (SWIS; May et al., 2003) is frequently used for tracking office discipline referrals (ODRs) within a SWPBIS model. ODR data are then used to make data-driven decisions about behavior management. Although SWIS is frequently used to examine aggregate data, individual youth ODR profiles can also be accessed and used to provide direct feedback to students or parents. It may be possible or more cost-effective to consider repurposing a program such as SWIS for use with other progress and/or practice data in the context of a SBMH program versus developing another standalone program which could be viewed as an additional burden on school personnel. Modifiable billing management systems that could allow for tracking progress and practices in a manner that meets auditing requirements, incorporates educational data, and supports contemporary demands for “accountability,” represent an additional avenue for infrastructure repurposing.

A final consideration with respect to the use of infrastructure to track educational data, either in the context of new or repurposed infrastructure, relates to local and federal information sharing policies. For example, the information collected during healthcare interventions is generally subject to the Health Insurance Portability and Accountability Act (HIPPA), whereas the exchange of educational data is governed by the Family Educational Rights and Privacy Act (FERPA). Although both laws are intended to protect the confidentiality of individual information and avoid inappropriate or unauthorized disclosures, the approaches are not always compatible (Bergen, 2004). Furthermore, both policies may facilitate or inhibit the use of new or existing infrastructure in SBMH practice and will need to be addressed as these issues evolve. For the large proportion of schools that contract with local, external agencies to provide school-based services (more than 50%; Foster et al. 2005), explicit data-sharing agreements may need to be a component of those contracts to satisfy FERPA regulations. To this end, service recipient consent forms can be updated to reflect this situation in a manner consistent with both laws (Lever, Andrews, & Weist, 2008).

Summary

SBMH services play an essential role in helping students achieve both positive mental health, as well as educational outcomes. School and academic indicators can vary widely in foci, ranging from proximal to distal with regard to their anticipated relationship to different types of interventions. Unfortunately, empirical support for interventions that impact mental health and educational outcomes or incorporate various forms of school and academic information into data-driven decision making is limited. Nevertheless, systematic reviews provide a reason for some optimism surrounding the potential impact of mental health interventions on educational indicators. The famous maxim, “what gets measured gets done” (Behn, 2003) suggests that a more explicit focus on school and academic data in the context of SBMH services may only increase these effects. Decision-making models from mental health, such as the one put forth by Daleiden and Chorpita (2005) for identifying and leveraging various forms of evidence, can be readily applied to SBMH. In this paper, we suggested various ways in which educational outcomes can be utilized in alignment with Daleiden and Chorpita’s (2005) model to advance the goal of shifting them from distal to more proximal outcomes in SBMH service delivery. Nevertheless, despite the potential for its utility, numerous barriers still remain for the implementation of such a model, such as the underdeveloped nature of the literature on monitoring and consulting protocols for educational data, as well as lack of infrastructure for collecting and using data.

As suggested earlier, the likelihood that data-driven decision making approaches will be successfully implemented in routine service delivery settings is enhanced by the high degree of compatibility with existing educational policies and values (e.g., RtI). Indeed, the fit between new practices and different levels of the destination context is a common component of many contemporary implementation models (e.g., Aarons, Hurlburt, & Horwitz, 2011). Moreover, decision making approaches that integrate information across client functional domains are only likely to increase in relevance as healthcare reform (Affordable Care Act of 2010) moves delivery systems toward greater integration of mental health with other types of services; thus continuing to deemphasize the role of the specialty mental health sector (Hoagwood, 2013). As these changes occur, school mental health has an opportunity to take on a natural leadership role in advancing models that support this type of service integration, rooted in the use of data that span multiple service types and functional domains.

Despite high current (and future) compatibility with the school context, data-driven decision making may require significant time and resources to implement and some components of the model may be more feasibly or rapidly adopted than others. Fortunately, even in the absence of well-developed consultation protocols or technical infrastructure to support data monitoring and integration, there are multiple pathways through which SBMH programs can pursue a more comprehensive approach to integrating educational outcomes into data-driven decision making. For instance, focusing on the generation of individual case history evidence for all new or current students receiving services may represent a feasible starting point most likely to carry immediate quality improvement benefits. In sum, considerable opportunities exist to advance routinized, data-driven decision making models in SBMH. Accomplishing this goal carries great promise for supporting the integration of mental health and education agendas in a manner likely to enhance the long-term survival of education sector mental health services.

Acknowledgments

This publication was made possible, in part, by funding from grant number K08 MH095939, awarded to the first author from the National Institute of Mental Health (NIMH).

Dr. Lyon is an investigator with the Implementation Research Institute (IRI), at the George Warren Brown School of Social Work, Washington University in St. Louis; through an award from the National Institute of Mental Health (R25 MH080916) and the Department of Veterans Affairs, Health Services Research & Development Service, Quality Enhancement Research Initiative (QUERI).

References

  1. Aarons G, Hurlburt M, Horwitz S. Advancing a conceptual model of evidence-based practice implementation in child welfare. Administration and Policy in Mental Health. 2011;38:4–23. doi: 10.1007/s10488-010-0327-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Adelman HS, Taylor L. Promoting mental health in schools in the midst of school reform. Journal of School Health. 2000;70:171–178. doi: 10.1111/j.1746-1561.2000.tb06467.x. [DOI] [PubMed] [Google Scholar]
  3. Ajzen I. The theory of planned behavior. Organizational Behavior and Human Decision Processes. 1991;50:179–211. [Google Scholar]
  4. American Psychological Association Task Force on Promotion and Dissemination of Psychological Procedures, Division of Clinical Psychology. Training in and dissemination of empirically-validated psychological treatments. The Clinical Psychologist. 1995;48:3–23. [Google Scholar]
  5. Anderson AR, Christenson SL, Sinclair MF, Lehr CA. Check & Connect: The importance of relationships for promoting engagement with school. Journal of School Psychology. 2004;42:95–113. [Google Scholar]
  6. Arbuthnot J. Sociomoral reasoning in behavior-disordered adolescents: Cognitive and behavioral change. In: McCord J, Tremblay R, editors. Preventing antisocial behavior. New York: Guilford Press; 1992. pp. 283–310. [Google Scholar]
  7. Atkins MS, Hoagwood KE, Kutash K, Seidman E. Toward the integration of education and mental health in schools. Administration and Policy in Mental Health and Mental Health Services Research. 2010;37:40–47. doi: 10.1007/s10488-010-0299-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Becker WC, Carnine DW. Direct Instruction: A behavior theory model for comprehensive educational intervention with the disadvantaged. In: Bijou SW, Ruiz R, editors. Behavior modification: Contributions to education. Hillsdale, NJ: Erlbaum; 1981. pp. 145–210. [Google Scholar]
  9. Behn RD. Why measure performance? Different purposes require different measures. Public Administration Review. 2003;63:586–606. [Google Scholar]
  10. Beidas RS, Kendall PC. Training therapists in evidence-based practice: A critical review of studies from a systems-contextual perspective. Clinical Psychology: Science and Practice. 2010;17:1–30. doi: 10.1111/j.1468-2850.2009.01187.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Bergen MD. HIPAA-FERPA revisited. Journal of School Nursing. 2004;20:107–112. doi: 10.1177/10598405040200020901. [DOI] [PubMed] [Google Scholar]
  12. Bickman L. A measurement feedback system (MFS) is necessary to improve mental health outcomes. Journal of the American Academy of Child and Adolescent Psychiatry. 2008;47:1114. doi: 10.1097/CHI.0b013e3181825af8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  13. Bickman L, Kelley SD, Athay M. The technology of measurement feedback systems. Couple and Family Psychology: Research and Practice. 2012;1:274–284. doi: 10.1037/a0031022. [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. Bickman L, Kelley SD, Breda C, de Andrade AR, Riemer M. Effects of routine feedback to clinicians on mental health outcomes of youths: results of a randomized trial. Psychiatric Services. 2011;62:1423–1429. doi: 10.1176/appi.ps.002052011. [DOI] [PubMed] [Google Scholar]
  15. Bradley R, Danielson L, Doolittle J. Responsiveness to intervention: 1997 to 2007. Teaching Exceptional Children. 2007;39:8–12. [Google Scholar]
  16. Bradshaw CP, Buckley JA, Ialongo NS. School-based service utilization among urban children with early onset educational and mental health problems: The squeaky wheel phenomenon. School Psychology Quarterly. 2008;23:169–186. [Google Scholar]
  17. Brown S, Rahn-Blakeslee A. Training school-based practitioners to collect intervention integrity data. School Mental Health. 2009;1:143–153. [Google Scholar]
  18. Burns BJ, Costello EJ, Angold A, Tweed D, Stangl D, Farmer EM, Erkanli A. Children's mental health service use across service sectors. Health Affairs. 1995;14:147–159. doi: 10.1377/hlthaff.14.3.147. [DOI] [PubMed] [Google Scholar]
  19. Carey J, Dimmitt C. A model for evidence-based elementary school counseling: Using school data, research, and evaluation to enhance practice. The Elementary School Journal. 2008;108:422–430. [Google Scholar]
  20. Chorpita BF, Bernstein AD, Daleiden EL. Empirically guided coordination of multiple evidence-based treatments: An illustration of relevance mapping in children's mental health services. Journal of Consulting and Clinical Psychology. 2011;79:470–480. doi: 10.1037/a0023982. [DOI] [PubMed] [Google Scholar]
  21. Chorpita BF, Bernstein A, Daleiden EL, Research Network on Youth Mental Health. Driving with roadmaps and dashboards: Using information resources to structure the decision models in service organizations. Administration and Policy in Mental Health and Mental Health Services Research. 2008;35:114–123. doi: 10.1007/s10488-007-0151-x. [DOI] [PubMed] [Google Scholar]
  22. Chorpita BF, Daleiden EL, Ebesutani C, Young J, Becker KD, Nakamura BJ, Starace N. Evidence-based treatments for children and adolescents: An updated review of indicators of efficacy and effectiveness. Clinical Psychology: Science and Practice. 2011;18:153–171. [Google Scholar]
  23. Crone DA, Horner RH, Hawken LS. Responding to problem behavior in schools: The behavior education program. New York: Guilford Press; 2004. [Google Scholar]
  24. Daleiden E, Chorpita BF. From data to wisdom: Quality improvement strategies supporting large-scale implementation of evidence based services. Child and Adolescent Psychiatric Clinics of North America. 2005;14:329–349. doi: 10.1016/j.chc.2004.11.002. [DOI] [PubMed] [Google Scholar]
  25. Daleiden EL, Chorpita BF, Donkervoet CM, Arensdorf AA, Brogan M. Getting better at getting them better: Health outcomes and evidence-based practice within a system of care. Journal of the American Academy of Child and Adolescent Psychiatry. 2006;45:749–756. doi: 10.1097/01.chi.0000215154.07142.63. [DOI] [PubMed] [Google Scholar]
  26. Dimmitt C, Carey J, Hatch T. Evidence-based school counseling: Making a difference with data-driven practices. Thousand Oaks, CA: Corwin; 2007. [Google Scholar]
  27. DuPaul GJ, Kern L, Gormley MJ, Volpe RJ. Early intervention for young children with ADHD: Academic outcomes for responders to behavioral treatment. School Mental Health. 2011;3:117–126. [Google Scholar]
  28. Farahmand FK, Grant KE, Polo AJ, Duffy SN. School-based mental health and behavioral programs for low-income, urban youth: A systematic and meta-analytic review. Clinical Psychology: Science and Practice. 2011;18:372–390. [Google Scholar]
  29. Farmer EM, Burns BJ, Phillips SD, Angold A, Costello EJ. Pathways into and through mental health services for children and adolescents. Psychiatric Services. 2003;54:60–66. doi: 10.1176/appi.ps.54.1.60. [DOI] [PubMed] [Google Scholar]
  30. Fixsen DL, Naoom SF, Blase KA, Friedman RM, Wallace F. Implementation research: A synthesis of the literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network; 2005. [Google Scholar]
  31. Foster S, Rollefson M, Doksum T, Noonan D, Robinson G, Teich J. DHHS Pub. No.(SMA) 05–4068. Rockville, MD: Center for Mental Health Services; 2005. School Mental Health Services in the United States, 2002–2003. Substance Abuse and Mental Health Services Administration. [Google Scholar]
  32. Franklin C, Kim JS, Tripodi SJ. A meta-analysis of published school social work practice studies 1980–2007. Research on Social Work Practice. 2009;19:667–677. [Google Scholar]
  33. Garland AF, Haine-Schlagel R, Brookman-Frazee L, Baker-Ericzen M, Trask E, Fawley-King K. Improving community-based mental health care for children: Translating knowledge into action. Administration and Policy in Mental Health and Mental Health Services Research. 2013;40:6–22. doi: 10.1007/s10488-012-0450-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  34. Garland AF, Kruse M, Aarons G. Clinicians and outcome measurement: What’s the use? Journal of Behavioral Health Services & Research. 2003;40:393–405. doi: 10.1007/BF02287427. [DOI] [PubMed] [Google Scholar]
  35. Halford WK, Hayes S, Christensen A, Lambert M, Baucom DH, Atkins DC. Toward making progress feedback an effective common factor in couple therapy. Behavior Therapy. 2012;42:49–60. doi: 10.1016/j.beth.2011.03.005. [DOI] [PubMed] [Google Scholar]
  36. Harmon SC, Lambert MJ, Smart DM, Hawkins E, Nielsen SL, Slade K, Lutz W. Enhancing outcome for potential treatment failures: Therapist–client feedback and clinical support tools. Psychotherapy Research. 2007;17:379–392. [Google Scholar]
  37. Hatfield D, McCullough L, Frantz SH, Krieger K. Do we know when our clients get worse? An investigation of therapists' ability to detect negative client change. Clinical Psychology & Psychotherapy. 2010;17:25–32. doi: 10.1002/cpp.656. [DOI] [PubMed] [Google Scholar]
  38. Hatfield DR, Ogles BM. The use of outcome measures by psychologists in clinical practice. Professional Psychology: Research and Practice. 2004;35:485–491. [Google Scholar]
  39. Herschell AD, Kolko DJ, Baumann BL, Davis AC. The role of therapist training in the implementation of psychosocial treatments: A review and critique with recommendations. Clinical Psychology Review. 2010;30:448–466. doi: 10.1016/j.cpr.2010.02.005. [DOI] [PMC free article] [PubMed] [Google Scholar]
  40. Higa-McMillan CK, Kimhan Powell CK, Daleiden EL, Mueller CW. Pursuing an evidence-based culture through contextualized feedback: Aligning youth outcomes and provider practices. Professional Psychology: Research and Practice. 2011;42:137–144. [Google Scholar]
  41. Hoagwood KE. Don’t mourn: Organize. Reviving mental health services research for healthcare quality improvement. Clinical Psychology: Science and Practice. 2013;20:120–126. [Google Scholar]
  42. Hoagwood KE, Olin SS, Kerker BD, Kratochwill TR, Crowe M, Saka N. Empirically based school interventions targeted at academic and mental health functioning. Journal of Emotional and Behavioral Disorders. 2007;15:66–92. [Google Scholar]
  43. Hodges K. Child and Adolescent Functional Assessment Scale (CAFAS) Ann Arbor, MI: Functional Assessment Systems; 1998. [DOI] [PubMed] [Google Scholar]
  44. Kataoka S, Jaycox LH, Wong M, Nadeem E, Langley A, Tang L, Stein BD. Effects on school outcomes in low-income minority youth: Preliminary findings from a community-partnered study of a school trauma intervention. Ethnicity & Disease. 2011;21(S1):71–77. [PMC free article] [PubMed] [Google Scholar]
  45. Kelley SD, Bickman L, Norwood ED. Evidence-based treatments and common factors in youth psychotherapy. In: Barry L, Miller SD, Wampold BE, Hubble MA, editors. The heart and soul of change: Delivering what works in therapy. 2nd ed. Washington, DC: American Psychological Association; 2010. pp. 325–355. [Google Scholar]
  46. Kelly MS. Data-driven decision making in school-based mental health: (How) Is it possible? Advances in School Mental Health Promotion. 2011;4:2–4. [Google Scholar]
  47. Kelly MS, Lueck C. Adopting a data-driven public health framework in schools: Results from a multi-disciplinary survey on school-based mental health practice. Advances in School Mental Health Promotion. 2011;4:5–12. [Google Scholar]
  48. Kern L, DuPaul GJ, Volpe RJ, Sokol NG, Lutz GJ, Arbolino LA, VanBrakle JD. Multi-setting assessment-based intervention for young children at risk for attention deficit hyperactivity disorder: Initial effects on academic and behavioral functioning. School Psychology Review. 2007;36:237–255. [Google Scholar]
  49. Kratochwill T, Hoagwood K, Kazak A, Weisz J, Hood K, Vargas L, Banez G. Practice-based evidence for children and adolescents: Advancing the research agenda in schools. School Psychology Review. 2012;41:215–235. [Google Scholar]
  50. Kutash K, Duchnowski AJ, Lynn N. School-based mental health: An empirical guide for decision-makers. Research & Training Center for Children's Mental Health, Louis de la Parte Florida Mental Health Institute, University of Florida; 2006. [Google Scholar]
  51. Lambert MC, Schmitt N, Samms-Vaughan ME, An JS, Fairclough M, Nutter CA. Is it prudent to administer all items for each Child Behavior Checklist cross-informant syndrome? Evaluating the psychometric properties of the Youth Self-Report dimensions with confirmatory factor analysis and item response theory. Psychological Assessment. 2003;15:550–568. doi: 10.1037/1040-3590.15.4.550. [DOI] [PubMed] [Google Scholar]
  52. Lever N, Andrews C, Weist MD. School mental health and HIPAA [White paper] 2008 Retrieved from http://csmh.umaryland.edu.
  53. Lyon AR, Charlesworth-Attie S, Vander Stoep A, McCauley E. Modular psychotherapy for youth with internalizing problems: Implementation with therapists in school-based health centers. School Psychology Review. 2011;40:569–581. [Google Scholar]
  54. Lyon AR, Ludwig K, Romano E, Leonard S, Vander Stoep A, McCauley E. "If it's worth my time, I will make the time": School-based providers' decision-making about participating in an evidence-based psychotherapy consultation program. Administration and Policy in Mental Health and Mental Health Services Research. doi: 10.1007/s10488-013-0494-4. in press. [DOI] [PMC free article] [PubMed] [Google Scholar]
  55. Lyon AR, Stirman SW, Kerns SEU, Bruns EJ. Developing the mental health workforce: Review and application of training strategies from multiple disciplines. Administration and Policy in Mental Health and Mental Health Services Research. 2011;38:238–253. doi: 10.1007/s10488-010-0331-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
  56. May S, Ard W, III, Todd AW, Horner RH, Glasgow A, Sugai G. Schoolwide information system. Eugene, OR: Educational and Community Supports, University of Oregon; 2003. [Google Scholar]
  57. Palmiter D. A survey of the assessment practices of child and adolescent clinicians. American Journal of Orthopsychiatry. 2004;74:122–128. doi: 10.1037/0002-9432.74.2.122. [DOI] [PubMed] [Google Scholar]
  58. Patient Protection and Affordable Care Act of 2010. Pub. L. No. 111-148, § 6301, 124 Stat. 2010:727. [Google Scholar]
  59. PracticeWise, L. L. C. PracticeWise Evidence-Based Services Database. 2013 Retrieved from. http://www.practicewise.com.
  60. President’s New Freedom Commission on Mental Health. Achieving the promise: Transforming mental health care in America: Final report. Rockville, MD: Author; 2003. (DHHS Publication No. SMA-03–3832) [Google Scholar]
  61. Prodente CA, Sander MA, Weist M. Furthering support for expanded school mental health programs. Children’s Services: Social Policy, Research, and Practice. 2002;5:173–188. [Google Scholar]
  62. Roeser R, Eccles J, Freedman-Doan C. Academic functioning and mental health in adolescence: Patterns, progressions, and routes from childhood. Journal of Adolescent Research. 1999;14:135–174. [Google Scholar]
  63. Schweinhart LJ, Montie J, Xiang Z, Barnett WS, Belfield CR, Nores M. Lifetime effects: the High/Scope Perry Preschool study through age 40. Ypsilanti: High/Scope Press; 2005. [Google Scholar]
  64. Shimokawa K, Lambert M, Smart D. Enhancing treatment outcomes of patients at risk of treatment failure: Meta-analytic and mega-analytic review of a psychotherapy quality assurance system. Journal of Consulting and Clinical Psychology. 2010;78:298–311. doi: 10.1037/a0019247. [DOI] [PubMed] [Google Scholar]
  65. Slade K, Lambert M, Harmon C, Smart D, Bailey R. Improving psychotherapy outcome: The use of immediate electronic feedback and revised clinical support tools. Clinical Psychology & Psychotherapy. 2008;15:287–303. doi: 10.1002/cpp.594. [DOI] [PubMed] [Google Scholar]
  66. Stephan S, Davis E, Burke PC, Weist M. Helping Others Help Children: Clinical Supervision of Child Psychotherapy. Washington, DC: American Psychological Association; 2006. Supervision in school mental health; pp. 209–222. [Google Scholar]
  67. Stormshak EA, Connell A, Dishion T. An adaptive approach to family-centered intervention in schools: Linking intervention engagement to academic outcomes in middle and high school. Prevention Science. 2009;10:221–235. doi: 10.1007/s11121-009-0131-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  68. Substance Abuse and Mental Health Services Administration. Partners for Change Outcome Management System (PCOMS): International Center for Clinical Excellence. 2012 Jan; Retrieved from the National Registry of Evidence-based Programs and Practices Web site, http://www.nrepp.samhsa.gov/ViewIntervention.aspx?id=249.
  69. Sugai G, Horner RR. A promising approach for expanding and sustaining school-wide positive behavior support. School Psychology Review. 2006;35:245–259. [Google Scholar]
  70. Teich JL, Robinson G, Weist MD. What kinds of mental health services do public school in the United States provide. Advances in School Mental Health Promotion. 2007;1:13–22. [Google Scholar]
  71. Weist MD. Mental health services in schools: Expanding opportunities. In: Ghuman HS, Sarles RM, editors. Handbook of Child and Adolescent Outpatient, Day Treatment and Community Psychiatry. Philadelphia, PA: Taylor & Francis; 1998. pp. 347–358. [Google Scholar]
  72. Weist M, Paternite C. Building and interconnected policy-training-practice-research agenda to advance school mental health. Education & Treatment of Children. 2006;29:173–196. [Google Scholar]
  73. Weisz JR, Chorpita BF, Palinkas LA, Schoenwald SK, Miranda J, Bearman SK, Gibbons RD. Testing standard and modular designs for psychotherapy treating depression, anxiety, and conduct problems in youth: A randomized effectiveness trial. Archives of General Psychiatry. 2012;69:274–282. doi: 10.1001/archgenpsychiatry.2011.147. [DOI] [PubMed] [Google Scholar]
  74. Weisz JR, Jensen-Doss A, Hawley KM. Evidence-based youth psychotherapies versus usual clinical care: A meta-analysis of direct comparisons. American Psychologist. 2006;61:671–689. doi: 10.1037/0003-066X.61.7.671. [DOI] [PubMed] [Google Scholar]
  75. Whipple JL, Lambert MJ, Vermeersch DA, Smart DW, Nielsen SL, Hawkins EJ. Journal of Counseling Psychology. 2003;50:59–68. [Google Scholar]
  76. Whitehead J. Self-study, living educational theories, and the generation of educational knowledge. Studying Teacher Education. 2009;5:107–111. [Google Scholar]
  77. Young J, Daleiden E, Chorpita B, Schiffman J, Mueller C. Assessing stability between treatment planning documents in a System of Care. Administration & Policy in Mental Health. 2007;34:530–539. doi: 10.1007/s10488-007-0137-8. [DOI] [PubMed] [Google Scholar]
  78. Zahner G, Pawelkiewicz W, DeFrancesco J, Adnopoz J. Children’s mental health service needs and utilization patterns in an urban community: An epidemiological assessment. Journal of the American Academy of Child & Adolescent Psychiatry. 1992;31:951–960. doi: 10.1097/00004583-199209000-00025. [DOI] [PubMed] [Google Scholar]
  79. Zheng K, McGrath D, Hamilton A, Tanner C, White M, Pohl J. A case study in ambulatory practices. Journal of Decision Systems. 2013;18:117–140. [Google Scholar]

RESOURCES