Abstract
Background
Efficacious psychiatric treatments are not consistently deployed in community practice, and clinical outcomes are attenuated compared with those achieved in clinical trials. A major focus for mental health services research is to develop effective and cost-effective strategies that increase the use of evidence-based assessment, prevention, and treatment approaches in community settings.
Objective
The goal of this program of research is to apply insights from behavioral economics and participatory design to advance the science and practice of implementing evidence-based practice (EBP) for individuals with psychiatric disorders across the life span.
Methods
Project 1 (Assisting Depressed Adults in Primary care Treatment [ADAPT]) is patient-focused and leverages decision-making heuristics to compare ways to incentivize adherence to antidepressant medications in the first 6 weeks of treatment among adults newly diagnosed with depression. Project 2 (App for Strengthening Services In Specialized Therapeutic Support [ASSISTS]) is provider-focused and utilizes normative pressure and social status to increase data collection among community mental health workers treating children with autism. Project 3 (Motivating Outpatient Therapists to Implement: Valuing a Team Effort [MOTIVATE]) explores how participatory design can be used to design organizational-level implementation strategies to increase clinician use of EBPs. The projects are supported by a Methods Core that provides expertise in implementation science, behavioral economics, participatory design, measurement, and associated statistical approaches.
Results
Enrollment for project ADAPT started in 2018; results are expected in 2020. Enrollment for project ASSISTS will begin in 2019; results are expected in 2021. Enrollment for project MOTIVATE started in 2018; results are expected in 2019. Data collection had begun for ADAPT and MOTIVATE when this protocol was submitted.
Conclusions
This research will advance the science of implementation through efforts to improve implementation strategy design, measurement, and statistical methods. First, we will test and refine approaches to collaboratively design implementation strategies with stakeholders (eg, discrete choice experiments and innovation tournaments). Second, we will refine the measurement of mechanisms related to heuristics used in decision making. Third, we will develop new ways to test mechanisms in multilevel implementation trials. This trifecta, coupled with findings from our 3 exploratory projects, will lead to improvements in our knowledge of what causes successful implementation, what variables moderate and mediate the effects of those causal factors, and how best to leverage this knowledge to increase the quality of care for people with psychiatric disorders.
Trial Registration
ClinicalTrials.gov NCT03441399; https://www.clinicaltrials.gov/ct2/show/NCT03441399 (Archived by WebCite at http://www.webcitation.org/74dRbonBD)
International Registered Report Identifier (IRRID)
DERR1-10.2196/12121
Keywords: implementation science, behavioral economics, mental health
Introduction
Background
Worldwide, psychiatric disorders account for more years lived with disability than any other category of disease [1]. The risk of premature mortality of people with severe psychiatric disorders is elevated [2], and the annual burden to the US economy is approximately half a trillion dollars, less than half of which is due to the cost of treatment [3]. Efficacious treatments are not consistently deployed in community practice, and clinical outcomes are attenuated compared with those achieved in clinical trials [4-6]. A major frontier for mental health services research is to develop effective and cost-effective strategies that increase the use of evidence-based assessment, prevention, and treatment approaches in community settings [7]. Although the field of implementation science has offered many new frameworks that identify factors associated with the use of evidence-based practice (EBP) in health and mental health care [8,9], there is still much potential to be realized in developing and testing new approaches that more successfully increase the use of EBP. The goal of our Advanced Laboratories for Accelerating the Reach and Impact of Treatments for Youth and Adults with Mental Illness (ALACRITY) Center is to apply behavioral economics and participatory design to accelerate the reach and impact of treatments for individuals with psychiatric illness across the life span.
Gaps in the Field of Implementation Science
On average, it takes 17 years for 14% of research to make its way into practice, with the majority of research findings never deployed in the community [10]. This finding has galvanized the development of implementation science, a discipline that has evolved rapidly, and focuses on the scientific study of methods to increase the adoption, implementation, and sustainment of EBP [11]. Implementation strategies are the interventions of implementation science. Early implementation research tested strategies that primarily involved training clinicians in EBP, based on assumptions that clinicians did not use them because they lacked knowledge and skills. Findings from a number of randomized controlled trials suggested that training improved clinicians’ knowledge of and attitudes toward EBP but did not lead to practice change [12-14]. This body of research highlighted contextual factors, such as clinician motivation and organizational culture, typically considered nuisance factors in efficacy trials, as important and understudied variables in their own right [15,16].
Implementation research in both health and mental health care began prioritizing the identification of determinants at clinician, organization, and system levels that affect implementation success or failure. Several heuristic implementation frameworks that attempted to capture constructs at each of these levels supported these studies [8,9,17]. Broadly, these studies used either qualitative methods to elucidate barriers to or facilitators of implementation [18-21] or quantitative methods to test associations between determinants and implementation outcomes such as fidelity to EBPs [22-26]. Furthermore, 3 major gaps have emerged from research that our center aims to address: (1) implementation science has not leveraged the rich literature from behavioral economics, (2) implementation research encourages stakeholder involvement but has not yet operationalized how best to do so, and (3) implementation research lacks causal theory.
Implementation Science Has Not Leveraged the Rich Literature From Behavioral Economics
Implementation studies historically have been premised on the assumption that clinicians make rational clinical choices that maximize utility for themselves and their patients [27]. A growing body of research from the field of behavioral economics suggests that this is not how clinicians make decisions. Behavioral economics includes a set of models and frameworks that recognize that individuals tend to make decisions under the constraints of bounded rationality [28]. In other words, clinicians do not always make decisions based on complete information, exhaustive analysis of all potential outcomes, and maximization of expected utility. Instead, individuals are influenced by myriad psychological, social, cognitive, and emotional factors and a wide range of simplifying cognitive heuristics or shortcuts when making decisions. Clinicians likely are influenced in their decision making about which treatment to use and how to use it by heuristics such as availability bias (a case seen recently is particularly salient), hindsight bias (tendency to infer causality from a recent event even if it was not predictable), and status quo bias (the tendency to stick with the approach they usually use even if new and better approaches may now be available) [29,30].
To date, implementation research has not leveraged insights from behavioral economics to design implementation strategies. This approach is promising and has been applied outside of the scope of implementation science with regard to physician and patient behavior in health care [31-34]. The application of this approach necessarily moves the field away from implementation strategies designed to increase knowledge and toward strategies such as changing the environment (ie, choice architecture) to make it easier to do the desired thing, making EBP use the default, and using incentives and rewards to leverage cognitive heuristics. Incentives refer to informing individuals that they will receive rewards if they perform a behavior. Rewards refer to giving an individual money, vouchers, or valued objects or status when the behavior is performed [35].
Implementation Research Encourages Stakeholder Involvement but Has Not Operationalized How to Do So
Although implementation science underscores the importance of stakeholder involvement in the implementation process [36], there has been little study on how to systematically involve stakeholders, such as patients, providers, administrators, payers, and policy makers, in the development, refinement, and testing of implementation strategies [37]. Engaging stakeholders systematically can increase the specificity, accuracy, and success of implementation strategies. Participatory design approaches, which emphasize active involvement of stakeholders in the design process, can be used to include stakeholder input in the process of designing and refining implementation strategies [38].
Implementation Science Lacks Causal Theory
Causal theory is largely underdeveloped in implementation science [11,39], and there is a limited understanding of the mechanisms by which implementation strategies work [29,40]. One major rate-limiting step is that randomized controlled trials of implementation strategies rarely incorporate formal tests of mediating mechanisms [40]. This is due in part to the underdevelopment of rigorous statistical methods to test mediating and moderating effects of hypothesized mechanisms in a multilevel context. Implementation trials almost always are clustered, with patients nested within clinicians and clinicians within organizations [8,9,17]. Furthermore, implementation strategies can be directed at patients, clinicians, or organizations, and strategies at 1 level may target behavior and outcomes at other levels. For example, changes in organizational climate may affect clinician behavior and patient outcomes. There are few validated statistical approaches to test these pathways, sometimes referred to as complex moderated mediation or conditional indirect effects in a multilevel context, thus limiting the forward movement of the field in understanding how or when implementation strategies are most effective.
A second factor that limits the development of causal theory in implementation science is the lack of standardized and validated measures that assess putative mechanisms that link strategies to outcomes. Important work is currently underway to address this measurement gap in some areas of implementation science [41]; however, constructs from the field of behavioral economics, including measures that assess cognitive heuristics, have been notably absent. This is an important gap given the potential promise of these heuristics as a lever for behavior change.
Even when measures are available to assess putative mediating mechanisms and investigators test these variables as mediators within randomized controlled trials, a third factor that limits the development of causal theory in implementation science is the failure of many implementation strategies to engage the targeted mechanisms [40]. A recent systematic review of 88 randomized controlled implementation trials in mental health service settings found no evidence that any implementation strategy engaged its targeted mechanisms of action [40]. One potentially important reason for this is that, to date, the design of implementation strategies has not incorporated systematic end-user feedback and perspectives. Studies have shown that intervention design is significantly improved when it systematically elicits end-user feedback about behavioral bottlenecks and other barriers to enactment of the targeted behavior and incorporates this feedback into intervention design [42].
The center is funded as part of the National Institute of Mental Health ALACRITY P50 to support the rapid development, testing, and refinement of novel and integrative approaches for optimizing the effectiveness of treatments for and prevention of mental disorders and organizing and delivering mental health services in community settings [43]. The major aim of the Penn ALACRITY center is to accelerate the pace at which effective treatments for mental disorders are deployed in community practice, thereby increasing their impact on improving quality of life for people with these disorders, and to advance the science of implementation. The Penn ALACRITY center is intended to support research that demonstrates high synergy across disciplines and that increases the public health impact of existing and emerging mental health interventions and service delivery strategies. The Penn ALACRITY center addresses these goals with the following objectives:
Objective 1: Apply innovative, interdisciplinary approaches from behavioral economics to implementation science. |
Objective 2: Apply methods from participatory design to ensure that the stakeholders’ voice is included in the development of implementation strategies in a systematic, rigorous, and collaborative manner. |
Objective 3: Develop statistical approaches that allow for the elucidation of mechanisms and causal theory. |
Methods
Overview
The Methods Core is the foundation of the Penn ALACRITY center. Specifically, it supports 3 incubators related to implementation strategy design, measurement, and statistical methods: (1) optimization of implementation strategy design through our design incubator, (2) refinement of measurement of mechanisms through our measurement incubator, and (3) development of novel approaches to test mechanisms in multilevel implementation trials through our statistical methods incubator. The Methods Core supports 3 exploratory projects that are wide in scope and span the most salient levels at which implementation takes place—the individual in treatment, the clinician, and the organization. Although each project stands alone, they are linked through common methods and measurement tools (see Table 1).
Table 1.
Attributes | Project 1: ADAPTa | Project 2: ASSISTSb | Project 3: MOTIVATEc |
Ecological level | Patient | Clinician | Organization |
Population | Adults with depression | Youth with autism | A wide range of diagnoses and ages |
Type of incentive | Financial | Social | Mix |
Outcome | Medication adherence | Data collection | Acceptability of implementation strategies |
aADAPT: Assisting Depressed Adults in Primary care Treatment.
bASSISTS: App for Strengthening Services In Specialized Therapeutic Support.
cMOTIVATE: Motivating Outpatient Therapists to Implement: Valuing a Team Effort.
Project 1 (Assisting Depressed Adults in Primary care Treatment [ADAPT]) compares the effectiveness of different schedules of financial incentives to increase medication adherence among adults recently diagnosed with depression in primary care settings. Project 2 (App for Strengthening Services In Specialized Therapeutic Support [ASSISTS]) examines the effectiveness of normative pressure in increasing the use of EBP among frontline clinicians working with children with autism in schools. Project 3 (Motivating Outpatient Therapists to Implement: Valuing a Team Effort [MOTIVATE]) develops and tests the acceptability of organization-focused implementation strategies to increase clinicians’ use of EBPs in community mental health clinics. In the section that follows, we will describe the major activities of the Methods Core, followed by more in-depth descriptions of each project.
Methods Core
Design Incubator: Optimize Implementation Strategy Design
The design incubator will test several participatory approaches to develop implementation strategies. Here, we describe 4 methods including innovation tournaments, behavioral design, rapid-cycle prototyping, and discrete choice experiments.
Innovation tournaments [44,45] take a collaborative and systematic approach to addressing complex and relatively unstructured problems using ideas from end users. Innovation tournaments begin when a host calls for ideas in an area of interest. End users are invited to submit ideas, which go through sequential stages of screening and evaluation by crowdsourcing peer review and expert input to filter and shape the raw ideas into the most promising ideas. At the end of the tournament, a few winning ideas are selected. Although innovation tournaments are solution-focused, they have added benefits related to team building and shifting organizational climate to be more egalitarian so that end users have direct input. Innovation tournaments have been successfully used in many contexts to increase stakeholder engagement, including quality improvement in health care [44,45], but have not been used to address challenges of implementing EBP in mental health services. In addition, 2 of our 3 projects (ASSISTS and MOTIVATE) include this approach. In ASSISTS, we use innovation tournaments to engage therapists in designing nonfinancial incentive strategies to improve the use of EBP among one-to-one aides working with school-age children with autism. Although we propose to leverage normative pressure to increase use of 1 EBP, data collection, there are many ways normative pressures can be applied, and there may be other incentives that may be equally or more effective, which we can learn about from our stakeholders. In MOTIVATE, we use innovation tournaments to engage clinicians in identifying the best way for organizations to use financial and nonfinancial incentives to help clinicians implement EBP.
Behavioral design is a systematic approach, informed by engineering and human-centered design principles, to understand human behavior and apply those insights to the design of behavior change interventions [46,47]. In this 5-step approach, designers first define the problem and then diagnose the problem from a behavioral lens, using qualitative and quantitative data about the context of the target behavior. The diagnosis process yields hypotheses informed by behavioral insights about the channels or barriers to the desired behavior. Next, these hypotheses are translated into potential solutions. Design solutions are also informed by behavioral insights. One design approach, developed by the UK Behavioural Insights Team, is the Easy, Attractive, Social, and Timely Framework, which organizes design solutions into factors that make the desired behavior Easy, Attractive, Social, and Timely [48]. For example, creating default solutions (which people will naturally stick with) makes a behavior very easy. Providing peer comparisons makes a behavior social, as most of us care how we do relative to peers, and what others think of us. Designed solutions are then tested and scaled through rigorous experiments. Project MOTIVATE employs behavioral design to generate solutions to improve the implementation of EBPs in community mental health settings. The contextual data for diagnosis phase will comprise, in part, the ideas submitted in the innovation tournament.
Rapid-cycle prototyping is an industry innovation that has recently been applied to health care and is a complementary approach to behavioral design. The goal of rapid-cycle prototyping is to test potential innovations more efficiently, less expensively, and more reliably than traditional clinical trials [49]. These approaches leverage mini-pilots, or experiments that are integrated within operations, to learn how to best design strategies. Rapid-prototyping does not rely on a finished product to test. Rather, mock-ups or inexpensive versions are tested before completing the product. For example, when IBM wanted to test how users would respond to speech recognition software, it placed a hidden typist in another room who could hear the speaker through a microphone, rather than developing this complex technology first [50]. Rapid-cycle prototyping has been used extensively at the Penn Medicine Center for Health Care Innovation in a variety of clinical contexts as a way to quickly learn from successive iterations of a new technology. We use rapid-cycle prototyping in 1 of our projects. In ASSISTS, our digital tool to collect data and improve implementation will rely heavily on rapid-cycle prototyping to iteratively test the interface, information content, and response to a phone-based data collection app for providers of one-on-one behavioral support for children with autism.
Discrete choice experiments [51,52] are frequently applied in health economics as a way to rate the acceptability of programs. They have not been used to provide input on the design of implementation strategies although they represent another promising approach to increasing stakeholder engagement [53]. Discrete choice experiments are a technique for systematically eliciting individual preferences for options and their specific attributes. By systematically eliciting tradeoffs among constructed outcome combinations, discrete choice experiments generate data that can quantify relative utility or satisfaction for the presented option as well as its specific attributes. This strategy allows for eliciting preferences for treatments that do not currently exist or that individuals have not yet experienced. We use discrete choice experiments in MOTIVATE to evaluate the acceptability of collaboratively developed implementation strategies targeting organizations to increase clinician EBP implementation.
Measurement Incubator: Refine Measurement of Mechanisms Related to Clinician Factors
Implementation science frameworks posit that individual factors such as motivation, self-efficacy, knowledge, and attitudes are important in the implementation process [8,9]. To date, these factors have primarily been described and measured using health behavior theories such as the Theory of Planned Behavior [54] and Social Cognitive Theory [55]. Less explored in implementation research have been the psychological heuristics that shape decision making and characterize our decision-making styles and may both mediate and moderate the impact of implementation strategies. In our 3 exploratory projects, we hypothesize that psychological heuristics affect implementation strategy success. For example, risk aversion describes the human tendency to value losses more than equivalent gains [30] and may shape how a consumer responds to a financial incentive to adhere to an EBP or may make managers reluctant to take a gamble on innovative practices that may put quality at risk. Present bias refers to the tendency of individuals to overvalue immediate or current rewards compared with future rewards [56]. Present-biased individuals may be more responsive to financial incentives; present-biased managers may require short-term rewards that provide more immediate feedback than those commonly seen in many pay-for-performance programs that involve incentive disbursement at the end of each year. Regret aversion refers to the tendency of individuals to reduce the possibility of regret when making choices and can be deployed in the design of lotteries or other tangible or intangible rewards systems; response to such designs is likely to vary with underlying heterogeneity in regret aversion. Individuals’ sensitivity to conformity and social referents similarly can be leveraged both through descriptive comparisons (eg, “this is how you are using EBP compared with your peers”) and injunctive comparisons (eg, “these are your supervisors’ expectations of how you will use EBP”) [57]. An individual’s unique pattern of cognitive biases and decision-making styles can be described as their behavioral phenotype [42] and may explain individual variation in responses to implementation strategies.
Implementation science frameworks also posit the importance of organizational factors in explaining implementation success, emphasizing constructs such as organizational culture (the collective sense of how work is done in an organization), organizational climate (the collective sense of how the work environment affects psychological well-being), implementation climate (group perspective on whether use of an innovation is expected, supported, and rewarded), and implementation leadership (group beliefs about how capably a leader supports EBP implementation) [58-62]. These organizational constructs explain much of the variance in implementation outcomes [23]. They are generally measured by aggregating individual responses within organizations or organizational units following a demonstration of construct validity at the organization level. In addition to defining individual decision-making styles, behavioral phenotypes may be important organizational descriptors, which would require aggregating responses across individuals. As an exploratory objective, we will examine the validity and utility of aggregating individual behavioral phenotypes to the organizational level [63]. In other words, do people with similar behavioral phenotypes cluster in organizations, and do these aggregated phenotypes predict important implementation outcomes? Individual behavioral phenotypes could cluster within organizations if, for example, leaders only hire individuals whose psychological heuristics are consistent with their own or if the organizational culture changes employees’ heuristics. If individual behavioral phenotypes translate into an organizational construct, we will also explore questions related to organizational composition, such as how clinicians’ behavioral phenotypes compare with those of their administrators and which is more predictive of implementation and clinical outcomes.
We will refine existing, validated measures from the behavioral economics literature for assessing psychological heuristics. Once measured in patients, providers, and administrators, these phenotypes can be evaluated as mediators and moderators of implementation strategy effects.
Statistical Methods Incubator: Develop Novel Approaches to Test Mediating and Moderating Effects of Mechanisms
Mechanisms refer to processes that are responsible for change [64]; they can be considered the active ingredients that explain the specific ways in which implementation strategies affect implementation and client outcomes. The identification of mechanisms can lead to more efficient and tailored strategies based on the EBP of interest and the context in which it is implemented. The goals of the statistical methods incubator are to develop methods that quantify the magnitude and statistical significance of cross-level indirect effects in mediation models (the approach needed to test mechanisms) that span patient, clinician, and organizational levels and develop methods and guidelines for designing studies that have adequate statistical power to detect mediation effects in these multilevel trials. These methods are necessary to rigorously test the implementation strategies that are developed through our exploratory projects and pilot studies.
Although some research has begun to identify challenges and propose solutions for addressing mediation in simple 2-level mediation models (patients nested within providers) [65], little is known about the extension of these methods to 3-level models (patients within providers within organizations) or models that incorporate multiple measures over time. For example, questions remain about the extent to which various model specifications result in biased parameter estimates and the most effective strategies for overcoming these biases to obtain accurate parameter estimates and correct tests of statistical significance for indirect effects in 3-level models. Complications arise in these models because of the interdependence of observations within levels and because the relationships among lower-level variables can vary at different levels [65-67]. Although significant progress has been made in accounting for these design features when modeling direct effects, the modeling of indirect effects is more complicated and has not been examined with as much rigor. This deficit is particularly important in implementation studies of organizations where there can be considerable homogeneity within levels, and interventions at 1 level can have substantial effects on other levels—all of which complicates modeling [23,40,68].
There are no good guidelines to help investigators design multilevel studies so that they have adequate power to detect indirect effects of clinical interest. The importance of designing studies with adequate statistical power to detect meaningful effects is well understood [69] and accepted. Several resources are available to support researchers in ensuring that studies are adequately powered to detect main effects of interventions in both single-level trials and multilevel trials [70-72]. As the field moves to an experimental approach that requires testing of the mediating mechanisms that link implementation strategies to outcomes at multiple levels, we will have to examine indirect effects in studies that have sufficient statistical power to detect these effects, should they be present. Although methods are available to calculate statistical power for main effects in clustered trials, we know of no validated methods to calculate statistical power to detect cross-level indirect effects in multilevel trials. Without this information, investigators are unable to plan studies that are adequately powered to address questions of mechanisms. This work will build on and extend 2 approaches to test indirect effects in multilevel models—multilevel structural equation modeling [73] and the centered within context with means reintroduced approach [65]—to address 3-level models with mediators, interventions, and randomization at different levels. This work will result in generalizable instructions and guidelines on how to conduct multilevel mediation analysis in 3-level mediation models applicable to studying mechanisms in a wide range of implementation trials and empirical evidence supporting the need for these approaches to increase the precision and accuracy of indirect effect estimates.
Exploratory Projects
Project 1: Assisting Depressed Adults in Primary Care Treatment (Patient-Level)
Improving the management of adult depression is one of the great challenges facing outpatient mental health care. As continuous antidepressant treatment tends to improve symptoms of depression [74-76], quality of life [74], and social functioning [77] as well as reduce health care costs [78], it is a cornerstone of evidence-based treatment for adult depression. Yet adults who initiate antidepressants for depression often discontinue within the first few weeks of treatment, before their medication becomes fully effective [79]. Although patient-level strategies have been highlighted in several implementation frameworks, there has been little empirical study relating to patient-level uptake of EBP [80]. We will conduct a pilot study to test whether modest time-limited escalating or de-escalating daily financial rewards for patient antidepressant use, based on behavioral economic theory, improves medication adherence and clinical outcomes of adults initiating treatment of depression. This will make a contribution to implementation science by elucidating how patient-facing implementation strategies can be used to increase the manner in which patients engage with EBP.
Tangible financial patient rewards have successfully increased a wide range of health behaviors [31,34,81-84], including medication adherence [85-88]. Financial rewards for medication adherence tend to have their strongest effects when they are provided frequently and close in time to when the medication is taken [88]. In depression, in contrast to many other conditions, it may be necessary to provide financial incentives for antidepressant adherence only during the initial weeks of treatment, when untreated depression makes nonadherence risk greatest and before the patient’s mood begins to improve [89,90]. After this point, antidepressants may help lift the patient’s mood, providing feedback to become self-reinforcing and facilitate better adherence.
Behavioral economics research has highlighted that the design and delivery of financial incentives significantly influence effectiveness [91,92]. Antidepressant therapy requires continuous medication adherence. However, the therapy’s mechanism of action makes early adherence difficult: the rewards of antidepressants (decreased depressive symptoms) do not materialize instantaneously but only after several weeks of use, whereas the costs (inconvenience and side effects) accrue in the present. With this in mind, it is possible that providing larger initial incentives that fade over time may help people to overcome initial inertia and get started. It is also possible, however, that an increasing daily incentive, which people generally prefer [93], may better leverage key behavioral principles, including the use of reference points (people compare with what they have received previously, and the increasing rewards will be thus viewed positively) and loss aversion as patients who initiate treatment face an ever-greater lost opportunity if they discontinue medications as rewards increase [94]. However, it is also possible that an increasing daily incentive, which people generally prefer [93], may better leverage key behavioral principles, including the endowment effect (ascribing more value to things because one owns them) and loss aversion as patients who initiate treatment face an ever-greater lost opportunity if they discontinue medications as rewards increase [94].
We will compare the effects of usual care, escalating, and de-escalating patient financial incentives on daily antidepressant medication adherence and depression symptom control of nonelderly adults with clinical depression (see Table 2). A three-arm pilot study will randomize 120 adults in outpatient treatment who are starting antidepressants for depression to receive either (1) usual care (n=40), (2) usual care and escalating daily financial incentives (n=40), or (3) usual care and de-escalating daily financial incentives (n=40). Participants assigned to the escalating incentive arm will receive US $2 per day, increasing to US $7 per day with daily feedback tied to use of Adheretech wireless medication devices using the Way to Health platform for the first 6 weeks of antidepressant adherence (US $189 maximum). Those assigned to the de-escalating incentive arm will receive an incentive that decreases from US $7 per day to US $2 per day for daily antidepressant adherence over the 6-week period (US $189 maximum). The study will achieve the following specific aims: (1) compare the effectiveness of usual care, escalating incentives, and de-escalating incentives on improving adherence to antidepressant therapy and reducing depressive symptoms 6 weeks following treatment initiation; (2) determine whether 6 week escalating or de-escalating financial incentives continue to improve antidepressant adherence and depressive symptom control over the 6- to 12-week period following termination of the incentives; and (3) assess the similarity of the study groups with respect to potential negative effects of incentives including regret over study participation. We will also explore potentially moderating effects of 2 psychological biases (present bias and information avoidance) on the effectiveness of the interventions to improve daily antidepressant adherence. Habit formation and decision regret will be evaluated as secondary outcomes. Given that this study is one of the first of its kind to explore financial incentives for medication adherence in individuals with psychiatric disorders, we have paid careful attention to potential ethical concerns [95]. It is of note that we will not prescribe any medication, and we leave the assessment of benefits relative to risks for each patient to that patient’s clinician. We have included the Decision Regret Scale [96], which we will monitor regularly to identify potential issues of dissatisfaction with study participation. We also plan to conduct qualitative interviews with participants asking specifically about their perspectives on financial incentives and adherence to antidepressants.
Table 2.
Stages and time point | Study period | |||||||
|
Screening | Baseline | Intervention period (weeks 1-6) | Follow-up | ||||
|
|
|
|
End of week 6 | End of week 12 | |||
Enrollment | ||||||||
|
Eligibility screen | Xa | X | —b | — | — | ||
|
Verbal informed consent | X | — | — | — | — | ||
|
Randomization | — | X | — | — | — | ||
Interventions | ||||||||
|
Escalating financial incentives | — | — | X | — | — | ||
|
De-escalating financial incentives | — | — | X | — | — | ||
|
Control | — | — | X | — | — | ||
Assessments | ||||||||
|
Patient Health Questionnaire-9 | X | — | — | X | X | ||
|
Generalized Anxiety Disorder-7 | — | X | — | X | X | ||
|
Beck Hopelessness Scale | — | X | — | X | X | ||
|
Theory of Planned Behavior | — | X | — | — | — | ||
|
Information avoidance | — | X | — | — | — | ||
|
Present bias | — | X | — | — | — | ||
|
Decision Regret Scale | — | — | — | X | X | ||
|
Habit formation | — | — | — | X | X | ||
|
Daily antidepressant adherence (automated collection via electronic pill bottle) | — | — | X | X | X | ||
|
Antidepressant prescription (via electronic health record abstraction) | — | — | X | X | X |
aX denotes the study period in which each component occurs.
bNot applicable.
Project 2: App for Strengthening Services in Specialized Therapeutic Support (Clinician-Level)
Elementary school children with autism often need intensive support throughout the day [97]. Concerns about safety, behavioral challenges, and the need for a highly structured environment have resulted in an increased use of one-to-one aides at home, school, and in the community [98,99]. These aides, referred to as therapeutic support staff (TSS) in Philadelphia, usually have a bachelor’s or associate’s degree and receive limited training and supervision due to the community-based nature of their work, which often requires them to work in settings independent from their supervisors and peers [98,100]. Ideally, aides would use evidence-based interventions in the family of applied behavior analysis to help children reduce problem behaviors and increase adaptive behaviors [101,102].
Philadelphia’s Medicaid system spends more than US $80 million a year on TSS, about a third of the children’s mental health services budget. Although children with autism comprise 7% of all children served through this system, they account for 40% of TSS services. Administrators, advocates, and parents have decried the poor or unknown quality of care and outcomes associated with it, yet the very nature of the work they do makes it difficult to monitor. Our observations of TSS in the community [98], combined with our interviews with administrators and clinicians, suggest that TSS are not supported in providing high-quality, evidence-based care, in large part because of the isolating nature of their work and limited opportunity for measurement of performance, acknowledgment, and feedback.
This project will develop and test strategies to increase TSS’s self-efficacy, supervision, and sense of belonging to a professional community, with opportunities for peer comparison and supervisor recognition as mechanisms to increase the use of EBP. The target practice for this study is data collection. We chose data collection because (1) it is a component of every EBP for children with autism and is common to many mental health interventions for other children as well, (2) this foundational practice is essential to measuring and monitoring outcomes and has been associated with more positive outcomes in and of itself, and (3) it makes possible the objective assessment of the effectiveness of implementation strategies and supports iteration and improvement.
Our clinician-focused implementation strategy is based on 3 psychological principles informed by behavioral economics. The first is bounded rationality, defined as limited information, cognitive capacity, and willpower [103]. TSS may find data collection difficult because they are not sure what information to collect or how to collect it easily. The second is perceptions of social norms [57]. On the basis of typical practice, TSS may believe that their supervisors do not expect them to collect data and that none of their peers collect data. The third is hyperbolic discounting, in which people prefer more immediate gratification (not expending effort to collect data) at the expense of long-term outcomes (data ultimately used to show a child’s progress and inform future interventions) [56].
Philadelphia’s Department of Behavioral Health is making substantial investments in establishing and enforcing standards for autism intervention. Data collection will be an important part of these new standards. We will use a participatory design approach to build a digital app on the Penn Way to Health platform that allows TSS to (1) receive frequent communication and reminders about how and when to collect data, (2) easily collect and upload data, (3) observe how the child in their care is progressing, (4) observe how they compare with their peers in data collection, and (5) receive positive recognition from their supervisors and employers in response to frequent and accurate data collection.
We will use rapid-cycle prototyping, an iterative development process that involves multiple tests of our intervention’s utility, feasibility, acceptability, and potential for long-term system-wide implementation. To develop the components of this new tool, we will (1) conduct an innovation tournament among TSS and their supervisors to identify ways to increase TSS’s data collection; (2) observe and query 10 TSS workers in the field to examine how they collect data, the functional and structural barriers that impede their ability and willingness to collect data, and their intentions, attitudes, subjective norms, and self-efficacy regarding data collection; (3) use rapid prototyping and testing to create an app through Way to Health that makes data collection easier, more rewarding, and socially desirable and refine the app based on observation and data collection; (4) test the refined app with 30 TSS from 3 agencies; and (5) explore broader applicability of this technology with our partners to determine how use of other EBPs can be objectively and inexpensively measured and rewarded.
Project 3: Motivating Outpatient Therapists to Implement: Valuing a Team Effort (Organizational-Level)
In project MOTIVATE, we will partner with stakeholders to develop implementation strategies that target organizations and leverage established principles from behavioral economics to improve EBP implementation in community mental health clinics. In our work investigating the implementation of EBP over the past 5 years [23,104], agency administrators and policy makers have repeatedly told us that the most significant barrier to implementing EBP is the need for a significant organizational financial investment, which is challenging in the context of an under-resourced public mental health system [21,105-108].
In response to this challenge, payers, including Philadelphia’s Department of Behavioral Health and Intellectual disAbility Services (DBHIDS), are beginning to use financial incentives to motivate organizations to encourage EBP implementation. These efforts are based in part on evidence from a few published studies, which show that pay-for-performance schemes, that is, paying clinicians directly for the implementation of EBPs, result in greater use of EBP and higher fidelity [109-111]. However, studies have also shown that incenting organizations rather than clinicians is not highly effective in changing clinicians’ use of EBP [112-115]. This discrepancy highlights the importance of understanding how to help organizations leverage incentives to change clinician behavior most efficiently and effectively.
The limited impact of paying organizations to change clinician behavior may be due to a number of factors ranging from poor incentive design to organizational incentives not being translated into incentives for frontline clinicians [116]. Organizational leaders likely do not have the training in how to design or use flow-through incentive funds effectively nor is there any research (or established implementation strategies) to guide this practice [115]. We address this gap through a participatory design process that integrates stakeholder input from clinicians, administrators, policy makers, and payers to develop incentive-oriented implementation strategies that target organizations [36].
Our participatory design approach incorporates 3 novel and promising methods for systematically eliciting and leveraging end-user input to design effective interventions. First, we will use an innovation tournament (described previously) to generate ideas from clinicians (the end users) about the best ways for organizations to use financial and nonfinancial incentives to facilitate clinician implementation of EBP (see description above). Second, we will refine the ideas generated through the innovation tournament using a behavioral diagnosis process. We will use a structured tool developed by ideas42 to comprehensively analyze the ideas submitted in the tournament and identify specific behavioral barriers impeding the use of EBPs by clinicians. For example, tournament ideas that indicate that clinicians run out of time to implement EBPs during the standard 50-min therapy session may suggest that the planning fallacy (ie, the tendency to consistently underestimate the time needed to complete an action) contributes to incomplete or infrequent use of EBPs in session. The behavioral diagnosis step will yield multiple hypothesized behavioral bottlenecks that, once confirmed or disconfirmed by stakeholders, will inform implementation strategy design.
Third, once a set of implementation strategies have been identified as potentially useful, we will use a discrete choice experiment to systematically elicit and quantify stakeholders’ preferences regarding how these strategies should be designed and structured [53]. Discrete choice experiments present potential users of a product or service with a series of forced-choice questions that require them to choose between alternative designs. For example, clinicians might be required to choose between a financial incentive in the form of a large annual bonus for high EBP fidelity or a small monthly payment for fidelity or they might choose to verify fidelity by tape recording all sessions or having 1 in-session observation. By systematically combining various attributes and levels, a discrete choice experiment quantifies the extent to which specific design features are desired. This information will then be used to inform the design of an incentive strategy that targets organizations to improve clinicians’ EBP delivery.
Declarations
Current Study Status
We have begun recruitment and data collection for Projects 1 and 3; data collection is ongoing for both projects. Project 2 recruitment and data collection will begin in 2019. No publications containing the results of this study have been submitted or published in any journal.
Ethics Approval and Consent to Participate
The institutional review boards of the University of Pennsylvania and the City of Philadelphia approved all study procedures, and all ethical guidelines were followed. All individuals interested in participating in the research described in this protocol will provide written informed consent.
Setting
This work is occurring in close collaboration with community stakeholders vested in mental health in the City of Philadelphia. Our major partners include the DBHIDS, the School District of Philadelphia, the nonprofit organizations that serve the mental health needs of Philadelphia residents in the DBHIDS network, and the University of Pennsylvania Health System. Philadelphia is a city of 1.6 million racially and ethnically diverse communities, approximately a third of who are enrolled in Medicaid. Philadelphia manages the behavioral health care of its Medicaid enrollees through Community Behavioral Health (CBH), a not-for-profit corporation that falls within the organizational structure of DBHIDS. CBH pays for mental health and substance use care for more than 100,000 consumers annually and has more than 250 organizations in network, including approximately 50 CBH centers and other specialized mental health and substance use agencies providing outpatient services.
The School District of Philadelphia serves approximately 140,000 students educated in 214 schools annually, making it the eighth largest school district in the United States. Furthermore, three-quarters of its students are eligible for free or reduced-price lunch. About two-thirds of students are African American and 10% are Latinx, and 107 languages other than English are spoken at home by District students.
The University of Pennsylvania Health System is the largest health system in the Delaware Valley. It includes 6 hospitals and 2 regional medical centers. Last year, the health system accommodated more than 5 million outpatient visits, 135,000 adult hospital admissions, and 350,000 emergency room visits.
All projects are approved by the appropriate institutional review boards.
Results
Enrollment for project ADAPT started in 2018; results are expected in 2020. Enrollment for project ASSISTS will begin in 2019; results are expected in 2021. Enrollment for project MOTIVATE started in 2018; results are expected in 2019. Data collection had begun for ADAPT and MOTIVATE when this protocol was submitted.
Discussion
Summary
Although hundreds of treatments for common psychiatric disorders have demonstrated efficacy, problems persist in optimizing their implementation in community practice [117,118]. When evidence-based interventions are adopted, they are often not implemented in the way they were designed and do not result in the same outcomes observed with highly selected patients under controlled conditions. The 3 projects described in this protocol address the problems of incomplete uptake of selected EBPs in community practice and will result in approaches that could lay the foundation for ways to address implementation gaps at the levels of individuals in treatment, clinicians, and organizations. Our Methods Core will develop statistical, participatory design, and behavioral phenotyping approaches to increase the specificity and external validity of implementation strategies and the rigor with which and for whom they work.
One innovation of our Penn ALACRITY center is the merging of mental health treatment, implementation science, behavioral economics, and participatory design. Implementation research in mental health has identified important, malleable characteristics of treatments, the individuals using them, and the organizations in which they work that affect their use and outcomes. Behavioral economists have identified a wide range of patterns in human decision making that may contribute to poor health outcomes as well as methods of designing the environment to optimize optimal decision making [91,92,119-127]. These complementary approaches may be helpful when considering the implementation challenges faced in the community. Participatory design actively involves stakeholders in designing new technologies to help ensure that the results meet their needs. Our Penn ALACRITY center activities will introduce a new level of rigor and innovative methods for eliciting and incorporating stakeholder input and will represent the first merging of interdisciplinary perspectives. Future research can test the output of these different participatory design approaches to test their effectiveness in the design of implementation strategies.
Outcomes and Conclusions
We anticipate that our Penn ALACRITY center will result in the following outcomes. First, through the design, measurement, and statistical methods incubators, the Methods Core will generate guidelines for using participatory design approaches (ie, innovation tournaments, rapid-cycle prototyping, and discrete choice experiments) in the development of implementation strategies; a set of publicly available measures of behavioral phenotypes and a database that pools deidentified patient, provider, and organizational data; and practical statistical tools, guidelines, and information that facilitate the design of the next generation of mechanism-focused randomized trials in implementation science. Second, through our exploratory projects, we will generate data that will seed future implementation science studies that incorporate advances from behavioral economics and participatory design [128]. Third, through our scientific and dissemination activities, we hope both to (a) advance implementation science by integrating new conceptual models and methods to develop implementation strategies and (b) increase the use of EBPs in community settings to improve patient well-being and quality of care.
Acknowledgments
The authors would like to acknowledge the input from their community partners including the DBHIDS, Community Behavioral Health, the School District of Philadelphia, the University of Pennsylvania Health System, and the organizations and clinicians that serve the needs of individuals with mental health difficulties in the city of Philadelphia. The authors would also like to thank the following individuals who contributed to the proposed work: (1) coinvestigators (Sudeep Bhatia, PhD; David Asch, MD, MBA; Roy Rosin, MBA; Paul Allison, PhD; Fran Barg, PhD, Med; Meena Bewtra, MD, MPH; Yong Chen, PhD; John Kimberly, PhD; and Rosemary Frasso, PhD, MSc, CPH); (2) clinical research coordinators (Vivian Byeon, BA and Megan Reilly, MPH); (3) data manager (Ming Xie, MS); (4) consultants (Kristopher Preacher, PhD and Reed Johnson, PhD); (5) Way to Health team (Devon Taylor, MPH; Christianne Sevinc, MPH; and Stephanie Brown, BA); (6) Your Big Idea Team (Deirdre Darragh, MA); (7) External Advisory Board members (C Hendricks Brown, PhD; John Landsverk, PhD; Joan Erney, JD; and Barbara Bunkle, PhD); and (8) Data Safety and Monitoring Board members (Marc Atkins, PhD; Marisa Domino, PhD; and Craig Newschaffer, PhD). This work is funded through the National Institute of Mental Health P50 MH113840 (principal investigators: RB, DM, and KV).
Abbreviations
- ADAPT
Assisting Depressed Adults in Primary care Treatment
- ALACRITY
Advanced Laboratories for Accelerating the Reach and Impact of Treatments for Youth and Adults with Mental Illness
- ASSISTS
App for Strengthening Services In Specialized Therapeutic Support
- CBH
Community Behavioral Health
- DBHIDS
Department of Behavioral Health and Intellectual disAbility Services
- EBP
evidence-based practice
- MOTIVATE
Motivating Outpatient Therapists to Implement: Valuing a Team Effort
- TSS
therapeutic support staff
Footnotes
Authors' Contributions: All authors contributed to the conceptualization and design of the proposed work (RB, AB, MC, ZC, JF, EBH, AL, DM, SM, MO, MP, RS, KV, NW, and KZ). RB, DM, and KV are principal investigators and are responsible for all center activities. AB, SM, and NW are the project directors for the Methods Core. SM, MO, and KV are project directors for Project 1. DM and MP are project directors for Project 2. RB, RS, and NW are project directors for Project 3. RB and DM drafted the initial manuscript. All authors read, provided critical feedback and editing, and approved the final manuscript.
Conflicts of Interest: None declared.
References
- 1.Patel V, Chisholm D, Parikh R, Charlson FJ, Degenhardt L, Dua T, Ferrari AJ, Hyman S, Laxminarayan R, Levin C, Lund C, Medina Mora ME, Petersen I, Scott J, Shidhaye R, Vijayakumar L, Thornicroft G, Whiteford H, DCP MNS Author Group Addressing the burden of mental, neurological, and substance use disorders: key messages from Disease Control Priorities, 3rd edition. Lancet. 2016 Apr 16;387(10028):1672–85. doi: 10.1016/S0140-6736(15)00390-6. [DOI] [PubMed] [Google Scholar]
- 2.Walker ER, McGee RE, Druss BG. Mortality in mental disorders and global disease burden implications: a systematic review and meta-analysis. JAMA Psychiatry. 2015 Apr;72(4):334–41. doi: 10.1001/jamapsychiatry.2014.2502. http://europepmc.org/abstract/MED/25671328 .2110027 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Insel TR. Assessing the economic costs of serious mental illness. Am J Psychiatry. 2008 Jun;165(6):663–5. doi: 10.1176/appi.ajp.2008.08030366.165/6/663 [DOI] [PubMed] [Google Scholar]
- 4.Evans TS, Berkman N, Brown C, Gaynes B, Weber RP. Agency for Healthcare Research and Quality. 2016. [2018-12-08]. Disparities within serious mental illness https://effectivehealthcare.ahrq.gov/sites/default/files/pdf/mental-illness-disparities_technical-brief.pdf . [PubMed]
- 5.Girlanda F, Fiedler I, Becker T, Barbui C, Koesters M. The evidence-practice gap in specialist mental healthcare: systematic review and meta-analysis of guideline implementation studies. Br J Psychiatry. 2017 Jan;210(1):24–30. doi: 10.1192/bjp.bp.115.179093.S000712500028063X [DOI] [PubMed] [Google Scholar]
- 6.Melendez-Torres GJ, Dickson K, Fletcher A, Thomas J, Hinds K, Campbell R, Murphy S, Bonell C. Systematic review and meta-analysis of effects of community-delivered positive youth development interventions on violence outcomes. J Epidemiol Community Health. 2016 Dec;70(12):1171–7. doi: 10.1136/jech-2015-206132.jech-2015-206132 [DOI] [PubMed] [Google Scholar]
- 7.Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, Griffey R, Hensley M. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. 2011 Mar;38(2):65–76. doi: 10.1007/s10488-010-0319-7. http://europepmc.org/abstract/MED/20957426 . [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011 Jan;38(1):4–23. doi: 10.1007/s10488-010-0327-7. http://europepmc.org/abstract/MED/21197565 . [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009 Aug 7;4:50. doi: 10.1186/1748-5908-4-50. http://www.implementationscience.com/content/4//50 .1748-5908-4-50 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Balas EA, Boren SA. Managing clinical knowledge for health care improvement. Yearb Med Inform. 2000;(1):65–70.me00010065 [PubMed] [Google Scholar]
- 11.Williams NJ, Beidas RS. Annual research review: the state of implementation science in child psychology and psychiatry: a review and suggestions to advance the field. J Child Psychol Psychiatry. 2018 Aug 25; doi: 10.1111/jcpp.12960. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Beidas RS, Edmunds JM, Marcus SC, Kendall PC. Training and consultation to promote implementation of an empirically supported treatment: a randomized trial. Psychiatr Serv. 2012 Jul;63(7):660–5. doi: 10.1176/appi.ps.201100401. http://europepmc.org/abstract/MED/22549401 . [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Miller WR, Yahne CE, Moyers TB, Martinez J, Pirritano M. A randomized trial of methods to help clinicians learn motivational interviewing. J Consult Clin Psychol. 2004 Dec;72(6):1050–62. doi: 10.1037/0022-006X.72.6.1050.2004-21587-015 [DOI] [PubMed] [Google Scholar]
- 14.Sholomskas DE, Syracuse-Siewert G, Rounsaville BJ, Ball SA, Nuro KF, Carroll KM. We don't train in vain: a dissemination trial of three strategies of training clinicians in cognitive-behavioral therapy. J Consult Clin Psychol. 2005 Feb;73(1):106–15. doi: 10.1037/0022-006X.73.1.106. http://europepmc.org/abstract/MED/15709837 .2005-01321-013 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Beidas RS, Kendall PC. Training therapists in evidence-based practice: a critical review of studies from a systems-contextual perspective. Clin Psychol (New York) 2010 Mar;17(1):1–30. doi: 10.1111/j.1468-2850.2009.01187.x. http://europepmc.org/abstract/MED/20877441 . [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Herschell AD, Kolko DJ, Baumann BL, Davis AC. The role of therapist training in the implementation of psychosocial treatments: a review and critique with recommendations. Clin Psychol Rev. 2010 Jun;30(4):448–66. doi: 10.1016/j.cpr.2010.02.005. http://europepmc.org/abstract/MED/20304542 .S0272-7358(10)00040-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004;82(4):581–629. doi: 10.1111/j.0887-378X.2004.00325.x. http://europepmc.org/abstract/MED/15595944 .MILQ325 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Locke J, Wolk CB, Harker C, Olsen A, Shingledecker T, Barg F, Mandell D, Beidas R. Pebbles, rocks, and boulders: the implementation of a school-based social engagement intervention for children with autism. Autism. 2017 Nov;21(8):985–94. doi: 10.1177/1362361316664474. http://europepmc.org/abstract/MED/28954537 . [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Palinkas LA, Um MY, Jeong CH, Chor KH, Olin S, Horwitz SM, Hoagwood KE. Adoption of innovative and evidence-based practices for children and adolescents in state-supported mental health clinics: a qualitative study. Health Res Policy Syst. 2017 Mar 29;15(1):27. doi: 10.1186/s12961-017-0190-z. https://health-policy-systems.biomedcentral.com/articles/10.1186/s12961-017-0190-z .10.1186/s12961-017-0190-z [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Stein BD, Celedonia KL, Kogan JN, Swartz HA, Frank E. Facilitators and barriers associated with implementation of evidence-based psychotherapy in community settings. Psychiatr Serv. 2013 Dec 1;64(12):1263–6. doi: 10.1176/appi.ps.201200508. http://europepmc.org/abstract/MED/24292731 .1784232 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Beidas RS, Stewart RE, Adams DR, Fernandez T, Lustbader S, Powell BJ, Aarons GA, Hoagwood KE, Evans AC, Hurford MO, Rubin R, Hadley T, Mandell DS, Barg FK. A multi-level examination of stakeholder perspectives of implementation of evidence-based practices in a large urban publicly-funded mental health system. Adm Policy Ment Health. 2016 Nov;43(6):893–908. doi: 10.1007/s10488-015-0705-2. http://europepmc.org/abstract/MED/26658692 .10.1007/s10488-015-0705-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Higa-McMillan CK, Nakamura BJ, Morris A, Jackson DS, Slavin L. Predictors of use of evidence-based practices for children and adolescents in usual care. Adm Policy Ment Health. 2015 Jul;42(4):373–83. doi: 10.1007/s10488-014-0578-9. [DOI] [PubMed] [Google Scholar]
- 23.Beidas RS, Marcus S, Aarons GA, Hoagwood KE, Schoenwald S, Evans AC, Hurford MO, Hadley T, Barg FK, Walsh LM, Adams DR, Mandell DS. Predictors of community therapists' use of therapy techniques in a large public mental health system. JAMA Pediatr. 2015 Apr;169(4):374–82. doi: 10.1001/jamapediatrics.2014.3736. http://europepmc.org/abstract/MED/25686473 .2118581 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Beidas R, Skriner L, Adams D, Wolk CB, Stewart RE, Becker-Haimes E, Williams N, Maddox B, Rubin R, Weaver S, Evans A, Mandell D, Marcus SC. The relationship between consumer, clinician, and organizational characteristics and use of evidence-based and non-evidence-based therapy strategies in a public mental health system. Behav Res Ther. 2017 Dec;99:1–10. doi: 10.1016/j.brat.2017.08.011.S0005-7967(17)30169-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Benjamin Wolk C, Marcus SC, Weersing VR, Hawley KM, Evans AC, Hurford MO, Beidas RS. Therapist- and client-level predictors of use of therapy techniques during implementation in a large public mental health system. Psychiatr Serv. 2016 May 1;67(5):551–7. doi: 10.1176/appi.ps.201500022. http://europepmc.org/abstract/MED/26876658 . [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Brookman-Frazee L, Haine RA, Baker-Ericzén M, Zoffness R, Garland AF. Factors associated with use of evidence-based practice strategies in usual care youth psychotherapy. Adm Policy Ment Health. 2010 May;37(3):254–69. doi: 10.1007/s10488-009-0244-9. http://europepmc.org/abstract/MED/19795204 . [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Becker GS. The Economic Approach to Human Behavior. Chicago, IL: The University of Chicago Press; 1978. [Google Scholar]
- 28.Fiske S, Taylor SE. Social Cognition: From Brains to Culture. Thousand Oaks, CA: SAGE Publications; 2013. [Google Scholar]
- 29.Tversky A, Kahneman D. The framing of decisions and the psychology of choice. Science. 1981 Jan 30;211(4481):453–8. doi: 10.1126/science.7455683. [DOI] [PubMed] [Google Scholar]
- 30.Kahneman D, Tversky A. Prospect theory: an analysis of decision under risk. Econometrica. 1979;47:263–91. doi: 10.2307/1914185. [DOI] [Google Scholar]
- 31.Volpp KG, Troxel AB, Pauly MV, Glick HA, Puig A, Asch DA, Galvin R, Zhu J, Wan F, DeGuzman J, Corbett E, Weiner J, Audrain-McGovern J. A randomized, controlled trial of financial incentives for smoking cessation. N Engl J Med. 2009 Feb 12;360(7):699–709. doi: 10.1056/NEJMsa0806819.360/7/699 [DOI] [PubMed] [Google Scholar]
- 32.Patel MS, Day SC, Halpern SD, Hanson CW, Martinez JR, Honeywell Jr S, Volpp KG. Generic medication prescription rates after health system-wide redesign of default options within the electronic health record. JAMA Intern Med. 2016 Dec 1;176(6):847–8. doi: 10.1001/jamainternmed.2016.1691.2520677 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Doshi JA, Lim R, Li P, Young PP, Lawnicki VF, State JJ, Troxel AB, Volpp KG. A synchronized prescription refill program improved medication adherence. Health Aff (Millwood) 2016 Aug 1;35(8):1504–12. doi: 10.1377/hlthaff.2015.1456.35/8/1504 [DOI] [PubMed] [Google Scholar]
- 34.Halpern SD, French B, Small DS, Saulsgiver K, Harhay MO, Audrain-McGovern J, Loewenstein G, Brennan TA, Asch DA, Volpp KG. Randomized trial of four financial-incentive programs for smoking cessation. N Engl J Med. 2015;372:2108–17. doi: 10.1056/NEJMoa1414293. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 35.Michie S, Richardson M, Johnston M, Abraham C, Francis J, Hardeman W, Eccles MP, Cane J, Wood CE. The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: building an international consensus for the reporting of behavior change interventions. Ann Behav Med. 2013 Aug;46(1):81–95. doi: 10.1007/s12160-013-9486-6. [DOI] [PubMed] [Google Scholar]
- 36.Chambers DA, Azrin ST. Research and services partnerships: partnership: a fundamental component of dissemination and implementation research. Psychiatr Serv. 2013 Jun;64(6):509–11. doi: 10.1176/appi.ps.201300032.1691166 [DOI] [PubMed] [Google Scholar]
- 37.Pellecchia M, Mandell DS, Nuske HJ, Azad G, Benjamin Wolk C, Maddox BB, Reisinger EM, Skriner LC, Adams DR, Stewart R, Hadley T, Beidas RS. Community-academic partnerships in implementation research. J Community Psychol. 2018;46(7):941–52. doi: 10.1002/jcop.21981. [DOI] [PubMed] [Google Scholar]
- 38.Mitchell V, Ross T, May A, Sims R, Parker C. Empirical investigation of the impact of using co-design methods when generating proposals for sustainable travel solutions. CoDesign. 2016;12(4):205–20. doi: 10.1080/15710882.2015.1091894. [DOI] [Google Scholar]
- 39.Lewis CC, Klasnja P, Powell BJ, Lyon AR, Tuzzio L, Jones S, Walsh-Bailey C, Weiner B. From classification to causality: advancing understanding of mechanisms of change in implementation science. Front Public Health. 2018 May 7;6:136. doi: 10.3389/fpubh.2018.00136. doi: 10.3389/fpubh.2018.00136. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.Williams NJ. Multilevel mechanisms of implementation strategies in mental health: integrating theory, research, and practice. Adm Policy Ment Health. 2016 Sep 1;43(5):783–98. doi: 10.1007/s10488-015-0693-2. http://europepmc.org/abstract/MED/26474761 .10.1007/s10488-015-0693-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 41.Weiner BJ, Lewis CC, Stanick C, Powell BJ, Dorsey CN, Clary AS, Boynton MH, Halko H. Psychometric assessment of three newly developed implementation outcome measures. Implement Sci. 2017 Aug 29;12(1):108. doi: 10.1186/s13012-017-0635-3. https://implementationscience.biomedcentral.com/articles/10.1186/s13012-017-0635-3 .10.1186/s13012-017-0635-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Kangovi S, Asch DA. Behavioral phenotyping in health promotion: embracing or avoiding failure. J Am Med Assoc. 2018 May 22;319(20):2075–6. doi: 10.1001/jama.2018.2921.2680117 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.National Institutes of Health. 2016. [2018-09-05]. Advanced Laboratories for Accelerating the Reach and Impact of Treatments for Youth and Adults with Mental Illness (ALACRITY) Research Centers (P50) https://grants.nih.gov/grants/guide/pa-files/PAR-16-354.html .
- 44.Terwiesch C, Ulrich K. Innovation Tournaments: Creating and Selecting Exceptional Opportunities. Boston, MA: Harvard Business Review Press; 2009. [Google Scholar]
- 45.Terwiesch C, Mehta SJ, Volpp KG. Innovating in health delivery: the Penn Medicine innovation tournament. Healthc (Amst) 2013 Jun;1(1-2):37–41. doi: 10.1016/j.hjdsi.2013.05.003.S2213-0764(13)00016-X [DOI] [PubMed] [Google Scholar]
- 46.Tantia P. The new science of designing for humans. Stanford Social Innovation Review. 2017;15(2):29–33. https://ssir.org/articles/entry/the_new_science_of_designing_for_humans . [Google Scholar]
- 47.Datta S, Mullainathan S. Behavioral design: a new approach to development policy. Rev Income Wealth. 2014 Feb 4;60(1):7–35. doi: 10.1111/roiw.12093. [DOI] [Google Scholar]
- 48.Service O, Hallsworth M, Halpern D, Algate F, Gallagher R, Nguyen S, Ruda S, Sanders M, Pelenur M, Gyani A, Harper H, Reinhard J, Kirkman E. The Behavioural Insights Team. 2014. [2018-12-08]. EAST: Four Simple Ways to Apply Behavioural Insights https://www.behaviouralinsights.co.uk/publications/east-four-simple-ways-to-apply-behavioural-insights/
- 49.Asch DA, Rosin R. Innovation as discipline, not fad. N Engl J Med. 2015 Aug 13;373(7):592–4. doi: 10.1056/NEJMp1506311. [DOI] [PubMed] [Google Scholar]
- 50.Zahran S. Software Process Improvement: Practical Guidelines for Business Success. Harlow, UK: Pearson Education Limited; 1998. [Google Scholar]
- 51.Lancsar E, Louviere J. Conducting discrete choice experiments to inform healthcare decision making: a user's guide. Pharmacoeconomics. 2008;26(8):661–77. doi: 10.2165/00019053-200826080-00004.2684 [DOI] [PubMed] [Google Scholar]
- 52.Louviere JJ, Islam T, Wasi N, Street D, Burgess L. Designing discrete choice experiments: Do optimal designs come at a price? J Consum Res. 2008 Aug 1;35(2):360–75. doi: 10.1086/586913. [DOI] [Google Scholar]
- 53.Salloum RG, Shenkman EA, Louviere JJ, Chambers DA. Application of discrete choice experiments to enhance stakeholder engagement as a strategy for advancing implementation: a systematic review. Implement Sci. 2017 Nov 23;12(1):140. doi: 10.1186/s13012-017-0675-8. https://implementationscience.biomedcentral.com/articles/10.1186/s13012-017-0675-8 .10.1186/s13012-017-0675-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 54.Ajzen I. The theory of planned behavior. Organ Behav Hum Decis Process. 1991 Dec;50(2):179–211. doi: 10.1016/0749-5978(91)90020-T. [DOI] [Google Scholar]
- 55.Bandura A. Self-efficacy: toward a unifying theory of behavioral change. Psychol Rev. 1977;84(2):191–215. doi: 10.1037/0033-295X.84.2.191. [DOI] [PubMed] [Google Scholar]
- 56.O'Donoghue T, Rabin M. Doing it now or later. Am Econ Rev. 1999;89(1):103–24. doi: 10.1257/aer.89.1.103. [DOI] [Google Scholar]
- 57.Cialdini RB. Influence: Science and Practice. Boston, MA: Allyn & Bacon; 2000. [Google Scholar]
- 58.Glisson C, Schoenwald SK, Hemmelgarn A, Green P, Dukes D, Armstrong KS, Chapman JE. Randomized trial of MST and ARC in a two-level evidence-based treatment implementation strategy. J Consult Clin Psychol. 2010 Aug;78(4):537–50. doi: 10.1037/a0019160. http://europepmc.org/abstract/MED/20658810 .2010-14877-009 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 59.Proctor EK, Landsverk J, Aarons G, Chambers D, Glisson C, Mittman B. Implementation research in mental health services: an emerging science with conceptual, methodological, and training challenges. Adm Policy Ment Health. 2009 Jan;36(1):24–34. doi: 10.1007/s10488-008-0197-4. http://europepmc.org/abstract/MED/19104929 . [DOI] [PMC free article] [PubMed] [Google Scholar]
- 60.Ehrhart MG, Aarons GA, Farahnak LR. Assessing the organizational context for EBP implementation: the development and validity testing of the Implementation Climate Scale (ICS) Implement Sci. 2014 Oct 23;9:157. doi: 10.1186/s13012-014-0157-1. https://implementationscience.biomedcentral.com/articles/10.1186/s13012-014-0157-1 .s13012-014-0157-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 61.Aarons GA, Ehrhart MG, Farahnak LR, Sklar M. Aligning leadership across systems and organizations to develop a strategic climate for evidence-based practice implementation. Annu Rev Public Health. 2014;35:255–74. doi: 10.1146/annurev-publhealth-032013-182447. http://europepmc.org/abstract/MED/24641560 . [DOI] [PMC free article] [PubMed] [Google Scholar]
- 62.Green AE, Albanese BJ, Cafri G, Aarons GA. Leadership, organizational climate, and working alliance in a children's mental health service system. Community Ment Health J. 2014 Oct;50(7):771–7. doi: 10.1007/s10597-013-9668-5. http://europepmc.org/abstract/MED/24323137 . [DOI] [PMC free article] [PubMed] [Google Scholar]
- 63.Camerer CF, Malmendier U. Behavioral Economics and Its Applications. Princeton, NJ: Princeton University Press; 2007. Behavioral Economics of Organizations; pp. 235–90. [Google Scholar]
- 64.Kazdin AE. Mediators and mechanisms of change in psychotherapy research. Annu Rev Clin Psychol. 2007;3:1–27. doi: 10.1146/annurev.clinpsy.3.022806.091432. [DOI] [PubMed] [Google Scholar]
- 65.Zhang Z, Zyphur MJ, Preacher KJ. Testing multilevel mediation using hierarchical linear models: problems and solutions. Organ Res Methods. 2008;12(4):695–719. doi: 10.1177/1094428108327450. [DOI] [Google Scholar]
- 66.Scott MA, Simonoff JS, Marx BD, editors. The SAGE Handbook of Multilevel Modeling. London, England: Sage Publications; 2013. [Google Scholar]
- 67.Raudenbush SW, Bryk AS. Hierarchical Linear Models: Applications and Data Analysis Methods. Thousand Oaks, CA: Sage Publications; 2001. [Google Scholar]
- 68.Williams NJ, Glisson C, Hemmelgarn A, Green P. Mechanisms of change in the ARC organizational strategy: increasing mental health clinicians' EBP adoption through improved organizational culture and capacity. Adm Policy Ment Health. 2017 Mar;44(2):269–83. doi: 10.1007/s10488-016-0742-5. http://europepmc.org/abstract/MED/27236457 .10.1007/s10488-016-0742-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 69.Cohen J. Statistical Power Analysis for the Behavioral Sciences. Hillsdale, NJ: Lawrence Erlbaum Associates; 1988. [Google Scholar]
- 70.Cohen J. A power primer. Psychol Bull. 1992 Jul;112(1):155–9. doi: 10.1037/0033-2909.112.1.155. [DOI] [PubMed] [Google Scholar]
- 71.Scherbaum CA, Ferreter JM. Estimating statistical power and required sample sizes for organizational research using multilevel modeling. Organ Res Methods. 2008 Apr 8;12(2):347–67. doi: 10.1177/1094428107308906. [DOI] [Google Scholar]
- 72.Snijders T. Power and Sample Size in Multilevel Linear Models. In: Everitt B, Howell D, editors. Encyclopedia of Statistics in Behavioral Science. Chichester, UK: Wiley; 2005. pp. 1570–3. [Google Scholar]
- 73.Preacher KJ, Zyphur MJ, Zhang Z. A general multilevel SEM framework for assessing multilevel mediation. Psychol Methods. 2010 Sep;15(3):209–33. doi: 10.1037/a0020141.2010-18042-001 [DOI] [PubMed] [Google Scholar]
- 74.Demyttenaere K, Andersen HF, Reines EH. Impact of escitalopram treatment on quality of life enjoyment and satisfaction questionnaire scores in major depressive disorder and generalized anxiety disorder. Int Clin Psychopharmacol. 2008 Sep;23(5):276–86. doi: 10.1097/YIC.0b013e328303ac5f.00004850-200809000-00005 [DOI] [PubMed] [Google Scholar]
- 75.Charbonneau A, Rosen AK, Owen RR, Spiro 3rd A, Ash AS, Miller DR, Kazis L, Kader B, Cunningham F, Berlowitz DR. Monitoring depression care: in search of an accurate quality indicator. Med Care. 2004 Jun;42(6):522–31. doi: 10.1097/01.mlr.0000127999.89246.a6.00005650-200406000-00004 [DOI] [PubMed] [Google Scholar]
- 76.Melfi CA, Chawla AJ, Croghan TW, Hanna MP, Kennedy S, Sredl K. The effects of adherence to antidepressant treatment guidelines on relapse and recurrence of depression. Arch Gen Psychiatry. 1998 Dec;55(12):1128–32. doi: 10.1001/archpsyc.55.12.1128. [DOI] [PubMed] [Google Scholar]
- 77.Miranda J, Chung JY, Green BL, Krupnick J, Siddique J, Revicki DA, Belin T. Treating depression in predominantly low-income young minority women: a randomized controlled trial. J Am Med Assoc. 2003 Jul 2;290(1):57–65. doi: 10.1001/jama.290.1.57.290/1/57 [DOI] [PubMed] [Google Scholar]
- 78.Cantrell CR, Eaddy MT, Shah MB, Regan TS, Sokol MC. Methods for evaluating patient adherence to antidepressant therapy: a real-world comparison of adherence and economic outcomes. Med Care. 2006 Apr;44(4):300–3. doi: 10.1097/01.mlr.0000204287.82701.9b.00005650-200604000-00002 [DOI] [PubMed] [Google Scholar]
- 79.Olfson M, Marcus SC, Tedeschi M, Wan GJ. Continuity of antidepressant treatment for adults with depression in the United States. Am J Psychiatry. 2006 Jan;163(1):101–8. doi: 10.1176/appi.ajp.163.1.101.163/1/101 [DOI] [PubMed] [Google Scholar]
- 80.Kreuter MW, Bernhardt JM. Reframing the dissemination challenge: a marketing and distribution perspective. Am J Public Health. 2009 Dec;99(12):2123–7. doi: 10.2105/AJPH.2008.155218.AJPH.2008.155218 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 81.Marcus AC, Kaplan CP, Crane LA, Berek JS, Bernstein G, Gunning JE, McClatchey MW. Reducing loss-to-follow-up among women with abnormal Pap smears. Results from a randomized trial testing an intensive follow-up protocol and economic incentives. Med Care. 1998 Mar;36(3):397–410. doi: 10.1097/00005650-199803000-00015. [DOI] [PubMed] [Google Scholar]
- 82.Kullgren JT, Troxel AB, Loewenstein G, Asch DA, Norton LA, Wesby L, Tao Y, Zhu J, Volpp KG. Individual- versus group-based financial incentives for weight loss: a randomized, controlled trial. Ann Intern Med. 2013 Apr 2;158(7):505–14. doi: 10.7326/0003-4819-158-7-201304020-00002. http://europepmc.org/abstract/MED/23546562 .1671710 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 83.John LK, Loewenstein G, Troxel AB, Norton L, Fassbender JE, Volpp KG. Financial incentives for extended weight loss: a randomized, controlled trial. J Gen Intern Med. 2011 Jun;26(6):621–6. doi: 10.1007/s11606-010-1628-y. http://europepmc.org/abstract/MED/21249462 . [DOI] [PMC free article] [PubMed] [Google Scholar]
- 84.Volpp KG, John LK, Troxel AB, Norton L, Fassbender J, Loewenstein G. Financial incentive-based approaches for weight loss: a randomized trial. J Am Med Assoc. 2008 Dec 10;300(22):2631–7. doi: 10.1001/jama.2008.804. http://europepmc.org/abstract/MED/19066383 .300/22/2631 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 85.Volpp KG, Loewenstein G, Troxel AB, Doshi J, Price M, Laskin M, Kimmel SE. A test of financial incentives to improve warfarin adherence. BMC Health Serv Res. 2008 Dec 23;8:272. doi: 10.1186/1472-6963-8-272. https://bmchealthservres.biomedcentral.com/articles/10.1186/1472-6963-8-272 .1472-6963-8-272 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 86.Kimmel SE, Troxel AB, Loewenstein G, Brensinger CM, Jaskowiak J, Doshi JA, Laskin M, Volpp K. Randomized trial of lottery-based incentives to improve warfarin adherence. Am Heart J. 2012 Aug;164(2):268–74. doi: 10.1016/j.ahj.2012.05.005. http://europepmc.org/abstract/MED/22877814 .S0002-8703(12)00321-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 87.Asch DA, Troxel AB, Stewart WF, Sequist TD, Jones JB, Hirsch AG, Hoffer K, Zhu J, Wang W, Hodlofski A, Frasch AB, Weiner MG, Finnerty DD, Rosenthal MB, Gangemi K, Volpp KG. Effect of financial incentives to physicians, patients, or both on lipid levels: a randomized clinical trial. J Am Med Assoc. 2015 Nov 10;314(18):1926–35. doi: 10.1001/jama.2015.14850. http://europepmc.org/abstract/MED/26547464 .2468891 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 88.Petry NM, Alessi SM, Byrne S, White WB. Reinforcing adherence to antihypertensive medications. J Clin Hypertens (Greenwich) 2015 Jan;17(1):33–8. doi: 10.1111/jch.12441. doi: 10.1111/jch.12441. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 89.Akincigil A, Bowblis JR, Levin C, Walkup JT, Jan S, Crystal S. Adherence to antidepressant treatment among privately insured patients diagnosed with depression. Med Care. 2007 Apr;45(4):363–9. doi: 10.1097/01.mlr.0000254574.23418.f6. http://europepmc.org/abstract/MED/17496721 .00005650-200704000-00014 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 90.Nierenberg AA, Farabaugh AH, Alpert JE, Gordon J, Worthington JJ, Rosenbaum JF, Fava M. Timing of onset of antidepressant response with fluoxetine treatment. Am J Psychiatry. 2000 Sep;157(9):1423–8. doi: 10.1176/appi.ajp.157.9.1423. [DOI] [PubMed] [Google Scholar]
- 91.Loewenstein G, Asch DA, Volpp KG. Behavioral economics holds potential to deliver better results for patients, insurers, and employers. Health Aff (Millwood) 2013 Jul;32(7):1244–50. doi: 10.1377/hlthaff.2012.1163.32/7/1244 [DOI] [PubMed] [Google Scholar]
- 92.Loewenstein G, Brennan T, Volpp KG. Asymmetric paternalism to improve health behaviors. J Am Med Assoc. 2007 Nov 28;298(20):2415–7. doi: 10.1001/jama.298.20.2415.298/20/2415 [DOI] [PubMed] [Google Scholar]
- 93.Petry NM, Peirce JM, Stitzer ML, Blaine J, Roll JM, Cohen A, Obert J, Killeen T, Saladin ME, Cowell M, Kirby KC, Sterling R, Royer-Malvestuto C, Hamilton J, Booth RE, Macdonald M, Liebert M, Rader L, Burns R, DiMaria J, Copersino M, Stabile PQ, Kolodner K, Li R. Effect of prize-based incentives on outcomes in stimulant abusers in outpatient psychosocial treatment programs: a national drug abuse treatment clinical trials network study. Arch Gen Psychiatry. 2005 Oct;62(10):1148–56. doi: 10.1001/archpsyc.62.10.1148.62/10/1148 [DOI] [PubMed] [Google Scholar]
- 94.Camerer C, Hua Ho T. Experience-weighted attraction learning in normal form games. Econometrica. 1999 Jul;67(4):827–74. doi: 10.1111/1468-0262.00054. [DOI] [Google Scholar]
- 95.Giles EL, Robalino S, Sniehotta FF, Adams J, McColl E. Acceptability of financial incentives for encouraging uptake of healthy behaviours: a critical review using systematic methods. Prev Med. 2015 Apr;73:145–58. doi: 10.1016/j.ypmed.2014.12.029.S0091-7435(14)00511-8 [DOI] [PubMed] [Google Scholar]
- 96.Brehaut JC, O'Connor AM, Wood TJ, Hack TF, Siminoff L, Gordon E, Feldman-Stewart D. Validation of a decision regret scale. Med Decis Making. 2003;23(4):281–92. doi: 10.1177/0272989X03256005. [DOI] [PubMed] [Google Scholar]
- 97.Myers SM, Johnson CP, American Academy of Pediatrics Council on Children With Disabilities Management of children with autism spectrum disorders. Pediatrics. 2007 Nov;120(5):1162–82. doi: 10.1542/peds.2007-2362.peds.2007-2362 [DOI] [PubMed] [Google Scholar]
- 98.Azad GF, Locke J, Downey MM, Xie M, Mandell DS. One-to-one assistant engagement in autism support classrooms. Teach Educ Spec Educ. 2015 Nov 1;38(4):337–46. doi: 10.1177/0888406415603208. http://europepmc.org/abstract/MED/26807003 . [DOI] [PMC free article] [PubMed] [Google Scholar]
- 99.Fisher M, Pleasants SL. Roles, responsibilities, and concerns of paraeducators: findings from a statewide survey. Remedial Spec Educ. 2011 Feb;33(5):287–97. doi: 10.1177/0741932510397762. [DOI] [Google Scholar]
- 100.Koegel RL, Kim S, Koegel LK. Training paraprofessionals to improve socialization in students with ASD. J Autism Dev Disord. 2014 Sep;44(9):2197–208. doi: 10.1007/s10803-014-2094-x. http://europepmc.org/abstract/MED/24671749 . [DOI] [PMC free article] [PubMed] [Google Scholar]
- 101.Young HE, Falco RA, Hanita M. Randomized, controlled trial of a comprehensive program for young students with autism spectrum disorder. J Autism Dev Disord. 2016 Feb;46(2):544–60. doi: 10.1007/s10803-015-2597-0.10.1007/s10803-015-2597-0 [DOI] [PubMed] [Google Scholar]
- 102.Odom SL, Boyd BA, Hall LJ, Hume K. Evaluation of comprehensive treatment models for individuals with autism spectrum disorders. J Autism Dev Disord. 2010 Apr;40(4):425–36. doi: 10.1007/s10803-009-0825-1. [DOI] [PubMed] [Google Scholar]
- 103.Simon HA. Bounded rationality and organizational learning. Organ Sci. 1991;2(1):125–34. doi: 10.1287/orsc.2.1.125. [DOI] [Google Scholar]
- 104.Beidas RS, Aarons G, Barg F, Evans A, Hadley T, Hoagwood K, Marcus S, Schoenwald S, Walsh L, Mandell DS. Policy to implementation: evidence-based practice in community mental health--study protocol. Implement Sci. 2013 Mar 24;8:38. doi: 10.1186/1748-5908-8-38. https://implementationscience.biomedcentral.com/articles/10.1186/1748-5908-8-38 .1748-5908-8-38 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 105.Stewart RE, Adams DR, Mandell DS, Hadley TR, Evans AC, Rubin R, Erney J, Neimark G, Hurford MO, Beidas RS. The perfect storm: collision of the business of mental health and the implementation of evidence-based practices. Psychiatr Serv. 2016 Feb;67(2):159–61. doi: 10.1176/appi.ps.201500392. http://europepmc.org/abstract/MED/26522680 . [DOI] [PMC free article] [PubMed] [Google Scholar]
- 106.Chou AF, Wallace N, Bloom JR, Hu TW. Variation in outpatient mental health service utilization under capitation. J Ment Health Policy Econ. 2005 Mar;8(1):3–14. [PubMed] [Google Scholar]
- 107.Hadley TR. Financing changes and their impact on the organization of the public mental health system. Adm Policy Ment Health. 1996 May;23(5):393–405. doi: 10.1007/BF02112900. [DOI] [Google Scholar]
- 108.Honberg R, Kimball A, Diehl S, Usher L, Fitzpatrick M. National Alliance on Mental Illness. 2011. Nov, [2018-12-10]. State Mental Health Cuts: The Continuing Crisis https://www.nami.org/getattachment/About-NAMI/Publications/Reports/StateMentalHealthCuts2.pdf .
- 109.Garner BR, Godley SH, Dennis ML, Hunter BD, Bair CM, Godley MD. Using pay for performance to improve treatment implementation for adolescent substance use disorders: results from a cluster randomized trial. Arch Pediatr Adolesc Med. 2012 Oct;166(10):938–44. doi: 10.1001/archpediatrics.2012.802.1308504 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 110.Garner BR, Godley SH, Bair CM. The impact of pay-for-performance on therapists' intentions to deliver high-quality treatment. J Subst Abuse Treat. 2011 Jul;41(1):97–103. doi: 10.1016/j.jsat.2011.01.012. http://europepmc.org/abstract/MED/21315539 .S0740-5472(11)00019-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 111.Shepard DS, Calabro JA, Love CT, McKay JR, Tetreault J, Yeom HS. Counselor incentives to improve client retention in an outpatient substance abuse aftercare program. Adm Policy Ment Health. 2006 Nov;33(6):629–35. doi: 10.1007/s10488-006-0054-2. [DOI] [PubMed] [Google Scholar]
- 112.Song Z, Rose S, Safran DG, Landon BE, Day M, Chernew ME. Harvard Library Office for Scholarly Communication. [2018-12-14]. Payment reform in Massachusetts: effect of global payment on heath care spending and quality 4 years into the alternative quality contract https://dash.harvard.edu/bitstream/handle/1/12407606/ZSong%20Honors%20Thesis%209-22-14%20Redacted%20Version%203%200.pdf?sequence=5 .
- 113.McWilliams JM, Chernew ME, Landon BE, Schwartz AL. Performance differences in year 1 of pioneer accountable care organizations. N Engl J Med. 2015 May 14;372:1927–36. doi: 10.1056/NEJMsa1414929. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 114.McWilliams JM, Hatfield LA, Chernew ME, Landon BE, Schwartz AL. Early performance of accountable care organizations in medicare. N Engl J Med. 2016 Jun 16;374(24):2357–66. doi: 10.1056/NEJMsa1600142. http://europepmc.org/abstract/MED/27075832 . [DOI] [PMC free article] [PubMed] [Google Scholar]
- 115.Acevedo A, Lee MT, Garnick DW, Horgan CM, Ritter GA, Panas L, Campbell K, Bean-Mortinson J. Agency-level financial incentives and electronic reminders to improve continuity of care after discharge from residential treatment and detoxification. Drug Alcohol Depend. 2018 Feb 1;183:192–200. doi: 10.1016/j.drugalcdep.2017.11.009.S0376-8716(17)30596-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 116.Stewart RE, Lareef I, Hadley TR, Mandell DS. Can we pay for performance in behavioral health care? Psychiatr Serv. 2017 Feb 1;68(2):109–11. doi: 10.1176/appi.ps.201600475. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 117.Garland AF, Haine-Schlagel R, Brookman-Frazee L, Baker-Ericzen M, Trask E, Fawley-King K. Improving community-based mental health care for children: translating knowledge into action. Adm Policy Ment Health. 2013 Jan;40(1):6–22. doi: 10.1007/s10488-012-0450-8. http://europepmc.org/abstract/MED/23212902 . [DOI] [PMC free article] [PubMed] [Google Scholar]
- 118.Pincus HA, Page AE, Druss B, Appelbaum PS, Gottlieb G, England MJ. Can psychiatry cross the quality chasm? Improving the quality of health care for mental and substance use conditions. Am J Psychiatry. 2007 May;164(5):712–9. doi: 10.1176/ajp.2007.164.5.712.164/5/712 [DOI] [PubMed] [Google Scholar]
- 119.Volpp KG, Asch DA, Galvin R, Loewenstein G. Redesigning employee health incentives--lessons from behavioral economics. N Engl J Med. 2011 Aug 4;365(5):388–90. doi: 10.1056/NEJMp1105966. http://europepmc.org/abstract/MED/21812669 . [DOI] [PMC free article] [PubMed] [Google Scholar]
- 120.Asch DA, Muller RW, Volpp KG. Automated hovering in health care--watching over the 5000 hours. N Engl J Med. 2012 Jul 5;367(1):1–3. doi: 10.1056/NEJMp1203869. [DOI] [PubMed] [Google Scholar]
- 121.Volpp KG, Pauly MV, Loewenstein G, Bangsberg D. P4P4P: an agenda for research on pay-for-performance for patients. Health Aff (Millwood) 2009;28(1):206–14. doi: 10.1377/hlthaff.28.1.206. http://europepmc.org/abstract/MED/19124872 .28/1/206 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 122.Halpern SD, Asch DA, Volpp KG. Commitment contracts as a way to health. BMJ. 2012 Jan 30;344:e522. doi: 10.1136/bmj.e522. http://europepmc.org/abstract/MED/22290100 . [DOI] [PMC free article] [PubMed] [Google Scholar]
- 123.Volpp KG, Loewenstein G, Asch DA. Assessing value in health care programs. J Am Med Assoc. 2012;307(20):2153–4. doi: 10.1001/jama.2012.3619. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 124.Volpp KG, Loewenstein G, Asch DA. Choosing wisely: low-value services, utilization, and patient cost sharing. J Am Med Assoc. 2012 Oct 24;308(16):1635–6. doi: 10.1001/jama.2012.13616. http://europepmc.org/abstract/MED/23093160 .1386618 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 125.Loewenstein G, Asch DA, Friedman JY, Melichar LA, Volpp KG. Can behavioural economics make us healthier? BMJ. 2012 May 23;344:e3482. doi: 10.1136/bmj.e3482. http://www.bmj.com/cgi/pmidlookup?view=long&pmid=22623635 . [DOI] [PubMed] [Google Scholar]
- 126.Loewenstein G, John LK, Volpp K. Using Decision Errors to Help People Help Themselves. In: Shafer E, editor. The Behavioral Foundations of Public Policy. Princeton, NJ: Princeton University Press; 2012. [Google Scholar]
- 127.Volpp KG, Loewenstein G. Using ideas from behavioral economics to promote improvements in health behaviors. In: Kahan S, Gielen AC, Fagan PJ, Green LW, editors. Health Behavior Change in Populations. Baltimore, MD: Johns Hopkins University Press; 2014. [Google Scholar]
- 128.Falk-Krzesinski HJ, Börner K, Contractor N, Fiore SM, Hall KL, Keyton J, Spring B, Stokols D, Trochim W, Uzzi B. Advancing the science of team science. Clin Transl Sci. 2010 Oct;3(5):263–6. doi: 10.1111/j.1752-8062.2010.00223.x. http://europepmc.org/abstract/MED/20973925 . [DOI] [PMC free article] [PubMed] [Google Scholar]