Abstract
Background
The public health impact of behavioral and biobehavioral interventions to prevent and treat mental health and substance use problems hinges on developing methods to strategically maximize their effectiveness, affordability, scalability, and efficiency.
Methods
The multiphase optimization strategy (MOST) is an innovative, principled framework that guides the development of multicomponent interventions. Each phase of MOST (Preparation, Optimization, Evaluation) has explicit goals and a range of appropriate research methods to achieve them. Methods for attaining Optimization and Evaluation phase goals are well-developed. However, methods used in the Preparation phase are often highly researcher-specific, and concrete ways to achieve Preparation phase goals are a priority area for further development.
Results
We propose that the discover, design, build, and test (DDBT) framework provides a theory-driven and methods-rich roadmap for achieving the goals of the Preparation phase of MOST, including specifying the conceptual model, identifying and testing candidate intervention components, and defining the optimization objective. The DDBT framework capitalizes on strategies from the field of human-centered design and implementation science to drive its data collection methods.
Conclusions
MOST and DDBT share many conceptual features, including an explicit focus on implementation determinants, being iterative and flexible, and designing interventions for the greatest public health impact. The proposed synthesized DDBT/MOST approach integrates DDBT into the Preparation phase of MOST thereby providing a framework for rigorous and efficient intervention development research to bolster the success of intervention optimization.
Plain Language Summary
1. What is already known about the topic? Optimizing behavioral interventions to balance effectiveness with affordability, scalability, and efficiency requires a significant investment in intervention development.
2. What does this paper add? This paper provides a structured approach to integrating human-centered design principles into the Preparation phase of the multiphase optimization strategy (MOST).
3. What are the implications for practice, research, or policy? The proposed synthesized model provides a framework for rigorous and efficient intervention development research in the Preparation phase of MOST that will ensure the success of intervention optimization and contribute to improving public health impact of mental health and substance use interventions.
Keywords: multiphase optimization strategy, intervention design < intervention, interventions < community based, mental health intervention, substance abuse prevention
Introduction
An optimization approach to building and adapting behavioral interventions prioritizes continual and quantifiable progress toward solving individual and societal problems, such as addressing the unmet need for mental health and substance use services, promoting critical health behaviors, and preventing the development of mental and physical health problems related to adverse childhood events. Intervention optimization is a process that seeks to identify an intervention that yields the best-expected outcome, given the real-world conditions imposed by the setting in which the intervention will be implemented (Collins, 2018). To increase the potential for public health impact, interventions should be designed for the contexts in which they will be used, prioritizing implementability as much as effectiveness. Often, the intervention design itself is overlooked as an implementation determinant, with more attention being paid to strategies to improve intervention adoption and sustainment (Lewis et al., 2021). Given the widely documented, nearly two-decade gap in time that it takes evidence-based interventions to develop in scientific laboratories and make their way to community settings (Estabrooks et al., 2018), some intervention developers are now adopting an implementation-forward approach. This approach means they are identifying implementation determinants at the outset and considering them throughout the intervention development process. Though adopting an implementation-forward orientation is a laudable goal, the challenge is finding a way to develop interventions for implementation contexts systematically and effectively.
Intervention optimization is an expedient way to develop interventions for successful implementation. The multiphase optimization strategy (MOST) is an innovative, principled framework that guides the development and optimization of multicomponent interventions such that they are effective, affordable, scalable, and efficient. (Collins, 2018). MOST outlines three phases that comprise the development of interventions: (a) Preparation, (b) Optimization, and (c) Evaluation. Each phase has a specific purpose and activities, but phase goals may be achieved using various research methods dictated by the research question and type of intervention being developed. In the Preparation phase, researchers derive/revise the conceptual model for the intervention, develop a set of candidate components, define the optimization objective of the intervention package (i.e., operational definition of the best-expected outcome under key constraints of the implementation setting), and conduct pilot testing of the candidate components. In the Optimization and Evaluation phase, researchers identify the experimental design that best matches the research question and can yield the most valuable scientific information with the least resources expended.
Research methods used to complete the Preparation phase activities are less clearly defined as these activities are commonly researcher/project-specific and based on theory and empirical evidence. However, the information gathered in the Preparation phase is critically important as it lays the foundation for the entire optimization process. Intervention optimization using the MOST framework requires a systematic approach overall, and the Preparation phase of MOST would benefit from more systematic yet flexible approaches to accomplish Preparation phase goals. Concrete ways to achieve Preparation phase activities, such as defining the optimization objective, which often aligns with implementation determinants informed by stakeholders, are a priority development area. In a recent commentary detailing key directions for MOST research in the next decade, Collins and colleagues noted that within the MOST framework, “approaches for obtaining information and managing input from stakeholders have not yet been developed” (Collins et al., 2021, p. 2003; see Windsor et al., 2021 for a notable exception).
We propose that the discover, design, build, and test (DDBT) framework, which centers on an array of human-centered design (HCD) strategies (e.g., cognitive walkthroughs, usability testing, direct observation; (Lyon, Brewer, et al., 2020) may serve as a systematic guide to accomplish the goals of the MOST Preparation phase including effectively designing novel or adapted intervention components in ways that improve their implementability from the outset. Although DDBT may be applicable across multiple phases of MOST, integrating DDBT into the Preparation phase of MOST provides a theory-driven and methods-rich roadmap for preparing a new or adapted intervention for optimization. In DDBT, HCD strategies can be used to achieve specific Preparation phase objectives, such as refining the conceptual model, identifying and designing candidate intervention components, specifying the optimization objective, and conducting pilot studies.
We begin with a brief overview of the two frameworks on which our argument is built—MOST and DDBT. We then discuss the overlap in core principles shared by MOST and DDBT approaches to intervention development. Next, we discuss practical applications of this synthesized approach by describing how the HCD strategies leveraged in DDBT can be used to address each Preparation phase goal. We present an applied example of a project using this synthesized approach to design a digital intervention for children coping with post-separation/divorce interparental conflict that is effective, affordable, scalable, and efficient in the context of the family court. The intention is to provide scaffolding for researchers interested in using DDBT to accomplish the goals of the MOST Preparation phase.
The MOST Framework
MOST is an engineering-inspired framework for developing behavioral or biobehavioral interventions that strategically balances intervention effectiveness with affordability, scalability, and efficiency (Collins, 2018). The guiding tenet of MOST is that interventions should be optimized to maximize efficacy and/or effectiveness within the constraints of a well-defined optimization objective, and only after an intervention has been optimized should it be evaluated in an RCT (Collins, 2018). Thus, “optimization” in MOST refers to a process of systematically determining (a) the most effective combination of intervention components that (b) the specified real-world implementation determinants as defined by the optimization objective. Intervention components are defined as any part of the intervention that can be separated for experimental study (Collins, 2018). A candidate component may include the intervention content (e.g., cognitive restructuring, behavioral activation), engagement strategies (e.g., coaching calls, text reminders), delivery of content structures (e.g., intervention algorithms, measurement-based care), or delivery fidelity approaches (e.g., clinician checklist, training, and supervision protocol). MOST studies often also include a “constant component” that is not under experimental control but is included in all conditions based on clinical or theoretical reasons (e.g., psychoeducation, established minimum best practice). When investigators build interventions based on knowledge of the relative contribution of each intervention component, the result is optimized interventions that are not only effective but also, over time, will be incrementally more efficient, affordable, and scalable (i.e., able to be implemented at scale without extensive adaptation). Notably, MOST aims to continually optimize interventions across time and settings, making them more effective and efficient with each iteration.
MOST is distinct from the classical treatment package approach to intervention development. The classical treatment package approach examines behavioral interventions as a package (i.e., the effect of all components collectively). It seeks to establish the efficacy of an intervention package through an RCT prior to making necessary adaptations to the intervention for it to be implemented in the intended treatment setting. Consequently, successful implementation often requires more than one resource-intensive RCT to establish preliminary efficacy, assess the effectiveness of the treatment package in the intended setting, and examine implementation strategies and outcomes (Aarons et al., 2017; Curran et al., 2012; Onken et al., 2014). Following this approach often takes more than 15 years to yield an effective and implementable intervention (Karlin & Cross, 2014).
In contrast, MOST prioritizes implementation objectives from the outset. It analyzes component effects to identify an optimized intervention comprising only components that meet the prespecified optimization objective before evaluating an intervention package in a resource-intensive RCT. See Collins (2018) and Guastaferro and Collins (2019) for a comprehensive introduction to MOST. Using MOST, intervention developers move iteratively through three phases (Preparation, Optimization, and Evaluation) to optimize an intervention package. In the Preparation phase, researchers (a) identify or develop a conceptual model to guide intervention component development; (b) identify, create, and/or refine candidate intervention components; (c) conduct pilot studies, as needed, to assess and improve the feasibility and acceptability of each intervention component before transitioning to the Optimization phase; and (d) identify the optimization objective for the intervention package (e.g., maximum cost per patient vs. highest effectiveness).
Then, in the Optimization phase, researchers conduct an optimization trial to identify components of the optimized intervention based on the estimated effects of those components (independently and in combination). Optimization trials are rigorous, randomized experiments that use various experimental designs (e.g., factorial trial, sequential multiple assignment randomized trial, micro-randomized trial) to collect the empirical information needed to optimize an intervention for efficacy and implementation. For example, a factorial experimental design allows researchers to examine the main and interaction effects for all intervention components considered for inclusion in the optimized intervention package. Although optimization trials typically focus on behavioral markers of change (e.g., weight loss, minutes of exercise), component effects could be conceptualized as any outcome that is of interest to the investigator, including implementation determinants (e.g., usability, engagement). Implementation determinants could be considered candidate intervention components in the Optimization phase if, for example, the purpose of the optimization trial is to assess implementability effects of design alternatives. As such, the Preparation and Optimization phases guide the systematic (and often iterative) development and refinement of an optimized intervention package to be evaluated in the final Evaluation phase (Freedland, 2020). At the end of the Optimization phase, the researcher has identified the optimized intervention package, considering the optimization objective.
Lastly, in the Evaluation phase, the optimized intervention package is evaluated against a suitable comparison, often but not always in a two-arm RCT. MOST includes an RCT experimental design for intervention evaluation like the classical treatment package approach. However, using MOST, the optimized intervention package subjected to experimentation in the RCT has been through two initial phases (Preparation and Optimization) focused on iteratively developing an affordable, effective, and scalable intervention package. The majority of intervention scientists will be familiar with many of the activities outlined in the Preparation phase. The advantage of MOST is to provide a framework to achieve these activities in a systematic and intentional way. As a result, researchers using MOST are able to incorporate real-world implementation constraints from the outset of intervention design rather than struggle to overcome these constraints later. In addition to the design of the intervention itself, the MOST framework applies to optimizing implementation strategies to promote the adoption and sustained use of interventions (Broder-Fingert et al., 2019). Ultimately, MOST researchers aim to illuminate ways to continually improve the effectiveness, affordability, scalability, and efficiency of the intervention.
MOST is a comprehensive and iterative implementation-forward framework for optimizing multicomponent interventions based on empirical information about component effects and real-world constraints (Guastaferro & Collins, 2021). Researchers across several public health priorities have used MOST to successfully optimize multicomponent interventions including, but not limited to, smoking cessation (Piper et al., 2016, 2018; Schlam et al., 2016), weight loss (Spring et al., 2020), and prevention of sexually transmitted infections in college students (Tanner et al., 2021; Wyrick et al., 2020).
The DDBT Model
The DDBT model was introduced as a framework to study and improve the usability of intervention components as a key determinant of intervention implementation (Lyon, 2019). Usability is the degree to which the individuals for whom an intervention is intended can accomplish specified goals easily, effectively, and efficiently in a satisfactory manner and in the specific context where it will be implemented (Lyon & Koerner, 2016). DDBT is a three-phase model that combines elements of HCD and implementation science to inform the systematic development of intervention components. The overall goal of DDBT is to maximize usability and implementation outcomes—and ultimately service recipient outcomes—of novel or redesigned interventions. The first phase, Discover, involves learning about the contextual constraints of the intervention (i.e., the setting in which it will be implemented), the needs and preferences of individuals/users, and optimal design elements of the intervention components (e.g., digital vs. in-person). In this first phase, stakeholders play a fundamental role. Stakeholder engagement is critical in DDBT, and HCD more broadly, as user preferences are a key intervention-level implementation determinant. In other industries, consumer preferences and interest in a new product drives its advancement through the development process. In other words, if a product is not of interest to potential consumers, it will not be developed. This mindset is not common for behavioral interventions, which likely contributes to challenges with intervention implementation, especially adoption and engagement. Stakeholders are broadly defined as any individual who will potentially impact the implementation of the intervention. Stakeholders may be primary users, such as clinicians or other providers who will deliver the intervention, and/or patients, clients, participants, or families who will directly use the intervention, or secondary users, such as administrators who make decisions about which interventions to adopt. Their input determines the primary issues or user needs to be addressed by the intervention, identification of anticipated multilevel implementation determinants, and how current interventions are working in the targeted setting.
The second phase of DDBT, Design and Build, involves iteratively developing and refining intervention components (e.g., via rapid prototyping) so that they are designed to meet the needs of key stakeholders—especially those who will directly use, or benefit from, the intervention (i.e., primary users)—and align with system constraints. DDBT intentionally does not prescribe specific design processes but researchers using this model use a variety of techniques (see Dopp et al., 2019 for a glossary of techniques). Typically, the evaluation includes multiple approaches to data collection and information gathering, such as usability testing and observations. Intervention theory can, and we argue, should be considered as an input when determining which intervention components should be tested, revised, and ultimately retained. Although it is not required by DDBT, many researchers using this model have considered intervention components using a tasks and packaging breakdown model (see Table 2 in Lyon, Brewer, et al., 2020). See Bearss et al., (2022) for an applied example.
Table 2.
Example Application: DDBT Design and Build Activities for the MOST Preparation Phase.
|
Note. IPC = interparental conflict; AIM = Acceptability of Intervention Measure; IAM = Intervention Appropriateness Measure; DDBT = discover, design, build, and test; MOST = multiphase optimization strategy.
The third and final phase of DDBT, Test, involves small-scale pilot testing of the newly designed intervention components to ensure they are acceptable to end-users as well as feasible and appropriate for the setting before being formally tested for efficacy or effectiveness (Leon et al., 2011; Lyon et al., 2019). Many researchers adopt a mixed methods approach to this phase, using qualitative methods and quantitative ratings (e.g., Weiner et al., 2017) to assess feasibility, contextual appropriateness, and acceptability. Like MOST, DDBT is applicable to the development or redesign of both client-facing interventions and implementation strategies (Lyon et al., 2019).
Across all three phases of the DDBT framework, HCD strategies are used to achieve the key objectives. A wide range of HCD strategies can bolster intervention implementation. Most prioritize basing the design of intervention components on the preferences and needs of the individuals who will ultimately use the intervention and the setting in which it will be implemented (Dopp et al., 2019). In intervention development, HCD strategies are used to gather data on the wishes, preferences, beliefs, values, and ideas of key stakeholders and develop design solutions. There are at least 30 distinct HCD strategies relevant to intervention development/redesign and implementation (see Dopp et al., 2019 for full review), including design-oriented focus groups and surveys, co-creation sessions, heuristic evaluations, and observational usability tests. Often, intervention developers adopt a multi-method approach to make use of several HCD strategies. For example, following a focus group or qualitative survey to learn about potential interest in and preferences for the intervention, an intervention developer may conduct a co-creation session during which the intervention developer and potential user(s) design an intervention component prototype side-by-side. The intervention developer may decide to conduct user testing sessions with another group of potential users to directly observe them interacting with the prototype, and then conduct an experience sampling study in which potential users engage with the prototype for a period of time during the course of their daily activities. Key themes in HCD research include early and repeated involvement of stakeholders in all phases of the intervention development process and a mixed-method, iterative research design.
Alignment of Principles Underlying MOST and DDBT
We suggest that combining the MOST and DDBT frameworks enhances the intervention optimization process. Importantly, the two frameworks share many core principles, illustrating their shared vision of an implementation-forward approach to intervention development. Beyond that, the two frameworks offer complementary perspectives and research methods that provide a theory-driven and methods-rich roadmap for preparing a new or adapted intervention for optimization.
First, MOST and DDBT both take a systematic, phased, and iterative approach to intervention development. They share the assumption that interventions should be developed based on factors that define successful implementation in real-world settings and ultimately contribute to public health impact. Notably, MOST and DDBT are “problem-agnostic,” suitable for any complex (i.e., multicomponent) psychosocial, behavioral, or public health intervention or implementation strategy. This principle applies to all behavioral intervention development processes broadly defined, including the establishment of novel interventions as well as redesigning interventions for new contexts or populations, regardless of the mode of delivery (i.e., digital, in-person, hybrid, individual, group) or recipient of the intervention (e.g., direct to consumers, training models for clinicians, organization-level implementation strategies). Importantly, both frameworks explicitly acknowledge that intervention researchers should be responsive to changes in systems and constraints over time (e.g., during application in new contexts) and understand that interventions may require redesign to maintain fit.
Second, although intervention effectiveness is a critical element of MOST and DDBT, both contend that intervention developers should balance effectiveness with intervention-level implementation determinants such as affordability, scalability, and efficiency. Prioritization of these determinants ultimately depends on the constraints of the intended setting in which the intervention will be implemented and the users or beneficiaries of the intervention. Thus, the goals of DDBT and MOST are complementary and inextricably linked. For example, an intervention must be highly usable (DDBT goal) to be maximally effective (MOST goal), successfully implementable at scale (DDBT and MOST goal), and efficient (MOST goal).
Third, and possibly most important, DDBT and MOST emphasize the critical role of engaging stakeholders, intervention gatekeepers, and intervention users or beneficiaries in the intervention development process. Input from these sources directly informs the intervention's implementability and needs to be prioritized early in the development process (i.e., in the Preparation phase of MOST and throughout the phases of DDBT). DDBT provides explicit methods and a structure for gathering this information as part of the MOST Preparation phase, and MOST provides the structure to rigorously evaluate the candidate intervention components that emerge from the DDBT process.
Using HCD Strategies of DDBT to Accomplish The Preparation Phase Goals of MOST
The DDBT framework can be used to accomplish the goals of the MOST Preparation phase: determining the key optimization objective for the intended setting, creating theory- and evidence-based intervention components and pilot testing to assess and improve components’ usability, feasibility, acceptability, and appropriateness (all key implementation determinants; Lyon & Bruns, 2019; Weiner et al., 2017). The DDBT framework primarily employs HCD methods to design or refine interventions and/or implementation strategies with the overarching goal of improving usability and other key intervention-level implementation determinants.
Figure 1 outlines the MOST framework and describes how the DDBT framework can be embedded within the Preparation phase of MOST to meet specified goals. As we outline each step, we describe an ongoing example project that applied the synthesized DDBT/MOST approach to develop a digital intervention for children aged 9–12 years who experience high levels of interparental conflict in the context of parental separation/divorce.
Figure 1.
Application of DDBT to accomplish the preparation phase of MOST. Note. Adapted with permission from Collins (2018). DDBT = discover, design, build, and test; MOST = multiphase optimization strategy.
The first phase of the DDBT framework, Discover, addresses two MOST Preparation phase activities: identifying or developing the conceptual model and the optimization objective of the intervention (Table 1). The conceptual model is based on theory and extant empirical evidence; it clarifies hypothesized mechanisms and associated intervention targets to inform component selection. In the case of limited theory and/or empirical literature, the conceptual model can be informed by other sources including, but not limited to, secondary data analyses, clinical experience, unpublished literature, and conference presentations (Collins, 2018). The Discover phase uses methods designed to contextualize the conceptual model by deepening understanding of the implementation context. Engaging in mixed-method research activities, intervention developers prioritize structured contextual observation of primary users and other stakeholders (i.e., providers and consumers/patients) who are relevant to natural processes in the intended implementation setting. For example, structured contextual observation could include shadowing providers to capture workflows in the intended setting and/or watching intervention session recordings while providers explain their thought processes and the challenges they faced (Lyon, 2019).
Table 1.
Example Application: DDBT Discover Activities for the MOST Preparation Phase.
|
Note. IPC = interparental conflict; CFIR = consolidated framework for implementation research.
Additionally, the DDBT Discover phase emphasizes HCD methods to engage primary user stakeholders to ensure interventions meet their needs. These methods are essential for identifying optimization objectives in the MOST Preparation phase. An exemplary Discover phase method is a systematic user-identification process which begins with creating an overly inclusive list of potential users, clarifying the most relevant subset of user characteristics, and then describing, prioritizing, and selecting main primary user groups (Lyon, Koerner, et al., 2020). Primary users of interventions most often include providers and intervention recipients whose needs and constraints must be directly addressed in the intervention design. When an existing intervention is being redesigned, the Discover phase typically involves direct user testing (e.g., cognitive walkthroughs, task-based usability testing) of the original intervention components to identify modification targets. User testing is an efficient approach to understanding how an intervention component operates with users as studies are typically conducted with samples as small as five participants (Turner et al., 2006) and lower cost and “lower fidelity” prototypes—such as paper versions of redesign solutions for a digital intervention—to make the most efficient use of available resources (Lyon, Brewer, et al., 2020; Lyon, Koerner, et al., 2020). Another illustrative Discover phase method identifies stakeholder needs via semistructured interviews, focus groups, and survey studies that target the aspects of an intervention that might be misaligned with a particular context. There are related examples in the literature of MOST researchers using mixed methods approaches to identify or clarify optimization objectives for interventions addressing smoking cessation and substance use prevention (Bernstein et al., 2018; Whitesell et al., 2019). These stakeholder engagement and data collection methods can enable efficient and effective work with primary end-users to identify preliminary optimization objectives for the intended treatment setting.
The second phase of the DDBT framework, Design and Build, addresses two MOST Preparation phase activities: identify a set of candidate intervention components and conduct pilot studies (Table 2). The Design and Build methods allow researchers to iteratively develop, evaluate, and refine prototypes of intervention components to meet the needs of the identified stakeholders (Lyon et al., 2019) and then prioritize intervention components for testing (see Lyon, Coifman, et al., 2021; Lyon, Pullmann, et al., 2021, note that when an intervention/strategy is being redesigned, this prioritization typically happens in the Discover phase). Specifically, researchers build prototypes of new intervention components based on findings and insights from the Discover phase and rapidly evaluate and iterate these prototypes with small samples in user testing sessions to improve component usability and other implementation determinants. In the Design and Build phases, stakeholders engage in similar user testing methods as the Discover phase (e.g., think-aloud protocol, structured observation [Beidas et al., 2014; Lyon et al., 2019]), and the primary goal is to understand if the identified usability problems and implementation constraints have been adequately addressed. In addition to the focus on usability, researchers may use HCD methods to iteratively improve components’ feasibility, acceptability, and appropriateness.
The third phase of the DDBT framework, the Test phase, addresses the final MOST Preparation phase activities of identifying a set of candidate intervention components to be empirically examined in the Optimization phase and conducting intervention component pilot studies (Table 3). The Test phase focuses explicitly on ensuring components are usable, feasible, acceptable, and appropriate, in the intended treatment setting. Test phase methods include small-scale pilot testing of the newly designed/refined intervention components using various methods, such as a one- or two-arm pre-post trial or a pilot randomized factorial trial and are often conceptualized as pilot hybrid effectiveness-implementation trials (Curran et al., 2012). Primary outcomes in the Test phase include usability (Lyon, Pullmann, et al., 2021) and other implementation determinant measures completed by primary users (e.g., providers, consumers, patients) such as the Acceptability of Intervention Measure (AIM), Intervention Appropriateness Measure (IAM), and Feasibility of Intervention Measure (FIM, Weiner et al., 2017). Intervention recipients sometimes complete outcome measures to determine the feasibility of assessment and preliminarily evaluate intervention impact. Qualitative interviews may be conducted to explain and elaborate on quantitative data. Additional implementation feasibility data may also be collected, such as the number of training hours needed for providers to achieve adequate intervention fidelity, costs to implement the intervention, including provider training time, delivery time, and costs of ongoing supervision to ensure intervention fidelity over time.
Table 3.
Example Application: DDBT Test Activities for the MOST Preparation Phase.
|
Note. DDBT = discover, design, build, and test; MOST = multiphase optimization strategy.
Summary, Limitations and Future Directions
In this article, we reviewed MOST and DDBT, two innovative, implementation-forward frameworks for developing, improving, or redesigning behavioral interventions. Given their many shared conceptual features and to address the identified need for concrete ways to achieve MOST Preparation phase goals, we proposed a synthesized model (DDBT/MOST) that utilizes the DDBT framework as a theory-driven and methods-rich roadmap for achieving MOST Preparation phase goals. The Preparation phase provides the critical foundation for the subsequent phases of MOST and cannot be overlooked or undervalued. A recent systematic review by Landoll et al., (2022) proposed a checklist for Preparation phase activities to standardize reporting of this phase of MOST to enhance rigor and reproducibility. Though the checklist ensures Preparation phase activities are reported accurately, the authors note there is room for methodological guidance about accomplishing the goals of this phase. Specifically, concrete methods for carrying out Preparation phase activities systematically will equip intervention researchers with critical tools to conduct rigorous and efficient Preparation phase research, reaping benefits that will flow to all subsequent phases of intervention development, including optimization and evaluation trials. The DDBT framework provides one approach that is informed by HCD to achieve the Preparation phase goals of MOST.
It is important to note that DDBT is not the only framework for systematically achieving the Preparation phase activities and that many other approaches similarly include a phased approach to intervention development (see O’Cathain et al., 2019 for a taxonomy of intervention development approaches). DDBT integrates elements from many of the key categories highlighted by O'Cathain et al., (2019), including intervention development that is target population-centered, implementation-based, and stepped or phase-based. However, beyond its natural fit with the fundamental principles of MOST, DDBT offers several unique advantages. First, DDBT provides a useful (but not prescribed) set of methods for intervention evaluation and redesign. Second, DDBT explicitly centers stakeholder feedback throughout the intervention development process. Third, DDBT makes use of HCD strategies (i.e., user testing, observation) to systematically identify usability issues and drive redesign solutions. Nevertheless, DDBT is one of many intervention development approaches, and diverse approaches will be useful for different contexts, needs, and objectives.
DDBT also has important limitations. DDBT is user-centered but deliberately does not always involve full sharing of power with users/participants. This limitation stems from the HCD roots of DDBT and its contrast with a fully participatory process based on the rich traditions of Community-based Participatory Research (Minkler & Wallerstein, 2011). DDBT is also primarily focused on optimizing interventions for usability/implementability and less focused on optimization for effectiveness. In the Test Phase, determining preliminary efficacy is important but secondary to determining usability, feasibility, acceptability, appropriateness. This is one reason that DDBT is an excellent framework to achieve the Preparation phase goals of MOST, prior to the Optimization phase of MOST. Finally, DDBT is an overarching framework to guide redesign processes, but it is not a step-by-step guide for how to redesign intervention components. A prescriptive “how to” guide would have benefits like making the redesign process very concrete and ensuring standardization, but it would also have drawbacks like less flexibility and assuming a one-size-fits-all redesign process.
The proposed synthesized DDBT/MOST approach to developing interventions should be evaluated for practical application and studied more broadly. For example, we see the great additional potential for incorporating DDBT/HCD methods in the Optimization and Evaluation phases of MOST. In line with the MOST guiding principle of continual improvement (Collins, 2018) and as highlighted by Huffman et al., (2020), ongoing feedback from stakeholders and end-users is critical throughout intervention development phases. For example, there may be valuable uses of DDBT/HCD in the MOST Optimization phase to evaluate outcomes other than effectiveness, such as component usability. In such cases, Optimization phase activities could include factorial designs in which competing intervention component design options are evaluated using in vivo usability assessment techniques in service settings (Lyon, Brewer, et al., 2020; Lyon, Koerner, et al., 2020).
The proposed synthesized DDBT/MOST approach integrates DDBT into the Preparation phase of MOST thereby providing a framework for rigorous and efficient research that centers the user voice in intervention development. This systematic approach will enhance the rigor and reproducibility of Preparation phase activities of MOST, bolster the success of intervention optimization, and contribute to advancements in intervention and implementation science more broadly.
Footnotes
The author(s) declared the following potential conflicts of interest with respect to the research, authorship, and/or publication of this article: Dr. Lyon is an Associate Editor of Implementation Research and Practice; as such, he was not involved in the peer review process for this manuscript. Dr. Guastaferro is a Guest Editor for Implementation Research and Practice; as such, she was not involved in the peer review process for this manuscript.
Funding: The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: Karey L. O’Hara's work on this paper was supported by a career development award provided by the National Institute of Mental Health (K01MH120321). Lindsey M. Knowles’s work on this paper was supported by a grant from the National Multiple Sclerosis Society (MB1706-27847). Aaron R. Lyon's work on this paper was also supported by NIMH (P50MH115837). The content is solely the authors' responsibility and does not necessarily represent the official views of the National Institutes of Health.
ORCID iDs: Karey L. O’Hara https://orcid.org/0000-0001-7429-1021
Kate Guastaferro https://orcid.org/0000-0002-5616-9708
Aaron R. Lyon https://orcid.org/0000-0003-3657-5060
References
- Aarons G. A., Sklar M., Mustanski B., Benbow N., Brown C. H. (2017). “Scaling-out” evidence-based interventions to new populations or new health care delivery systems. Implementation Science, 12(1), 111. 10.1186/s13012-017-0640-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bearss K., Tagavi D., Lyon A. R., Locke J. (2022). Iterative redesign of a caregiver-mediated intervention for use in educational settings. Autism, 26(3), 666–677. 10.1177/13623613211066644 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Beidas R. S., Cross W., Dorsey S. (2014). Show me, don’t tell me: Behavioral rehearsal as a training and analogue fidelity tool. Cognitive and Behavioral Practice, 21(1), 1–11. 10.1016/j.cbpra.2013.04.002 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bernstein S. L., Dziura J., Weiss J., Miller T., Vickerman K. A., Grau L. E., Pantalon M. V., Abroms L., Collins L. M., Toll B. (2018). Tobacco dependence treatment in the emergency department: A randomized trial using the multiphase optimization strategy. Contemporary Clinical Trials, 66, 1–8. 10.1016/j.cct.2017.12.016 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Broder-Fingert S., Kuhn J., Sheldrick R. C., Chu A., Fortuna L., Jordan M., Rubin D., Feinberg E. (2019). Using the multiphase optimization strategy (MOST) framework to test intervention delivery strategies: A study protocol. Trials, 20(1), 728. 10.1186/s13063-019-3853-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- Brooke, J. (1996). SUS-A quick and dirty usability scale. Usability Evaluation in Industry, 189(194), 4–7. [Google Scholar]
- Collins L. M. (2018). Optimization of behavioral, biobehavioral, and biomedical interventions: The multiphase optimization strategy (MOST). Springer. [Google Scholar]
- Collins L. M., Strayhorn J. C., Vanness D. J. (2021). One view of the next decade of research on behavioral and biobehavioral approaches to cancer prevention and control: Intervention optimization. Translational Behavioral Medicine, 11(11), 1998–2008. 10.1093/tbm/ibab087 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Compas B. E., Jaser S. S., Bettis A. H., Watson K. H., Gruhn M. A., Dunbar J. P., Williams E., Thigpen J. C. (2017). Coping, emotion regulation, and psychopathology in childhood and adolescence: A meta-analysis and narrative review. Psychological Bulletin, 143(9), 939–991. 10.1037/bul0000110 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cummings E. M., Schermerhorn A. C., Davies P. T., Goeke-Morey M. C., Cummings J. S. (2006). Interparental discord and child adjustment: Prospective investigations of emotional security as an explanatory mechanism. Child Development, 77(1), 132–152. 10.1111/j.1467-8624.2006.00861.x [DOI] [PubMed] [Google Scholar]
- Curran G. M., Bauer M., Mittman B., Pyne J. M., Stetler C. (2012). Effectiveness-implementation hybrid designs: Combining elements of clinical effectiveness and implementation research to enhance public health impact. Medical Care, 50(3), 217–226. 10.1097/MLR.0b013e3182408812 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Damschroder L. J., Aron D. C., Keith R. E., Kirsh S. R., Alexander J. A., Lowery J. C. (2009). Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implementation Science, 4(1), 50. 10.1186/1748-5908-4-50 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Davidson R. D., O’Hara K. L., Beck C. J. A. (2014). Psychological and biological processes in children associated with high conflict parental divorce. Juvenile and Family Court Journal, 65(1), 29–44. 10.1111/jfcj.12015 [DOI] [Google Scholar]
- Dopp A. R., Parisi K. E., Munson S. A., Lyon A. R. (2019). A glossary of user-centered design strategies for implementation experts. Translational Behavioral Medicine, 9(6), 1057–1064. 10.1093/tbm/iby119 [DOI] [PubMed] [Google Scholar]
- Estabrooks P. A., Brownson R. C., Pronk N. P. (2018). Dissemination and implementation science for public health professionals: An overview and call to action. Preventing Chronic Disease, 15, E162. 10.5888/pcd15.180525 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Freedland K. E. (2020). Pilot trials in health-related behavioral intervention research: Problems, solutions, and recommendations. Health Psychology, 39(10), 851–862. 10.1037/hea0000946 [DOI] [PubMed] [Google Scholar]
- Grych J. H. (1998). Children’s appraisals of interparental conflict: Situational and contextual influences. Journal of Family Psychology, 12(3), 437–453. 10.1037/0893-3200.12.3.437 [DOI] [Google Scholar]
- Guastaferro K., Collins L. M. (2019). Achieving the goals of translational science in public health intervention research: The multiphase optimization strategy (MOST). American Journal of Public Health, 109(Suppl. 2), S128–S129. 10.2105/AJPH.2018.304874 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Guastaferro K., Collins L. M. (2021). Optimization methods and implementation science: An opportunity for behavioral and biobehavioral interventions. Implementation Research and Practice, 2, 26334895211054364. 10.1177/26334895211054363 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Huffman J. C., Millstein R. A., Celano C. M., Healy B. C., Park E. R., Collins L. M. (2020). Developing a psychological–behavioral intervention in cardiac patients using the multiphase optimization strategy: Lessons learned from the field. Annals of Behavioral Medicine, 54(3), 151–163. 10.1093/abm/kaz035 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Karlin B. E., Cross G. (2014). From the laboratory to the therapy room: National dissemination and implementation of evidence-based psychotherapies in the U.S. Department of Veterans Affairs Health Care System. The American Psychologist, 69(1), 19–33. 10.1037/a0033888 [DOI] [PubMed] [Google Scholar]
- Landoll R. R., Vargas S. E., Samardzic K. B., Clark M. F., Guastaferro K. (2022). The preparation phase in the multiphase optimization strategy (MOST): A systematic review and introduction of a reporting checklist. Translational Behavioral Medicine, 12(2), 291–303. 10.1093/tbm/ibab146 [DOI] [PubMed] [Google Scholar]
- Leon A. C., Davis L. L., Kraemer H. C. (2011). The role and interpretation of pilot studies in clinical research. Journal of Psychiatric Research, 45(5), 626–629. 10.1016/j.jpsychires.2010.10.008 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lewis C. C., Mettert K., Lyon A. R. (2021). Determining the influence of intervention characteristics on implementation success requires reliable and valid measures: Results from a systematic review. Implementation Research and Practice, 2, 263348952199419. 10.1177/2633489521994197 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Long N., Slater E., Forehand R., Fauber R. (1988). Continued high or reduced interparental conflict following divorce: Relation to young adolescent adjustment. Journal of Consulting and Clinical Psychology, 56(3), 467–469. 10.1037/0022-006X.56.3.467 [DOI] [PubMed] [Google Scholar]
- Lyon A. (2019). How implementable is that evidence-based practice? Designing and supporting streamlined and contextually-appropriate innovations in behavioral health [Webinar]. http://cepim.northwestern.edu/calendar-events/2019-02-05
- Lyon A. R., Brewer S. K., Areán P. A. (2020a). Leveraging human-centered design to implement modern psychological science: Return on an early investment. The American Psychologist, 75(8), 1067–1079. 10.1037/amp0000652 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lyon A. R., Bruns E. J. (2019). User-centered redesign of evidence-based psychosocial interventions to enhance implementation—Hospitable soil or better seeds? JAMA Psychiatry, 76(1), 3. 10.1001/jamapsychiatry.2018.3060 [DOI] [PubMed] [Google Scholar]
- Lyon A. R., Coifman J., Cook H., McRee E., Liu F. F., Ludwig K., Dorsey S., Koerner K., Munson S. A., McCauley E. (2021a). The cognitive walkthrough for implementation strategies (CWIS): A pragmatic method for assessing implementation strategy usability. Implementation Science Communications, 2(1), 78. 10.1186/s43058-021-00183-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lyon A. R., Koerner K. (2016). User-centered design for psychosocial intervention development and implementation. Clinical Psychology, 23(2), 180–200. 10.1111/cpsp.12154 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lyon A. R., Koerner K., Chung J. (2020b). Usability evaluation for evidence-based psychosocial interventions (USE-EBPI): A methodology for assessing complex intervention implementability. Implementation Research and Practice, 1, 263348952093292. 10.1177/2633489520932924 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lyon A. R., Munson S. A., Renn B. N., Atkins D. C., Pullmann M. D., Friedman E., Areán P. A. (2019). Use of human-centered design to improve implementation of evidence-based psychotherapies in low-resource communities: Protocol for studies applying a framework to assess usability. JMIR Research Protocols, 8(10), e14990. 10.2196/14990 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lyon A. R., Pullmann M. D., Jacobson J., Osterhage K., Al Achkar M., Renn B. N., Munson S. A., Areán P. A. (2021b). Assessing the usability of complex psychosocial interventions: The Intervention Usability Scale. Implementation Research and Practice, 2, 263348952098782. 10.1177/2633489520987828 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Minkler M., Wallerstein N. (2011). Community-based participatory research for health: From process to outcomes. John Wiley & Sons. [Google Scholar]
- O’Cathain A., Croot L., Sworn K., Duncan E., Rousseau N., Turner K., Yardley L., Hoddinott P. (2019). Taxonomy of approaches to developing interventions to improve health: A systematic methods overview. Pilot and Feasibility Studies, 5(1), 41. 10.1186/s40814-019-0425-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- O’Hara K. L., Rhodes C. A., Wolchik S. A., Sandler I. N., Yun-Tein J. (2021). Longitudinal effects of postdivorce interparental conflict on children’s mental health problems through fear of abandonment: Does parenting quality play a buffering role? Child Development, 92(4), 1476–1493. 10.1111/cdev.13539 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Onken L. S., Carroll K. M., Shoham V., Cuthbert B. N., Riddle M. (2014). Reenvisioning clinical science: Unifying the discipline to improve the public health. Clinical Psychological Science, 2(1), 22–34. 10.1177/2167702613497932 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Piper M. E., Cook J. W., Schlam T. R., Jorenby D. E., Smith S. S., Collins L. M., Mermelstein R., Fraser D., Fiore M. C., Baker T. B. (2018). A randomized controlled trial of an optimized smoking treatment delivered in primary care. Annals of Behavioral Medicine, 52(10), 854–864. 10.1093/abm/kax059 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Piper M. E., Fiore M. C., Smith S. S., Fraser D., Bolt D. M., Collins L. M., Mermelstein R., Schlam T. R., Cook J. W., Jorenby D. E., Loh W.-Y., Baker T. B. (2016). Identifying effective intervention components for smoking cessation: A factorial screening experiment. Addiction (Abingdon, England), 111(1), 129–141. 10.1111/add.13162 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rhoades K. A. (2008). Children’s responses to interparental conflict: A meta-analysis of their associations with child adjustment. Child Development, 79(6), 1942–1956. 10.1111/j.1467-8624.2008.01235.x [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sandler I. N., Tein J.-Y., West S. G. (1994). Coping, stress, and the psychological symptoms of children of divorce: A cross-sectional and longitudinal study. Child Development, 65(6), 1744–1763. 10.1111/j.1467-8624.1994.tb00846.x [DOI] [PubMed] [Google Scholar]
- Schlam T. R., Fiore M. C., Smith S. S., Fraser D., Bolt D. M., Collins L. M., Mermelstein R., Piper M. E., Cook J. W., Jorenby D. E., Loh W.-Y., Baker T. B. (2016). Comparative effectiveness of intervention components for producing long-term abstinence from smoking: A factorial screening experiment. Addiction (Abingdon, England), 111(1), 142–155. 10.1111/add.13153 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Schleider J. L., Dobias M. L., Sung J. Y., Mullarkey M. C. (2020). Future directions in single-session youth mental health interventions. Journal of Clinical Child & Adolescent Psychology, 49(2), 264–278. 10.1080/15374416.2019.1683852 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Spring B., Pfammatter A. F., Marchese S. H., Stump T., Pellegrini C., McFadden H. G., Hedeker D., Siddique J., Jordan N., Collins L. M. (2020). A factorial experiment to optimize remotely delivered behavioral treatment for obesity: Results of the Opt–IN study. Obesity, 28(9), 1652–1662. 10.1002/oby.22915 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Tanner A. E., Guastaferro K. M., Rulison K. L., Wyrick D. L., Milroy J. J., Bhandari S., Thorpe S., Ware S., Miller A. M., Collins L. M. (2021). A hybrid evaluation-optimization trial to evaluate an intervention targeting the intersection of alcohol and sex in college students and simultaneously test an additional component aimed at preventing sexual violence. Annals of Behavioral Medicine, 55(12), 1184–1187. 10.1093/abm/kaab003 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Turner, C. W., Lewis, J. R., & Nielsen, J. (2006). Determining usability test sample size. International Encyclopedia of Ergonomics and Human Factors, 3(2), 3084–3088. [Google Scholar]
- Weiner B. J., Lewis C. C., Stanick C., Powell B. J., Dorsey C. N., Clary A. S., Boynton M. H., Halko H. (2017). Psychometric assessment of three newly developed implementation outcome measures. Implementation Science, 12(1). 10.1186/s13012-017-0635-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Whitesell N. R., Mousseau A. C., Keane E. M., Asdigian N. L., Tuitt N., Morse B., Zacher T., Dick R., Mitchell C. M., Kaufman C. E. (2019). Integrating community-engagement and a multiphase optimization strategy framework: Adapting substance use prevention for American Indian families. Prevention Science, 20(7), 1136–1146. 10.1007/s11121-019-01036-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- Windsor L. C., Benoit E., Pinto R. M., Gwadz M., Thompson W. (2021). Enhancing behavioral intervention science: Using community-based participatory research principles with the multiphase optimization strategy. Translational Behavioral Medicine, 11, 1596–1605. 10.1093/tbm/ibab032 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wyrick D. L., Tanner A. E., Milroy J. J., Guastaferro K., Bhandari S., Kugler K. C., Thorpe S., Ware S., Miller A. M., Collins L. M. (2022). It matters: Optimization of an online intervention to prevent sexually transmitted infections in college students. Journal of American College Health, 70(4), 1212–1222. 10.1080/07448481.2020.1790571 [DOI] [PMC free article] [PubMed] [Google Scholar]