Abstract
Although classic models of implementation emphasized the importance of innovation characteristics in their adoption and sustained use, contemporary implementation research and practice have deprioritized these variables. Human-centered design (HCD) is an approach that grounds product development in information collected about the people and settings that will ultimately use those products. HCD has strong roots in psychological theory, but its application is typically limited to the development of digital technologies. HCD is rarely applied to the design of psychosocial innovations — including both service-recipient-facing interventions and implementation strategies — within the applied psychological disciplines. The current paper reviews the psychological origins of HCD and details pathways through which HCD theories and methods can be leveraged to advance the “core tasks” of contemporary implementation research and practice in psychology. These include (1) identification of multilevel implementation determinants through specification of user needs and contexts; (2) tailoring of implementation strategies, such as contextually-driven intervention redesign; and (3) evaluating implementation mechanisms and outcomes, including disentangling how the core HCD focus on usability relates to closely associated implementation variables such as acceptability, feasibility, and appropriateness. Collectively, these applications provide directions through which to leverage the mature field of HCD, maximize psychology’s return on its early theoretical investment, and promote the large-scale impact of findings from across the applied fields of psychology.
Public Health Significance:
Most effective innovations from psychological science are used rarely in routine practice, greatly reducing their public health impact. Human-centered design provides methods and tools through which innovations — and the implementation processes that support them — can be developed or redesigned to ensure public health impact.
Keywords: Implementation, Human-centered design, User-centered design, Adaptation
Editor’s note.
This article is part of a special issue, “Expanding the Impact of Psychology Through Implementation Science,” published in the Xxxxxx 2020 issue of American Psychologist. Shannon Wiltsey Stirman and Rinad S. Beidas served as editors of the special issue, with Anne E. Kazak as advisory editor.
The field of implementation tackles a central problem facing psychologists—ensuring that the knowledge generated through psychological science is implemented and sustained in non-research contexts (Chambers, 2008). Implementation activities can be broadly categorized into implementation research—the scientific study of methods to promote the systematic uptake of research findings and other evidence-based practices into routine practice (Eccles & Mittman, 2006)—and implementation practice—conducting applied work to improve the quality of service systems through the use of research evidence, but without the primary goal of producing generalizable knowledge (Franks & Bory, 2015). Within implementation research and practice, success relies on attention to barriers and facilitators (i.e., determinants) across multiple levels, including the process of implementation as it unfolds over time, characteristics of individuals, aspects of the inner setting (i.e., immediate organization) and outer setting (e.g., inter-organizational linkages; policy), and traits of the intervention itself (Damschroder et al., 2009). Despite wide recognition that implementation research and practice must attend to these multiple levels, recent efforts have focused much more on the organization/system and individual levels, whereas the interventions themselves have received less attention (Lyon & Bruns, 2019).
The recent emergence of a science of intervention adaptation (Baumann, Cabassa, & Stirman, 2017; Chambers & Norton, 2016), has sparked new interest in understanding the characteristics of interventions that bolster and/or detract from implementation success. In examining these aspects of interventions, several researchers have turned to the area of human-centered design (HCD; a.k.a., user-centered design), a field that grounds the development of a product in understanding the needs and preferences of people who will use it (Norman & Draper, 1986). Applying HCD principles to the goals of the implementation field provides new opportunities for improving the design, adaptation, implementation, and sustainment of both interventions and implementation strategies (i.e., methods or techniques to enhance the adoption, implementation, and sustainment of new practices). This paper will review the psychological origins of HCD, describe current applications in psychology and related disciplines, and detail the pathways through which HCD theories and methods can be mobilized to advance the “core tasks” of the contemporary field of implementation, as applied to modern psychological science.
Human-Centered Design (HCD) Overview
HCD is an approach initially intended for product development and the design of management systems (Rubin & Chisnell, 2008). Key HCD terms are in Supplemental File 1. HCD involves the contextualized application of “human factors” knowledge (i.e., human capabilities and limitations) to improve usability and user experience (Kasdaglis & Stowers, 2016). Typically articulated as a multi-step, iterative process involving initial exploration and articulation of user needs, development of prototypes, and revision of designs in response to data collection, HCD creates solutions by involving the human perspective (i.e., the end user) in all steps of creating a solution. It involves numerous trial-and-error applications of the solution, with each iteration resulting in a new, more usable, and often innovative solution (Norman & Draper, 1986). HCD is not a method itself, but a field that makes use of quantitative, qualitative, and mixed-methods approaches to align products with contexts of use. Information gathering for HCD is often inherently mixed-methods, relying heavily on qualitative inquiry to understand contextual constraints and user needs, as well as quantitative data to compare alternative designs or evaluate products over multiple iterations (Albert & Tullis, 2013). In this way, HCD closely parallels the use of mixed methods in implementation research. The unique contributions of HCD are most apparent in the specific processes through which qualitative and quantitative data are collected, especially data about a specific product or intervention (e.g., involving direct, structured interactions with products to elicit usability issues). Given this, HCD techniques have particular merit when improved usability or contextual appropriateness is an explicit objective.
HCD is most commonly applied to the development and evaluation of digital technologies (Altman, Huang, & Breland, 2018; Minichiello, Hood, & Harkness, 2018), but it also has been applied to numerous non-technology problems, such improving school lunches (Kalil, 2015) or enhancing patient healthcare experiences (IDEO, 2012). Lyon, Dopp, Brewer, Kientz, and Munson (in press) recently described how HCD can advance mental health service accessibility (via streamlined and scalable products), effectiveness (via engaging and targeted innovations) and equity (via contextually and culturally appropriate innovations) by codifying the perspectives of stakeholders. Additionally, building on the multi-stage models that exemplify HCD, emerging frameworks such as Discover, Design-Build, Test (DDBT; Lyon et al., 2019) articulate how a staged HCD process can be applied to both interventions and implementation strategies. DDBT begins with evaluation of unadapted innovations and the target context (Discover) to drive iterative redesign and user testing (Design-Build) before piloting and evaluating implementation and service outcomes in the real world (Test).
A central goal of HCD is improving the usability of solutions (Hartson, 1998; Supplemental File 1). Usability is defined as the extent to which a product can be used effectively by specified users to achieve specified goals in a specified context of use (ISO, 1998). Although usability was originally conceptualized as “ease of use,” more contemporary definitions tend to include both ease of use and “usefulness” (Hartson, 1998; Lund, 2001). Maximizing usability ensures that new products, tools, or processes have interfaces that are intuitive, require little learning, and can be easily adopted by typical users (ease of use) and align with the contextualized needs of the users for which they are intended (usefulness) (Kasdaglis & Stowers, 2016). Both aspects are important as easy-to-use products may not be useful to stakeholders, or vice versa. Despite broad applicability across domains, many of the principles and practices in contemporary HCD were originally derived from psychology. Nevertheless, HCD is underutilized to improve the implementation of products from psychological science.
HCD’s Psychological Roots
As displayed in Table 1, HCD has strong roots in psychological theory (e.g., cognitive psychology, organizational psychology, social psychology). Indeed, a landmark design text, “The Design of Everyday Things” (Norman, 2013) was originally published under the title “The Psychology of Everyday Things” (Norman, 1988; emphasis ours). “Software psychology” was a term used in the 1970s and 1980s to describe the work being done to infuse a consideration of human characteristics into the design of software and computer systems (e.g., Shneiderman, 1980). The research and practice of software psychology and the “psychology of human-computer interaction” (Card, Newell, & Moran, 1983) contributed several concepts to modern-day HCD, including intentionally describing human beings as they interact with software and systems, developing guidelines for designing systems with an understanding of human factors, and assessing the usability of systems (Carroll, 1997; Patel & Kushniruk, 1998). This work relied heavily on traditional cognitive psychology laboratory-based research designs, and tended to include unrepresentative populations and situations. Thus, the guidelines yielded during this period did not meet the needs of developers designing for real-world users (Carroll, 1997).
Table 1.
HCD principles linked to psychological theory
| Design goals for psychosocial innovations (Lyon & Koerner, 2016) based on HCD usability principles (e.g., Nielsen, 1994) |
Definitions | Roots in psychological theory |
|---|---|---|
| Learnability | Users can rapidly build understanding of, or facility in, intervention use. | Cognitive load theory (Sweller, 1988); working memory (Baddeley, 1992) |
| Memorability | Users can easily remember and apply intervention components (without many added supports). | Cognitive complexity theory (Kieras & Polson, 1985); cognitive load theory (Sweller, 1988) |
| Efficiency | Minimal time, effort, and cost are required for intervention use and problem resolution. | Activity theory (Kuutti, 1995); task representation theory (Card, Moran, & Newell, 1980; Card et al., 1983) |
| Low cognitive load | Minimal thinking is required to complete intervention tasks (e.g., tasks are simple / involve few steps). | Cognitive load theory (Sweller, 1988) |
| Error reduction | Users can prevent or rapidly recover from misapplications of intervention components. | Error detection and compensation (Gehring, Goss, Coles, Meyer, & Donchin, 1993) |
| Satisfaction | Intervention is viewed as acceptable and valuable, especially compared to alternate products in the health marketplace. | Theory of reasoned action (Fishbein & Ajzen, 1975) |
| Capitalizing on context | Intervention incorporates/addresses the static properties of the destination context that limit intervention use. | Ecological psychology (Barker, 1968); sociocultural theory of cognitive development (Vygotsky, 1978) |
An additional barrier limiting software psychology’s applied value was its conceptual reliance on information processing theory’s understanding of users as human computers—“information processors”—interacting with computer systems (Patel & Kushniruk, 1998). A shift occurred in the 1990s, when the need for usable frameworks inspired a number of new theoretical perspectives to focus on the specific contexts surrounding humans and computer systems. Such frameworks included “distributed cognition” and “activity theory,” which has conceptual roots in Lev Vygotsky’s socio-cultural perspective and “zone of proximal development” (Carroll, 1997). Jean Piaget’s theories of learning were also instrumental in generating thought about how learning occurs as people interact with objects (Lourenço, 2012). Resulting perspectives such as “activity theory” spanned cognitive, developmental, and cultural psychology by describing how individuals’ interactions with their environments impact individual, social, and cultural development (Patel & Kushniruk, 1998). Importantly, this evolution in thinking included the infusion of ethnographic and participatory research methods into the field of human-computer interaction (HCI; Hughes, King, Rodden, & Andersen, 1994; Olson & Olson, 2003). Thus, the orientation of HCI research and practice, with HCD traditionally falling underneath the broad umbrella of HCI, has shifted from a perceptual/cognitive perspective to a social/organizational one (Olson & Olson, 2003), a shift that has facilitated HCD’s alignment with implementation. Contemporary HCD is now a robust and multifaceted discipline that is increasingly entering mainstream culture and the popular press (e.g., Gardner, 2017; Parker-Pope, 2016). Supported by a growing diversity of perspectives and multi-disciplinary design teams (e.g., including collaborating cognitive psychologists, anthropologists, sociologists; Patel & Kushniruk, 1998; Rogers, 2004), HCD continues to evolve new research agendas and new methods to support them (Carroll, 1997; Sears, Lazar, Ozok, & Meiselwitz, 2008). HCD’s contextual design principles and user-centered/participatory design methods are now poised to be leveraged to advance the goals of the implementation field.
HCD’s Utility for Implementation Research and Practice
It is commonly cited that two thirds of implementation efforts fail (Damschroder et al., 2009; Lewis et al., 2018) and, despite growing attention, progress has been slow to close the longstanding research-to-practice gap. Leveraging HCD to facilitate the implementation of evidence-based innovations is a unique contribution to the field and an overdue “return on investment” for the early contributions from psychological science. As noted above, HCD principles are now being applied more broadly than just digital technologies, including in cancer care (Mullaney, Pettersson, Nyholm, & Stolterman, 2012), evidence-based psychotherapies (Lyon & Bruns, 2019), and complex psychosocial implementation strategies (Lyon et al., 2019).
There is much that HCD can do to improve the uptake of the psychosocial innovations that arise from the psychological sciences. Applied fields of psychology (e.g., clinical, organizational, educational, forensic, sports) have developed evidence-based strategies to address issues like improving worker engagement, leadership training, competency assessments in the justice system, and training police in de-escalation practices (Bakker, 2011; de Tribolet-Hardy, Kesic, & Thomas, 2015; Rogers & Johansson-Love, 2009; Scott & Webber, 2008). However, as has been observed across so many disciplines, most of these best-practice strategies have met with poor uptake (e.g., Deadrick & Gibson, 2009). A major challenge has been that these and other practices psychologists develop are rarely designed in collaboration with the end user or in the intended destination setting. HCD can address these implementation challenges by redesigning evidence-based practices or implementation strategies (or both) in ways that incorporate the target audience’s needs, learning styles, and operational contexts.
Classic models of implementation (e.g., Damschroder et al., 2009; Rogers, 1962; Wensing & Grol, 2005) have emphasized the importance of innovation characteristics in their adoption. This is perhaps most notable in Rogers' (1962) Diffusion of Innovations theory, which described a range of relevant characteristics of innovations that facilitate or inhibit their large-scale use (i.e., relative advantage, trialability, adaptability, complexity). However, the bulk of contemporary implementation research has deemphasized innovation-level factors, focusing much more explicitly on individual factors (e.g., practitioner attitudes) and organizational factors (e.g., organizational climate) (Dopp, Parisi, Munson, & Lyon, 2019b; Lewis, Weiner, Stanick, & Fischer, 2015; Waltz, Powell, Fernández, Abadie, & Damschroder, 2019). Nevertheless, while frequently overlooked, attention to innovation characteristics is well-aligned with a growing focus on co-creation, defined as deep involvement of key stakeholders to create the infrastructure and context that enables and sustains the use of evidence in practice, in implementation research and practice (Metz, Boaz, & Robert, 2019). HCD provides a set of specific principles and methods that can help to accomplish more general co-creation goals (e.g., a focus on local knowledge, brokering connections and building trust, ongoing collaboration, investment and support) that are theorized to enhance implementation outcomes (Proctor et al., 2011) by explicitly involving end users in intervention and implementation strategy development.
Example Applications of HCD to Improve Implementation of Psychological Science
Applications of HCD that are not focused on digital technologies are relatively nascent, but emerging examples can be found in the applied psychological disciplines. These include educational, engineering, and clinical/counseling psychology. Minichiello and colleagues (2018) detailed how HCD tools can improve education instruction and found that, while nearly all applications of HCD in education have focused on the development of digital technologies, a small subset directly applied HCD to improve curriculum. In an example of one of these with relevance to both educational and engineering psychology, Turns et al. (2015) developed and presented detailed, research-based personas (i.e., research-based profiles of hypothetical users and use case situations; Grudin & Pruitt, 2002) to both students and graduate student instructors in an undergraduate engineering course to drive curriculum improvements. Personas were used as prompts for group discussions about teaching and learning in an attempt to replace naive stereotypes. The authors reported that personas had utility in surfacing the unstated biases of each user group (learners and instructors) related to learning and teaching. Although the authors did not assess specific curriculum improvements as a result, they concluded that using personas to address common issues in instruction was likely to be valuable to curriculum developers.
In clinical psychology, psychosocial interventions (e.g., psychotherapy, counseling, case management) are a preferred mode of treatment for service recipients (McHugh, Whitton, Peckham, Welge, & Otto, 2013). However, clinicians rarely use evidence-based psychosocial interventions, in part because they were often designed without input from end-users (Institute of Medicine, 2015). To improve the usability of Problem Solving Therapy (PST; Nezu, Nezu, & D’Zurilla, 2012), one team employed HCD techniques such as a review of modifications that previously trained providers had made, which modifications they found most useful, and the top challenges to supporting patients in the use of out-of-session clinical content. After three prototype iterations, they generated a new intervention which was found to be easier to learn and sustain, and resulted in more robust clinical outcomes than PST (Alexopoulos et al., 2015).
In addition to programs and practices focused on service recipients, HCD can also be applied to implementation strategies, which — just like psychotherapies, educational curricula, and many other types of innovations developed by the applied psychological sciences — also tend to be complex psychosocial interventions (Proctor, Powell, & McMillen, 2013). In a study relevant to school and counseling psychology, Lyon et al. (2018) used cognitive walkthroughs, a method to simulate and assess users’ internal cognitive models for particular tasks or situations (Mahatody, Sagar, & Kolski, 2010), to iteratively design a post-training consultation package for school-based providers interested in engaging in data-driven decision making about services. The approach identified a set of 12 usability issues for the consultation strategy (e.g., case presentations often carried a danger of lengthy or meandering discussions that interfered with other topics) which were addressed via redesigned materials and guidance for consultants and learners (e.g., modeling brevity; strategies to conclude lengthy presentations). Findings indicated that the redesigned strategy was effective in improving the use of data to drive practice decisions.
Addressing the “Core Tasks” of Implementation via HCD
Beyond the examples described above, there are broader opportunities for HCD to advance implementation by contributing new ways to address its core objectives. Increasingly, implementation research and practice have become focused on (1) clearly identifying barriers and facilitators (i.e., determinants) of implementation success (Krause et al., 2014); (2) developing strategies to address determinants (Powell et al., 2015); and (3) evaluating implementation outcomes (Proctor et al., 2011), as well as the mechanisms through which implementation strategies bring about implementation outcomes (Lewis et al., 2018). Collectively, these objectives reflect the blueprint for many implementation research studies and a basic causal model for implementation practice initiatives. Below each of these objectives is described as well as the specific approaches from HCD that can be applied to help achieve them.
Identification of Determinants
Researchers and practitioners have detailed numerous barriers and facilitators of evidence-based practice uptake and sustainment. A large portion of this work has been conducted within the applied psychological disciplines, especially in clinical, counseling, school, educational, and health psychology (e.g., Forman, Olin, Hoagwood, Crowe, & Saka, 2009; Jones, Crabb, Turnbull, & Oxlad, 2014; Klingner, Ahwee, Pilonieta, & Menendez, 2003; Pagoto et al., 2007) but also including fields such as organizational psychology and management (e.g., Belling, James, & Ladkin, 2004). Most commonly, papers have presented post-mortem accounts of projects that did not fully achieve their goals via retrospectively-constructed descriptions of “lessons learned.” This initial work was critically important to develop a comprehensive understanding of the myriad factors that can contribute to success or failure. Much of the vast body of knowledge that these studies generated has now been synthesized, with 601 unique determinants of practice change identified (Krause et al., 2014). Often, implementation determinants are organized within multilevel frameworks, which may specify potentially important factors at the levels of the outer context, inner context, individuals, innovation, and process levels (e.g., Aarons, Hurlburt, & Horwitz, 2011; Damschroder et al., 2009).
As the number of potential implementation determinants has grown, methods of identifying, measuring, and prioritizing them have struggled to keep pace. In the most comprehensive systematic review of implementation assessment instruments to date, Lewis and colleagues (2015) identified over 420 unique instruments, most of which assess determinants. Despite this large number, findings indicated that the vast majority of instruments are not high quality nor equally distributed across system levels (e.g., 98 instruments at the individual level, but only 4 instruments measuring the outer context). HCD methods can help to identify and prioritize determinants in ways that may address some of the shortcomings of existing approaches and support implementation and sustainment, both of which have been notoriously difficult to achieve. Recent work by Dopp et al. (2019a) has detailed HCD strategies with high relevance to implementation. Table 2 displays a subset of these strategies.
Table 2.
HCD techniques to identify multilevel implementation determinants
| System Level (Damschroder et al., 2009) |
Example HCD techniques (Dopp et al., 2019a) |
|---|---|
| Outer setting |
Experience models: Specification of relationships between innovation attributes, people, organizations, and places in different use cases or scenarios to identify key aspects that may enable or inhibit implementation. Design charrette sessions: Participatory workshop in which key stakeholders engage in creative or complex decision making in response to user research. |
| Inner setting |
Process maps: Represent system-level processes in ways that highlight the critical individuals, their actions, the targets of the actions, and situational triggers as they unfold over time. Observational field visits: Observe the settings in which an innovation will be used to gather information about the everyday activities, environments, interactions, objects and users in that setting. |
| Individuals |
Define target users and their needs: Identify and prioritize target users and meaningful user subgroups (including primary, secondary, and non-users). Personas: Research-based profiles of hypothetical users and use case situations. |
| Innovation |
Usability testing: Observe potential users interacting with an innovation to examine how easily, quickly, and satisfyingly they perform specific tasks. Competitive user experience research: Evaluate competing products and how they are used, and compare them to one another - and the developing innovation - on a set of key dimensions to identify predictors of adoption. |
Outer setting.
HCD methods can be leveraged at multiple levels to achieve the core implementation task of determinant identification. At the level of the outer setting, the use of qualitative techniques such as experience models—which include the specification of relationships between innovation attributes, people, organizations, and places in different use cases or scenarios (i.e., “experiences”)—can facilitate the identification of key aspects of multiple settings that may enable or inhibit implementation (i.e., determinants) (Hanington & Martin, 2012). Experience models can present system-level examples of the most common (or potentially problematic) characteristics of outer contexts within which a new innovation may be implemented. Review of experience models by stakeholders who have knowledge of the broader system (e.g., funders or policymakers) as well as aspects of the subsystems (e.g., organizational administrators or service providers) can evaluate whether the model aligns with their experiences and expectations, fill in gaps, and anticipate unintended consequences (e.g., differential alignment of interventions among agencies can create opportunities for inequities), thus identifying which collections of outer setting determinants may be most likely to relate to implementation success. For example, in sports psychology, explicitly setting goals and delivering feedback is an evidence-based approach for increasing performance (Ward, 2011); however, this technique is not applied universally. If an athletic organization such as the National Women's Soccer League wanted to learn more about the impact of the outer context on the extent to which teams implement in routine goal setting and feedback, they might construct outer context profiles for teams that do and do not engage in the practice (including linkages with state-level athletic organizations, coaching policies, etc.), share those with team managers, owners, and players, and identify possible policy changes (e.g., contingent funding or other incentives) that could target outer setting determinants to facilitate its adoption and sustained use.
Inner setting.
The inner organizational context is well studied within the implementation literature, with many applications of principles and techniques from related disciplines, such as industrial-organizational psychology (Aarons, Ehrhart, Farahnak, & Hurlburt, 2015; Weiner, 2009). Nevertheless, identification and prioritization of the most critical organizational determinants remains challenging due to the complexity and dynamism of organizational systems. Research has identified various inner setting variables relevant to both implementation and sustainment (e.g., intervention fit with agency mission; Palinkas, Campbell, & Saldana, 2018). Many strategies apply sets of techniques to address a particular organizational determinant (e.g., implementation climate), but may not be equipped to identify or address other determinants at the organizational level (e.g., the set of networks and communication channels among employees). HCD techniques that can be brought to bear on this include the application of process maps, which leverage user research (i.e., what is known about the needs and expectations of the individuals in a setting) to represent system-level processes in ways that highlight critical individuals, actions, targets of the actions, and situational triggers over time (Dopp et al., 2019a). Applied in forensic psychology, process maps might detail all interactions between offenders, correctional officers, and other professionals during initial incarceration. User research could surface specific emotional experiences and needs (e.g., fear, anger) from different perspectives in different situations (e.g., short versus long-term confinement) that could be used to help identify individuals at greater safety risk (e.g., self-harm or violence).
Individuals.
Along with organizational determinants, variables at the level of the individuals involved in implementation—most often individual service providers, but also including service recipients—are among the most commonly studied in implementation research (Lewis et al., 2015). Nevertheless, this knowledge has not translated into adequate market segmentation (i.e., dividing markets into specific target groups) for the potential users of new psychological innovations. Most often, there is an implicit assumption that these innovations, such as evidence-based psychosocial interventions for mental health problems (e.g., cognitive behavioral therapy protocols for depression), are designed for adoption by all potential users (Lyon & Koerner, 2016). In reality, however, an innovation intended for all possible users (e.g., all therapists or all depressed clients) is unlikely to effectively meet the needs of any particular subgroup. This perspective is neatly captured by the mantra, “if you are designing for everyone, you are designing for no one.” HCD has a strong orientation toward the explicit identification/segmentation of user groups and the prioritization of specific subgroups affected by a problem that a design is intended to address. These include primary users (whose needs should be prioritized), secondary users (whose needs can be met if they do not compromise the needs of the primary users), and even negative users (for whom the innovation is explicitly not intended) (Cooper, Reimann, & Cronin, 2007). Lyon and Koerner (2016) articulated how, for most mental health treatments, master’s-level therapists are often an implicit primary user group, given that they provide the bulk of mental health services nationwide (Hyde, 2013). Other primary users typically include service recipients, whose needs and preferences can be explicitly incorporated in to services via HCD in a manner consistent with current movements toward patient-centered care (Selby & Lipstein, 2014). Secondary users, rarely considered, include stakeholders such as system administrators, who often make intervention adoption decisions system wide.
Despite the utility of applying HCD methods for specifying well-defined user groups, there are inevitable tradeoffs between the degree of specification of end users, the scale of implementation, and the need for usability. Although usability is always important, early users of niche innovations are often highly specialized, “super-expert” outliers who have the skills, motivation, or patience to adopt products irrespective of their usability. This phenomenon was clearly present in the early years of computing, when usability was considered relatively unimportant for initial adopters of computer technologies (Hartson, 1998). Just like early computer adopters, original users of many innovations from the applied psychological disciplines (such as evidence-based psychotherapies) were highly specialized and unrepresentative of the general workforce; often doctoral-level providers or highly-motivated graduate students (Weisz et al., 2005). In contrast, as large-scale use of innovations has become an increasingly explicit goal, the target users for these innovations have inevitably shifted to the general population of practitioners. The field now faces a clear and present need to address the needs of a more diverse user pool, but this has not been accompanied by a corresponding increase in attention to innovation usability. Figure 1 displays the theoretical relationship between an increasingly general user base and the criticality of usability for successful adoption. Given that specialized experts make up only a small percentage of potential users, it is hypothesized that this relationship is likely to be curvilinear, with usability’s importance increasingly rapidly after a small number of “super-experts.” Within psychological science, evidence for this relationship comes from a recent study evaluating the usability of an exposure protocol for anxiety with a stratified sample of novice, intermediate, and expert clinician users (Lyon, Koerner, & Chung, under review). Findings support the theoretical relationship between implementation scale and the importance of usability presented in Figure 1, as expert clinicians encountered considerably fewer usability problems compared to novice and intermediate clinicians.
Figure 1.

Theoretical association between the scale of implementation and the criticality of innovation usability.
Innovations.
Despite applicability across system levels, HCD is perhaps best suited to contribute novel approaches to identifying implementation determinants at the innovation level, which is currently underemphasized and underdeveloped in contemporary implementation research (Lyon et al., 2019). For example, relative advantage (i.e., stakeholder perceptions of the advantage of implementing an innovative versus an alternative; Gustafson et al., 2003) is commonly referenced by implementation frameworks (e.g., Damschroder et al., 2009; Rogers, 2010). However, ways of directly assessing relative advantage are few (Lewis et al., 2015). Mixed-methods HCD techniques such as competitive user experience research, which involves developing an understanding of how people use and perceive a competing product to identify dimensions along which products are unique and similar (Bergen & Peteraf, 2002), have utility for understanding relative advantage. Information from this technique can be used to construct a competitor array in which key dimensions along which systems vary can be assigned weightings (0.01 to 0.99 and totaling 1.0 across dimensions), based on their criticality for implementation.
Innovation usability, in particular, is a key determinant of whether that innovation is seen as acceptable, contextually appropriate, widely adopted, and sustained (Lyon & Bruns, 2019). Usability testing is an inherently mixed-methods cornerstone technique in HCD (Hartson, 1998), and one that has recently begun to be applied to the complex programs developed by the applied psychological disciplines. Lyon et al. (under review) described a methodology for evaluating the ease of use and usefulness of psychosocial interventions that involves (1) identifying end users, (2) defining intervention components, (3) testing components, and (4) organizing and prioritizing usability issues. Applied to an exposure protocol for anxiety, the method yielded quantitative usability ratings and 13 qualitative usability problems (e.g., the protocol failed to block contraindicated therapist behaviors that inhibit effectiveness) to be addressed through redesign.
Selecting and Tailoring Strategies
Based on the large number of identified determinants, selecting and tailoring implementation strategies to match determinants has become an increasing focus of implementation research and practice (Baker et al., 2010). However, methods for accomplishing this goal are only just emerging (Grol, Wensing, Eccles, & Davis, 2013; Lewis et al., 2018). Some tailoring methods explicitly incorporate information about implementation strategy feasibility (e.g., concept mapping; Waltz et al., 2015), but this is underdeveloped, potentially increasing the risk of developing packages of strategies that are, themselves, difficult to implement. Just as HCD methods are relevant to the redesign of psychosocial interventions to improve their usability and implementability, they also can be leveraged to improve the ease of use and usefulness of implementation strategies, enhancing feasibility (Lyon et al., 2018).
Furthermore, strategies that directly target the innovation level are far fewer than those at levels such as the inner setting or individual service providers (Waltz et al., 2019). Dopp and colleagues (2019b) determined that just 4% of strategies primarily focus on the innovation level. Nevertheless, deliberate innovation redesign to enhance contextual appropriateness, uptake, and sustainment is a promising implementation strategy that is currently underutilized. As Lyon and Bruns (2019) describe, “attending primarily to the hospitability of systems and individuals and not attending to issues inherent to the interventions that interface with those systems is much akin to a farmer solely focusing on the characteristics of the soil and not the quality or germination potential of the seeds” (p. 3). As noted previously, HCD has potential to augment existing compilations of implementation strategies with techniques that aim to improve the design quality of interventions, thus enhancing their implementability and decreasing the number of strategies required to promote their adoption (von Thiele Schwarz, Aarons, & Hasson, 2019).
Implementation Outcomes and Mechanisms
Implementation outcomes are defined as the effects of deliberate actions to implement new treatments, practices, and services (Proctor et al., 2011). Implementation mechanisms are processes or events through which implementation strategies operate to bring about desired implementation outcomes (Lewis et al., 2018). HCD is well situated to support the evaluation of implementation outcomes and implementation mechanisms in at least two ways. First, HCD—and the concept of usability—can contribute to ongoing discussions about conceptual clarity and overlap of implementation outcomes with related constructs. Although the measurement of behavioral implementation outcomes—including adoption, penetration, fidelity, sustainment—varies, the distinctions among the constructs are relatively clear and consistent. In contrast, perceptual implementation outcomes such as acceptability, feasibility, and appropriateness are more ambiguous, interrelated, and potentially overlapping with the concept of usability. Proctor and colleagues (2011) defined acceptability as the degree to which a practice, or innovation is palatable or satisfactory; appropriateness as perceived fit of the practice or innovation for a given setting or individual; and feasibility as the extent to which an innovation can be carried out in a given setting. In contrast to these perceptual outcomes, usability is a characteristic of the innovation, ideally derived from directly observed interactions with that product (e.g., via usability testing; Albert & Tullis, 2013; Lyon et al., in press). It is for this reason that usability is best considered a determinant of implementation, rather than an implementation outcome. This conceptual relationship, and the extent to which constructs are dependent on the innovation itself versus its interaction with particular users and use contexts, are displayed in the Venn Diagram in Supplemental File 2. Among their recommendations for research on implementation, Williams and Beidas (2019) indicated that “more clarity on the relationship between determinants and specific implementation outcomes is needed” (p.437). Future research could evaluate the usability of a range of innovations to determine how it accounts for variance across acceptability, feasibility, and appropriateness. Studies should similarly assess the extent to which usability impacts behavioral implementation outcomes (e.g., adoption), as some usability components (e.g., ease of use) may act on these outcomes more directly than others (e.g., usefulness).
Second, in addition to being a determinant of implementation outcomes, innovation usability is a key theoretical mechanism through which innovation redesign is likely to improve implementation and service outcomes. If usability improves as the result of streamlining an intervention (e.g., by removing less essential components) and implementation outcomes also improve, then usability is a candidate mediator of that improvement. Exploring this connection is the primary objective of the University of Washington ALACRITY Center (UWAC), funded by the National Institute of Mental Health (Lyon et al., 2019). Across 15 pilot projects, the UWAC will integrate (1) usability issues for interventions and strategies, (2) redesign solutions, and (3) implementation and service outcomes. By integrating findings across all projects, the UWAC will begin to answer the question of whether redesign solutions improve usability and implementation and clinical outcomes. Future applications may include explicit measurement approaches to gather information about psychosocial innovation usability as well as guidance documents and methods for systematic redesign of intervention and strategies.
Barriers and Limitations
Despite considerable potential, some barriers may interfere with the widespread application of HCD within implementation science and practice. First, because it requires additional early-stage data collection, HCD has the potential to create added burden (e.g., time) during initial implementation stages as explicit evaluation of interventions and implementation strategies is used to guide their adaptation and ensure contextual appropriateness. Nevertheless, the early use of HCD is widely understood to decrease the likelihood of costly design mistakes being carried forward to later stages of development, suggesting that HCD processes may still be cost effective. Second, though HCD has numerous methods for evaluating and understanding relevant contextual (i.e., inner and outer setting) factors, a limitation is that it offers fewer tools for directly addressing those variables. In this way, the organizational focus of many implementation strategies – which may include higher level techniques to directly influence implementation contexts such as changing liability laws, accessing new funding, or training leaders (Powell et al., 2015) – is complimentary and synergistic with HCD. Third, the work of implementation and HCD professionals is largely disconnected and siloed, creating a challenge for applying HCD in contemporary implementation research and practice. A recent concept mapping study by Dopp and colleagues (2020) assessed overlap between implementation strategies and HCD strategies and found that only three out of the ten identified strategy clusters integrated both disciplines, with the rest (seven of ten) reflecting strategies from only HCD or only implementation. Although this underscores the potential for unique contributions from each field, better promotion of interdisciplinary collaborations may be needed (Dopp et al., 2019b).
Conclusion
Applications of HCD to identify implementation determinants, develop strategies, and assess implementation outcomes reflect opportunities for psychology to realize a return on its early investment by reaping the benefits of the methods and principles that its theories have helped to generate. Although the various interrelated contributions of psychological theories to HCD are complicated and sometimes murky, much of contemporary HCD owes its success to the field of psychology. Most applications of HCD remain focused on digital innovations (Altman et al., 2018; Minichiello et al., 2018), but clear avenues exist for their application to psychosocial products. Given the persistent research-to-practice gap in the implementation of psychological science and the associated lack of movement in large-scale public health impact for many evidence-based practices, now is the time to mobilize HCD methods to advance the field of implementation within psychology and beyond.
Supplementary Material
Acknowledgements
Preparation of this manuscript was supported, in part, by grants R34MH109605 (PI: Lyon), F32MH116623 (PI: Brewer), and P50MH115837 (PI: Areán), awarded by the National Institute of Mental Health.
Biography



References
- Aarons GA, Ehrhart MG, Farahnak LR, & Hurlburt MS (2015). Leadership and Organizational Change for Implementation (LOCI): A randomized mixed method pilot study of a leadership and organization development intervention for evidence-based practice implementation. Implementation Science, 10(1), 11. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Aarons GA, Hurlburt M, & Horwitz SM (2011). Advancing a conceptual model of evidence-based practice implementation in public service sectors. Administration and Policy in Mental Health and Mental Health Services Research, 38(1), 4–23. 10.1007/s10488-010-0327-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Albert W, & Tullis T (2013). Measuring the user experience: Collecting, analyzing, and presenting usability metrics (2 edition). Amsterdam ; Boston: Morgan Kaufmann. [Google Scholar]
- Alexopoulos GS, Raue PJ, Kiosses DN, Seirup JK, Banerjee S, & Arean PA (2015). Comparing engage with PST in late-life major depression: A preliminary report. The American Journal of Geriatric Psychiatry, 23(5), 506–513. 10.1016/j.jagp.2014.06.008 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Altman M, Huang TT, & Breland JY (2018). Design thinking in health care. Preventing Chronic Disease, 15 10.5888/pcd15.180128 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Baddeley A (1992). Working memory. Science, 255(5044), 556–559. 10.1126/science.1736359 [DOI] [PubMed] [Google Scholar]
- Baker R, Camosso-Stefinovic J, Gillies C, Shaw EJ, Cheater F, Flottorp S, & Robertson N (2010). Tailored interventions to overcome identified barriers to change: Effects on professional practice and health care outcomes. Cochrane Database of Sys. Reviews, (3). [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bakker AB (2011). An evidence-based model of work engagement. Current Directions in Psychological Science, 20(4), 265–269. [Google Scholar]
- Barker RG (1968). Ecological psychology: Concepts and methods for studying the environment of human behavior. Stanford University Press. [Google Scholar]
- Baumann AA, Cabassa LJ, & Stirman SW (2017). Adaptation in dissemination and implementation science (2nd ed.; Brownson RC, Colditz GA, & Proctor EK, Eds.). [Google Scholar]
- Belling R, James K, & Ladkin D (2004). Back to the workplace: How organisations can improve their support for management learning and development. Journal of Management Development, 23(3), 234–255. [Google Scholar]
- Bergen M, & Peteraf MA (2002). Competitor identification and competitor analysis: A broad-based managerial approach. Managerial and Decision Economics, 23(4–5), 157–169. 10.1002/mde.1059 [DOI] [Google Scholar]
- Card SK, Moran TP, & Newell A (1980). Computer text-editing: An information-processing analysis of a routine cognitive skill. Cognitive Psychology, 12(1), 32–74. 10.1016/0010-0285(80)90003-1 [DOI] [Google Scholar]
- Card SK, Newell A, & Moran TP (1983). The psychology of human-computer interaction. Hillsdale, NJ, USA: L. Erlbaum Associates Inc. [Google Scholar]
- Carroll JM (1997). Human-computer interaction: Psychology as a science of design. Annual Review of Psychology, 48(1), 61–83. 10.1146/annurev.psych.48.1.61 [DOI] [PubMed] [Google Scholar]
- Chambers DA (2008). Advancing the science of implementation: A workshop summary. Administration and Policy in Mental Health and Mental Health Services Research, 35(1–2), 3–10. 10.1007/s10488-007-0146-7 [DOI] [PubMed] [Google Scholar]
- Chambers DA, & Norton WE (2016). The adaptome: Advancing the science of intervention adaptation. American Journal of Preventive Medicine, 51(4), S124–S131. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cooper A, Reimann R, & Cronin D (2007). About Face 3: The essentials of interaction design (3rd edition). Indianapolis, IN: Wiley. [Google Scholar]
- Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, & Lowery JC (2009). Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implementation Science, 4(1), 50 10.1186/1748-5908-4-50 [DOI] [PMC free article] [PubMed] [Google Scholar]
- de Tribolet-Hardy F, Kesic D, & Thomas SDM (2015). Police management of mental health crisis situations in the community: Status quo, current gaps and future directions. Policing and Society, 25(3), 294–307. [Google Scholar]
- Deadrick DL, & Gibson PA (2009). Revisiting the research–practice gap in HR: A longitudinal analysis. Human Resource Management Review, 19(2), 144–153. [Google Scholar]
- Dopp AR, Parisi KE, Munson SA, & Lyon AR (2019a). A glossary of user-centered design strategies for implementation experts. Translational Beh Med, 9(6), 1057–1064. [DOI] [PubMed] [Google Scholar]
- Dopp AR, Parisi KE, Munson SA, & Lyon AR (2019b). Integrating implementation and user-centered design strategies to enhance the impact of health services: Protocol from a concept mapping study. Health Research Policy and Systems, 17:1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dopp AR, Parisi KE, Munson SA, & Lyon AR (2020). Aligning implementation and user-centered design strategies to enhance the impact of health services: results from a concept mapping study. Implementation Science Communications, 1:17. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Eccles MP, & Mittman BS (2006). Welcome to implementation science. Implementation Science, 1(1), 1 10.1186/1748-5908-1-1 [DOI] [Google Scholar]
- Fishbein M, & Ajzen I (1975). Belief, attitude, intention and behaviour: An introduction to theory and research (Vol. 27). London: Addison-Wesley. [Google Scholar]
- Forman SG, Olin SS, Hoagwood KE, Crowe M, & Saka N (2009). Evidence-based interventions in schools: Developers’ views of implementation barriers and facilitators. School Mental Health, 1(1), 26 10.1007/s12310-008-9002-5 [DOI] [Google Scholar]
- Franks RP, & Bory CT (2015). Who supports the successful implementation and sustainability of evidence-based practices? Defining and understanding the roles of intermediary and purveyor organizations. New Directions for Child and Adolescent Development, 2015(149), 41–56. 10.1002/cad.20112 [DOI] [PubMed] [Google Scholar]
- Gardner L (2017, September 10). Can design thinking redesign higher ed? The Chronicle of Higher Education. Retrieved from https://www.chronicle.com/article/Can-Design-Thinking-Redesign/241126 [Google Scholar]
- Gehring WJ, Goss B, Coles MG, Meyer DE, & Donchin E (1993). A neural system for error detection and compensation. Psychological Science, 4(6), 385–390. [Google Scholar]
- Grol R, Wensing M, Eccles M, & Davis D (2013). Improving patient care: The implementation of change in health care. John Wiley & Sons. [Google Scholar]
- Grudin J, & Pruitt J (2002). Personas, participatory design and product development: An infrastructure for engagement. In Binder J, Gregory J, & Wagner I (Eds.), PDC; (pp. 144–152). Retrieved from http://rossy.ruc.dk/ojs/index.php/pdc/article/view/249 [Google Scholar]
- Gustafson DH, Sainfort F, Eichler M, Adams L, Bisognano M, & Steudel H (2003). Developing and testing a model to predict outcomes of organizational change. Health Services Research, 38(2), 751–776. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hanington B, & Martin B (2012). Universal methods of design. Rockport Publishers. [Google Scholar]
- Hartson HR (1998). Human-computer interaction: Interdisciplinary roots and trends. Journal of Systems and Software, 43, 103–118. 10.1016/S0164-1212(98)10026-2 [DOI] [Google Scholar]
- Hughes J, King V, Rodden T, & Andersen H (1994). Moving out from the control room: Ethnography in system design. Proceedings of the 1994 ACM Conference on Computer Supported Cooperative Work, 429–439. 10.1145/192844.193065 [DOI] [Google Scholar]
- Hyde PS (2013). Report to Congress on the Nation’s substance abuse and mental health workforce issues. US Dept. for Health and Human Serv., Substance Abuse and Mental Health Serv. (January 2013), 10 Retrieved from http://www.cimh.org/sites/main/files/file-attachments/samhsa_bhwork_0.pdf [Google Scholar]
- IDEO. (2012). A hospital centered on the patient experience. Retrieved October 10, 2019, from: https://www.ideo.com/case-study/a-hospital-centered-on-the-patient-experience [Google Scholar]
- Institute of Medicine (2015). Psychosocial interventions for mental and substance use disorders: A framework for establishing evidence-based standards. Retrieved from: http://www.ncbi.nlm.nih.gov/books/NBK305126/ [PubMed]
- International Organization for Standardization. (1998). Ergonomic requirements for office work with visual display terminals (VDTs) - Part 11: Guidance on usability. Geneva, Switzerland: International Organization for Standardization, 9241. [Google Scholar]
- Jones L, Crabb S, Turnbull D, & Oxlad M (2014). Barriers and facilitators to effective type 2 diabetes management in a rural context: A qualitative study with diabetic patients and health professionals. Journal of Health Psychology, 19(3), 441–453. [DOI] [PubMed] [Google Scholar]
- Kalil T (2015, September 4). Using human-centered design to make government work better and cost less [The White House: President Barack Obama]. Retrieved October 10, 2019, from https://obamawhitehouse.archives.gov/blog/2015/09/04/using-human-centered-design-make-government-work-better-and-cost-less [Google Scholar]
- Kasdaglis N, & Stowers K (2016). Beyond human factors: The role of human centered design in developing a safety-critical system In Stephanidis C (Ed.), HCI International 2016 – Posters’ Extended Abstracts (pp. 345–351). Springer International Publishing. [Google Scholar]
- Kieras D, & Polson PG (1985). An approach to the formal analysis of user complexity. International Journal of Man-Machine Studies, 22(4), 365–394. [Google Scholar]
- Klingner JK, Ahwee S, Pilonieta P, & Menendez R (2003). Barriers and facilitators in scaling up research-based practices. Exceptional Children, 69(4), 411–429. [Google Scholar]
- Krause J, Van Lieshout J, Klomp R, Huntink E, Aakhus E, Flottorp S, … others. (2014). Identifying determinants of care for tailoring implementation in chronic diseases: An evaluation of different methods. Implementation Science, 9(1), 102. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kuutti K (1995). Activity theory as a potential framework for human-computer interaction research In Nardi B (Ed.), Context and Consciousness: Activity Theory and Human-Computer Interaction (1st ed., pp. 17–44). Cambridge, Mass: The MIT Press. [Google Scholar]
- Lavery D, Cockton G, & Atkinson MP (1997). Comparison of evaluation methods using structured usability problem reports. Behaviour & Information Technology, 16(4–5), 246–266. 10.1080/014492997119824 [DOI] [Google Scholar]
- Lewis CC, Klasnja P, Powell B, Tuzzio L, Jones S, Walsh-Bailey C, & Weiner B (2018). From classification to causality: Advancing understanding of mechanisms of change in implementation science. Frontiers in Public Health, 6, 136. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lewis CC, Stanick CF, Martinez RG, Weiner BJ, Kim M, Barwick M, & Comtois KA (2015). The Society for Implementation Research Collaboration Instrument Review Project: A methodology to promote rigorous evaluation. Implementation Science, 10, 2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lewis CC, Weiner BJ, Stanick C, & Fischer SM (2015). Advancing implementation science through measure development and evaluation: A study protocol. Implementation Science, 10(1), 102 10.1186/s13012-015-0287-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lourenço O (2012). Piaget and Vygotsky: Many resemblances, and a crucial difference. New Ideas in Psychology, 30(3), 281–295. 10.1016/j.newideapsych.2011.12.006 [DOI] [Google Scholar]
- Lund A (2001). Measuring usability with the USE questionnaire. Usability Interface, 8(2), 3–6. [Google Scholar]
- Lyon AR, Coifman J, Cook H, Liu F, Ludwig K, Dorsey S, … McCauley E (2018, December 3). The Cognitive Walk-through for Implementation Strategies (CWIS): A pragmatic methodology for assessing strategy usability. Presented at the 11th Annual Conference on the Science of Dissemination and Implementation, Washington D.C. [Google Scholar]
- Lyon AR, Comtois KA, Kerns SE, Landes SJ, & Lewis CC (in press). Closing the science-practice gap in implementation before it widens In Shlonsky A, Mildon R, & Albers B (Eds.), Effective Implementation Science. Springer. [Google Scholar]
- Lyon AR, & Koerner K (2016). User-centered design for psychosocial intervention development and implementation. Clin. Psychology: Science & Practice, 23(2), 180–200. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lyon AR, Koerner K, & Chung J (under review). Usability Evaluation for Evidence-Based Psychosocial Interventions (USE-EPBI): A methodology for assessing complex innovation implementability. [DOI] [PMC free article] [PubMed]
- Lyon AR, Munson SA, Renn BN, Atkins DA, Pullmann MD, Friedman E, & Arean PA (2019). Human-centered design to improve implementation of evidence-based psychotherapies in low-resource communities: UW ALACRITY Center Methods Core protocol. JMIR Research Protocols, 8, 10. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lyon AR, & Bruns EJ (2019). User-centered redesign of evidence-based psychosocial interventions to enhance implementation—Hospitable soil or better seeds? JAMA Psychiatry, 76(1), 3–4. [DOI] [PubMed] [Google Scholar]
- Lyon AR, Dopp AR, Brewer SK, Kientz JA, & Munson SA (in press). Designing the future of children’s mental health services. Administration and Policy in Mental Health and Mental Health Services Research. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Mahatody T, Sagar M, & Kolski C (2010). State of the art on the Cognitive Walkthrough Method, its variants and evolutions. International Journal of Human–Computer Interaction, 26(8), 741–785. 10.1080/10447311003781409 [DOI] [Google Scholar]
- McHugh RK, Whitton SW, Peckham AD, Welge JA, & Otto MW (2013). Patient preference for psychological vs. pharmacological treatment of psychiatric disorders: A meta-analytic review. The Journal of Clinical Psychiatry, 74(6), 595. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Metz A, Boaz A, & Robert G (2019). Co-creative approaches to knowledge production: What next for bridging the research to practice gap? Evidence & Policy: A Journal of Research, Debate and Practice, 15(3), 331–337. [Google Scholar]
- Minichiello A, Hood JR, & Harkness DS (2018). Bringing user experience design to bear on STEM education: A narrative literature review. Journal for STEM Education Research, 1(1), 7–33. 10.1007/s41979-018-0005-3 [DOI] [Google Scholar]
- Mullaney T, Pettersson H, Nyholm T, & Stolterman E (2012). Thinking beyond the cure: A case for human-centered design in cancer care. International Journal of Design, 6(3). [Google Scholar]
- Nezu AM, Nezu CM, & D’Zurilla TJ (2012). Problem-solving therapy: A treatment manual (1st ed.). 10.1891/9780826109415 [DOI] [Google Scholar]
- Norman DA (2013). The design of everyday things. Basic Books. [Google Scholar]
- Norman DA (1988). The psychology of everyday things. New York, NY, US: Basic Books. [Google Scholar]
- Norman DA, & Draper SW (Eds.). (1986). User centered system design: New perspectives on human-computer interaction (1 edition). Hillsdale, N.J.: CRC Press. [Google Scholar]
- Olson GM, & Olson JS (2003). Human-computer interaction: Psychological aspects of the human use of computing. Annual Review of Psychology, 54(1), 491–516. [DOI] [PubMed] [Google Scholar]
- Pagoto SL, Spring B, Coups EJ, Mulvaney S, Coutu M-F, & Ozakinci G (2007). Barriers and facilitators of evidence-based practice perceived by behavioral science health professionals. Journal of Clinical Psychology, 63(7), 695–705. [DOI] [PubMed] [Google Scholar]
- Palinkas LA, Campbell M, & Saldana L (2018). Agency leaders' assessments of feasibility and desirability of implementation of evidence-based practices in youth-serving organizations using the stages of implementation completion. Frontiers in Public Health, 6, 161. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Parker-Pope T (2016, January 4). “Design thinking” for a better you. Retrieved October 9, 2019, from: https://well.blogs.nytimes.com/2016/01/04/design-thinking-for-a-better-you/ [Google Scholar]
- Patel VL, & Kushniruk AW (1998). Interface design for health care environments: The role of cognitive science. Proceedings of the AMIA Symposium, 29–37. [PMC free article] [PubMed] [Google Scholar]
- Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, … Kirchner JE (2015). A refined compilation of implementation strategies: Results from the Expert Recommendations for Implementing Change (ERIC) project. Implementation Science, 10, 21 10.1186/s13012-015-0209-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Proctor EK, Powell BJ, & McMillen JC (2013). Implementation strategies: Recommendations for specifying and reporting. Implementation Science, 8, 139 10.1186/1748-5908-8-139 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Proctor EK, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, … Hensley M (2011). Outcomes for implementation research: Conceptual distinctions, measurement challenges, and research agenda. Administration and Policy in Mental Health and Mental Health Services Research, 38(2), 65–76. 10.1007/s10488-010-0319-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rogers EM (2010). Diffusion of innovations (5th ed.). New York, NY: Simon and Schuster. [Google Scholar]
- Rogers EM (1962). Diffusion of innovations. Free Press of Glencoe. [Google Scholar]
- Rogers R, & Johansson-Love J (2009). Evaluating competency to stand trial with evidence-based practice. J. of the Am. Academy of Psychiatry and the Law Online, 37(4), 450–460. [PubMed] [Google Scholar]
- Rogers Y (2004). New theoretical approaches for human-computer interaction. Annual Review of Information Science and Technology, 38(1), 87–143. [Google Scholar]
- Rubin J, & Chisnell D (2008). Handbook of usability testing: How to plan, design and conduct effective tests. Indianapolis, IN: John Wiley & Sons. [Google Scholar]
- Scott S, & Webber C (2008). Evidence-based leadership development: The 4L framework. Journal of Educational Administration, 46, 762–776. [Google Scholar]
- Sears A, Lazar J, Ozok A, & Meiselwitz G (2008). Human-centered computing: Defining a research agenda. Intl. Journal of Human–Computer Interaction, 24(1), 2–16. [Google Scholar]
- Selby JV, & Lipstein SH (2014). PCORI at 3 years—progress, lessons, and plans. New England Journal of Medicine, 370(7), 592–595. [DOI] [PubMed] [Google Scholar]
- Shneiderman B (1980). Software psychology. Winthrop, Cambridge, Mass, 48, 161–172. [Google Scholar]
- Sweller J (1988). Cognitive load during problem solving: Effects on learning. Cognitive Science, 12(2), 257–285. 10.1207/s15516709cog1202_4 [DOI] [Google Scholar]
- Turns J, Borgford-Parnell J, & Ferro T (2015). Exploring the usefulness of personas in engineering education. 6th Research in Engineering Edu. Symposium, Dublin, Ireland. [Google Scholar]
- von Thiele Schwarz U, Aarons GA, & Hasson H (2019). The Value Equation: Three complementary propositions for reconciling fidelity and adaptation in evidence-based practice implementation. BMC Health Services Research, 19(1), 868. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Vygotsky LS (1978). Mind in society: The development of higher psychological processes (Revised ed. edition; Cole M, John-Steiner V, Scribner S, & Souberman E, Eds.). Cambridge, Mass: Harvard University Press. [Google Scholar]
- Waltz TJ, Powell BJ, Fernández ME, Abadie B, & Damschroder LJ (2019). Choosing implementation strategies to address contextual barriers: Diversity in recommendations and future directions. Implementation Science, 14(1), 42. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Waltz TJ, Powell BJ, Matthieu MM, Damschroder LJ, Chinman MJ, Smith JL, … Kirchner JE (2015). Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: Results from the Expert Recommendations for Implementing Change (ERIC) study. Implementation Science, 10, 109 10.1186/s13012-015-0295-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ward P (2011). Goal setting and performance feedback In Behavioral sport psychology (pp. 99–112). New York City, NY: Springer. [Google Scholar]
- Weiner BJ (2009). A theory of organizational readiness for change. Implementation Science, 4(1), 67 10.1186/1748-5908-4-67 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Weisz JR, Doss AJ, & Hawley KM (2005). Youth psychotherapy outcome research: A review and critique of the evidence base. Annu. Rev. Psychol, 56, 337–363. [DOI] [PubMed] [Google Scholar]
- Wensing M, & Grol R (2005). Characteristics of successful interventions In Grol R, Wensing M, & Eccles MP (Eds.), Improving patient care: The implementation of change in clinical practice (pp. 60–70). Edinburgh, UK: Elsevier/Butterworth Heinemann. [Google Scholar]
- Williams NJ, & Beidas RS (2019). Annual Research Review: The state of implementation science in child psychology and psychiatry: A review and suggestions to advance the field. Journal of Child Psychology and Psychiatry, 60(4), 430–450. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
