Skip to main content
Journal of Clinical and Translational Science logoLink to Journal of Clinical and Translational Science
. 2021 Sep 13;5(1):e180. doi: 10.1017/cts.2021.850

A stakeholder engagement method navigator webtool for clinical and translational science

Bethany M Kwan 1,2,3,4,, Kate Ytell 1,2, Marilyn Coors 5,6, Matthew DeCamp 1,5,7, Brad Morse 1,7, Julie Ressalam 5, Jenna E Reno 1,2,3, Meleah Himber 5, Julie Maertens 2, Robyn Wearner 3, Kaylee Gordon 4, Matthew K Wynia 1,5,7,8
PMCID: PMC8596067  PMID: 34849255

Abstract

Stakeholder engagement is increasingly expected by funders and valued by researchers in clinical and translational science, yet many researchers lack access to expert consultation or training in selecting appropriate stakeholder engagement methods. We describe the development of a novel Stakeholder Engagement Navigator webtool. We conducted an environmental scan and literature review, along with investigator interviews, surveys, and engagement expert facilitated group discussion. We formally reviewed and cataloged 29 distinct engagement methods. To develop the webtool, we used an iterative design process that followed Design Thinking phases: empathize, define, ideate, prototype, and test. As prioritized during user testing, the Stakeholder Engagement Navigator webtool both educates and guides investigators in selecting an engagement method based on key criteria. The V1.0 Navigator webtool filters methods first by purpose of engagement (noted by 62% of users as the highest priority criteria), then by budget, time per stakeholder interaction, and total interactions. The tool is available at DICEMethods.org. The Stakeholder Engagement Navigator webtool is a user-centered, interactive webtool suitable for use by researchers seeking guidance on appropriate stakeholder engagement methods for clinical and translational research projects.

Keywords: Stakeholder engagement, translational science, research methods, tools, design thinking

Introduction

In the last decade, expectations for engaging patients and other stakeholders throughout the conceptualization, conduct, and dissemination of research have become an established part of the clinical and translational research culture [13]. Funders such as the Patient-Centered Outcomes Research Institute (PCORI) and increasingly the National Institutes of Health (NIH) encourage or require partnerships with stakeholders throughout the research process [4,5]. A stakeholder is an individual or group responsible for or affected by health- and healthcare-related decisions that can be informed by research evidence and includes groups like payers, practitioners, and policymakers as well as patients, families, and communities [6]. Stakeholder engagement can improve public commitment to research, make research and the products of research more patient-centered, and enhance likelihood of successful dissemination and implementation in real-world settings [79]. General principles and processes of participatory research [10,11] have been well established. There are multiple methods for stakeholder engagement in research and empirical evidence on the strengths and weaknesses of specific methods is emerging [12,13], though gaps remain regarding the comparative effectiveness of various engagement methods in relation to specific research contexts and purposes [14].

Researchers seeking guidance on appropriate methods of stakeholder engagement for their projects may turn to clinical translational science award (CTSA) programs or others that offer training and consultation on community and stakeholder engagement [15,16], but these programs typically have limited resources (i.e., expert personnel) for providing consultations. Scalable infrastructure could support improvements in stakeholder-engaged research [17], and self-directed, web-based interactive tools are emerging solutions across clinical and translational research [18,19].

The Data Science to Patient Value (“D2V”) initiative at the University of Colorado Anschutz Medical Campus supports advanced research in data science and data-driven health care, including through pilot award funding and other research support. The D2V Dissemination, Implementation, Communication, Ethics and Engagement (DICE) core consults with researchers who wish to engage stakeholders throughout the process of designing, conducting, and disseminating their work. The process of “designing for dissemination” requires engagement of patients, clinicians, and other health system stakeholders, and the DICE core includes leaders from our local CTSA, bioethicists, clinicians, health services researchers, D&I scientists, and communication, instructional design, and user-centered design experts (see supplementary material for details on DICE core team composition). As with other consultation services, however, the core is a limited resource that cannot meet all campus needs.

The D2V DICE core therefore undertook an iterative process of design, development, and testing of an interactive web-based tool (henceforth “webtool”) to guide researchers in learning about, selecting, and using a variety of methods for stakeholder-engaged research for their grant writing, protocol planning, implementation, and evidence dissemination.

The design process addressed: 1) What education and expert guidance do health researchers need to select and use engagement methods? 2) What criteria of engagement methods and the research context are relevant to decisions about which method to use? 3) What features of a webtool would help researchers with self-directed selection and use of engagement methods?

Methods

Overview of Design and Development Process

The design and development of the engagement methods webtool was guided by user-centered design processes (Fig. 1). The DICE core followed Design Thinking stages described by Ideo.org: Empathize, Define, Ideate, Prototype, and Test [20]. Design Thinking stages are iterative in nature – such that progress from one stage to another often returns to prior stages with new insights. The team participated in a self-paced Design Thinking course from IDEO.org during early stages of the project. Multiple prototypes were developed, revised, and refined over time.

Fig. 1.

Fig. 1.

Stakeholder engagement navigator webtool design thinking process. CCTSI, Colorado Clinical and Translational Sciences Institute; D2V, Data Science to Patient Value.

Intended Audience and Context for Use

The webtool is designed with health services researchers in mind, though the engagement methods cataloged and the information on stakeholder engagement provided are not limited to use in health services research. The webtool is designed for use by researchers at all career levels and in any stage of a research project, from planning through implementation to dissemination.

Empathize Stage

During “Empathize” stage activities, our goal was to understand the educational and resource needs of our intended audience and to cataloge existing resources. Steps in this stage included D2V pilot grantee consultations and educational events, ethnographic interviews, a literature review of engagement methods, and an environmental scan of comparable engagement method and consumer product selection tools, websites, recommender, and filtering systems.

DICE core members conducted formal consultations with D2V pilot grantees (n = 7) across two annual funding cycles, using these as opportunities to explore current resources available to help investigators select methods and to understand when and why researchers make decisions about engagement methods for their research. The DICE core also collaborated with the Colorado Clinical and Translational Sciences Institute (CCTSI)’s community engagement core to provide a required 2-hour educational workshop on principles of community engagement for pilot grantees; workshop evaluations revealed unmet needs for guidance on why to engage stakeholders, what to engage them in, and how to engage them.

A literature review combined with the expertise of the DICE core team identified an initial list of about 40 engagement methods, many from the appendix of Engaging Stakeholders To Identify and Prioritize Future Research Needs by O’Haire et al. [21]. This list was eventually refined to a total of 31.

The environmental scan revealed two exemplar engagement selection webtools: Engage2020’s Action Catalogue [22] (a general societal engagement tool from the European Commission) and Participedia (a global crowdsourcing platform for participatory political processes) [23]. Neither tool focuses on the specific needs of health researchers, but both provided valuable design ideas for our webtool (see supplementary material). A final Empathize stage activity was four ethnographic interviews with research faculty and staff who provide expert consultation and guidance on research methods, including engagement methods. Insights from the environmental scan and ethnographic interviews led to "how might we" design questions, focusing on three areas: 1) helping investigators understand the stakeholder engagement process as well as the timeline and resources involved; 2) working with investigators to identify their goals in performing stakeholder engagement; and 3) guiding investigators to understand what resources they already have, which ones they need, and how best to apply them.

Define Stage

The purpose of the “Define” stage was to clarify and state the core needs and problems of users. Define stage activities included 1) development of user personas and use cases for the webtool and 2) classification of the 31 engagement methods according to criteria relevant to selection and use. The team developed 5 use cases to illustrate the needs of our key audience (health services researchers) with varying levels of experience with stakeholder engagement methods (see supplementary material). These were used to clarify which design features were important to which types of users and to reduce costs by preventing errant design upfront.

Formal cataloging of the key criteria of each engagement method was based on three steps: a card-sorting activity, a team review, and user surveys. A group of twelve national experts in stakeholder engagement and six members of the DICE team met in-person to undertake a card-sorting activity [24] in which they were asked to organize the list of methods into high-level categories (see supplementary material). Group discussion yielded the idea that the term “methods” did not sufficiently encompass all types of engagement activities; some were more appropriately called “approaches” (e.g., high-level frameworks that do not specify a set process, such as Community-Based Participatory Research), while “methods” were defined as sets of activities with explicit, step-by-step procedures.

The DICE team then undertook a method review process designed to classify each of the 31 methods according to seven dimensions: consistency with the definition of engagement, including both 1) bidirectionality (the method supports collaborative discussion with two-way communication rather than unidirectional data collection) and 2) “longitudinality” (the method supports relationships over time with stakeholders rather than a single interaction), 3) purpose of engagement (including “Identify and explore new perspectives or understanding,” “Expand and diversify stakeholder outreach,” and “Disseminate findings to relevant audiences,” adapted from PCORI’s Engagement Rubric [25]), 4) cost (high, medium, or low), 5) duration (time required for any given stakeholder interaction), 6) level of training/expertise needed to carry out the method effectively, and 7) strength of the evidence base. These dimensions were identified as relevant to choosing an engagement method, both scientifically (e.g., appropriateness of the method for achieving engagement goals) and pragmatically (e.g., budget, access to stakeholders, team expertise, and timeline).

The method review process was similar to an NIH style grant review; two reviewers were assigned to review detailed descriptions of each method and complete a form indicating scores for each of the dimensions using a 9-point scale (see supplementary material). In addition, reviewers were asked to select relevant engagement purposes for each method from the list of purposes above [25]. The review panel convened for discussion, with the primary and secondary reviewers providing their ratings and justification for their selections. A key decision was that methods that did not align with the definition of engagement (i.e., supporting bidirectional, longitudinal engagement) were re-labeled a “tool” rather than a method.

To determine user perspectives on important criteria for method selection, the DICE core and a CCTSI partner worked together to administer a 5-item feature prioritization survey at a community engagement seminar in February 2020 (see supplementary material). Twenty-one respondents ranked the importance of website features, the helpfulness, and importance of method selection criteria, and conveyed their interest in using an engagement method selection tool.

Ideate Stage

The “Ideate” stage purpose was to develop an organizing framework and brainstorm webtool content, features, and organization. Based on Empathize and Define stage activities, it was determined that the webtool should include three core features: education on the principles and purposes of stakeholder engagement, a method selection tool, and guidance on seeking expert consultation. Ultimately, a simple filtering technique was selected. The tool would assess user engagement goals and resources, use this assessment to filter recommended methods, and then offer informational sheets, called “strategy fact sheets,” for the selected method or tool(s). This system would also easily allow addition of more methods and tools over time.

Next, the DICE core storyboarded the layout and features of the webtool, led by a team member with expertise in instructional design. This process generated the initial informational and navigational architecture, which was further tested and refined by reference to the use cases and how each would progress, step-by-step, through the method selection process (see supplementary material).

In parallel with storyboarding, we developed an organizing framework reflecting the prioritized webtool content (Fig. 2) and educational content conveying principles and purposes of engagement and defining engagement approaches, methods, and tools.

Fig. 2.

Fig. 2.

Stakeholder engagement method navigator webtool organizing framework. IAP2, International Association of Public Participation; PCORI, Patient-Centered Outcomes Research Institute.

Prototype and Test Stages

Prototyping and testing were carried out to iterate successively more interactive, complete prototypes, and to evaluate usability and usefulness. During the Design Thinking course, the DICE core developed an initial static prototype using the JustInMind prototyping and wireframing tool available at JustInMind.com (Justinmind (C) 2021) [26]. This prototype specified general content and desired functionality to provide users with a sample of engagement strategies based on a few criteria.

Following the storyboarding and using the organizing framework, DICE core members developed an interactive prototype that included pages for educational content and the selection tool. After several rounds of iteration based on input from team members, a team member with expertise in user-centered design led contextual inquiry [27] user testing (see supplementary material) with four individuals representing key user types, identified based on the use cases developed during the Define stage. The resulting webtool underwent further task-oriented Think Aloud [28] testing (see supplementary material) with 16 participants attending a virtual pragmatic research conference in August 2020. Participants were separated into breakout rooms and navigated through the website with the goal of completing specific tasks while vocalizing their cognitive process aloud. Users were given hypothetical scenarios (e.g., “You are a junior researcher with a small budget being asked to find a method suitable for engaging patients on a national scale”) and then instructed to navigate the website within that scenario. User feedback guided one additional round of iteration, yielding the V1.0 webtool.

Results

The V1.0 webtool can be accessed at DICEMethods.org. Select screenshots from the tool are shown in Fig. 3. Overall, webtool design was informed by 77 unique individuals, including 7 D2V pilot grantees, 4 participants in ethnographic interviews, 12 external engagement experts, 20 usability testers, 21 survey respondents, and 14 DICE core team members.

Fig. 3.

Fig. 3.

Screenshots from the stakeholder engagement method navigator webtool.

Education and Guidance in Engagement Method Selection

The first question we sought to answer was: What education and expert guidance do health researchers need to select and use engagement methods?

Insights from the Empathize and Define stages included the following: Researchers understand that stakeholder engagement is valuable and want to include it in their research design and implementation; however, researchers are not familiar with stakeholder engagement methods. Researchers need an efficient means to learn how to conduct stakeholder engagement and to include engagement methods in grant proposals.

These findings drove team discussion and synthesis around the content and scope of education and guidance to be included in the webtool. Educating investigators on the basics of stakeholder engagement was equally important to guiding method selection. As a result, the webtool includes two main sections, an “Education Hub” and a “Find Engagement Strategies” section, which take up equal real estate at the top of the website. The Education Hub includes a section on stakeholder engagement basics, such as definitions of stakeholder engagement, its importance in research, a breakdown of core principles of stakeholder engagement, and a guide for identifying stakeholders and establishing their roles. For those already familiar with stakeholder engagement, the Education Hub also includes “A Deeper Dive,” which describes how to distinguish among approaches, methods, and tools and provides users with details on the different engagement approaches they might use as frameworks for their projects.

Engagement Method Cataloging and Selection Criteria

The second question we sought to answer was: What criteria of engagement methods and the research context are relevant to decisions about which method to use?

A consultation intake form developed for D2V engagement consultations was an early prototype for gathering method selection criteria and included details such as stage of clinical or translational research (T1-T4), types of stakeholders to be engaged, engagement purpose, available funding to pay stakeholders, and more features selected to help guide expert consultations – only some of which were ultimately included in the selection tool (see supplementary material). From the card-sorting activity conducted during the define stage, potential methods classification suggested by engagement experts included longitudinality, deliberative approaches, hypothesis-generating methods, or modality (virtual or in-person).

From team sensemaking following the Empathize, Define, and Ideate stage activities, the DICE core ultimately determined the set of 31 engagement strategies that should be first distinguished as either approaches, methods, or tools (see supplementary material). We developed definitions for each of these terms, which can be found in Table 1. Four distinguishing criteria emerged relevant to engagement method selection addressing both scientific fit (purpose of engagement) and practical fit with resources (budget, duration of individual engagement interactions, and timeline). Results from the feature prioritization survey were used to determine how these criteria would be prioritized by the filtering process. In order, based on level of importance most selected by respondents, the most important criteria in determining what method to use for stakeholder engagement in research was “How well the method achieves the specific goal,” followed by “Skills/personnel required to conduct the engagement method,” “Strengths or evidence base supporting the method,” “Time required to conduct the engagement method,” and “Cost required to conduct the engagement method.” Based on these findings, engagement purpose became the primary filter for the method selection tool.

Table 1.

Stakeholder engagement approaches, methods, and tools: definitions, explanations, and examples

Approach Method Tool
Definition An organizing framework used to establish and understand stakeholders’ roles in decision-making, as well as their roles in the conceptualization, conduct, and dissemination of research A set of specific tools, techniques, and processes used to enact the ‘high-level’ purposes of engagement: Identify and convene stakeholders, create reciprocal relationships (level the playing field), engage in bidirectional communication, elicit perspectives, and make decisions over time and in partnership A specific activity for gathering information, facilitating group discussion, brainstorming, etc. Different tools are often used in combination, and specific methods sometimes prescribe the use of specific tools alone or in combination
Explanation Approaches provide an overarching framework or set of principles for engagement that help the researcher define the roles of stakeholders in the research process, but do not necessarily prescribe the use of any particular methods or tools Methods include step-by-step procedures for engagement and often come in pre-existing packages of tools and process guidance, (e.g., Boot Camp Translation, citizen juries, Community Engagement Studio, etc.) though users may also customize packages of discrete tools and techniques that match your own needs and resources Tools are not complete, standalone ways to accomplish the purposes of engagement, but are used in the context of a method for engagement. Researchers might select multiple tools to create a custom method that matches their specific needs and resources (e.g., a combination of surveys and facilitated discussions used as part of engagement efforts)
Placement in the webtool Educational section Method selection feature Method selection feature
Examples
  1. Community-based participatory research

  2. Deliberative processes/public deliberation

  3. Experienced-based co-design

  1. Bootcamp translation

  2. Community engagement studio

  3. Deliberative polling

  4. Delphi technique

  5. Stakeholder panel/advisory committee

  1. Focus groups

  2. Key informant interviews

  3. Nominal group technique

  4. Simple ethnography

  5. Survey/questionnaire

  6. Town hall meeting

Within the method selection feature of the webtool, engagement purposes are categorized by research stage (planning, implementation, and/or dissemination). Users then further refine recommended methods based on anticipated budget (e.g., accounting for personnel effort, stakeholder incentives, other materials), timeline for project completion, and anticipated availability of the stakeholders (e.g., would they be available for only brief interactions or potentially able to attend longer sessions?). DICE core members developed orienting questions (Fig. 3a) for each distinguishing feature, which are posed both at the beginning of the selection tool (so that researchers may gather what they will need to know in advance) and at the corresponding step in the selection tool. For each method and tool, “strategy fact sheets” describe information on budget, time frame, workload, appropriate applications, materials and personnel needed, and a “how-to” section.

Engagement Navigator Webtool Features

Our final question was: What features of a webtool would help researchers with self-directed selection and use of engagement methods?

Based on the environmental scan of comparable tools from the Empathize stage, we developed a modification of an interactive “bubble” feature that displayed results on Engage2020’s Action Catalogue [22]. We carried this concept over into our webtool but streamlined the user experience by first asking screening questions about what stage of research the user plans to incorporate stakeholder engagement activities in and their purpose for engaging stakeholders. These questions narrow down the engagement methods and tools in the interactive results display to only those applicable to the user’s project. As the user inputs additional information about their project, the methods, and tools that are less applicable decrease in size but remain accessible so that the user is still able to learn about them if they wish.

The contextual inquiry usability testing from the iterative Prototype and Test stage activities informed design changes. Most design changes to early prototypes related to usability of the engagement method selection results page (see supplementary material). In general, users felt that there was too much content on the results page and the cognitive effort to make sense of the page was overly burdensome. Therefore, the design team prioritized streamlining the filtering system feedback by refining visual cues and the visual layout of the content. The results of the Think Aloud usability testing further refined the web tool prototype (see supplementary material). Overall, results from this round of user testing highlighted changes in the visual cues, words, or phrasing used in the interface, instead of the interface functionality itself, that is, user requests focused on quick comprehension.

Discussion

Using an iterative, multi-stage design thinking process, the DICE core identified researcher needs and desired features of a webtool that would facilitate education and guidance in selection of stakeholder engagement methods. As depicted in our organizing framework, the webtool educational content incorporates engagement principles, approaches, and frameworks, while a method selection tool guides users through a methods filtering interface. Our V1.0 webtool includes educational content for beginners (“The Basics”) and more advanced scholars (“A Deeper Dive”). Relevant distinguishing features of engagement methods included both scientific relevance of the methods for specific engagement goals, as well as feasibility and appropriateness considerations (budgets, timelines, stakeholder availability, and team expertise).

Some novel design-based conclusions stemmed from following user experience (UX) best practices during the development process [29]. The DICE core spent time with end-users to understand our audience, who generally think “less is more” with regard to academic web-based content and dislike content and/or visual overload. The concise and playful selector tool allowed end-users to accomplish the goal of identifying appropriate stakeholder engagement strategies easily and efficiently. A balance was achieved in decreasing cognitive load when manipulating the system, while acknowledging that this audience has a high baseline cognitive capability.

Since we began our design process, PCORI released their web-based Engagement Tool and Resource Repository (https://www.pcori.org/engagement/engagement-resources/Engagement-Tool-Resource-Repository), which provides links to resources to support conduct and evaluation of engagement efforts, organized by resource focus, phase of research, health conditions, stakeholder type, and populations. The PCORI repository contains multiple resources for researchers but does not provide recommendations on selecting engagement methods as our webtool does.

Implications for CTSAs

This work advances the science and practice of clinical and translational research in two ways. First, we demonstrate the utility (but resource intensiveness) of design thinking methods for developing these sorts of web-based research guidance tools. To be done well, webtool development requires substantial investments of time and financial resources; in our case, a Dean’s transformational research initiative was the main source of funding, with additional support from the CCTSI. Second, this work informs how researchers choose and curate methods for stakeholder engagement in clinical and translational research. We used an NIH style review process to rate methods based on our experience and findings of a literature review. This revealed an obvious research gap: The evidence base behind most engagement methods is limited, and few randomized controlled trials (RCTs) exist that evaluate methods or compare them head-to-head. The literature describes how researchers use different methods of engagement and the positive impact engagement has on research but is relatively silent on whether using particular engagement methods for specific purposes could better or more efficiently achieve the different goals of engagement [3,30,31]. Just as particular scientific methods are uniquely suited for answering specific question, our project advances the science of engagement by allowing researchers to begin tailoring engagement methods to the task at hand.

Next Steps

Next steps for the Stakeholder Engagement Navigator webtool include developing and implementing a dissemination strategy, designing and conducting evaluations to assess real-world utility, and devising processes to help ensure that the webtool is maintained, updated, and sustained over time.

Acknowledgements

The authors wish to acknowledge the contributions of Suzanne Millward for guiding the DICE team through the beginning stages of the Design Thinking process, Erika Blacksher for substantive contributions to the design of the webtool, the valuable input received from user testing participants, as well as the support of the University of Colorado’s Data Science to Patient Value initiative. Supported by NIH/NCATS Colorado CTSA Grant Number UL1 TR002535. Contents are the authors’ sole responsibility and do not necessarily represent official NIH views.

Supplementary material

For supplementary material accompanying this paper visit http://dx.doi.org/10.1017/cts.2021.850.

S2059866121008505sup001.pdf (3.2MB, pdf)

click here to view supplementary material

Disclosures

The authors have no conflicts of interest to declare.

References

  • 1. Manafò E, Petermann L, Vandall-Walker V, Mason-Lai P. Patient and public engagement in priority setting: a systematic rapid review of the literature. PLoS One 2018; 13(3): e0193579. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2. Kendall C, Fitzgerald M, Kang RS, et al. Still learning and evolving in our approaches: patient and stakeholder engagement among Canadian community-based primary health care researchers. Research Involvement and Engagement 2018; 4(1): 47. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3. Price A, Albarqouni L, Kirkpatrick J, et al. Patient and public involvement in the design of clinical trials: an overview of systematic reviews. Journal of Evaluation in Clinical Practice 2018; 24(1): 240–253. [DOI] [PubMed] [Google Scholar]
  • 4. Selker HP, Wilkins CH. From community engagement, to community-engaged research, to broadly engaged team science. Journal of Clinical and Translational Science 2017; 1(1): 5–6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5. Fleurence R, Selby JV, Odom-Walker K, et al. How the Patient-Centered Outcomes Research Institute is engaging patients and others in shaping its research agenda. Health Affairs 2013; 32(2): 393–400. [DOI] [PubMed] [Google Scholar]
  • 6. Concannon TW, Meissner P, Grunbaum JA, et al. A new taxonomy for stakeholder engagement in patient-centered outcomes research. Journal of General Internal Medicine 2012; 27(8): 985–991. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7. Klein G, Gold LS, Sullivan SD, et al. Prioritizing comparative effectiveness research for cancer diagnostics using a regional stakeholder approach. Journal of Comparative Effectiveness Research 2012; 1(3): 241–255. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8. Poger JM, Mayer V, Duru OK, et al. Network engagement in action: stakeholder engagement activities to enhance patient-centeredness of research. Medical Care 2020; 58(Suppl 6 1): S66–s74. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9. Bodison SC, Sankaré I, Anaya H, et al. Engaging the community in the dissemination, implementation, and improvement of health-related research. Clinical and Translational Science 2015; 8(6): 814–819. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10. Kwon SC, Tandon SD, Islam N, Riley L, Trinh-Shevrin C. Applying a community-based participatory research framework to patient and family engagement in the development of patient-centered outcomes research and practice. Translational Behavioral Medicine 2018; 8(5): 683–691. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11. Kirwan JR, de Wit M, Frank L, et al. Emerging guidelines for patient engagement in research. Value in Health 2017; 20(3): 481–486. [DOI] [PubMed] [Google Scholar]
  • 12. Harrison JD, Auerbach AD, Anderson W, et al. Patient stakeholder engagement in research: a narrative review to describe foundational principles and best practice activities. Health Expectations 2019; 22(3): 307–316. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13. Bishop AC, Elliott MJ, Cassidy C. Moving patient-oriented research forward: thoughts from the next generation of knowledge translation researchers. Research Involvement and Engagement 2018; 4(1): 23. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14. Pearson CR, Duran B, Oetzel J, et al. Research for improved health: variability and impact of structural characteristics in federally funded community engaged research. Progress in Community Health Partnerships: Research, Education, and Action 2015; 9(1): 17–29. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15. DiGirolamo A, Geller AC, Tendulkar SA, Patil P, Hacker K. Community-based participatory research skills and training needs in a sample of academic researchers from a clinical and translational science center in the Northeast. Clinical and Translational Science 2012; 5(3): 301–305. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16. Shea CM, Young TL, Powell BJ, et al. Researcher readiness for participating in community-engaged dissemination and implementation research: a conceptual framework of core competencies. Translational Behavioral Medicine 2017; 7(3): 393–404. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17. Hendricks D, Shanafelt A, Riggs S, Call K, Eder M. Proceedings of Advancing the Science of Community Engaged Research (CEnR). Innovative and effective methods of stakeholder engagement in translational research: abstracts. BMC Proceedings 2019; 13(2): P1. [Google Scholar]
  • 18. Meissner P, Cottler LB, Eder M, Michener JL. Engagement science: the core of dissemination, implementation, and translational research science. Journal of Clinical and Translational Science 2020; 4(3): 216–218. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19. Ford B, Rabin B, Morrato EH, Glasgow RE. Online resources for dissemination and implementation science: meeting demand and lessons learned. Journal of Clinical and Translational Science 2018; 2(5): 259–266. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20. IDEO. Design thinking; 2020. (https://www.ideou.com/pages/design-thinking)
  • 21. O’Haire C, McPheeters M, Nakamoto E, et al. AHRQ methods for effective health care. In: Engaging stakeholders to identify and prioritize future research needs. Rockville, MD: Agency for Healthcare Research and Quality (US), 2011. [PubMed] [Google Scholar]
  • 22. Engage2020. Action catalogue; 2015. (http://actioncatalogue.eu/)
  • 23. Participedia. Welcome to participedia; 2020. (https://participedia.net/about)
  • 24. Spencer D, Warfel T. Card sorting: a definitive guide; 2004. (https://boxesandarrows.com/card-sorting-a-definitive-guide/)
  • 25. Engagement Rubric for Applicants. PCORI (Patient-Centered Outcomes Research Institute) website; 2014. (https://www.pcori.org/sites/default/files/Engagement-Rubric.pdf)
  • 26. Customized Synergy. Computer program; 2014. (http://customizedsynergy.com)
  • 27. Whiteside J, Bennett J, Holtzblatt K. Usability engineering: our experience and evolution. In: Helander M, eds. Handbook of Human-Computer Interaction. Amsterdam: North-Holland; 1988. 791–817. [Google Scholar]
  • 28. Trulock V. Think Aloud Protocol. Understanding HCI website; 2008.
  • 29. Gualtieri M. Best Practices in User Experience (UX) Design. Design Compelling User Experiences to Wow Your Customers. Cambridge, MA: Forrester Research, Inc., 2009. [Google Scholar]
  • 30. Gierisch JM, Hughes JM, Williams JW, Jr, Gordon AM, Goldstein KM. Qualitative exploration of engaging patients as advisors in a program of evidence synthesis: cobuilding the science to enhance impact. Medical Care 2019; 57(10 Suppl 3): S246–s252. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31. Forsythe LP, Carman KL, Szydlowski V, et al. Patient engagement in research: early findings from the Patient-Centered Outcomes Research Institute. Health Affairs 2019; 38(3): 359–367. [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

For supplementary material accompanying this paper visit http://dx.doi.org/10.1017/cts.2021.850.

S2059866121008505sup001.pdf (3.2MB, pdf)

click here to view supplementary material


Articles from Journal of Clinical and Translational Science are provided here courtesy of Cambridge University Press

RESOURCES