Skip to main content
Implementation Science Communications logoLink to Implementation Science Communications
. 2024 Sep 3;5:95. doi: 10.1186/s43058-024-00630-8

Applying cognitive walkthrough methodology to improve the usability of an equity-focused implementation strategy

Kelly A Aschbrenner 1,10,, Emily R Haines 2, Gina R Kruse 3, Ayotola O Olugbenga 4, Annette N Thomas 4, Tanveer Khan 5, Stephanie Martinez 6, Karen M Emmons 7, Stephen J Bartels 6,7,8,9
PMCID: PMC11373107  PMID: 39227912

Abstract

Background

Our research team partnered with primary care and quality improvement staff in Federally Qualified Community Health Centers (CHCs) to develop Partnered and Equity Data-Driven Implementation (PEDDI) to promote equitable implementation of evidence-based interventions. The current study used a human-centered design methodology to evaluate the usability of PEDDI and generate redesign solutions to address usability issues in the context of a cancer screening intervention.

Methods

We applied the Cognitive Walkthrough for Implementation Strategies (CWIS), a pragmatic assessment method with steps that include group testing with end users to identify and prioritize usability problems. We conducted three facilitated 60-min CWIS sessions with end users (N = 7) from four CHCs that included scenarios and related tasks for implementing a colorectal cancer (CRC) screening intervention. Participants rated the likelihood of completing each task and identified usability issues and generated ideas for redesign solutions during audio-recorded CWIS sessions. Participants completed a pre-post survey of PEDDI usability. Our research team used consensus coding to synthesize usability problems and redesign solutions from transcribed CWIS sessions.

Results

Usability ratings (scale 0–100: higher scores indicating higher usability) of PEDDI averaged 66.3 (SD = 12.4) prior to the CWIS sessions. Scores averaged 77.8 (SD = 9.1) following the three CWIS sessions improving usability ratings from “marginal acceptability” to “acceptable”. Ten usability problems were identified across four PEDDI tasks, comprised of 2–3 types of usability problems per task. CWIS participants suggested redesign solutions that included making data fields for social determinants of health and key background variables for identifying health equity targets mandatory in the electronic health record and using asynchronous communication tools to elicit ideas from staff for adaptations.

Conclusions

Usability ratings indicated PEDDI was in the acceptable range following CWIS sessions. Staff identified usability problems and redesign solutions that provide direction for future improvements in PEDDI. In addition, this study highlights opportunities to use the CWIS methodology to address inequities in the implementation of cancer screening and other clinical innovations in resource-constrained healthcare settings.

Keywords: Implementation strategies, Human-centered design, Usability, Cognitive walkthrough, Cancer screening, Cancer prevention, Health equity


Contributions to the literature.

  • This article describes the application of an innovative human-centered design methodology to evaluate the usability of an equity-focused implementation strategy in the context of cancer screening.

  • Usability problems and redesign solutions were identified by end users who were healthcare professionals from community health centers.

  • This study highlights new opportunities to apply the CWIS methodology to advance equity-focused implementation of clinical interventions.

Background

Early detection of cancer saves lives, and it improves treatment options and outcomes and quality of life after treatment [1]. Despite these benefits, cancer screening for colorectal cancer (CRC), lung cancer and female breast and cervical cancers is underused in the United States [2], with lower rates of cancer screening among minoritized racial and ethnic groups, sexual and gender minoritized communities, people living in rural areas, and lower income individuals [35]. Barriers to cancer screening include insurance coverage and costs, structural barriers (e.g., lack of transportation, lack of childcare) and cultural barriers (e.g., lack of language services, stigma about screening procedures) [610]. Addressing cancer screening inequities in routine clinical practice requires the ability to identify patient groups underserved during implementation of evidence-based cancer screening interventions and adapting outreach and/or interventions to address barriers faced by the identified underserved groups [11, 12].

Most prior research addressing inequities in cancer screening has focused on developing multi-component interventions designed to address barriers for specific populations [1316]. While this approach to screening development is necessary and important, it may be challenging to integrate numerous different interventions tailored for specific patient populations in resource-constrained healthcare settings such as Federally Qualified Community Health Centers (CHCs). CHCs face many challenges in delivering cancer-preventive care, including workforce shortages and lack of resources [17, 18]. An alternative approach to implementing different multiple-component interventions for specific populations is to implement what works for a broad population, identify performance and outcome gaps in underserved groups, and make adaptations to address these gaps [19, 20].

To advance strategies for equitable implementation of cancer screening interventions, our team partnered with primary care and quality improvement staff at CHCs to develop Partnered and Equity Data-Driven Implementation (PEDDI) [21]. PEDDI is an implementation strategy that involves an external facilitator whose role is to serve as a change agent who provides interactive problem solving and support to an internal team of healthcare professionals responsible for implementing an evidence-based intervention (EBI) [22]. The external facilitator guides the team to obtain and use data collected in routine practice to: (a) identify patient groups experiencing gaps in outreach, use, and/or benefit from an EBI, and (b) rapidly adapt screening outreach and/or the intervention to address identified gaps. PEDDI consists of five steps: (1) plan to obtain data needed to make subgroup comparisons; (2) select variables for comparisons to identify gaps and obtain clinic data; [3] identify gaps and prioritize specific gaps as health equity target(s); [4] identify and plan feasible adaptations to outreach and/or intervention; and [5] conduct rapid cycle testing of the planned adaptations [21]. While PEDDI was developed in the context of a CRC intervention, it is intended to have broader application as an implementation strategy for promoting equitable implementation of EBIs in clinical settings.

We previously conducted a pilot study examining the feasibility of PEDDI in the context of a paired CRC and social needs screening intervention at CHCs [21]. With external facilitation, CHCs obtained and used data to identify gaps in outreach and completion of CRC screening with respect to race/ethnicity, gender, age, and language. Adaptations to improve access and use of the intervention included cultural, linguistic, and health literacy tailoring. CHC teams reported that external facilitation that included systematic review of data were helpful in identifying and prioritizing gaps. None of the four CHCs completed rapid cycle testing of adaptations largely due to competing priorities during the COVID-19 response. After assessing the practicality of PEDDI as an implementation strategy in CHCs, we took the next step of assessing how easy it is for healthcare professionals (i.e. end users) to use PEDDI to reach their goals of equitable implementation through usability testing.

This report describes a second pilot study applying an innovative human-centered design (HCD) methodology to improve the usability of PEDDI building on initial pilot testing [21] as part of iterative implementation strategy development. HCD is an approach applied in many disciplines, including implementation science [2325], to develop useable and desirable innovations from the perspective of the people and systems in which the innovations will be used [26]. We applied an HCD methodology, Cognitive Walkthrough for Implementation Strategies (CWIS) [25], a pragmatic assessment method developed by implementation researchers that includes group testing with end users to identify and prioritize usability problems. This report describes how the CWIS methodology was applied specifically to address inequities in the implementation of a cancer screening intervention in resource-constrained healthcare settings. We present findings from the usability evaluation and redesign solutions generated by CHC partners. We discuss the implications of the study methods and findings for advancing equity-focused implementation processes and partnerships.

Materials and methods

We conducted this study at the Harvard Implementation Science Center for Cancer Control Equity (ISCCCE), which is funded by the National Cancer Institute as one of seven Implementation Science Centers in Cancer Control nationwide [27]. Harvard ISCCCE, in collaboration with the Massachusetts League of Community Health Centers (MLCHC), has partnerships with a network of 30 CHCs across Massachusetts [28]. With support from the MLCHC, ISCCCE investigators recruited healthcare professionals from CHCs in Massachusetts to participate in a project to improve the usability of PEDDI. The Harvard Longwood Campus IRB and Dartmouth Health Human Research Protection Program IRB independently reviewed the study protocol and determined that the submission was not research as defined by U.S. Department of Health and Human Services regulations.

Participants

We used purposive sampling [29] to invite participants who held roles at CHCs that represented end users of an equity-focused implementation strategy to participate in this study. This included members of primary care teams defined broadly as physicians, nurses, medical assistants, care managers, and community health workers and quality improvement staff. As noted in prior literature [29], the relatively small sample sizes in HCD projects are both a strength and limitation of HCD. Small end user samples can enable in-depth interactions with participants which can lead to rich, detailed feedback with nuanced insights on usability and potential modifications that can be missed with larger sample sizes using methods that prioritize breadth over depth (e.g., surveys with pre-determined questions)We aimed to recruit a small sample of end users to discuss usability issues in-depth. Our research team collaborated with MLCHC leadership to generate a list of potential participants with their contact information. We then sent an email to 14 CHC primary care and quality improvement staff from four CHCs that participated in our prior PEDDI pilot study and five CHCs that were new to PEDDI to participate in the CWIS project with information about the project that included study procedures, anticipated time commitment, and incentives. Eight staff agreed to participate in the study. Seven of the eight staff completed the usability testing sessions, which included four participants from two CHCs that were new to PEDDI and three participants from two CHCs that participated in the initial pilot study. The final sample that participated in group testing included: one Chief Operating Officer; two Directors of Quality Improvement; one Director of Operations; one Population Health Manager; one Quality Improvement Manager; and one Community Health Worker/Medical Interpreter. Participants were compensated with a $50 gift card for participating in each of three 60-min CWIS sessions.

Cognitive Walkthrough for Implementation Strategies (CWIS) methodology

We applied CWIS [25] to improve the usability of PEDDI from the perspective of end users who were healthcare professionals employed at CHCs, building on initial pilot testing as part of iterative implementation strategy development. Cognitive walkthroughs involve walking end users through work (i.e., key tasks) that needs to be done to successfully complete an implementation process and critically evaluating each task to identify aspects that could be challenging to users. As described below, we applied the six steps of the CWIS methodology with minor modifications to reduce complexity: 1) determine necessary strategy pre-conditions; 2) hierarchical task analysis; 3) task prioritization; 4) convert top tasks to testing scenarios; 5) pragmatic group testing with representative users; and 6) usability issue identification and redesign. Applying the CWIS methodology, our research team conducted the CWIS steps through internal meetings and email communication and through a combination of: a) online surveys; b) CWIS sessions; c) document review; and d) email correspondence with CHC participants.

CWIS Step 1: Determine the necessary preconditions for the implementation strategy

The first step in the CWIS methodology is to articulate the preconditions necessary for an implementation strategy to be effective. Preconditions include characteristics of the settings and end users believed to be most appropriate for the implementation strategy [25]. We designed PEDDI to be used in resourced-constrained healthcare settings by teams comprised of both clinical and quality improvement or population health staff who are routinely involved in the planning, implementation, delivery and/or evaluation of evidence-based interventions.

CWIS Step 2: Hierarchical task analysis

The second step in the CWIS methodology involves performing hierarchical task analysis by identifying the behavioral and physical tasks (e.g., making requests, reviewing data) and cognitive tasks (e.g., identifying patient gaps) that comprise the implementation strategy [25]. Three members of our research team (KA, GK, and SB) identified the overall tasks and subtasks in the PEDDI model. We generated the initial list of tasks by discussing: (1) what users need to do in each of the 5 Steps of the PEDDI model; (2) the subtasks required to carry out the associated larger tasks; and (3) what happens before and after completing each of the tasks. The first author refined the task list based on feedback from the research team and presented the refined list to research team members to confirm that all relevant tasks had been identified.

CWIS Step 3: Task prioritization

We prioritized tasks selected for group usability testing among CHC participants based on: (1) the anticipated likelihood that end users might encounter issues or errors when completing a task, and (2) the importance of completing the task correctly for successful implementation [25]. Our research team prioritized the five tasks aligned with the five PEDDI steps and associated subtasks listed on Table 1.

Table 1.

PEDDI Steps, Tasks and Associated Subtasks

Overall Steps and Tasks Subtasks
1. Plan to obtain the data needed to make subgroup comparisons 1a. Identify where the data will come from (e.g., download data from a population health management system linked to an electronic health record)
1b. Identify how the request for data will be made (e.g., ask CHC information technology team for assistance)
1c. Identify who on the team will obtain the data and in what time frame (include a plan for follow-up requests and how delays in obtaining the data will be handled)
2. Select variables for comparisons to identify gaps and obtain clinic data 2a. Create a list of demographic and other variables known to be associated with gaps in CRC screening using home kits (e.g., social determinants of health)
2b. Consider completeness and availability of data (e.g., Is the data routinely collected and available?)
2c. Check for completeness and availability of data (e.g., examine data frequencies)
2d. Select a subset of variables to define patient subgroups for comparisons
2e. Use or modify existing template with instructions to organize the data for comparisons or create a new template for your organization
3. Once data is obtained, identify gaps and prioritize specific gaps as health equity targets 3a. Review data comparisons to identify gaps in select outcomes (e.g., outreach, reach, return). Decide whether you will address one or more outcomes
3b. Prioritize one or more gaps as health equity targets based on: 1) organizational priorities that support addressing the gap; and 2) available staff and other resources needed to address the gap
3c. Identify 1–2 gaps to prioritize for the first round of adaptation and rapid cycle testing
4. Identify and plan feasible adaptations to outreach and/or intervention to address the gaps prioritized in Step 3 4a. Brainstorm with your healthcare team possible adaptations to the outreach and/or intervention
4b. Identify whether adaptations will be made to the: 1) content, 2) training, or 3) other
4c. Identify the reasons for the adaptation, including the intent or goal of the adaptation and confirm the adaptation is aligned with the goal
4d. Develop a list of any resources, training, and/or approvals needed to make adaptations
4e. Develop a protocol for the adaptation and train the team in the adapted outreach and/or intervention
5. Conduct rapid cycle testing of the planned adaptations 5a. Determine how you will test the adaptation on a small scale: identify which staff will be involved, assign responsibilities; develop documentation; and develop data collection procedures
5b. Perform the rapid cycle test. If you are not able to do the test as planned, document any changes you must make in order to perform the test
5c. Examine the data collected on the adaptation. Determine whether the adaptation was successful, partially successful or not at all successful
5d. Use the results of the rapid cycle test to decide on your next steps. Determine whether you need to further modify the adaptation and if so, what will the adaptation look like for evaluation in the next testing cycle

CWIS Step 4: Convert tasks to testing scenarios

Step 4 in the CWIS methodology involves generating scenarios that end users are likely to encounter while performing each of the tasks and associated subtasks, which are evaluated during group usability testing [25]. Our team converted the top five tasks and subtasks prioritized in CWIS Step 3 into scenarios for CWIS testing. Each scenario included: (1) a brief written description of the scenario, overall task and subtasks; (2) a script for a facilitator to use when introducing each task and subtasks; and (3) an image that represents the scenario by displaying the task and related subtasks.

CWIS Step 5: Group testing with representative users

Following an orientation to the project and overview of the PEDDI model, participants were asked to complete the Implementation Strategy Usability Scale (ISUS) adapted from the System Usability Scale [30] by CWIS developers [25]. ISUS includes 10-items assessing the overall usability of an implementation strategy on a five-point scale (1 = Strongly Disagree to 5 = Strongly Agree). Example items include: 1. “I think that I would like to use this [implementations strategy] frequently,” and 2. “I found the [implementation strategy] unnecessarily complex." For each of the odd numbered questions, 1 is subtracted from the answer value. For each of the even numbered questions, the answer value is subtracted from 5. The new values are summed to create a total score which is then multiplied by 2.5, with higher scores indicating greater usability. Participants received a $50 incentive for completing ISUS at baseline prior to the CWIS sessions and $50 for completing ISUS post-test following the CWIS sessions.

A facilitator from the research team (first author) with assistance from a study coordinator conducted three online CWIS sessions lasting approximately 60-min each, guiding participants through four scenarios and 17 associated subtasks. Because of time constraints, after presenting scenarios for Steps 1 and 2 of PEDDI we revised our initial plan to present all five scenarios and skipped Step 3 so we could complete usability testing for Steps 4 and 5 since these steps were either partially completed (Step 4) or not completed at all (Step 5) during the initial PEDDI pilot study. After each scenario was presented, participants were asked to rate each of the tasks based on their anticipated likelihood of success of completing the task on a four-point Likert scale (1 = No (very small changes of success); 2 = Probably not (small chance of success); 3 = Probably (probable chance of success); and 4 = Yes (very good chance of success). The facilitator presented the results of the ratings to the group during the sessions. Participants were asked to discuss anticipated usability problems and share any redesign solutions both verbally and on a digital interactive whiteboard that was accessible during the session. Sessions were audio recorded and transcribed so usability problems and redesign solutions could be synthesized by the research team.

CWIS Step 6: Usability issue identification and redesign

Following CWIS methodology [25], our team used a consensus coding approach to identify usability problems noted in the transcriptions from each of the three CWIS sessions. We supplemented transcripts with data recorded from the whiteboard sessions. We used a team based approach to coding the data in which the lead investigator and study coordinator independently coded the transcripts from the three CWIS sessions to identify usability issues and redesign solutions voiced by participants during the sessions [31]. The two coders met to compare coded data, resolve any discrepancies in the coded data, and finalize the list of usability issues and redesign solutions identified in the data. We then created a table that displayed each task in PEDDI and its associated usability issues and redesign solutions. We used a structured approach to member checking [32] that involved sharing the table of usability issues and redesign solutions with participants via email and asking them to share their reactions and thoughts about whether there should be anything added or removed from the table. Three of the seven participants provided written feedback on the content of the table, which was incorporated into the final version. The results of the usability issue identification and redesign solutions are presented below.

Results

Overall strategy usability ratings

Seven participants completed both pre-test and post-test ISUS assessments. ISUS ratings (scale 0–100: higher scores indicating greater usability) at pre-test (prior to the CWIS sessions) ranged from 50.0 to 82.5, with a mean of 66.4 (median 67.5; standard deviation = 12.4). At post-test (following the CWIS sessions), ISUS ratings ranged from 65.0 to 90.0 with a mean of 77.8 (median 77.5; standard deviation = 9.1). Based on descriptors developed for the original System Usability Scale [33], the mean score at pre-test indicates marginal acceptability consistent with a grade of C + while the mean score at post-test indicates usability is acceptable consistent with a grade of B + . For each of the four overall task ratings that occurred during CWIS sessions, participants responded with an anticipated success rating of either 3 = Probably (probable chance of success) or 4 = Yes (a very good chance of success), indicating a high likelihood of anticipated success.

Usability problems and redesign solutions

Consensus coding yielded 10 distinct usability problems across four PEDDI tasks, comprised of 2–3 types of usability problems per task. Table 2 displays each of the usability problems.

Table 2.

PEDDI Usability Problems and Redesign Solutions

Steps/Task Subtask Usability Problem Description Redesign Solutions
1. Plan to obtain the data needed to make subgroup comparisons Identify where the data will come from (e.g., download data from a population health management system linked to an electronic health record) Staff do not know how to use data reporting tools • Staff have not been trained or are not proficient in use of available data reporting tools in the electronic health record to generate the subgroup comparisons that will enable them to identify gaps in outreach and/or return of FIT kits • Train and/or retrain staff to use the data reporting tools in the electronic health record to generate subgroup comparisons
Data on patient outreach for CRC is not routinely collected

• Data on patient outreach is not systematically collected in a way that can be examined in practice

• When data is collected, it does not get saved in the patient’s chart in a way that an analyst could use it to quantify outreach. This would require manual data collection

• There would be burden in understanding why the outreach did not happen

• Having unique data points for outreach for a single clinical measure is a burden when the CHC typically focuses on outreach for multiple measures

• Create a workflow to systematically track patient outreach activities

• Pull lists of patients for which a FIT was ordered

• Show the team evidence that examining gaps in outreach leads to improved outreach

• Examine outreach overall, not specific to CRC screening

2. Select variables for comparisons to identify gaps and obtain clinic data Consider completeness and availability of data (e.g., Is the data routinely collected and available?) Missing data on key variables

• Social determinants of health data may be missing

• When demographic data is collected it is not always complete

• Make data fields mandatory in the electronic health record

• Examine patient groups with missing data as their own group

• Find alternatives to asking the front desk staff to collect demographic data and other sensitive information

Delays in getting data reports • Reports needed to identify patient gaps are not always available in a timely manner • Use monthly reports on screening completions
Not feasible to evaluate more than one variable • Examining data broken down by race and ethnicity combined, the cells are often too small to be able to see the story

• Examine broader categories of socioeconomic status or insurance type to capture a larger population

• Examine both broad categories and subgroups at risk within the categories

4. Identify and plan feasible adaptations to outreach and/or intervention to address the gaps prioritized in Step 3 Brainstorm with your healthcare team possible adaptations to the outreach and/or intervention Difficult to find time for brainstorming with the healthcare team

• Getting everyone together as a team to brainstorm is difficult

• Iterative brainstorming would not be feasible in most cases

• Use an existing meeting time for brainstorming

• Form a subgroup to brainstorm on the specific topic

• Elicit ideas and perspectives from key individuals and compile the list of possible adaptations

• Perform root cause analysis to see which adaptation might have the most leverage

Identify whether adaptations will be made to the: 1) content, 2) training, or 3) other Difficult to adapt strategies for a subpopulation • Applying the same adapted outreach and/or intervention to an entire subpopulation may not be practical when clinicians are focused on tailoring care to the individual in front of them • Provide clarity on the goal of the adaptation
Develop a list of resources, training, and/or approvals needed to make adaptations Ideas for improvement can exceed resources • Ideas generated during brainstorming sessions can exceed the scope of available resources

• Use a facilitator who can keep solutions within the project scope and available resources

• Identify trainings planned for other purposes and leverage those trainings for the project

5. Conduct rapid cycle testing of the planned adaptations Determine how to test the adaptation on a small scale: identify which staff will be involved, assign responsibilities; develop documentation; develop data collection procedures Barriers to performing rapid cycle testing • Numerous barriers can get in the way of performing rapid cycle testing, including workforce shortages and competing demands • Start with one team, make adaptations, if successful then develop training materials and rollout to the center
Examine the data collected on the adaptation. Determine whether the adaptation was successful, partially successful or not at all successful Risk of losing momentum during adaptation process • There is a risk of losing momentum among the team if the adaptations don’t have impact and it's not apparent what change you could make to try again

• Qualitative feedback about how it went can help keep the team on track, interested, and engaged

• Data on proximal indicators can help keep the team on track and moving forward

Within the overall task of selecting variables for comparisons, the subtask “consider completeness and availability of data” had several usability problems, including missing data on key variables needed to make subgroup comparisons that will enable the healthcare team to identify gaps in outreach and/or return of the fecal immunochemical test FIT kit. As one CHC participant noted on the digital whiteboard, “Demographic data is collected but it is not always comprehensive. We made a few fields mandatory in the [EHR] for new patients moving forward.” Another participant shared that the CHC had initiated a quality improvement project to improve data collection: “We started a pilot project where a lot of the information on [demographic and social determinants of health] is going to be collected. We literally have a pink sheet that the information could be collected on because part of the what we noticed is that getting that information from the front desk itself is a privacy issue. So, we are trying to see if there is another avenue of getting that information.”

Within the overall task of adapting CRC screening outreach and/or intervention to address identified gaps, the subtasks with usability problems included: (1) brainstorming possible adaptations; (2) identifying whether adaptations will be made to the: a) content; b) training, or c) other; and (3) developing a list of resources, training, and/or approvals needed to make adaptations. Participants identified challenges with iterative brainstorming sessions and suggested that understanding the reasons for the gap can help with efficient selection of adaptations. As one participant commented: “I would think [brainstorming] would be one time. If you try an intervention and it doesn't work, you might go back. But we wouldn't spend multiple brainstorming sessions before picking an intervention. One thing that we do with a more complex problem is, sometimes it's really hard to know which one of these potential adaptations to pick, and so we do try to do a little bit of what we would call root cause work to see where we might have the most leverage. To make a change, and that would also be a brainstorming thing, like what are all the factors contributing to the barrier.”

Another participant commented that agreeing on the scope of adaptations can be a challenge that a meeting facilitator could help with: “In brainstorming, a lot of the ideas (and I'm biased in this) that come out are “Oh, we need a report that shows us this” or “We need our EHR to do that” or “We need another person whose sole job it is to do this.” I think a lot of that starts at the brainstorming session and it's helpful if there's a facilitator or someone to sort of say “OK, well the scope of this project is not to create new hire. We're not going be able to create a new position. We don't have time to implement a new tool in the EHR. What is in the framework of our scope. What can we do?”.

Discussion

The current study applied a novel HCD methodology (CWIS) to evaluate the usability of PEDDI as a strategy to promote equitable implementation of a CRC screening intervention and generate redesign solutions to address the identified usability issues with healthcare partners. The results of this study highlight new opportunities to use the CWIS methodology to specifically address inequities in the implementation of innovations in resource-constrained healthcare settings. Participant usability ratings of PEDDI improved from “marginal acceptability” to “acceptable” following three CWIS sessions. PEDDI usability problems included missing data on key variables needed to make subgroup comparisons that enable the healthcare team to identify gaps in outreach and/or return of the cancer screening intervention; impracticality of holding iterative brainstorming sessions to identify adaptations; and lack of skills, training and capacity to perform rapid cycle testing.

CWIS participants suggested redesign solutions that included making data fields for social determinants of health and key background variables for identifying health equity targets mandatory in the electronic health record (EHR) and using asynchronous communication tools to elicit ideas from staff for adaptations. CHC participants also identified barriers to rapid cycle testing in the CHC context, including workforce shortages and competing demands with other quality improvement activities. These are important findings as adaptations and rapid cycle testing methods have been identified as critical strategies for advancing health equity in implementation science [3436]. In the section below, we discuss key findings and implications for PEDDI redesign and future research. We also discuss new opportunities to apply the CWIS methodology to advance equity-focused implementation processes and partnerships.

Participants reported that data on patient outreach is not routinely collected in practice and when data is collected, it is not saved in the patient’s chart in a way that an analyst could quantify outreach. This creates a significant barrier to obtaining data to identify gaps in outreach for cancer screening. As a redesign solution, participants suggested creating a workflow to systematically track patient outreach. One promising way to implement this solution is to use patient portal messaging to send and automatically track outreach messages for cancer screening. In a randomized quality improvement trial conducted at a large, integrated academic health system, implementation of electronic patient portal messaging before mailing FIT kits led to a significant increase in CRC screening and improvement in the time to completion [37]. Socio-demographic disparities in patient portal access and use may impede EHR portals as an effective method for patient messaging and tracking outreach in CHCs, particularly among underserved groups [38, 39]. To address this barrier, we recommend PEDDI facilitators partner with healthcare teams to provide training and support for using the patient portal and tailoring messages toward patient groups experiencing inequities as suggested in a review of studies on optimal patient portal use [40]. Apart from the EHR, developing tracking systems for staff who conduct outreach and integrating the tracking into the workflow may facilitate more systematic tracking of outreach.

Within the overall task of selecting variables for comparisons, the subtask “consider completeness and availability of data” had several usability problems, including missing data on key variables needed to make subgroup comparisons that will enable the healthcare team to identify gaps in outreach and/or return of the FIT kit. The concept of completeness of EHR data relates to whether or not the data are actually present [41]. A recent systematic review of the quality of social determinants of health data in the EHR reported studies showing that incomplete data led to validity issues when data were not missing at random [42]. Efforts to improve demographic and social determinants of health data collection have been reported in the research literature [43, 44]. In a study of initiating and implementing social determinants of health data collection in 8 CHCs [45], healthcare staff and professionals reported that having a staff member who advocated for the adoption of EHR-based social determinants of health screening, provided suggestions for workflow, and promoted uptake of screening by clinic staff facilitated effective integration of EHR-based screening. Future versions of PEDDI should identify and train a data collection champion who can promote data collection for key demographic and social determinants of health data needed to identify health equity targets.

Participants identified barriers to performing adaptation brainstorming sessions with busy frontline providers and staff to address gaps in cancer screening outreach and intervention. This is a significant usability problem for PEDDI as adapting EBIs to address inequities is a critical component of equitable implementation science [46]. To date, the implementation research literature has focused on developing pragmatic methods for documenting and tracking adaptations [47, 48] with less attention given to methods for including healthcare partners in identifying and selecting feasible, acceptable and appropriate adaptations. Participants suggested using asynchronous communication tools (e.g. email, electronic health records) to elicit ideas for potential adaptations from CHC team members. Asynchronous communication tools are commonly used in clinical settings [49, 50] and could be leveraged to facilitate brainstorming and decision-making regarding outreach and intervention adaptations.

The PEDDI process involves conducting Plan-Do-Study-Act (PDSA) cycles to iteratively test planned adaptations on a small scale. Rapid cycle testing (including PDSAs) is a tool used in both improvement practice and implementation science to make small, deliberate changes and evaluate whether the changes lead to improvements [51, 52]. Participants identified lack of staff skills, training and capacity to perform rapid cycle testing beyond key members of the QI team as a barrier to applying PDSA cycles to evaluate adaptations to cancer screening outreach and intervention. As others have noted [53, 54], the skills, knowledge, resources and support required to conduct rigorous and effective PDSA cycles in healthcare settings are often underestimated. As an alternative approach to PDSAs, future iterations of PEDDI should include more pragmatic and feasible approaches to conducting rapid cycle evaluation that can be used in resource-constrained healthcare settings to generate actionable results related to adaptations. A recent workshop on advancing rapid cycle research in cancer care delivery convened by the National Cancer Institute [34] emphasized the importance of identifying pragmatic ways to conduct rapid cycle testing in settings that may require additional resources.

By applying the CWIS methodology, our team partnered with end users from primary care and quality improvement teams in CHCs to identify usability problems with PEDDI tasks and generate redesign solutions. There are many examples in the implementation science literature of implementation challenges [55], with factors that include lack of staff awareness and difficulty accessing critical information needed for successful implementation [56]. Such issues may be addressed by using HCD techniques, such as CWIS, with implementation partners to enhance strategy usability prior to implementation. CWIS may be particularly important for evaluating the usability of new implementation strategies in resource-constrained healthcare settings where there is a timely need to fit innovations into context. Adapting Strategies to Promote Implementation Reach and Equity (ASPIRE) [57] is a process that guides the user to evaluate assumptions underlying an implementation strategy and potential of an implementation strategy to widen disparities and then adapt the implementation strategy to ensure equity is considered. Future research could apply CWIS methodology within an ASPIRE framework to evaluate the extent to which components of implementation strategies (e.g., adaptation and rapid cycle testing) are feasible and acceptable to end users.

Limitations

The current application of the CWIS methodology to evaluate the usability of PEDDI has a number of limitations that should be considered when interpreting the study results. First, CWIS sessions included seven end users and the clinical setting was limited to CHCs. While multiple end user groups (e.g., clinical, administrative, leadership) from primary care teams and quality improvement were represented in this study, the perceived usability of PEDDI tasks may vary depending on available resources and other contextual factors related to the clinical setting. Plans for future research on PEDDI include evaluating the implementation strategy with additional end users across multiple CHCs. Second, participants did not actually complete the PEDDI tasks during the CWIS sessions, which may have produced different results. Three of the seven participants had participated in the previous feasibility study of PEDDI which may have influenced user satisfaction and confidence with task completion in the current usability study. Third, the project was limited to three 60-min sessions in which to complete PEDDI task ratings, identify usability problems, and generate redesign solutions. Redesign solutions were not generated for PEDDI Step 3. It is possible that additional redesign solutions would have been generated in additional sessions. Fourth, CHC participants completed post-test usability ratings of a version of PEDDI that did not explicitly incorporate the re-design solutions generated during CWIS sessions. It is possible that participants found PEDDI to be more usable at post-test because they were engaged in the process of critically evaluating the usability of each major task and sharing thoughts about possible redesign solutions. Finally, the current application of the PEDDI process was limited to partnering with an internal implementation team consisting of healthcare professionals. Future versions of PEDDI aimed at developing and implementing equity-focused adaptations will also include “end-users” consisting of patients and family members embracing community-engaged participatory methods in co-designing implementation strategies [58, 59].

Conclusions

Implementation strategies hold promise for advancing health equity for cancer prevention by addressing inequities in cancer screening initiatives. As an implementation strategy, PEDDI supports implementation of evidence-based cancer screening interventions and systematically identifies and addresses gaps in outreach and intervention use and benefit among underserved groups. The current project used a novel HCD methodology to evaluate the usability of PEDDI to promote equitable implementation of CRC screening. CWIS usability ratings indicated the PEDDI protocol was in the acceptable range following the CWIS sessions. The usability issues and redesign solutions generated by representative users provide direction for future improvements in the usability of PEDDI, including improving data collection on patient outreach, streamlining the process for selecting and evaluating adaptations to cancer screening outreach and/or intervention, and for conducting rapid tests of change in resource-constrained healthcare settings. Finally, the enhanced usability of PEDDI sets the stage for future uses of the CWIS methodology and the PEDDI implementation strategy for other equity-focused implementation processes, health conditions, healthcare contexts, and partnerships.

Acknowledgements

The authors acknowledge and appreciate the efforts of all participating community health centers.

Abbreviations

EHR

Electronic Health Record

EBI

Evidence-based Intervention

FIT

Fecal Immunochemical Test

CHC

Community Health Center

CRC

Colorectal Cancer Screening

CWIS

Cognitive Walkthrough for Implementation Strategy

HCD

Human Centered Design

PEDDI

Partnered and Equity Data-Driven Implementation

ISCCCE

Implementation Science Center for Cancer Control Equity

PDSA

Plan-Do-Study-Act

MLCHC

Massachusetts League of Community Health Centers

Authors’ contributions

KA, GK, EH, and SB conceptualized and designed the study. KA and SM conducted the CWIS sessions and collected and analyzed the data. KA, GK, EH, SB, KE, AT, TK, and AO interpreted the results and wrote and/or edited sections of the text. All authors read and approved the final manuscript.

Funding

This manuscript was made possible with support from the Implementation Science Center for Cancer Control Equity, a National Cancer Institute funded program (P50 CA244433). The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Cancer Institute.

Availability of data and materials

De-identified data from this study are not available in a public archive. De-identified data from this study may be made available as allowable according to institutional IRB standards. Requests for data should be made to the corresponding author.

Declarations

Ethics approval and consent to participate

The Harvard Longwood Campus IRB and Dartmouth Health Human Research Protection Program independently reviewed the study protocol and determined that the submission was not research as defined by U.S. Department of Health and Human Services regulations.

Consent for publication

Not applicable.

Competing interests

GK has a family financial interest in a digital health company, Dimagi, Inc. All other authors declare no relevant financial or non-financial interests.

Footnotes

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

References

  • 1.Panel PsC. Closing Gaps in Cancer Screening: Connecting People, Communities, and Systems to Improve Equity and Access. A Report from the President’s Cancer Panel to the President of the United States. Bethesda, MD2022.
  • 2.Ma Z, Richardson LC. Cancer Screening Prevalence and Associated Factors Among US Adults. Prev Chronic Dis. 2022;19:220063. 10.5888/pcd19.220063 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Spencer JC, Kim J, Wang L, Tiro JA, Feldman S, Kamimeni A, et al. Racial and ethnic disparities in the cervical cancer screening cascade in three US health care settings. J Clin Oncol. 2022;40(28):109.34929133 10.1200/JCO.2022.40.28_suppl.109 [DOI] [Google Scholar]
  • 4.Domogauer J, Cantor T, Quinn G, Stasenko M. Disparities in cancer screenings for sexual and gender minorities. Curr Probl Cancer. 2022;46(5):100858. 10.1016/j.currproblcancer.2022.100858 [DOI] [PubMed] [Google Scholar]
  • 5.Health NCIaNIo. Rural-Urban Disparities in Cancer [Available from: https://gis.cancer.gov/mapstory/rural-urban/index.html.
  • 6.Ponce-Chazarri L, Ponce-Blandón JA, Immordino P, Giordano A, Morales F. Barriers to Breast Cancer-Screening Adherence in Vulnerable Populations. Cancers (Basel). 2023;15(3):604. 10.3390/cancers15030604 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Lau J, Lim T-Z, Jianlin Wong G, Tan K-K. The health belief model and colorectal cancer screening in the general population: A systematic review. Preventive Medicine Reports. 2020;20:101223. 10.1016/j.pmedr.2020.101223 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Fuzzell LN, Perkins RB, Christy SM, Lake PW, Vadaparampil ST. Cervical cancer screening in the United States: Challenges and potential solutions for underscreened groups. Prev Med. 2021;144:106400. 10.1016/j.ypmed.2020.106400 [DOI] [PubMed] [Google Scholar]
  • 9.Muthukrishnan M, Arnold LD, James AS. Patients’ self-reported barriers to colon cancer screening in federally qualified health center settings. Prev Med Rep. 2019;15:100896. 10.1016/j.pmedr.2019.100896 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Lin Y, Liang L-J, Ding R, Prosper AE, Aberle DR, Hsu W. Factors Associated With Nonadherence to Lung Cancer Screening Across Multiple Screening Time Points. JAMA Network Open. 2023;6(5):e2315250. 10.1001/jamanetworkopen.2023.15250 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Richardson-Parry A, Baas C, Donde S, Ferraiolo B, Karmo M, Maravic Z, et al. Interventions to reduce cancer screening inequities: the perspective and role of patients, advocacy groups, and empowerment organizations. Int J Equity Health. 2023;22(1):19. 10.1186/s12939-023-01841-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Fiscella K, Humiston S, Hendren S, Winters P, Jean-Pierre P, Idris A, et al. Eliminating disparities in cancer screening and follow-up of abnormal results: what will it take? J Health Care Poor Underserved. 2011;22(1):83–100. 10.1353/hpu.2011.0023 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Duffy SW, Myles JP, Maroni R, Mohammad A. Rapid review of evaluation of interventions to improve participation in cancer screening services. J Med Screen. 2017;24(3):127–45. 10.1177/0969141316664757 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Liu D, Schuchard H, Burston B, Yamashita T, Albert S. Interventions to Reduce Healthcare Disparities in Cancer Screening Among Minority Adults: a Systematic Review. J Racial Ethn Health Disparities. 2021;8(1):107–26. 10.1007/s40615-020-00763-1 [DOI] [PubMed] [Google Scholar]
  • 15.Richardson-Parry A, Silva M, Valderas JM, Donde S, Woodruff S, van Vugt J. Interactive or tailored digital interventions to increase uptake in cervical, breast, and colorectal cancer screening to reduce health inequity: a systematic review. European Journal of Cancer Prevention. 2023;32(4):396–409. 10.1097/CEJ.0000000000000796 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Rogers CR, Matthews P, Xu L, Boucher K, Riley C, Huntington M, et al. Interventions for increasing colorectal cancer screening uptake among African-American men: A systematic review and meta-analysis. PLoS ONE. 2020;15(9):e0238354. 10.1371/journal.pone.0238354 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Chuang E, Pourat N, Chen X, Lee C, Zhou W, Daniel M, et al. Organizational Factors Associated with Disparities in Cervical and Colorectal Cancer Screening Rates in Community Health Centers. J Health Care Poor Underserved. 2019;30(1):161–81. 10.1353/hpu.2019.0014 [DOI] [PubMed] [Google Scholar]
  • 18.Huguet N, Hodes T, Holderness H, Bailey SR, DeVoe JE, Marino M. Community Health Centers’ Performance in Cancer Screening and Prevention. Am J Prev Med. 2022;62(2):e97–106. 10.1016/j.amepre.2021.07.007 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Kerkhoff AD, Farrand E, Marquez C, Cattamanchi A, Handley MA. Addressing health disparities through implementation science—a need to integrate an equity lens from the outset. Implement Sci. 2022;17(1):13. 10.1186/s13012-022-01189-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Baumann AA, Cabassa LJ. Reframing implementation science to address inequities in healthcare delivery. BMC Health Serv Res. 2020;20(1):190. 10.1186/s12913-020-4975-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Aschbrenner KA, Kruse G, Emmons KM, Singh D, Barber-Dubois ME, Miller AM, Thomas AN, Bartels SJ. Stakeholder and Equity Data-Driven Implementation: a Mixed Methods Pilot Feasibility Study. Prev Sci. 2024;25(Suppl 1):136–46. 10.1007/s11121-022-01442-9. 10.1007/s11121-022-01442-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Stetler CB, Legro MW, Rycroft-Malone J, Bowman C, Curran G, Guihan M, et al. Role of “external facilitation” in implementation of research findings: a qualitative evaluation of facilitation experiences in the Veterans Health Administration. Implement Sci. 2006;1:23. 10.1186/1748-5908-1-23 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Chen E, Neta G, Roberts MC. Complementary approaches to problem solving in healthcare and public health: implementation science and human-centered design. Transl Behav Med. 2021;11(5):1115–21. 10.1093/tbm/ibaa079 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Dopp AR, Parisi KE, Munson SA, Lyon AR. A glossary of user-centered design strategies for implementation experts. Transl Behav Med. 2019;9(6):1057–64. 10.1093/tbm/iby119 [DOI] [PubMed] [Google Scholar]
  • 25.Lyon AR, Coifman J, Cook H, McRee E, Liu FF, Ludwig K, et al. The Cognitive Walkthrough for Implementation Strategies (CWIS): a pragmatic method for assessing implementation strategy usability. Implement Sci Commun. 2021;2(1):78. 10.1186/s43058-021-00183-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Norman DA. User Centered System Design: New Perspectives on Human-computer Interaction. 1st ed. CRC Press; 1986. 10.1201/9780367807320.
  • 27.Oh AY, Emmons KM, Brownson RC, Glasgow RE, Foley KL, Lewis CC, Schnoll R, Huguet N, Caplon A, Chambers DA. Speeding implementation in cancer: The National Cancer Institute’s Implementation Science Centers in Cancer Control. J Natl Cancer Inst. 2023;115(2):131–8. 10.1093/jnci/djac198. 10.1093/jnci/djac198 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Kruse GR, Lee RM, Aschbrenner KA, Daly JG, Dargon-Hart S, Davies ME, et al. Embedding community-engaged research principles in implementation science: The implementation science center for cancer control equity. J Clin Transl Sci. 2023;7(1):e82. 10.1017/cts.2023.32 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Palinkas LA, Horwitz SM, Green CA, Wisdom JP, Duan N, Hoagwood K. Purposeful Sampling for Qualitative Data Collection and Analysis in Mixed Method Implementation Research. Adm Policy Ment Health. 2015;42(5):533–44. 10.1007/s10488-013-0528-y [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Sauro J. A Practical Guide to the System Usability Scale: Background, Benchmarks & Best Practices: Measuring Usability LLC; 2011.
  • 31.Saldaña J. The coding manual for qualitative researchers. Thousand Oaks, CA: Sage Publications Ltd; 2009. xi, 223-xi, p.
  • 32.C. M. Meaningful Member-Checking: A structured Approach to Member-Checking. American Journal of Qualitative Research. 7. 2023;2:41–52.
  • 33.Bangor A, Kortum PT, Miller JT. An Empirical Evaluation of the System Usability Scale. International Journal of Human-Computer Interaction. 2008;24(6):574–94. 10.1080/10447310802205776 [DOI] [Google Scholar]
  • 34.Norton WE, Kennedy AE, Mittman BS, Parry G, Srinivasan S, Tonorezos E, et al. Advancing rapid cycle research in cancer care delivery: a National Cancer Institute workshop report. J Natl Cancer Inst. 2023;115(5):498–504. 10.1093/jnci/djad007 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Chambers DA. Advancing adaptation of evidence-based interventions through implementation science: progress and opportunities. Front Health Serv. 2023;3:1204138. 10.3389/frhs.2023.1204138 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Baumann AA, Shelton RC, Kumanyika S, Haire-Joshu D. Advancing healthcare equity through dissemination and implementation science. Health Serv Res. 2023;58(S3):327–44. 10.1111/1475-6773.14175 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Goshgarian G, Sorourdi C, May FP, Vangala S, Meshkat S, Roh L, et al. Effect of Patient Portal Messaging Before Mailing Fecal Immunochemical Test Kit on Colorectal Cancer Screening Rates: A Randomized Clinical Trial. JAMA Network Open. 2022;5(2):e2146863-e. 10.1001/jamanetworkopen.2021.46863 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.Nishii A, Campos-Castillo C, Anthony D. Disparities in patient portal access by US adults before and during the COVID-19 pandemic. JAMIA Open. 2022;5(4):ooac104. 10.1093/jamiaopen/ooac104 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39.Richwine C, Johnson C, Patel V. Disparities in patient portal access and the role of providers in encouraging access and use. J Am Med Inform Assoc. 2023;30(2):308–17. 10.1093/jamia/ocac227 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40.Zhao JY, Song B, Anand E, Schwartz D, Panesar M, Jackson GP, et al. Barriers, Facilitators, and Solutions to Optimal Patient Portal and Personal Health Record Use: A Systematic Review of the Literature. AMIA Annu Symp Proc. 2017;2017:1913–22. [PMC free article] [PubMed] [Google Scholar]
  • 41.Kahn MG, Callahan TJ, Barnard J, Bauck AE, Brown J, Davidson BN, et al. A Harmonized Data Quality Assessment Terminology and Framework for the Secondary Use of Electronic Health Record Data. EGEMS (Wash DC). 2016;4(1):1244. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42.Cook LA, Sachs J, Weiskopf NG. The quality of social determinants data in the electronic health record: a systematic review. J Am Med Inform Assoc. 2021;29(1):187–96. 10.1093/jamia/ocab199 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 43.Vega Perez RD, Hayden L, Mesa J, Bickell N, Abner P, Richardson LD, et al. Improving Patient Race and Ethnicity Data Capture to Address Health Disparities: A Case Study From a Large Urban Health System. Cureus. 2022;14(1):e20973. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 44.De Lew N, Sommers BD. Addressing Social Determinants of Health in Federal Programs. JAMA Health Forum. 2022;3(3):e221064-e. 10.1001/jamahealthforum.2022.1064 [DOI] [PubMed] [Google Scholar]
  • 45.Gruß I, Bunce A, Davis J, Dambrun K, Cottrell E, Gold R. Initiating and Implementing Social Determinants of Health Data Collection in Community Health Centers. Popul Health Manag. 2021;24(1):52–8. 10.1089/pop.2019.0205 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 46.Shelton RC, Brownson RC. Enhancing Impact: A Call to Action for Equitable Implementation Science. Prev Sci. 2024;25(Suppl 1):174–89. 10.1007/s11121-023-01589-z. 10.1007/s11121-023-01589-z [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 47.Haley AD, Powell BJ, Walsh-Bailey C, Krancari M, Gruß I, Shea CM, et al. Strengthening methods for tracking adaptations and modifications to implementation strategies. BMC Med Res Methodol. 2021;21(1):133. 10.1186/s12874-021-01326-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 48.Rabin BA, McCreight M, Battaglia C, Ayele R, Burke RE, Hess PL, et al. Systematic, Multimethod Assessment of Adaptations Across Four Diverse Health Systems Interventions. Frontiers in Public Health. 2018;6:102. 10.3389/fpubh.2018.00102 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.Steitz BD, Sulieman L, Warner JL, Fabbri D, Brown JT, Davis AL, et al. Classification and analysis of asynchronous communication content between care team members involved in breast cancer treatment. JAMIA Open. 2021;4(3):ooab049. 10.1093/jamiaopen/ooab049 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 50.Steitz BD, Unertl KM, Levy MA. An Analysis of Electronic Health Record Work to Manage Asynchronous Clinical Messages among Breast Cancer Care Teams. Appl Clin Inform. 2021;12(4):877–87. 10.1055/s-0041-1735257 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 51.Leeman J, Rohweder C, Lee M, Brenner A, Dwyer A, Ko LK, et al. Aligning implementation science with improvement practice: a call to action. Implementation Science Communications. 2021;2(1):99. 10.1186/s43058-021-00201-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 52.Ogrinc G, Dolansky M, Berman AJ, Chambers DA, Davies L. Different approaches to making and testing change in healthcare. BMJ. 2021;374:n1010. 10.1136/bmj.n1010 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 53.Kaplan HC, Provost LP, Froehle CM, Margolis PA. The Model for Understanding Success in Quality (MUSIQ): building a theory of context in healthcare quality improvement. BMJ Qual Saf. 2012;21(1):13–20. 10.1136/bmjqs-2011-000010 [DOI] [PubMed] [Google Scholar]
  • 54.Reed JE, Card AJ. The problem with Plan-Do-Study-Act cycles. BMJ Qual Saf. 2016;25(3):147–52. 10.1136/bmjqs-2015-005076 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 55.Moyal-Smith R, Etheridge JC, Karlage A, Sonnay Y, Yuan CT, Havens JM, et al. Defining re-implementation. Implementation Science. Communications. 2023;4(1):60. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 56.Gransjøen AM, Wiig S, Lysdahl KB, Hofmann BM. Development and conduction of an active re-implementation of the Norwegian musculoskeletal guidelines. BMC Res Notes. 2018;11(1):785. 10.1186/s13104-018-3894-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 57.Gaias LM, Arnold KT, Liu FF, Pullmann MD, Duong MT, Lyon AR. Adapting strategies to promote implementation reach and equity (ASPIRE) in school mental health services. Psychol Sch. 2022;59(12):2471–85. 10.1002/pits.22515 [DOI] [Google Scholar]
  • 58.Villalobos A, Blachman-Demner D, Percy-Laurry A, Belis D, Bhattacharya M. Community and partner engagement in dissemination and implementation research at the National Institutes of Health: an analysis of recently funded studies and opportunities to advance the field. Implementation Science Communications. 2023;4(1):77. 10.1186/s43058-023-00462-y [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 59.Pinto RM, Park S, Miles R, Ong PN. Community engagement in dissemination and implementation models: A narrative review. Implementation Research and Practice. 2021;2:2633489520985305. 10.1177/2633489520985305 [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

De-identified data from this study are not available in a public archive. De-identified data from this study may be made available as allowable according to institutional IRB standards. Requests for data should be made to the corresponding author.


Articles from Implementation Science Communications are provided here courtesy of BMC

RESOURCES