Skip to main content
Learning Health Systems logoLink to Learning Health Systems
. 2023 Jan 27;7(3):e10359. doi: 10.1002/lrh2.10359

The implementation checklist: A pragmatic instrument for accelerating research‐to‐implementation cycles

Stephanie Prausnitz 1,, Andrea Altschuler 1, Lisa J Herrinton 1, Andrew L Avins 1, Douglas A Corley 1
PMCID: PMC10336492  PMID: 37448453

Abstract

Introduction

Learning health systems require rapid‐cycle research and nimble implementation processes to maximize innovation across disparate specialties and operations. Existing detailed research‐to‐implementation frameworks require extensive time commitments and can be overwhelming for physician‐researchers with clinical and operational responsibilities, inhibiting their widespread adoption. The creation of a short, pragmatic checklist to inform implementation processes may substantially improve uptake and implementation efficiency across a variety of health systems.

Methods

We conducted a systematic review of existing implementation frameworks to identify core concepts. Utilizing comprehensive stakeholder engagement with 25 operational leaders, embedded physician‐researchers, and delivery scientists, concepts were iteratively integrated to create and implement a final concise instrument.

Results

A systematic review identified 894 publications describing implementation frameworks, which included 15 systematic reviews. Among these, domains were extracted from three commonly utilized instruments: the Quality Implementation Framework (QIF), the Consolidated Framework for Implementation Research (CFIR), and the Reach, Effectiveness, Adoption, Implementation, and Maintenance (RE‐AIM) framework. Iterative testing and stakeholder engagement revision of a four‐page draft implementation document with five domains resulted in a concise, one‐page implementation planning instrument to be used at project outset and periodically throughout project implementation planning. The instrument addresses end‐user feasibility concerns while retaining the main goals of more complex tools. This instrument was then systematically integrated into projects within the Kaiser Permanente Northern California Delivery Science and Applied Research program to address stakeholder engagement, efficiency, project planning, and operational implementation of study results.

Conclusion

A streamlined one‐page implementation planning instrument, incorporating core concepts of existing frameworks, provides a pragmatic, robust framework for evidence‐based healthcare innovation cycles that is being broadly implemented within a learning health system. These streamlined processes could inform other settings needing a best practice rapid‐cycle research‐to‐implementation tool for large numbers of diverse projects.

Keywords: embedded research, implementation science, learning health system, program evaluation, quality improvement

1. INTRODUCTION

Rapid evidence‐to‐implementation cycles embedded within clinical and operational activities are central to successful learning health systems. 1 In some learning health systems, embedded research teams, including both clinicians and health services research methodologic experts, engage with stakeholders to design analyses relevant to operational change, create evidence, and collaborate with operational leaders to inform next‐step implementation. However, the steps between answering relevant questions and implementation pose a particular problem for the translation of evidence to action, especially for physician‐researchers in such organizations, who may lack the time, best‐practice expertise, and authority to operationalize new evidence‐based workflow changes.

Experts have developed implementation and evaluation models and frameworks to guide the research‐implementation‐evaluation continuum. 2 Each model and framework was developed to aid in the planning, implementation, and/or evaluation of healthcare innovations 3 and each requires substantial detail, resources, and time for completion and iterative updating. This complexity has markedly impeded their use in real‐world, time‐constrained healthcare settings seeking to complete large numbers of rapid‐cycle healthcare innovation projects. 4 Ironically, these carefully developed implementation tools are not being widely implemented.

Detailed, theory‐based frameworks and process models help rigorously identify potential facilitators, barriers, and processes; in contrast, end‐user feasibility requires brevity 3 and ease of use. The abbreviation of complex, multi‐domain questionnaire, planning, and evaluation tools into shorter formats that maintain key concepts may provide comparable results while increasing uptake and practical application. For example, the substantial abbreviation of health‐related quality of life assessments from 36 questions across eight domains to 12 questions was accomplished without substantial loss of information. 5 Similarly, the National Institutes of Health's PROMIS measures provide brief, practical assessments of patient‐reported outcome domains. 6 Concise checklist‐type tools to provide the right question at the right time have provided proven, feasible methods for guiding effective implementation of operational healthcare initiatives, even in complex settings such as operating rooms. 7 , 8

The current report describes the rationale, development, evaluation, revision, and adoption of a concise research‐to‐implementation planning instrument within a delivery science and applied research program embedded within a large learning healthcare system. Its creation integrated core concepts of existing frameworks and utilized a systematic process of end‐user stakeholder engagement. The checklist format promotes end‐user acceptability, uptake, and completion, which we anticipate will increase operational implementation of evidence‐based innovation.

2. METHODS

2.1. Setting and rationale

Kaiser Permanente Northern California (KPNC) is an integrated healthcare system serving >4.5 million patients across 21 medical centers. The Permanente Medical Group's Delivery Science and Applied Research (DARE) (www.kp.org/dare) program is an internally‐funded initiative with four primary project support mechanisms: a Physician Researcher Program supporting specialty‐embedded physician‐researchers and their evidence‐based innovation projects, using four‐year terms; a Delivery Science Grant program using two‐year project‐specific grants; a Rapid Analytics Unit for rapid‐start, short‐term projects of up to 12 months duration; and a Targeted Analysis Program 9 for answering focused questions of up to six months duration. In each funding mechanism, physician‐researchers are paired with trained research scientists. These mechanisms are supplemented by specialty research networks (communities of clinician‐investigators embedded within specialties), research training, publication and meeting support, and an administrative core to facilitate connections with stakeholders, develop broad communications tools, and foster internal and external dissemination. 10 Projects are identified, developed, and completed in consultation with the executive clinician‐leaders who lead the medical group's regional clinical operations.

2.2. Needs identification

Qualitative process evaluations were conducted of the first seven Rapid Analytics Unit (RAU) projects 11 , 12 , 13 , 14 , 15 , 16 , 17 , 18 to assess common facilitators and barriers to successful project design and implementation. Conducted by an NIH‐funded sociologist (AA), these included in‐depth interviews with investigators and key project and clinical stakeholders. Following these evaluations, the DARE program completed a similar process with the first six members of the Physician Researcher Program (PRP), after completion of their first four‐year funding cycle.

Both the RAU and PRP evaluations' results highlighted (1) perceived barriers for converting analytic findings to next‐step clinical‐operational implementation and (2) a desire for broad, early structured stakeholder engagement as a key element for ultimate implementation success. Physician‐researchers also frequently reported underestimating the data/technological coordination and project management needed for successful implementation. For example, one physician‐researcher developed and tested an electronic health record‐based method for identifying patients with familial hypercholesterolemia that was ultimately successfully implemented in KPNC. 16 However, its ultimate implementation required translating the research project's programming to the programming language used for clinical operations, leading to a substantial but avoidable delay in the research‐to‐implementation process.

Review of existing common complex implementation frameworks used to address such challenges raised concerns regarding concept redundancy and practicality for their completion in the large numbers of rapid‐cycle evidence‐to‐informing‐implementation projects led by physician‐researchers in KPNC. In the absence of an existing published short, practical tool, this led to an initiative to create a streamlined implementation planning instrument to aid collaborative implementation planning by DARE program members and operational leaders. We used an iterative, consensus‐based process to identify best practices within existing dissemination and implementation frameworks and then to develop a tool with face validity that aligned with organizational needs. The tool's main goals were to: identify and engage stakeholders at project initiation to maximize efficiency; provide teams with early implementation planning tools using a best‐practice framework from existing literature; and engender a common implementation planning platform and language across DARE's different evidence‐based innovation mechanisms for physician‐researchers and operational leaders.

2.3. Instrument development

Commonly used implementation frameworks were identified using the search terms “implementation” and “framework,” confining to systematic reviews. This search strategy yielded 894 results; exclusion of disease‐specific systematic reviews of single frameworks provided fifteen systematic reviews of different frameworks and metrics. Instrument domains were extracted from three frameworks identified by the co‐authors as commonly used and aligned with the above stated goals: the Quality Implementation Framework (QIF) process model, 19 to aid in evidence translation; the Consolidated Framework for Implementation Research (CFIR) implementation determinants framework, 20 , 21 , 22 to systematically assess potential barriers and facilitators to implementation; and the Reach, Effectiveness, Adoption, Implementation, and Maintenance (RE‐AIM) framework, 23 , 24 to recognize whether results are implemented. The extracted domains from these three frameworks were mapped onto a single document to identify overlapping concepts. In an iterative process, the five co‐authors developed a second version that minimized repetitive categories into a nine‐page instrument; differences in recommendations throughout iterations were resolved by consensus discussions.

Given a desire for conciseness from prior oral and written feedback from research scientists and program directors, this draft was condensed into a four‐page document with five domains: an introduction providing the document's purpose; questions to initiate discussions regarding the host setting for potential follow‐up interventions; a timetable guiding discussions with key stakeholders around creating an implementation structure; questions guiding discussions regarding managing the specific implementation context, including a project management timeline and communication management tools; and a template guiding questions for evaluating implementation success.

The four‐page document was electronically distributed to thirteen DARE physician‐researchers, who were key intended users, to evaluate instrument utility, acceptability, and practicality. 25 Additionally, two physician‐researchers performed pilot completions with ongoing projects. Feedback was then obtained using oral evaluations within a focused, topic‐specific meeting, and supplementary written feedback. Similar oral and written feedback methods were used within each DARE funding mechanism, ultimately including the 25 team members described below. Three main questions were addressed: utility, acceptability, and practicality. The resulting recommendations were summarized by DARE staff and used by the authors for further revision to create a one‐page version for further evaluation and testing using the same groups described above.

The 25‐person design and testing team included a PhD‐level sociologist with expertise in qualitative assessments and stakeholder engagement (AA); five PhD or MD Health Services researchers (including internal medicine (ALA), epidemiology (LJH), general health services, and mental health); two research program‐level directors (including specialties of epidemiology, gastroenterology [DAC], and pediatrics); one PhD‐level analyst; three masters‐level program managers (SP); and thirteen embedded physician‐researchers, most who are both clinical program/operational leads and actively engaged in research‐to‐implementation initiatives in their specialties (cardiology, emergency medicine, gastroenterology, population health, women's health, infectious disease, neurology, psychiatry, radiology, urology, and vascular surgery).

3. RESULTS

The consistent request, especially from physician‐researchers, was: “can the document be shorter” while retaining needed information? Many details from traditional instruments, even when abbreviated, were perceived as burdensome and/or “impractical.” Participants primarily desired early and iterative engagement with key operational and data/technology stakeholders to enhance awareness, buy‐in, and post‐analysis planning. A participant stated:

Some implementations are simple tweaks—changing the inclusion or exclusion criteria for an ongoing intervention. This still can change the cost of an existing program, though, by hundreds of thousands of dollars, so stakeholders need to agree. The key is to make an exhaustive list of stakeholders affected by the work, make sure they understand what you are doing, and start seeing your results early.

Resulting revisions created a document with three core elements from existing frameworks: (1) purpose, (2) questions to guide stakeholder engagement, and (3) main transition‐to‐implementation goals and timeline. Two key stakeholder groups, regional clinical specialty leaders and applicable data/technology leaders, were identified as needed for all projects, given their system‐wide decision‐making authority to realign workflows, personnel, and information technology for implementation. Topics deleted included highly specific details regarding implementation personnel, procedures, metrics, and step‐by‐step questions of personnel, roles, and discussion documentation. For example, estimating clinical efforts for operational implementation, staff recruitment, assessments of organizational capacity, and similar concepts, though conceptually useful, were perceived as highly labor‐intensive and “speculative” prior to study results. The document's last section included the RE‐AIM implementation evaluation framework. The physician‐researchers supported the importance of measuring an implemented project's effectiveness and impact, as one noted,

The implementation checklist should include a clear plan regarding how effectiveness/impact will be assessed. At least for my project, the commitment to ongoing evaluation and expectation that it would inform iterative improvement/revisions over time has been key to implementation.

However, since not all research projects will be implemented, this evaluation component was removed from this instrument and repurposed as a separate, companion template only for projects that progress to implementation. The revised one‐page format was then re‐evaluated, re‐tested, and approved by the end‐users using the same methods and goals described above; these end‐users explicitly noted the revised checklist was clearer, less redundant, and feasible for their regular use, while preserving the core elements they felt were needed for implementation planning. After some wording clarifications completed after feedback, it was finalized for use.

4. CONCLUSION

A sequential process developed a streamlined implementation tool that is concise and acceptable to the embedded researcher/operational teams where it is being actively used (Figure 1). Its revisions condensed key elements from more complex implementation frameworks into a more practical one‐page document. Keeping the instrument to one page, designing it for collaborative use by physician‐researcher and operational leaders, and extensively incorporating end‐user stakeholder engagement decreased the tool's administrative burden, and facilitated end‐user acceptance, making it more feasible for both initial planning and iterative updating as analytic results emerge.

FIGURE 1.

FIGURE 1

The Delivery Science and Applied Research (DARE) Program Implementation Planning Instrument, also known as the “Implementation Checklist”.

Delivery science and applied research teams in other healthcare settings can use and further refine this implementation planning instrument for rapid‐cycle evidence‐based innovation efforts. Although this briefer instrument includes core elements of existing frameworks, it does omit some detailed implementation planning steps from existing, more detailed frameworks that are clearly appropriate for large‐scale and/or well‐resourced implementation projects. The instrument is now being utilized in all KPNC DARE support mechanisms. Its elements are included in regular work‐in‐progress and end‐of‐analysis reports and its impact on successful implementation will be evaluated over time. This evaluation can include the instrument's acceptability, 25 completeness/consistency of use, speed of uptake, and project‐specific determinants of implementation success. We anticipate finding that it substantially accelerates next‐step evidence‐based change planning across departments within a large, learning health system. Its availability could facilitate similar benefits within other learning health system organizations that need simple, rapid‐cycle tools to guide time‐sensitive and time‐limited research teams through evidence‐based learning, innovation, implementation, and re‐evaluation components. It is our hope that, by publishing the checklist, others will use, adapt, refine, further develop, and evaluate this tool's usability and effectiveness.

CONFLICT OF INTEREST STATEMENT

The authors are partners and/or employees of The Permanente Medical Group and Kaiser Permanente and report no other conflicts of interest.

ACKNOWLEDGMENTS

The authors gratefully acknowledge Delivery Science and Applied Research program manager Jennifer Schneider, along with members of the Rapid Analytics Unit and the Physician Researcher Program, for their engagement in iteratively creating the implementation checklist Instrument. This project was supported by The Permanente Medical Group's Delivery Science and Applied Research program.

Prausnitz S, Altschuler A, Herrinton LJ, Avins AL, Corley DA. The implementation checklist: A pragmatic instrument for accelerating research‐to‐implementation cycles. Learn Health Sys. 2023;7(3):e10359. doi: 10.1002/lrh2.10359

REFERENCES

  • 1. Horwitz LI, Kuznetsova M, Jones SA. Creating a learning health system through rapid‐cycle, randomized testing. N Engl J Med. 2019;381(12):1175‐1179. [DOI] [PubMed] [Google Scholar]
  • 2. Huybrechts I, Declercq A, Verté E, Raeymaeckers P, Anthierens S. The building blocks of implementation frameworks and models in primary care: a narrative review. Front Public Health. 2021;9:675171. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3. Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci. 2015;10(1):53. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4. Moullin JC, Dickson KS, Stadnick NA, Rabin B, Aarons GA. Systematic review of the exploration, preparation, implementation, sustainment (EPIS) framework. Implement Sci. 2019;14(1):1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5. Jenkinson C, Layte R, Jenkinson D, et al. A shorter form health survey: can the SF‐12 replicate results from the SF‐36 in longitudinal studies? J Public Health Med. 1997;19(2):179‐186. [DOI] [PubMed] [Google Scholar]
  • 6. Cella D, Choi SW, Condon DM, et al. PROMIS® adult health profiles: efficient short‐form measures of seven health domains. Value Health. 2019;22(5):537‐544. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7. Gawande AA. The Checklist Manifesto—How to Get Things Right. New York: Metropolitan Books; 2009. [Google Scholar]
  • 8. Haynes AB, Weiser TG, Berry WR, et al. A surgical safety checklist to reduce morbidity and mortality in a global population. N Engl J Med. 2009;360(5):491‐499. [DOI] [PubMed] [Google Scholar]
  • 9. Schmittdiel JA, Dlott RS, Young JD, Rothman MB, Dyer W, Adams AS. The delivery science rapid analysis program: a research and operational partnership at Kaiser Permanente northern California. Learn Health Syst. 2017;1(4):e10035. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10. Lieu TA, Madvig PR. Strategies for building delivery science in an integrated health care system. J Gen Intern Med. 2019;34(6):1043‐1047. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11. Harzstark AL, Altschuler A, Amsden LB, et al. Implementation of a multidisciplinary expert testicular cancer tumor board across a large integrated healthcare delivery system via early case ascertainment. JCO Clin Cancer Inform. 2021;5:187‐193. [DOI] [PubMed] [Google Scholar]
  • 12. Nguyen‐Huynh MN, Klingman JG, Avins AL, et al. Novel Telestroke program improves thrombolysis for acute stroke across 21 hospitals of an integrated healthcare system. Stroke. 2018;49(1):133‐139. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13. Flint AC, Avins AL, Eaton A, et al. Risk of distal embolization from tPA (tissue‐type plasminogen activator) administration prior to endovascular stroke treatment. Stroke. 2020;51(9):2697‐2704. [DOI] [PubMed] [Google Scholar]
  • 14. Suh‐Burgmann E, Flanagan T, Osinski T, Alavi M, Herrinton L. Prospective validation of a standardized ultrasonography‐based ovarian cancer risk assessment system. Obstet Gynecol. 2018;132(5):1101‐1111. [DOI] [PubMed] [Google Scholar]
  • 15. Suh‐Burgmann EJ, Flanagan T, Lee N, et al. Large‐scale implementation of structured reporting of adnexal masses on ultrasound. J Am Coll Radiol. 2018;15(5):755‐761. [DOI] [PubMed] [Google Scholar]
  • 16. Birnbaum RA, Horton BH, Gidding SS, Brenman LM, Macapinlac BA, Avins AL. Closing the gap: identification and management of familial hypercholesterolemia in an integrated healthcare delivery system. J Clin Lipidol. 2021;15(2):347‐357. [DOI] [PubMed] [Google Scholar]
  • 17. Chong AJ, Fevrier HB, Herrinton LJ. Long‐term follow‐up of pediatric open and laparoscopic inguinal hernia repair. J Pediatr Surg. 2019;54(10):2138‐2144. [DOI] [PubMed] [Google Scholar]
  • 18. Altschuler A, Chong AJ, Alavi M, Herrinton LJ. Pediatric Surgeons' adoption of an innovative laparoscopic technique for inguinal hernia repair: a mixed methods study. J Laparoendosc Adv Surg Tech A. 2021;31(8):947‐953. [DOI] [PubMed] [Google Scholar]
  • 19. Meyers DC, Durlak JA, Wandersman A. The quality implementation framework: a synthesis of critical steps in the implementation process. Am J Community Psychol. 2012;50(3–4):462‐480. [DOI] [PubMed] [Google Scholar]
  • 20. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21. Kirk MA, Kelley C, Yankey N, Birken SA, Abadie B, Damschroder L. A systematic review of the use of the consolidated framework for implementation research. Implement Sci. 2016;11:72. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22. Birken SA, Powell BJ, Presseau J, et al. Combined use of the consolidated framework for implementation research (CFIR) and the theoretical domains framework (TDF): a systematic review. Implement Sci. 2017;12(1):2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23. Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE‐AIM framework. Am J Public Health. 1999;89(9):1322‐1327. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24. Gaglio B, Shoup JA, Glasgow RE. The RE‐AIM framework: a systematic review of use over time. Am J Public Health. 2013;103(6):e38‐e46. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25. Sekhon M, Cartwright M, Francis JJ. Acceptability of healthcare interventions: an overview of reviews and development of a theoretical framework. BMC Health Serv Res. 2017;17(1):88. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Learning Health Systems are provided here courtesy of Wiley

RESOURCES