Skip to main content
Critical Care Explorations logoLink to Critical Care Explorations
. 2026 Feb 17;8(2):e1372. doi: 10.1097/CCE.0000000000001372

A Framework and Method for Measuring the Implementation of Data Science in Critical Care

Daniel Woznica 1,, Tamara Al-Hakim 1, Vishakha K Kumar 1, J Perren Cobb 2, Rishikesan Kamaleswaran 3, Ashish K Khanna 4,5, Krzysztof Laudanski 6, Jerry J Zimmerman 7, Karin Reuter-Rice 3,8
PMCID: PMC12915721  PMID: 41701949

Abstract

BACKGROUND:

The implementation of data science concepts, skills, and tools in critical care research and practice faces multiple, complex barriers.

METHODS:

We developed an implementation science-based framework and method for measuring the adoption, implementation, and sustainment of data science concepts, skills, and tools in critical care—the Society of Critical Care Medicine (SCCM) Discovery Data Science Campaign (DSC) Implementation Research Logic Model (IRLM). Our IRLM specifies constructs for: 1) key determinants (i.e., barriers and facilitators) influencing the implementation of data science concepts, skills, and tools in critical care; 2) implementation strategies deployed by the SCCM Discovery DSC to address these determinants; 3) theorized mechanisms of action by which these strategies affect outcomes; and 4) upstream and downstream implementation outcomes influenced by implementation strategies.

RESULTS AND CONCLUSIONS:

We believe that our model can facilitate more rigorous measurement of theoretically grounded, empirically assessable factors driving implementation of data science concepts, skills, and tools in critical care.

Keywords: critical care, data science, implementation science, methods, models, theoretical


KEY POINTS

Question: How can we measure implementation of data science concepts, skills, and tools in critical care?

Findings: An Implementation Research Logic Model is a useful framework and method for facilitating measurement of implementation determinants, strategies, mechanisms, and outcomes.

Meaning: Better measurement of processes driving the adoption, implementation, and sustainment of data science concepts, skills, and tools in critical care can improve outcomes for researchers, administrators, clinicians, and ultimately patients and families.

A researcher aims to examine the relationships between diagnostic and procedure codes and clinical outcomes across diverse ICU settings during a rapidly evolving pandemic. A healthcare administrator seeks to enhance the accuracy of sepsis diagnosis and treatment within a hospital system by implementing innovative, disease-specific predictive models. Meanwhile, a frontline clinician plans to leverage a cutting-edge, machine learning-enabled clinical decision support system to inform patient care.

For such researchers, administrators, and clinicians, data science concepts, skills, and tools provide powerful analytic resources and capabilities. Examples of the paradigmatic research and practice shift afforded by data science’s advent include: the extraction of real-world data from electronic medical records as part of embedded, pragmatic clinical trials; aggregation of patient data in quality improvement initiatives to create real-time learning healthcare systems; and, integration and analysis of patient data across multiple “omics” (e.g., genomics, environmental exposomics) profiles to enable revolutionary precision health (14). Given that the ICU produces a larger volume, velocity, and variety of data—that is, the “3 Vs” of Big Data (5)—than any other healthcare setting, critical care is arguably positioned in the vanguard of this monumental health care advancement.

Yet despite these advances, the adoption, implementation, and sustainment of data science in critical care research and practice face multiple, complex barriers. These barriers are wide in scope, ranging from high variability in local data vocabularies, to incompatible information technology infrastructures, to an overall lack of expertise in the application of data science methods in intensive care clinical practice. Recognizing these challenges, Society of Critical Care Medicine (SCCM) Discovery, the Critical Care Research Network, launched its Data Science Campaign (DSC) in 2022. The mission of the DSC is to improve the care of critically ill patients by leveraging the use of Big Data for research capabilities, with the ultimate goal of data science application in clinical environments. The DSC’s implementation strategies comprise three main domains: 1) Establish guidance for large-scale data harmonization and data sharing (68); 2) Develop a data hub at SCCM; and 3) Conduct SCCM-sponsored datathons.

In this article, we present on our development of an Implementation Research Logic Model (IRLM) (9) that specifies: 1) key determinants influencing the implementation of data science concepts, skills, and tools in critical care; 2) specific implementation strategies deployed by the DSC to address these determinants; 3) theorized mechanisms of action by which these strategies affect implementation outcomes; and 4) upstream and downstream outcomes influenced by DSC implementation strategies. Our goal is to enable a more rigorous, theory-driven approach to measuring the factors that drive successful implementation of data science concepts, skills, and tools in critical care (10). The previously described researcher, healthcare administrator, and clinician could, in principle, use this model to systematically evaluate the key barriers, strategies, mechanisms, and outcomes involved in applying data science concepts, skills, and tools to critical care research and practice.

METHODS

Our methods leverage the IRLM to synthesize the multiple complex factors inherent in innovation implementation (9). The IRLM, a frequently used tool in implementation science, frames a four-part logic model comprised of implementation determinants, strategies, mechanisms, and outcomes. Importantly, the model is temporally agnostic, which means it can be developed and applied at any stage of the implementation process (i.e., pre-, during, and/or post), for planning, evaluation, and/or reporting. We iteratively developed a logic model throughout the course of the campaign using a combination of academic literature review, analysis of program documents (e.g., DSC business plan), and workshops and meetings among this article’s co-authors. Our project was reviewed and determined to be exempt from institutional review board (IRB) review because it did not meet the definition of research (WCG IRB, No. 1-1670047-1, June 14, 2023, “Discovery Data Science Campaign Implementation Science Logic Model”).

To begin, we specified “The Thing” being implemented, which is typically defined as an intervention, practice, or innovation (11). In the case of the DSC, we define “The Thing” as: the use of data science concepts, skills, and tools in critical care for research capabilities, with the ultimate goal of application in clinical environments. For short, we describe “The Thing” being implemented as the “use of data science in critical care.”

Determinants

Next, we populated the logic model’s first column, identifying the determinants influencing the use of data science in critical care. Determinants, often conceptualized as “barriers and facilitators,” are contextual factors that function as either independent variables or moderators to influence implementation processes and outcomes (12, 13). Several systematic literature reviews and empirical studies have identified implementation determinants for the use of data science across social sectors and within healthcare and critical care specifically (1418). We derived the determinants in our logic model from Reddy’s “synthesized factors associated with ‘Data Science Strategy’ in organizations,” which in our appraisal achieves the highest level of comprehensiveness, relevance, applicability, simplicity, logic, clarity, usability, suitability, and usefulness of existing frameworks (19). We extracted each of this model’s determinants, mapped them to constructs from the (Updated) Consolidated Framework for Implementation Research (CFIR)—the most frequently cited implementation determinants framework in implementation science (20)—and tabulated all constructs (20). Guided by Reddy’s synthesized factors and the corresponding CFIR constructs, we developed a list of relevant areas for determinant measurement in critical care (Table 1).

TABLE 1.

Determinants of the Use of Data Science in Critical Care

Reddy’s Synthesized Factors (Updated) Consolidated Framework for Implementation Research Constructs Relevant Areas for Determinant Measurement in Critical Care
Data characteristics: volume, velocity, and variety of data; data quality Innovation complexity Nature and number of connections and steps involved in real-time analysis of granular data to inform changing clinical conditions in life-or-death treatment scenarios
Quality and level of integration of disparate data sources, especially: lifespan data (e.g., pedigree); systems data (e.g., environmental exposome)
Technology: Infrastructure (hardware and software) IT infrastructure Ability to port, access, merge, and analyze critical care clinical (and other) data sources, such as from ICU monitoring systems
Organizational structure/model; organizational agility: business process, realignment of work practices, intensity of learning; ambidexterity Work infrastructure Organizational architecture and staffing to optimize dynamic, agile learning processes, characterized in particular by:
  Flexibility to (re)create roles, teams, and administrative procedures to catalyze digital innovation
  Standardized training to drive organizational learning
Data-driven decisionsInnovativeSustainable Culture Support for data-driven decision-making, and especially for open-source common data models (such as OMOP and FHIR)
Willingness to enact and sustain IT infrastructure necessary for data science implementation
Alignment with core strategy: distinct data strategy, (coordination of) stakeholder interests Mission alignment Presence of measurable goals and objectives related to the integration of data science into critical care research and practice
Resources to support novel data science initiatives amid multiple institutional priorities, in particular through strategic integration with existing priority initiatives (e.g., grants, quality registries, industry collaborations, etc)
Policies and regulations Policies and laws Data security and privacy policies to protect patient and organizational data
Governance: control (access), risk management, compliance, privacy, security, sharing, ownership Data governance processes with respect to the context of data reuse, accuracy, archival, curation, platforms, architecture, and effective sharing within and across ICUs and organizations
Business environment: competitive dynamics, industry structure, partner readiness, consumers (public) External pressure Urgency to operationalize real-world data to produce clinical knowledge (e.g., on COVID-19)
Market pressures influencing availability of resources for integration of critical care/data science
Incentive to align public and private strategic partnerships (e.g., electronic medical record/electronic health record industry partners) to facilitate alignment of proprietary data
Managerial willingness: top management support, trust and acceptance, breaking silos High-/mid-level leaders Buy-in from key administrative decision-makers about the acceptability, appropriateness, and feasibility of data science
Human-machine teaming trust
Talent (knowledge and skillset) Implementation facilitators (SMEs) Synergistic clinical and informatics expertise related to collecting, analyzing, and operationalizing large datasets to improve clinical practice and facilitate clinical trials

FHIR = Fast Healthcare Interoperability Resources; IT = information technology; OMOP = Observational Medical Outcomes Partnership; SMEs = subject matter experts.

Implementation Strategies

The second column of the logic model was then completed, detailing the DSC’s three core implementation strategies. Broadly speaking, implementation strategies may be defined as methods for increasing adoption, implementation, and sustainment of an intervention, innovation, or practice (21). In other words, they are the “stuff we do to try to help people/places do The Thing” (11). As with data science implementation determinants, previous literature reviews and empirical studies have named and defined implementation strategies related to data harmonization and sharing, use of data registries/warehouses/platforms, and datathons, within the health care sector and critical care specifically (2, 2232). We mapped the DSC’s three domains onto the Expert Recommendations for Implementing Change (ERIC) compilation of implementation strategies (21)—the most frequently cited implementation strategy taxonomy—and tabulated implementation strategy names and definitions, excerpted from the ERIC compilation and subsequent refinements (21, 33) (Table 2). For each domain, we further specified implementation strategies’ actors, actions, temporalities, doses, and justifications (34).

TABLE 2.

Implementation Strategies for the Use of Data Science in Critical Care

Data Science Campaign Domain (Refined) ERIC Implementation Strategies (Refined) ERIC Implementation Strategy Definitions Actors, Actions, Temporalities, Doses, and Justifications
Data harmonization and sharing Develop resource-sharing agreements “Develop partnerships with organizations that have resources needed to implement the innovation” Actors: Clinicians, researchers, administrators, data scientists
Stage implementation scale-up “Phase implementation efforts by starting with small pilots or demonstration projects and gradually move to a system wide rollout”
Actions: Establish principles for data standardization and sharing, develop CDEs for critical care, and perform a pilot project utilizing standardized CDEs
Temporality: CDEs modified and expanded regularly
Dose: Pilot: 2 yr
Justification: Fragmentation of data elements (especially in United States) substantially impedes research and quality improvement efforts
Discovery Critical Care Datahub Use data warehousing techniques “Integrate clinical records across facilities and organizations to facilitate implementation across systems” Actors: Data scientists
Centralize technical assistance “Develop and use a centralized system to deliver technical assistance focused on implementation issues” Actions: Aggregate electronic medical record data on ICU admissions from across sites to develop a cloud-based multipurpose platform for analysis of both clinical and nonclinical information. Enable federated data approaches (i.e., noncentralized data locations usable as a unified whole)
Temporality: Once developed, always available
Dose: Continuously maintained datahub, with multiple pathways for further data ingestion over time
Justification: Current process of manual data extraction and transformation across sites requires unsustainable personnel effort requirements
Society of Critical Care Medicine-sponsored datathons Use data experts “Involve, hire, and/or consult experts to acquire, structure, manage, report, and use data generated by implementation efforts” Actors: Clinicians, researchers, data scientists
Increase demand “Attempt to influence the market for the clinical innovation to increase competition intensity and to increase the maturity of the market for the clinical innovation”
Actions: Host recurring competitions focused on using data science techniques to solve real-world issues
Temporality: Annually
Dose: Two-d workshop
Justification: Pressing need for expedited application of data science in critical care research and practice, especially in emergency situations (e.g., natural disasters, emerging epidemics/pandemics)

CDE = common data element, ERIC = Expert Recommendations for Implementing Change.

Mechanisms

The third column of the logic model outlines the theorized mechanisms through which our implementation strategies influence outcomes. Implementation mechanisms are processes or events that operate as mediators between implementation strategies and outcomes (10, 3538). They are often conceptualized via theory-driven, “realist” causal explanations for how and why implementation strategies influence (or fail to influence) context-specific outcomes (10, 3538). Multiple literature reviews and empirical studies have named and defined mechanisms related to data science implementation across sectors and within health care and critical care (3948). The mechanisms column of our logic model derives from Galetsi’s “path to value Big Data Analytics in Healthcare” (49, 50). In brief, Galetsi’s pathway may be summarized as: the assemblage of Big Data Analytics Resources (BDAR) to build Big Data Analytics Capabilities (BDAC) that create health care value. Galetsi’s pathway satisfies multiple criteria for establishment of mechanisms, including strong association, specificity, consistency, experimental manipulation, timeline, gradient, and plausibility or coherence (35, 51). We tabulated all constructs in Galetsi’s framework and developed relevant examples pertaining to data science in critical care (Table 3).

TABLE 3.

Mechanisms of Implementation for the Use of Data Science in Critical Care

Mechanisms Relevant Examples in Critical Care
Big Data Analytics Resources (per Groves et al [52], Waller and Fawcett [53], Chen and Zhang [54] as cited in Galetsi et al [49, 50])
 Data types Clinical data in the electronic medical record for hospitalization or pre-/post-ICU care; clinical device data (e.g., waveforms, IV pumps, ventilators, dialysis machines); questionnaires (e.g., family satisfaction, patient-reported outcomes); diagnostic and procedure codes; drug utilization and wastage data
 Analytical resources Society for Simulation in Healthcare; PhysioNet (e.g., Medical Information Mart for Intensive Care IV, eICU); National Clinical Cohort Collaborative
Phillips eICU Research Collaborative Database; Amsterdam University Medical Centers Database
Big Data Analytics Capabilities, per Groves et al (52) as cited in Galetsi et al (49, 50)
 Monitoring, prediction/simulation, data mining, evaluation, and reporting Monitoring of high-fidelity waveforms and processed data from bedside monitors
Prediction models: Disease-based (e.g., sepsis) (55); intervention-based (e.g., hemodynamic monitoring); prediction of changes in stability and acuity (56, 57)
Mining of ICU data lakes (i.e., centralized repositories for storage of large amounts of raw data)
Integrated evaluation and reporting of complex data streams and thresholds
Created values
 Better diagnosis for provision of more personalized health care Rapid diagnosis, predictive and prognostic enrichment (e.g., Covid-19) (58, 59)
 Supporting/replacing professionals’ decision-making with automated algorithms Machine learning-enabled clinical decision support (60)
 New business models, products, and services Prediction models
 Enabling experimentation, expose variability, and improve performance Synthetic datasets modeling and experimentation where ethical considerations would not allow randomized trials
 Healthcare information sharing and coordination Mapping of clinical data to common data models (25)
 Creating data transparency Data cleaning and post-processing systems
 Identifying patient care risk ICU scoring systems (e.g., SOFA, pediatric SOFA) (6163)
Early warning systems (64)
Prediction of ICU readmissions (65)
 Offering customized actions by segmenting populations Population enrichment for interventional trials
 Reducing expenditure while maintaining quality Improved efficiency in treatment of sepsis
 Protecting privacy Cyber security and privacy systems in the ICU

eICU = electronic ICU, SOFA = Sequential Organ Failure Assessment.

Outcomes

Finally, we completed the fourth column of the logic model, detailing the outcomes related to using data science in critical care. The overarching framework for these outcomes is the CFIR Outcomes Addendum, which is designed to be directly compatible with use of the CFIR’s categorization of determinants and conceptualizes outcomes in terms of three principle components (66). First, “antecedent assessments” are contextual phenomena that reliably predict implementation and innovation impact. Second, “implementation outcomes” are dimensions of implementation success or failure, whether anticipated (expected in future) or actual (occurring in present or occurred in past). Third, “innovation outcomes” are the equitable impacts of implementation on key decision-makers, deliverers, and recipients. We tabulated implementation outcome constructs and construct dimensions and selected measures applicable to data science in critical care (Table 4).

TABLE 4.

Implementation Outcomes for the Use of Data Science in Critical Care

CFIR Outcomes Addendum Implementation Outcome Domains CFIR Outcomes Addendum Implementation Outcome Constructs Suggested Measures
Antecedent assessments Acceptability Acceptability of intervention measure (67)
Appropriateness Intervention appropriateness measure (67)
Feasibility Feasibility of intervention measure (67)
Implementation readiness Various measures of implementation readiness (e.g., Texas Christian University Organizational Readiness for Change) (68) and implementation climate (e.g., adaptations of Klein and Sorra’s construct developed within the context of information technology implementation) (69)
Implementation climate
Implementation outcomes Anticipated and actual Penetration (70) of Critical Care Data Dictionary common data elements
 Adoptability and adoption Penetration (70) and maintenance of Discovery Critical Care Datahub
 Implementability and implementation Maintenance of annual Society of Critical Care Medicine datathons
 Sustainability and sustainment Publication of peer-reviewed articles related to data science in critical care
Innovation outcomes Equitable population impact on:
Decision-makers (e.g., administrators) Institute of Medicine service outcomes (70):
 Efficiency
 Safety
 Effectiveness
 Equity
 Patient-centeredness
 Timeliness
Deliverers (e.g., researchers, data scientists, clinicians) “Quadruple Aim” outcomes (71) pertaining to health care deliverers, including:
 Reduced burnout
 Improved work experience
Recipients (e.g., patients) Barr’s health/effectiveness outcomes (72):
 Family and surrogate well-being/quality of life
 Physical functioning
 Post-ICU trajectory
 Psychologic well-being
 Satisfaction

CFIR = Consolidated Framework for Implementation Research.

RESULTS AND CONCLUSIONS

Figure 1 presents a consolidated SCCM Discovery DSC IRLM for the use of data science concepts, skills, and tools in critical care research and practice. The model uses horizontal arrows to denote the left-to-right “flow” of the implementation “pipeline”—from determinants to strategies to mechanisms to outcomes—as well as vertical arrows to indicate progression within each of these “piping” components. We believe that our proposed SCCM Discovery DSC IRLM will facilitate more rigorous measurement of theoretically grounded, empirically assessable factors driving the adoption, implementation, and sustainment of data science in critical care (10).

Figure 1.

Figure 1.

Society of Critical Care Medicine (SCCM) Discovery Data Science Campaign Implementation Research Logic Model for the use of data science concepts, skills, and tools in critical care research and practice. The model uses horizontal arrows to denote the left-to-right “flow” of the implementation “pipeline”—from determinants to strategies to mechanisms to outcomes—as well as vertical arrows to indicate progression within each of these “piping” components. BDA = Big Data Analytics, BDAC = Big Data Analytics Capabilities, SMEs = subject matter experts.

For example, the previously imagined researcher might use the logic model to comprehensively assess barriers and facilitators to extraction of diagnostic and procedure codes and clinical outcomes data across multiple differing ICUs. The healthcare administrator might use the model to select validated implementation outcomes—such as measures of acceptability, appropriateness, and feasibility—as part of planning integration of sepsis prediction models into quality improvement efforts. The frontline clinician intending to use machine learning-enabled clinical decision support can examine the data science implementation strategies and mechanisms driving this innovation. For researchers, healthcare administrators, and frontline clinicians interested in making further use of the model’s approach, the toolkit appended in this article’s supplemental digital content includes additional guidance on how to create a custom IRLM for a new investigation or new adoption/application in a clinical setting. With this framework and method in hand, we believe our model will enable the critical care data science community to better unite and synthesize knowledge about how to accelerate implementation of data science concepts, skills, and tools in the ICU.

We acknowledge that our framework and method has limitations. First, while we aimed to narrowly define “The Thing” being innovated as the use of data science concepts, skills, and tools, we recognize that the literature often conceptualizes this “Thing” ambiguously—for example, as “Big Data,” data science, BDAR, and/or BDAC. We nevertheless maintain that our operationalization of the use of data science concepts, skills, and tools as a concept in the model coheres intelligibly. Second, the logic model does not specify whether a given determinant operates as a barrier or facilitator, given the high variability of barrier and facilitator valences across contexts. We note that Reddy’s model does formulate each factor as either a barrier or enabler, and that later research by Reddy et al (73) analyzes empirical evidence from across multiple organizations to identify key barriers for data science implementation and the hierarchical relationships between them. Future research might synthesize barriers and facilitators from across multiple literature reviews and frameworks, including Reddy’s and others cited in this article. Third, although comprehensive, the model inevitably does not list all possible specifications of determinants, strategies, mechanisms, and outcomes, nor does it map potential causal pathways between them. We believe the parsimonious selection of constructs in our model to be an indicator of our comprehensive thinking through and prioritization of relevant concepts rooted in the literature, and we encourage future mapping of causal pathway diagrams between specific components as part of “realist” implementation research. Fourth, it is important to note that our model may conflict with ascendant conceptualizations of causal pathway diagrams that place determinants “to the right” of implementation mechanisms, under the premise that determinants are factors that either constrain or enable implementation strategies’ effects (35, 36). We believe that our placement of determinants “to the left” of implementation strategies aligns more closely with current standard conceptualizations of the relationship between these concepts in implementation science as well as our own intuitive understandings. We invite future revisions to the model that conceive of the order of determinants, strategies, and mechanisms differently.

Future research should revise and expand on our model to account for theories and frameworks pertaining to large language models and agentic systems that have arisen since the conception of the SCCM Discovery DSC and initial creation of its implementation strategies. In particular, we highlight the theories and frameworks synthesized in Reddy’s implementation science-informed translational path on application, integration, and governance of generative artificial intelligence (AI), which in our estimation satisfy the stringent criteria with which our data science-specific theories and frameworks were evaluated for inclusion (74). Reddy’s components of AI implementation could be directly mapped onto our logic model: “acceptance and adoption” in the antecedent assessments and implementation outcomes; “data and resources” in the BDAR and BDAC mechanisms; and “technical integration” and “governance” in the implementation strategies. The extensibility of the IRLM in terms of its ability to continually incorporate updated theories and frameworks such as those related to AI is ultimately one of its principal strengths, given the complexity of implementation as new innovations arise over time.

Ultimately, we share with other critical care researchers, data scientists, and clinicians the goal of successfully leveraging data science in critical care to improve outcomes for the critically ill and injured. We present the SCCM Discovery DSC IRLM in service of that aim.

Footnotes

This project is part of the Society of Critical Care Medicine Discovery Data Science Campaign.

Dr. Cobb reports consulting for Akido Labs and Bauhealth. Dr. Kamaleswaran reports funding from the National Institutes of Health. Dr. Khanna reports consulting for Medtronic, Edwards Lifesciences, GE Healthcare, Philips Research North America, Sentinel Medical, Bayer Corporation, AOP Health, Innoviva Therapeutics, and Pharmazz. Dr. Zimmerman reports Elsevier textbook royalties and funding from the National Institutes of Health. Dr. Reuter-Rice reports Elsevier textbook royalties and funding from the National Institutes of Health. Drs. Khanna and Zimmerman are former chairs of the Society of Critical Care Medicine (SCCM) Discovery Network, Drs. Cobb and Reuter-Rice are former co-chairs of the SCCM Data Science Campaign (DSC) workgroup, and Drs. Kamaleswaran and Laudanski are current co-chairs of the SCCM DSC workgroup. Dr. Khanna is on the SCCM Council, and Dr. Reuter-Rice is Vice Chancellor of the American College of Critical Care Medicine Board of Regents. Drs. Al-Hakim and Kumar are co-authors on the SCCM DSC workgroup article. Dr. Woznica has disclosed that he does not have any potential conflicts of interest.

Contributor Information

Tamara Al-Hakim, Email: tal-hakim@sccm.org.

J. Perren Cobb, Email: jpcobb@med.usc.edu.

Rishikesan Kamaleswaran, Email: r.kamaleswaran@duke.edu.

Ashish K. Khanna, Email: Ashish.Khanna@Advocatehealth.org.

Krzysztof Laudanski, Email: Laudanski.krzysztof@mayo.edu.

Jerry J. Zimmerman, Email: jerry.zimmerman@seattlechildrens.org.

Karin Reuter-Rice, Email: karin.reuter-rice@duke.edu.

REFERENCES

  • 1.Chambers DA, Feero WG, Khoury MJ: Convergence of implementation science, precision medicine, and the learning health care system: A new model for biomedical research. JAMA 2016; 315:1941–1942 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Kaplan LJ, Cecconi M, Bailey H, et al. : Imagine…(a common language for ICU data inquiry and analysis). Intensive Care Med 2020; 46:531–533 [DOI] [PubMed] [Google Scholar]
  • 3.Sanchez-Pinto LN, Luo Y, Churpek MM: Big data and data science in critical care. Chest 2018; 154:1239–1248 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Dash S, Shakyawar SK, Sharma M, et al. : Big data in healthcare: Management, analysis and future prospects. J Big Data 2019; 6:1–25 [Google Scholar]
  • 5.Laney D: 3D data management: Controlling data volume, velocity and variety. META Group Research Note 2001; 6:1 [Google Scholar]
  • 6.Armaignac DL, Heavner SF, Rausen M, et al. : Guiding principles for data sharing and harmonization: Results of a systematic review and modified Delphi from the Society of Critical Care Medicine Data Science Campaign. Crit Care Med 2025; 53:e619–e631 [DOI] [PubMed] [Google Scholar]
  • 7.Heavner SF, Kumar VK, Anderson W, et al. ; Society of Critical Care Medicine (SCCM) Discovery Panel on Data Sharing and Harmonization: Critical data for critical care: A primer on leveraging electronic health record data for research from Society of Critical Care Medicine’s panel on data sharing and harmonization. Crit Care Explor 2024; 6:e1179. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Murphy DJ, Anderson W, Heavner SH, et al. : Development of a core critical care data dictionary with common data elements to characterize critical illness and injuries using a modified Delphi method. Crit Care Med 2025; 53:e1045–e1054 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Smith JD, Li DH, Rafferty MR: The implementation research logic model: A method for planning, executing, reporting, and synthesizing implementation projects. Implement Sci 2020; 15:1–12 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Sarkies MN, Francis-Auton E, Long JC, et al. : Making implementation science more real. BMC Med Res Methodol 2022; 22:178. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Curran GM: Implementation science made too simple: A teaching tool. Implement Sci 2020; 1:1–3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Damschroder LJ: Clarity out of chaos: Use of theory in implementation research. Psychiatry Res 2020; 283:112461. [DOI] [PubMed] [Google Scholar]
  • 13.Nilsen P: Making sense of implementation theories, models, and frameworks. Implement Sci 2020; 10:53–79 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Mikalef P, Pappas IO, Krogstie J, et al. : Big data analytics capabilities: A systematic literature review and research agenda. Inf Syst E-Bus Manage 2018; 16:547–578 [Google Scholar]
  • 15.Mosch LK, Poncette A-S, Spies C, et al. : Creation of an evidence-based implementation framework for digital health technology in the intensive care unit: Qualitative study. JMIR Form Res 2022; 6:e22866. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Poncette AS, Meske C, Mosch L, et al. : How to overcome barriers for the implementation of new information technologies in intensive care medicine. In: Human Interface and the Management of Information. Information in Intelligent Systems. Yamamoto S Mori H (Eds). HCII 2019. Lecture Notes in Computer Science, vol 11570. Cham, Springer, 2019, pp 534–546 [Google Scholar]
  • 17.Reddy RC, Bhattacharjee B, Mishra D, et al. : A systematic literature review towards a conceptual framework for enablers and barriers of an enterprise data science strategy. Inf Syst E-Bus Manage 2022; 20:223–255 [Google Scholar]
  • 18.Al-Sai ZA, Abdullah R, Husin MH: Critical success factors for big data: A systematic literature review. IEEE Access 2020; 8:118940–118956 [Google Scholar]
  • 19.Flottorp SA, Oxman AD, Krause J, et al. : A checklist for identifying determinants of practice: A systematic review and synthesis of frameworks and taxonomies of factors that prevent or enable improvements in healthcare professional practice. Implement Sci 2013; 8:1–11 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Damschroder LJ, Reardon CM, Widerquist MAO, et al. : The updated consolidated framework for implementation research based on user feedback. Implement Sci 2022; 17:75. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Powell BJ, Waltz TJ, Chinman MJ, et al. : A refined compilation of implementation strategies: Results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci 2015; 10:1–14 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Somerville M, Cassidy C, Curran JA, et al. : Implementation strategies and outcome measures for advancing learning health systems: A mixed methods systematic review. Health Res Policy Syst 2023; 21:120. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Allen C, Coleman K, Mettert K, et al. : A roadmap to operationalize and evaluate impact in a learning health system. Learn Health Syst 2021; 5:e10258. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Harris S, Shi S, Brealey D, et al. : Critical Care Health Informatics Collaborative (CCHIC): Data, tools and methods for reproducible research: A multi-centre UK intensive care database. Int J Med Inform 2018; 112:82–89 [DOI] [PubMed] [Google Scholar]
  • 25.Heavner SF, Anderson W, Kashyap R, et al. : A path to real-world evidence in critical care using open-source data harmonization tools. Crit Care Explor 2023; 5:e0893. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Thoral PJ, Peppink JM, Driessen RH, et al. ; Amsterdam University Medical Centers Database (AmsterdamUMCdb) Collaborators and the SCCM/ESICM Joint Data Science Task Force: Sharing ICU patient data responsibly under the Society of Critical Care Medicine/European Society of Intensive Care Medicine Joint Data Science Collaboration: The Amsterdam University Medical Centers Database (AmsterdamUMCdb) example. Crit Care Med 2021; 49:e563–e577 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Adibuzzaman M, DeLaurentis P, Hill J, et al. : Big data in healthcare—the promises, challenges and opportunities from a research perspective: A case study with a model database. Am Med Inform Assoc 2017:384. [PMC free article] [PubMed] [Google Scholar]
  • 28.Cosgriff CV, Celi LA, Stone DJ: Critical care, critical data. Biomed Eng Comput Biol 2019; 10:1179597219856564 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Walkey AJ, Kumar VK, Harhay MO, et al. : The viral infection and respiratory illness universal study (VIRUS): An international registry of coronavirus 2019-related critical illness. Crit Care Explor 2020; 2:e0113. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Yang S, Stansbury LG, Rock P, et al. : Linking big data and prediction strategies: Tools, pitfalls, and lessons learned. Crit Care Med 2019; 47:840–848 [DOI] [PubMed] [Google Scholar]
  • 31.Heller B, Amir A, Waxman R, et al. : Hack your organizational innovation: Literature review and integrative model for running hackathons. J Innov Entrep 2023; 12:1–24 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Mougan C, Plant R, Teng C, et al. : How to data in datathons. Adv Neural Inf Process Syst 2023; 36:10440–10456 [Google Scholar]
  • 33.Perry CK, Damschroder LJ, Hemler JR, et al. : Specifying and comparing implementation strategies across seven large implementation interventions: A practical application of theory. Implement Sci 2019; 14:1–13 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Proctor EK, Powell BJ, McMillen JC: Implementation strategies: Recommendations for specifying and reporting. Implement Sci 2013; 8:1–11 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Lewis CC, Boyd MR, Walsh-Bailey C, et al. : A systematic review of empirical studies examining mechanisms of implementation in health. Implement Sci 2020; 15:1–25 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Lewis CC, Klasnja P, Lyon AR, et al. : The mechanics of implementation strategies and measures: Advancing the study of implementation mechanisms. Implement Sci Commun 2022; 3:1–11 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Lewis CC, Klasnja P, Powell BJ, et al. : From classification to causality: Advancing understanding of mechanisms of change in implementation science. Front Public Health 2018; 6:136. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.Wynn D, Jr, Williams CK: Principles for conducting critical realist case study research in information systems. MIS Quarterly 2012; 36:787–809 [Google Scholar]
  • 39.Dixon-Woods M, Campbell A, Chang T, et al. : A qualitative study of design stakeholders’ views of developing and implementing a registry-based learning health system. Implement Sci 2020; 15:1–11 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40.Keen J, Abdulwahid MA, King N, et al. : Effects of interorganisational information technology networks on patient safety: A realist synthesis. BMJ Open 2020; 10:e036608. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41.Mikalef P, Boura M, Lekakos G, et al. : Big data analytics capabilities and innovation: The mediating role of dynamic capabilities and moderating effect of the environment. Br J Manage 2019; 30:272–298 [Google Scholar]
  • 42.Mikalef P, Krogstie J, Pappas IO, et al. : Exploring the relationship between big data analytics capability and competitive performance: The mediating roles of dynamic and operational capabilities. Inf Manag 2020; 57:103169 [Google Scholar]
  • 43.Nolte A, Chounta I-A, Herbsleb JD: What happens to all these hackathon projects? Identifying factors to promote hackathon project continuation. Proc ACM Hum Comput Interact 2020; 4:1–26 [Google Scholar]
  • 44.Falk J, Kannabiran G, Hansen NB: What do hackathons do? Understanding participation in hackathons through program theory analysis. In: CHI Conference on Human Factors in Computing Systems. Yokohama, Japan. ACM, New York, NY, USA, 2021, pp 1–16 [Google Scholar]
  • 45.de Toledo Piza FM, Celi LA, Deliberato RO, et al. : Assessing team effectiveness and affective learning in a datathon. Int J Med Inform 2018; 112:40–44 [DOI] [PubMed] [Google Scholar]
  • 46.Randell R, Abdulwahid M, Greenhalgh J, et al. : How and in what contexts does networked health it improve patient safety? Elicitation of theories from the literature. In: MEDINFO 2019: Health and Wellbeing e-Networks for All. Amsterdam, IOS Press, 2019, pp 753–757 [DOI] [PubMed] [Google Scholar]
  • 47.Sabharwal R, Miah SJ: A new theoretical understanding of big data analytics capabilities in organizations: A thematic analysis. J Big Data 2021; 8:1–1733425651 [Google Scholar]
  • 48.Brossard P-Y, Minvielle E, Sicotte C: The path from big data analytics capabilities to value in hospitals: A scoping review. BMC Health Serv Res 2022; 22:134. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.Galetsi P, Katsaliaki K, Kumar S: Big data analytics in health sector: Theoretical framework, techniques and prospects. Int J Inf Manage 2020; 50:206–216 [Google Scholar]
  • 50.Galetsi P, Katsaliaki K, Kumar S: Values, challenges and future directions of big data analytics in healthcare: A systematic review. Soc Sci Med (1982) 2019; 241:112533. [DOI] [PubMed] [Google Scholar]
  • 51.Kazdin AE: Mediators and mechanisms of change in psychotherapy research. Annu Rev Clin Psychol 2007; 3:1–27 [DOI] [PubMed] [Google Scholar]
  • 52.Groves P, Kayyali B, Knott D, et al. : The ‘Big Data’ Revolution in Healthcare: Accelerating Value and Innovation. McKinsey & Company. 2013. Available at: https://www.mckinsey.com/~/media/mckinsey/industries/healthcare%20systems%20and%20services/our%20insights/the%20big%20data%20revolution%20in%20us%20health%20care/the_big_data_revolution_in_healthcare.pdf. Accessed January 20, 2026
  • 53.Waller MA, Fawcett SE: Data science, predictive analytics, and big data: A revolution that will transform supply chain design and management. J Bus Logist 2013; 34:77–84 [Google Scholar]
  • 54.Chen CP, Zhang C-Y: Data-intensive applications, challenges, techniques and technologies: A survey on big data. Inf Sci 2014; 275:314–347 [Google Scholar]
  • 55.Shashikumar SP, Wardi G, Malhotra A, et al. : Artificial intelligence sepsis prediction algorithm learns to say “I don’t know.” NPJ Digital Med 2021; 4:134. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 56.Lim L, Gim U, Cho K, et al. : Real-time machine learning model to predict short-term mortality in critically ill patients: Development and international validation. Crit Care 2024; 28:76. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 57.Al Dhoayan M, Alghamdi H, Arabi YM: Machine learning applications in critical care. Saudi Crit Care J 2019; 3:29–32 [Google Scholar]
  • 58.Wynants L, Van Calster B, Collins GS, et al. : Prediction models for diagnosis and prognosis of Covid-19: Systematic review and critical appraisal. BMJ 2020; 369:m1328. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 59.Roberts M, Driggs D, Thorpe M, et al. : Common pitfalls and recommendations for using machine learning to detect and prognosticate for COVID-19 using chest radiographs and CT scans. Nat Mach Intell 2021; 3:199–217 [Google Scholar]
  • 60.Hong N, Liu C, Gao J, et al. : State of the art of machine learning–enabled clinical decision support in intensive care units: Literature review. JMIR Med Inform 2022; 10:e28781. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 61.Reguera-Carrasco C, Barrientos-Trigo S: Instruments to measure complexity of care based on nursing workload in intensive care units: A systematic review. Intensive Crit Care Nurs 2024; 84:103672. [DOI] [PubMed] [Google Scholar]
  • 62.Moreno R, Rhodes A, Piquilloud L, et al. : The Sequential Organ Failure Assessment (SOFA) score: Has the time come for an update? Crit Care 2023; 27:15. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 63.Kashyap R, Sherani KM, Dutt T, et al. : Current utility of Sequential Organ Failure Assessment score: A literature review and future directions. Open Respir Med J 2021; 15:1–6 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 64.Henry KE, Giannini HM: Early warning systems for critical illness outside the intensive care unit. Crit Care Clin 2024; 40:561–581 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 65.Ruppert MM, Loftus TJ, Small C, et al. : Predictive modeling for readmission to intensive care: A systematic review. Crit Care Explor 2023; 5:e0848. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 66.Damschroder LJ, Reardon CM, Opra Widerquist MA, et al. : Conceptualizing outcomes for use with the Consolidated Framework for Implementation Research (CFIR): The CFIR outcomes addendum. Implement Sci 2022; 17:1–10 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 67.Weiner BJ, Lewis CC, Stanick C, et al. : Psychometric assessment of three newly developed implementation outcome measures. Implement Sci 2017; 12:1–12 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 68.Weiner BJ, Belden CM, Bergmire DM, et al. : The meaning and measurement of implementation climate. Implement Sci 2011; 6:1–12 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 69.Weiner BJ, Mettert KD, Dorsey CN, et al. : Measuring readiness for implementation: A systematic review of measures’ psychometric and pragmatic properties. Implement Res Pract 2020; 1:2633489520933896 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 70.Proctor E, Silmere H, Raghavan R, et al. : Outcomes for implementation research: Conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health 2011; 38:65–76 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 71.Bodenheimer T, Sinsky C: From triple to quadruple aim: Care of the patient requires care of the provider. Ann Fam Med 2014; 12:573–576 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 72.Barr J, Paulson SS, Kamdar B, et al. : The coming of age of implementation science and research in critical care medicine. Crit Care Med 2021; 49:1254–1275 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 73.Reddy RC, Mishra D, Goyal D, et al. : A conceptual framework of barriers to data science implementation: A practitioners’ guideline. Benchmarking: An Int J 2023; 31:3459–3496 [Google Scholar]
  • 74.Reddy S: Generative AI in healthcare: An implementation science informed translational path on application, integration and governance. Implement Sci 2024; 19:27. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Critical Care Explorations are provided here courtesy of Wolters Kluwer Health

RESOURCES