Abstract
Objective:
Our objective was to develop an evaluation framework for electronic health record (EHR)-integrated innovations to support evaluation activities at each of four information technology (IT) life cycle phases: planning, development, implementation, and operation.
Methods:
The evaluation framework was developed based on a review of existing evaluation frameworks from health informatics and other domains (human factors engineering, software engineering, and social sciences); expert consensus; and real-world testing in multiple EHR-integrated innovation studies.
Results:
The resulting Evaluation in Life Cycle of IT (ELICIT) framework covers four IT life cycle phases and three measure levels (society, user, and IT). The ELICIT framework recommends 12 evaluation steps: 1) business case assessment; 2) stakeholder requirements gathering; 3) technical requirements gathering; 4) technical acceptability assessment; 5) user acceptability assessment; 6) social acceptability assessment; 7) social implementation assessment; 8) initial user satisfaction assessment; 9) technical implementation assessment; 10) technical portability assessment; 11) long-term user satisfaction assessment; and 12) social outcomes assessment.
Discussion:
Effective evaluation requires a shared understanding and collaboration across disciplines throughout the entire IT life cycle. In contrast with previous evaluation frameworks, the ELICIT framework focuses on all phases of the IT life cycle across the society, user, and IT levels. Institutions seeking to establish evaluation programs for EHR-integrated innovations could use our framework to create such shared understanding and justify the need to invest in evaluation.
Conclusion:
As health care undergoes a digital transformation, it will be critical for EHR-integrated innovations to be systematically evaluated. The ELICIT framework can facilitate these evaluations.
Keywords: Health technology assessment, implementation science, health information technology, clinical decision support, human factors engineering, evaluation framework
INTRODUCTION
As the adoption of electronic health records (EHRs) has expanded rapidly in recent years, EHRs have taken center stage in modern healthcare. Moreover, EHRs have become platforms for the delivery of various health interventions. For the purposes of this paper, we define EHR-integrated innovations as health interventions that 1) have a software component, including clinical decision support (CDS) tools (e.g., EHR-integrated alerts and reminders, order facilitators, integrated information displays), specific modules of an EHR system (e.g., for order entry or clinical documentation), and interoperable externals apps; 2) are integrated with the EHR, which could be achieved through various mechanisms such as using EHR native functionality, proprietary interfaces, and/or standards-based interfaces; 3) address a specific healthcare domain and/or process, such as colorectal cancer screening, medication ordering, or neonatal bilirubin management; and 4) aim to improve healthcare outcomes such as quality of patient care, provider experience, or healthcare value. Examples of EHR-integrated innovations include provider and patient reminders for preventive care procedures, an evidence-based institutional order set for community acquired pneumonia, and an interoperable external app supporting shared-decision making for lung cancer screening.
EHR-integrated innovations have been identified as having great potential for improving health and health care.1 However, to achieve their full potential, EHR-integrated innovations must meet multiple requirements. Ideally, EHR-integrated innovations must integrate into an existing EHR system; meet clinical needs as defined by organizations, national bodies, and/or professional societies; be acceptable to end users including clinicians and patients; integrate seamlessly into existing clinical workflows and information systems; alter clinician or patient behavior; do no harm; and be portable across EHR platforms. The need to meet these various requirements makes development, implementation, and evaluation of EHR-integrated innovations quite challenging. Recent publications show that EHR-integrated innovations do not always meet these requirements.2-4 Thus, it is important to ensure the evaluation of EHR-integrated innovations is supported by an evaluation framework that can help address these various requirements.
Given the complexity of EHR-integrated innovations, we have identified three main requirements for an evaluation framework for EHR-integrated innovations: 1) support the full information technology (IT) life cycle; 2) address measures at society, user, and IT levels; and 3) address issues specific to EHR integration. Justification for these requirements is shown in Table 1.
Table 1.
Evaluation Framework Requirements
| Requirement | Justification |
|---|---|
| 1. Support the full IT life cycle | It has been proposed that the full potential of any health information intervention can only be reached if its development has been accompanied by appropriate and rigorous evaluation throughout the entire IT life cycle,5,6 defined, for the purposes of this paper, as including four phases: 1) planning (software ideation and prioritization), 2) development (software design and development), 3) implementation (software introduction into clinical practice),and 4) operation (maintenance and dissemination to other healthcare systems).7 An IT life cycle-aligned evaluation framework would improve efficiency by enabling stakeholders grounded in different domains to coordinate ongoing efforts by creating a common mental model.8-10 Moreover,a life cycle-aligned framework can enable the knowledge gained at earlier phases to guide the decisions made in later phases, as well as to enable feedback loops that support continuous learning. |
| 2. Address measures at society, user, and IT levels | Both traditional health information interventions and EHR-integrated innovations operate on three intervention levels: social (or organizational/environmental), user (or human), and IT (or technical), and all three levels need to be assessed in evaluation studies.11 Evaluation of these levels should be informed by several scientific domains including social sciences, human factors engineering, and software engineering. |
| 3. Address issues specific to EHR integration | EHR-integrated innovations operate uniquely in the context of the EHR, adding extra complexity to the already complex health information systems. Three issues specific to the evaluation of EHR-integrated innovations make an evaluation framework especially important. These issues include: 1) the technical integration between the EHR and the innovation which is highly diverse across institutions; 2) the user interface integration between the EHR and the innovation which is still in the early stages of addressing the overall user experience of the clinician; and 3) the EHR workflow integration of the innovation with usual EHR workflows. EHR workflow is clinic and unit specific and there continues to be no universally accepted methodology to quantify EHR workflows reliably. |
EHR – electronic health record, IT – information technology.
While many comprehensive evaluation frameworks for health information systems have been described in literature reviews,12-17 none of them meet the three requirements summarized in Table 1. Most evaluation frameworks developed for health information interventions do not meet the first requirement (i.e., support the full IT life cycle).11,18-21 For example, the Health Information Technology Research-based Evaluation Framework (HITREF) developed by Sockolow et al. describes 6 evaluation concepts (i.e., structural quality, quality of information logistics, unintended consequences/benefits, effects on quality process, and barriers and facilitators to adoption).18,19 While HITREF acknowledges the importance of context, it does not address the “how” and “when” of evaluation throughout the IT life cycle phases.15
We identified a few evaluation frameworks aligned with the IT life cycle,22-29 but these frameworks did not meet the second requirement (i.e., address society, user, and IT-level measures). Among these evaluation frameworks, many focused mainly on one level such as society-level,25 user-level,22,23 or IT-level measures.24 Finally, the evaluation frameworks that did focus on multiple IT life cycle phases and multiple levels either omitted one of the IT life cycle phases or one of the measure levels.26-29
Moreover, all existing evaluation frameworks failed the third requirement (i.e., address EHR integration issues). More specifically, the existing frameworks do not address the quality of back-end EHR integration, user-interface EHR integration, and the changes in EHR workflows caused by the innovation.17 Hence, there is a need for a framework describing an appropriate evaluation for every phase of the EHR-integrated innovation life cycle on society, user, and IT levels.
To address this emergent need for a holistic approach to the evaluation of EHR-integrated innovations, we developed the Evaluation in Life Cycle of IT (ELICIT) framework by incorporating concepts from human factors engineering, software engineering, and social sciences domains across the entire IT life cycle. The purpose of the ELICIT framework is to guide evaluation efforts during the planning, development, implementation, and operation of EHR-integrated innovations. The intended users of the framework are innovation research and development teams such as those from academic health systems and industry.
OBJECTIVE
The objective of this paper is to describe an evaluation framework for supporting the whole life cycle of EHR-integrated innovations.
METHODS
Definitions
To enable shared understanding across stakeholders grounded in different domains with divergent terminologies, we define the following terms for the purpose of this paper.
Study Design
This work is presented as a case study of the development of a new evaluation framework derived from the review of theoretical frameworks, the convening of an expert panel, and multiple real-world evaluations. All included projects were approved or exempted as quality improvement by the University of Utah (UU) Institutional Review Board (IRB).
Setting
This work was conducted at UU Heath, an academic medical center with five hospitals and 12 community clinics. UU Health has used the Epic® EHR in its primary care clinics since 1999 and enterprise-wide since 2016.
As interoperability standards and application programming interfaces (APIs) were increasingly required by government regulations and adopted by healthcare systems and EHR vendors,35 UU Health launched the ReImagine EHR initiative in 2016 to harness the promise of interoperable apps based on standards such as Substitutable Medical Applications, Reusable Technologies on Fast Healthcare Interoperability Resources (SMART on FHIR).36-39 The ReImagine EHR initiative developed, implemented and evaluated over ten EHR-integrated innovations (e.g., to support neonatal bilirubin management guidelines, lung cancer screening guidelines, smoking cessation, and chronic disease management guidelines), established productive corporate partnerships, and secured over $35 million in grant funding.40 ReImagine EHR’s innovations are seamlessly integrated with the Epic® EHR; are supported by native CDS components such as preventive care reminders; and are designed using interoperability standards to enable integration with other EHR products and dissemination across healthcare organizations. The initiative is led by the Associate Chief Medical Information Officer (KK). The technical development team consists of nine core team members and dozens of project-based contributors.40
Evaluation Infrastructure
When initially established in 2016, the ReImagine EHR initiative focused primarily on the design and development of EHR-integrated innovations. The Director of the UU Department of Biomedical Informatics Sociotechnical Core (CW) oversaw the evaluation aspects of this new initiative, with support provided by seed research grants. Given the number of requested innovations and resource constraints, only limited evaluation was performed. However, as more innovations entered clinical use, we recognized the need to invest further in evaluation and to establish a formal evaluation team. In 2018, the initiative hired an evaluation scientist (PVK) with training in biomedical informatics, biostatistics, health technology assessment, and data science to oversee and coordinate the data-focused aspects of evaluation with statistical support from the UU Study Design and Biostatistics Center. Other key members of the evaluation team include project leads, clinical informaticists, human-factors engineers, software engineers, clinical champions, and implementation scientists. The engagement of these stakeholders is supported by a combination of operational funding, training grants, research grants, and industry collaborations.
Framework Development
Overall Approach to Framework Development
The four-year iterative process of framework development is summarized in Figure 1. Our evaluation framework was iteratively developed through focused literature review, practical experience evaluating innovations, and expert consensus. Practical experience included the evaluation of traditional EHR-integrated innovations41-46 as well as interoperable EHR-integrated innovations40,47-51 including a neonatal bilirubin management app,47,48 a diabetes pharmacotherapy outcome prediction app,49 a chronic disease management app,50 a lung cancer screening shared decision app,51 and a clinical calculator app.52 These projects involved all the clinical informaticists noted below who worked together developing and evaluating the applications. The expert consensus was based on this shared experience and was formally reached through five meetings in 2020 where the model went through discussion, review, and further development each time. Additional asynchronous review and feedback of the framework continued in 2021 involving 10 experts with training in biomedical informatics (PVK, CW, GDF, TT, TJR, HSK, KK), human factors engineering (TT, HSK, CW), software engineering (GDF, HSK, KK, CN), implementation science (GAA, CRR), and biostatistics/effectiveness research (PVK, CW). During the meetings PVK presented the framework draft, asked specific questions about the model and elicited experts’ feedback regarding relevant outcomes, questions, methods and theoretical perspectives. Based on the outcomes of each meeting, PVK revised the framework for further review by these experts. After 5 meetings and the achievement of initial consensus, we transitioned to individual meetings and asynchronous communication until we reached consensus on the final framework.
Figure 1.
Framework Development Method and Timeline
Approach to Defining the IT Life Cycle Phases
We defined the IT life cycle phases based on two sources: Ammenwerth et al.’s planning, development, implementation, and operation phases6 and the Exploration, Preparation, Implementation, Sustainment (EPIS) implementation science framework phases.7,32,53 The EPIS framework describes four life cycle phases for the implementation of health services interventions.53 In the Exploration phase, stakeholders consider emergent or existing health needs and identify the best guidelines or clinical interventions to address those needs.53 In the Preparation phase, the potential barriers and facilitators of implementation are identified for external (outer) and organizational (inner) contexts, adaptation needs are further assessed for both the innovation (e.g., innovation) and for system or organizational structures or processes (e.g., reimbursement, changes in clinic workflows), and a detailed implementation plan is developed.53 In the Implementation phase, the intervention is rolled out into clinical practice. In the Sustainment phase, the intervention continues to be delivered, with or without adaptation as needed.53
Approach to Defining the Society, User, and IT-Level Measures
In creating our evaluation framework, we sought to incorporate key insights, theoretical perspectives, and practices from health informatics as well as inter-related domains of social sciences, human factors engineering, and software engineering (Figure 2). Relevant frameworks, models, theories and reporting guidelines are described below in the context of three levels of EHR-integrated innovations: society, user, and IT.
Figure 2.
Overlapping Domains Informing Evaluation Studies in Health Informatics
Society-Level Measures
To select exemplar society-level measures, we focused on two specific social sciences: effectiveness research and implementation science. Effectiveness research is the systematic evaluation of health technology to demonstrate the impact and inform stakeholders’ decision making related to the dissemination of health technologies based on their effectiveness.54 Effectiveness research perspectives are a loose collection of evaluation models that include clinical trials, heath technology assessment, comparative effectiveness research, health services research, health economics and outcome research methodologies.55 Effectiveness research is critical for identifying return on investment and summative assessments of process, health,21,54 economic,56 and safety outcomes.57-59 Reporting guidelines in effectiveness research include Consolidated Standards of Reporting Trials (CONSORT)60, Standards for QUality Improvement Reporting Excellence 2.0 (SQUIRE 2.0),61 STAtement on Reporting of Evaluation studies in Health Informatics (STARE-HI),62 and Consolidated Health Economic Evaluation Reporting Standards (CHEERS).63
A specific branch of social sciences called Implementation Science is especially useful for evaluating society level measures. Implementation science is “the scientific study of methods to promote the systematic uptake of research findings and other evidence-based practices into routine practice, and, hence, to improve the quality and effectiveness of health services”.64 Implementation science evaluation frameworks assess implementation strategies and outcomes such as adoption, reach and implementation fidelity.7,31,32,53,65,66 The main evaluation frameworks in this domain include Reach, Effectiveness, Adoption, Implementation, and Maintenance (RE-AIM),65 Nonadoption, Abandonment, Scale-up, Spread, and Sustainability (NASSS),67 and the Implementation Outcomes Framework.66 Process-oriented frameworks such as EPIS are useful in describing processes such as implementation or development.7,32,53 Determinant frameworks such as the Consolidated Framework for Implementation Research (CFIR)31 describe multi-level factors which influence implementation outcomes.66 Reporting guidelines in implementation science domain include Standards for Reporting Implementation Studies (StaRI)68 and Framework for Reporting Adaptations and Modifications to Evidence-based interventions (FRAME).30 Traditionally, implementation science frameworks have focused on the implementation of social and health services interventions,65,69 with less emphasis on digital technologies. However, there is an increasing demand for the adoption of implementation science frameworks in innovation development and evaluations.70-72
User-Level Measures
To select exemplar user-level measures, we focused on human factors engineering research. Human factors engineering entails studying user’s characteristics, needs, abilities, and limitations, and designing software around the user needs. Commonly-used human factors theories aiming to understand and modify end user behavior include the Social Cognitive Theory (SCT),73 the Situational Awareness (SA) framework,74 and the Unified Theory of Acceptance and Use of Technology (UTAUT).75 User centered design includes three steps.22,23,76-79 First, users’ needs, tasks, values, and workflows are identified using observations, interviews, or focus groups. These activities are grounded in models of contextual design,80 targeted Cognitive Task Analyses,81,82 and representational analysis to identify the informational display formats that are meaningful to users.83 Second, low and high fidelity prototypes are designed through an iterative process that involves journey mapping, story boarding, or concept mapping. Third, usability testing is conducted on the prototypes.22,84,85
IT-Level Measures
To select exemplar IT-level measures, we focused on the software engineering domain. Software engineering entails conceiving, specifying, designing, programming, documenting, testing, and refining software. Software engineering evaluation activities in health informatics address the soundness of the software architecture,24 functionality,86 security,87 interoperability,48,88 and readiness for clinical use.89 Process-oriented frameworks in this domain include models for the software development life cycle.90 The DeLone and McLean Information Systems Success Model specifies system quality, information quality and service quality as the main indicators of success at the technology level.91
Example: Representative EHR-integrated Innovation Project
We illustrate use of the ELICIT framework by describing the evaluation of a representative innovation – the Neonatal Bilirubin Management App.47,48 This innovation focuses on implementation of the 2004 American Academy of Pediatrics guideline for identifying and treating newborns with elevated bilirubin levels, which is a common condition affecting newborns that can lead to permanent brain damage.92
RESULTS
ELICIT Framework
The resulting ELICIT framework can support the whole IT life cycle. The first element of the ELICIT framework is the four often-overlapping and iterative IT life cycle phases: I. Planning, II. Development, III. Implementation, and IV. Operation (Figure 3).7,32,93 In the Planning phase, the software idea is conceived and prioritized, and the software development decision is made. In the Development phase, the software is designed and developed, the software rollout decision is made, and governance approval is secured. In the Implementation phase, the implementation is planned and the software is gradually rolled out into clinical practice. In the Operation phase, the software is maintained, updated as needed and disseminated to other clinical sites.
Figure 3.
The IT Life Cycle Phases
The ELICIT framework describes three levels of measures: society, user, and IT, which could be remembered with an acronym SUIT. Examples of measures corresponding to different levels are displayed in Figure 4. Society-level measures include impact on patient health, organizations, business processes, and return on investment. User-level measures focus on user experience, IT usability, and EHR workflows. Finally, IT-level measures focus on system, information, and service quality.
Figure 4.
Examples of Measures at the Society, User, and IT Levels
The ELICIT framework is shown in Figure 5 and expanded in Tables 2-5. The ELICIT framework consists of 12 evaluation steps fully covering the four phases of the IT life cycle and society, user, and IT measure levels.
Figure 5.
ELICIT Framework Overview: IT Life Cycle Phases, Levels of Measures, and Evaluation Steps
Table 2.
Planning Phase: Evaluation Steps, Exemplar Questions, and Exemplar Methods
| Step 1. Business case assessment: Is the project worth pursuing for the organization? | |
|---|---|
| Exemplar Questions |
Value proposition: Who are the stakeholders? What are the anticipated benefits such as improved patient care, provider experience, stakeholder finances, and/or development of grant proposals or partnerships? Will it advance science? Does the project address important care gaps and serve to bridge outer system policies and inner context operations?94 Risks and costs: What are the anticipated project costs? Are there risks to patients, providers, or the organization? Social feasibility: How difficult is the problem? Will a change in clinical and EHR workflows be required? Do relevant leaders support the project? Alternatives: What are existing clinical workflows? Does the EHR already do this sufficiently well? Will the EHR vendor tackle this problem soon? Are third-party solutions available? Health equity: Will the innovation produce desired impact across gender and minority groups? Stakeholder needs: Who are the other relevant stakeholders (e.g., healthcare system leaders, payers)? What needs do they have (e.g., clinical, reporting, regulatory, or financial needs)? |
| Exemplar Methods | Assessment of existing original papers and literature reviews to determine the magnitude of the problem and existing solutions. EHR chart audits and/or database queries to characterize the local prevalence and magnitude of the problem. Systematic input from relevant stakeholders. Market research and evaluation of existing and emerging alternative solutions. Workflow analysis interviews with relevant stakeholders to identify all relevant stakeholders, their decisions, information needs, communication patterns and pain points. Observations of relevant workflows. |
| Step 2. User requirements gathering: What user needs must be fulfilled for the project to be successful? | |
| Exemplar Questions |
User needs: Who are the relevant end users (e.g., physicians, patients, medical assistants)? How many end-users will be affected? Decision and communication tasks: What are the tasks for which they require support? What decisions are made? What information is required for each decision? What are the opportunities for improvement? Link to implementation planning: What processes have to be linked to implementation strategies? EHR workflow analysis: what are existing EHR workflows? Link to effectiveness assessment: What metrics need to be measured to assess whether stakeholder needs have been met? |
| Exemplar Methods | Observations of the users and their EHR workflows. Cognitive task analysis81,82 interviews to assess end user goal, tasks, and mental models, using Critical Incidence Technique, Stimulated Recall, screen capture, and/or eye tracking. Synthesis of the requirements to create a list of relevant measures, potential adverse events, and required data points for innovation summative evaluation. Building scenarios based on user stories. |
| Step 3. Technical requirements gathering: What technical needs must be fulfilled for the project to be successful? | |
| Exemplar Questions |
Technical feasibility: Will a change in EHR workflows be required? Is it feasible to technically implement an innovation to adequately address this problem? Are the required data and APIs available? Can the innovation be scaled? Can software performance requirements be reasonably met given the available software infrastructure? Functional requirements: What are the functional requirements based on the user needs and use cases? Software performance requirements: How fast and scalable does the software need to be? How many concurrent users does the application need to support? Information quality: What information does the system need to provide? How accurate and relevant should the information provided by the software be? Technical support: What service level agreements are needed? What categories of issues are expected? How quickly issues will need to be resolved based on their category? Installation and maintenance requirements: How much effort is acceptable for installation and maintenance? Interoperability requirements: What are EHR integration requirements? How portable does the innovation need to be? What interoperability standards need to be supported? Regulatory requirements: What are the security, privacy, and other regulatory requirements? |
| Exemplar Methods | Technical feasibility assessment including assessment of existing technical infrastructure, available data, APIs and knowledge resources. Systematic collection of input on software performance requirements from stakeholders including end-users and organizational stakeholders. Review of regulatory and security requirements. |
While evaluation is an iterative process and one in which different activities may occur in parallel, we provide in Figure 5 a typical progression across the 12 evaluation steps of the ELICIT framework. In the planning phase, evaluation starts with the problem being addressed and moves from social needs to user needs to more specific technical requirements. In the development phase, the direction of evaluation is generally reversed, beginning with verifying that the prototype meets minimum technical requirements then progressing to user acceptance testing and evaluation of intention to use by the intended social group. The implementation phase then generally starts with the assessment of organizational readiness for the innovations, moves on to evaluating initial user satisfaction, and the technical implementation is monitored as it is deployed at a larger scale and enhancement requests are implemented. Finally, the operation phase includes the technical assessment of the innovation to determine whether it could be disseminated to other health systems, as well as the evaluation of long-term user satisfaction and the impact of the innovation on clinical and financial outcomes important to society.
ELICIT framework steps, potential evaluation questions, and methods for data acquisition and analysis are included in Tables 2-5. While depicted in a linear form, steps within phases can occur in parallel or in different orders, and projects may move back to earlier phases (e.g., to take an implemented innovation back to software design and development based on user feedback).
The proposed questions and methods are labeled as “Exemplar Questions” and "Exemplar Methods" as they are clearly not comprehensive. The listed methods are a subset of approaches from a much larger superset of potential questions and methods, as there are many factors beyond those discussed in the paper that should be considered when selecting specific evaluation questions and methods. Actual questions and methods should be selected based on societal and user needs discovered during the planning phase and adapted based on available resources.
The planning phase involves innovation idea conception, prioritization, coordination with stakeholders, requirement gathering, and project governance review and approval. ELICIT framework steps 1, 2 and 3 fall into this phase (Table 2).
The development phase involves software design, software development, and governance review and organizational approval of the implementation. ELICIT framework steps 4, 5, and 6 fall into this phase (Table 3).
Table 3.
Development Phase: Evaluation Steps, Exemplar Questions, and Exemplar Methods
| Step 4. Technical acceptability assessment: Does the software meet technical requirements? | |
|---|---|
| Exemplar Questions |
Data readiness: Are the required data available, accurate, complete, high quality, and sufficiently granular? Functional requirements: Is the test coverage complete? Software performance: Will users accept system load and response times for the value provided? Can the software support the required number of concurrent users? Information quality: Is the software logic internally consistent, accurate and provide coverage for all anticipated patient patterns? Interoperability: What interoperability standards are used? Are required APIs supported?Are custom interfaces needed? Can the software handle partially compliant implementations of the interoperability standards? Regulatory compliance: Are legal requirements (e.g., HIPAA) met? Are the servers secure? Are sensitive data encrypted and backed up? Is access to sensitive data controlled? Does PHI leave the system? |
| Exemplar Methods | Automated performance testing of load and response times. Automated unit and integration testing. Manual user testing by beta testers (e.g., application response time for typical and edge use cases).95 Technical peer review of the innovation, particularly to safeguard security and privacy. Security vulnerability testing using vulnerability scanning tools. Documenting software development time, effort, and costs. Documenting barriers to development. |
| Step 5. User acceptability assessment: What designs are optimal to meet stakeholder needs? | |
| Exemplar Questions |
Cognitive load: Does using the innovation require less cognitive effort compared to the existing EHR workflow? Usability: Does the innovation support identified cognitive tasks and make it easy to do the right thing (e.g., usability, content quality, aesthetics)? What usability issues still need to be addressed? Is the innovation being used to meet unanticipated needs (e.g., resident education)? EHR display integration: Does the innovation display work well with the EHR user interface? |
| Exemplar Methods | Design sessions, stakeholder meetings, and usability evaluation studies to assess users’ experience, time to complete tasks, and accuracy with low and high fidelity prototypes and scenarios built in Step 2 including using Think Aloud method.96 Heuristic design evaluation using Nielsen usability principles by experts and end users, including comparing the innovation user interface with the EHR user interface.84,97 |
| Step 6. Social acceptability assessment: Is it worth deploying the software? | |
| Exemplar Questions |
Efficiency: Does the innovation require less keystrokes, mouse clicks, screen changes and time to complete a task while maintaining the same accuracy compared to the existing EHR workflow? Process efficacy: Does the innovation improve the completion of desired tasks compared to the existing EHR workflow? Social acceptability: Are other identified organizational needs addressed? For example, does the innovation adequately support identified billing or regulatory requirements? |
| Exemplar Methods | Predictions of intention to use could be informed by and behavior change theories such as the UTAUT75 and SCT.73 Formative evaluation in simulated settings comparing the innovation with usual care including using factorial experimental designs and scenario-based simulations.50 End user cognitive load assessment including using the NASA TLX.98 Task-based efficiency assessment including time, mouse clicks, keystrokes, and screen changes.50 |
EHR – electronic health record, HIPAA – Health Insurance Portability and Accountability Act, NASA TLX - National Aeronautics and Space Administration Task Load Index, PHI – protected health information.
The implementation phase involves implementation strategy design, staff communication, education and training, and pilot innovation rollout followed by wider rollout with iterative innovation improvements as needed. ELICIT framework steps 7, 8, and 9 fall into this phase (Table 4).
Table 4.
Implementation Phase: Evaluation Steps, Exemplar Questions, and Exemplar Methods
| Step 7. Social implementation assessment: What factors (e.g., barriers and facilitators) and outcomes do we need to consider or leverage to achieve a successful implementation? | |
|---|---|
| Exemplar Questions |
Readiness for wide clinical use: Do providers have major concerns with the innovation software performance, algorithm and data accuracy, usability, or workflow integration? Innovation reach: How many providers and patients could be reached by the innovation? Innovation adoption: Which clinical sites adopted the innovation? Usage patterns: How many patients and providers are reached by the intervention? For what percentage of eligible patients is the innovation used? Innovation implementation fidelity: Is the underlying healthcare practice implemented with fidelity (i.e., as conceived and planned)? Adaptations over time: What adaptations to innovation components were required during the implementation phase, what motivated the adaptations, and how were they valued by different stakeholders?99 Implementation strategies effectiveness: Did the implementation strategies work? What were determinants (predictors) and mechanisms (mediators and moderators) of success?Which implementation strategies worked the best? Implementation cost: How much does it cost to implement and maintain this innovation?100 Long-term adoption, reach and implementation fidelity: Who is using the innovation, how often, and for what population? For what percentage of relevant encounters is the innovation being used? Is the underlying healthcare practice implemented with fidelity (i.e., as conceived and planned)? Innovation normalization: Is the organization committed to the innovation? Is the innovation education included in new staff training? Dissemination package: What dissemination resources (e.g., evaluation metrics, effective implementation strategies) should be included in the dissemination package? |
| Exemplar Methods | Systematic input from key stakeholders in planning the implementation. Data queries to assess the current state (e.g., affected patients, targeted providers, and patterns of current health services use). Data queries to identify providers to target for interviews. Interviews and/or surveys with users and other stakeholders to identify barriers and facilitators to innovation implementation, considering implementation determinants described in EPIS7,32 and CFIR,31 guidance on documenting adaptation using FRAME.30 Systematic collection of feedback from pilot users through innovation advisory groups. Adoption and implementation monitoring through EHR and innovation data analysis including log analysis.101 Assessment of impact of innovation adaptations on reach and implementation fidelity. Assessment of impact of implementation strategies on reach and implementation fidelity. EHR and innovation data analysis including log analysis101 to compare usage to when the innovation could or ‘should’ have been used. Continuing monitoring for innovation malfunctions.102 |
| Step 8. Initial user satisfaction assessment: Do users find the innovation enjoyable? | |
| Exemplar Questions |
Usability, user acceptance and satisfaction: Do users find the innovation software acceptable, usable, and enjoyable (perceived enjoyment)? Does the innovation software contribute to app fatigue? Perceived effectiveness: Do providers feel that they are providing better care? Do providers feel more confident that they could provide needed care? EHR workflow analysis: Did EHR workflows become less fragmented? |
| Exemplar Methods | Assessment of usability and perceived effectiveness through semi-structured interviews. Analysis of EHR and software logs to determine changes in EHR workflows. |
| Step 9. Technical implementation assessment: Does the software meet technical requirements in the real-world setting? | |
| Exemplar Questions |
Integration, installation, and maintenance: Who will be responsible for software integration? What effort is required to install and maintain the innovation? What configurations are required on the customer system? Can the software be configured for the customer system? Does the software offer a mechanism to customize and update configurations including value sets, mappings, and orderables? Technical support: Is the technical support infrastructure sufficient to support the service level agreement? Real-word technical performance: Were there any significant innovation downtimes? Is software performance stable? Adaptations over time: What further technical adaptations were required during the implementation phase? How do changes in EHR versions affect the software? |
| Exemplar Methods | Continuous quality assurance Performance, scalability, and uptime analysis Continuous operational monitoring |
CFIR – Consolidated Framework for Implementation Research, EHR – electronic health record, EPIS - Exploration, Preparation, Implementation and Sustainment, FRAME – Framework for Reporting Adaptations and Modifications to Evidence-based interventions, SCT – Social Cognitive Theory, SUS – system usability scale, UTAUT - Unified Theory of Acceptance and Use of Technology.
The operation phase involves conducting a clinical trial to assess effectiveness, maintaining and monitoring innovations after the changes in the innovation have stabilized and disseminating the innovations to other sites. ELICIT framework steps 10, 11, 12 fall into this phase (Table 5).
Example: Using ELICIT Framework for a Representative EHR-integrated Innovation
We describe below use of our evaluation framework for a representative innovation, the Neonatal Bilirubin Management App.47,48
I. Planning phase
Step 1. Business case assessment
The business case for this innovation was to improve the management of neonatal hyperbilirubinemia.92 The healthcare guideline to address neonatal hyperbilirubinemia has been available since 2004 and was adopted as the standard of care at the UU Health newborn nursery shortly thereafter.92 However, there was wide variability in guideline adherence among the providers. To reduce this variability, universal bilirubin screening was implemented at UU Health in 2016, and an external standalone Web tool known as BiliTool was used to assess whether bilirubin levels required treatment by manually entering patient data abstracted from the EHR. The goal of the innovation project was to improve the fidelity of the guideline implementation and to save provider time through automation. Other motivators for pursuing this project included a prototype solution available through the SMART gallery app store,105 a manageable scope, the engagement of enthusiastic clinical champions (JHS and CHS), and the availability of key data requirements in the EHR. Moreover, no clear alternatives were available for fully meeting stakeholder needs within the EHR.
Step 2. User requirements gathering
The user requirements were gathered through Cognitive Task Analysis with clinical champions performed by JHS, CHS, and KK. The provider tasks and decisions included gathering relevant EHR data about mother-infant dyad; assessing newborn bilirubin levels in relation to patient-specific thresholds; assessing the probability of rebound hyperbilirubinemia following phototherapy;106 and making guideline-based care decisions.92
Step 3: Technical requirements gathering
The technical requirements were identified through the analysis of business and user requirements by KK and other members of the ReImagine EHR software development team. Key identified requirements included app loading times of under 5-10 seconds and limiting the data elements to the US Core FHIR APIs as much as possible to improve interoperability.
II. Development phase
Step 4. Technical acceptability assessment
Technical acceptability was ensured through automated regression testing and manual integration testing in the test EHR environment to assess accuracy and response time performed by KK and the software development team. As is the case with all of our innovations proceeding to clinical use, the security risks were formally reviewed by the Information Security Office, and these risks were deemed by the institutional leadership to be outweighed by the project benefits.
Step 4. User acceptability assessment
User acceptability assessment was conducted through multiple rounds of user feedback on low- and high-fidelity prototypes developed by the software development team. Heuristic evaluation based on Neilsen’s design principles84 was also used to identify and address usability issues by TT, PVK, HSK, and CW.
Step 6. Social acceptability assessment
For the social acceptability assessment, we focused on determining the efficiency gains. The efficiency assessment was conducted through a randomized, controlled, IRB-approved experiment in which 12 pediatric resident physicians were asked to manage bilirubin in 2-4 newborns, who were randomized to receive care supported by our EHR-integrated innovation or the previous standalone alternative (BiliTool) performed by PVK, HSK, and CW in 2018. This study showed a time savings of 66 seconds per bilirubin assessment (95% CI, 53-79) compared with usual time required for neonatal bilirubin management.47
III. Implementation phase
Step 7. Social implementation assessment
Implementation factors assessment was done with clinical champions and the EHR team by JHS, CHS, and KK in 2017. Implementation facilitators included leadership engagement, the previous adoption of the underlying clinical guideline in the nursery, a positive implementation climate, a strong culture of quality improvement, and newborn nursery educational infrastructure supporting guideline implementation. The resulting implementation strategy had three parts: 1) an email communication to end users of the EHR by the institutional IT leadership with information on the innovation, 2) socialization among attending providers through clinical champions, and 3) attending physicians and senior residents teaching incoming residents to use the innovation.
Implementation outcomes assessment started with beta users (JHS, CHS, and other attending physicians added gradually) who provided continuous feedback to help improve innovation usability and accuracy through weekly meetings. The Neonatal Bilirubin Management App was rolled out in the nursery and 12 community clinics in 2017. Implementation outcomes included innovation adoption, reach and implementation fidelity and were assessed by PVK in 2019. According to analysis of the app logs, in 2018, innovation was used at least once (adoption) in the nursery and 12 outpatient clinics. Innovation reach extended to all newborns born at the UU Health because the app was available system-wide. Most uses (86%) occurred in the nursery with physicians, nurses, as well as a variety of other stakeholders including medical students and medical assistants using the innovation.27 Innovation implementation fidelity was high, with bilirubin levels clinically managed via the innovation for 90% of babies born at UU Health. We continue to monitor reach and implementation fidelity through the weekly analysis of the EHR logs.
Step 8. Initial user satisfaction assessment
PVK, HSK, TT, and CW evaluated initial user satisfaction through qualitative analysis of 12 physician interviews. User response to the intervention was positive and some unintentional uses were uncovered such as using the app for resident education.
Step 9. Technical implementation assessment
Technical implementation assessment involved performance, scalability, and uptime analysis. It was performed by the software development team.
IV. Operation phase
Step 10. Technical portability assessment
Technical portability was evaluated through review of FHIR APIs that were not yet universally supported through US Core FHIR APIs performed by PVK, KK, CN, and the software development team, including through a formal analysis published in 2019.48
Step 11. Long-term user satisfaction assessment
PVK, TT, and CW evaluated long-term user satisfaction through quantitative analysis of 109 SUS surveys of 208 recent innovation end users identified through system logs (52% response rate). The mean SUS score for attending providers was 91.05 (95% CI, 86.31-95.79) which corresponds to “best imaginable” usability.47
Step 12. Social outcomes assessment
Social outcomes assessment involved process, health, and economic outcomes analysis by PVK. Process impact assessment was conducted as a before-and-after retrospective clinical trial from April 1, 2016 to April 30, 2019. With regard to process outcomes, this study showed that orders for clinically appropriate phototherapy during hospitalization increased for newborns with bilirubin levels above the guideline-recommended threshold (odds ratio, 1.84; 95% CI, 1.16-2.90; P = .01).47 In the interviews with 12 users conducted by PVK and HSK, we identified that users were using the app for teaching. No negative unintended process consequences were found. We did not find a significant difference in measured health outcomes including length of stay, intensive care unit admissions, and readmissions.47 No negative unintended health consequences were found. A preliminary economic evaluation was conducted in 2019. If deployed through the entire United States, given over 3.4 million eligible births in 2018, 66 seconds saved per use, and 5 uses per newborn, the innovation could save over 300,000 hours of clinician time annually,47 which for a $50-100 hourly pay would translate to $15-30 million in savings. We are exploring the conduct of a more formal cost-effectiveness analysis in the future.
DISCUSSION
In this paper, we described the ELICIT framework, an evaluation framework that can be used to support all phases in the EHR-integrated innovation life cycle through a comprehensive, replicable process. ELICIT framework provides potential evaluation questions and methods for each measure level (society, user, and IT). ELICIT framework incorporates constructs and methods from health informatics and three related domains: social sciences, human factors engineering, and software engineering. Leveraging these measures, ELICIT framework builds on the EPIS implementation framework, with consideration given to outer and inner implementation contexts, bridging factors, intervention characteristics, and intervention adaptations after implementation.7,32,53 As a holistic evaluation framework, ELICIT framework may enable shared understanding and enhanced coordination among stakeholders – a necessary foundation for good team function. Moreover, it may promote evaluation continuity across the IT life cycle by explicitly linking the findings from the initial steps of gathering requirements (Table 2), to designing and developing the software (Table 3), to clinical implementation (Table 4), and to evaluating implementation, effectiveness, and economic outcomes (Table 5). ELICIT framework also supports moving back to earlier phases as needed, which is important in the healthcare domain as both EHR-integrated innovations and their implementation contexts are constantly co-evolving. Evaluators do not need to complete all 12 steps and should prioritize the ones which matter the most to them based on the detailed questions provided in Tables 2-5.
Table 5.
Operation Phase: Evaluation Steps, Exemplar Questions, and Exemplar Methods
| Step 10. Technical portability assessment: Can the software be deployed across health systems and EHR platforms? | |
|---|---|
| Exemplar Questions |
Interoperability: What interoperability standards are used? Are the required interfaces supported across EHR platforms? Are custom interfaces needed? Integration requirements: What is required to integrate the innovation at new health systems and EHR platforms? |
| Exemplar Methods | Assessment of interoperability including portability to multiple EHR vendors.48 Evaluation of resources and time required for external implementations. |
| Step 11. Long-term user satisfaction assessment: Do users still find the system enjoyable? | |
| Exemplar Questions |
Usability, user acceptance and satisfaction: Do users find the innovation software acceptable, usable, and enjoyable (perceived enjoyment)? Does the innovation software contribute to app fatigue? Perceived effectiveness: Do providers feel that they are providing better care? Do providers feel more confident that they could provide needed care? |
| Exemplar Methods | Assessment of user satisfaction using SUS questionnaire85 and Net Promoter Score. Assessment of usability and perceived effectiveness through semi-structured interviews. |
| Step 12. Social outcomes assessment: Did the innovation have the desired impact on process, clinical, and financial outcomes? | |
| Exemplar Questions |
Process outcomes: Did the innovation improve the completion of desired tasks under real-world conditions? Unintended process consequences: Did the innovation produced unexpected results? Health outcomes: Did the innovation improve targeted health outcomes in clinical setting? Did the innovation impact patient safety? Did the innovation improve access to care? Unintended health consequences: Did the innovation produced unexpected results? Health equity: Did the innovation produce desired impact across gender and minority groups? Economic outcomes: What did it cost to design, develop, implement, and maintain the innovation? What is the cost of each unit of outcome improvement (e.g., saved life-years)? Did the innovation reduce healthcare cost? What was the return on investment? Dissemination value: How much value can the innovation bring to another organization? How much effort would it take to enable innovation dissemination to other clinical sites? |
| Exemplar Methods | Systematic collection of feedback from clinical champions and other end users regarding their experience and perceived changes to clinical practice including using semi-structured interviews and surveys. Assessment of process and effectiveness outcomes using clinical trial designs such as before-after, interrupted time series, or cluster-randomized. EHR and innovation data analysis including log analysis101 to obtain process outcome data. Systematic collection of feedback from end users about safety concerns related to the innovation and perceived impact on outcomes. Unintended consequences could be identified through interviews. Assessment of innovation costs through stakeholder interviews and/or the logging of relevant costs including personnel time. Cost-effectiveness analysis using financial data from public databases (e.g., CMS Physician Fee Schedule103) or from institutional databases.104 Synthesis of evidence from evaluation steps 1-12 to create the dissemination value estimate. |
CMS – Centers for Medicare & Medicaid Services, EHR – electronic health record.
The ELICIT framework was specifically designed to support the evaluation of EHR-integrated innovations. ELICIT’s IT-level measures address issues unique to EHR integration, including interoperability and portability across EHR platforms – a pervasive problem. Similarly, ELICIT’s user-level measures address the challenges associated with integrating EHR-based innovations into existing EHR user interfaces and clinical workflows. Furthermore, ELICIT’s society-level measures take into consideration how the EHR-based innovations affect organizational culture, business processes, patient outcomes and finances. These society-level measures are important for catalyzing further investments in, and wide dissemination of, EHR-integrated innovations.
Compared to other health informatics evaluation frameworks, the ELICIT framework adds 1) coverage of the entire IT life cycle, 2) description of measures specific to society, user, and IT levels, 3) measures specific to EHR-integrated innovations, 4) attention to clinical context perspectives, 5) attention to implementation science, and 6) a checklist of relevant questions and methods that could be considered at each phase. The ELICIT framework is also more concrete than existing frameworks because it provides a checklist of relevant exemplar questions and methods that could be considered at each phase. These concrete questions and methods can serve as a starting point for evaluation activities. While some aspects of the framework are similar to existing frameworks, none of the existing frameworks provide a matrix which covers every intersection between the four IT life cycle phases and the three levels of measures (i.e., society, user, and IT). For example, ELICIT framework includes all the constructs defined in HITREF:18,19 structural quality and quality of information logistics are covered in steps of the IT level; unintended consequences/benefits, effects on quality process, and barrier and facilitators to adoption are covered in the society level. Moreover, ELICIT framework incorporates elements from human factors engineering, software engineering, and implementation science which are not included in HITREF. In addition, we believe ELICIT framework is easier to follow because evaluation steps are assigned to discrete IT life cycle phases.
To our knowledge, this is the first evaluation framework for EHR-integrated innovations that focuses on appropriate evaluation processes throughout the full IT life cycle that specifically includes the implementation phase, a much neglected aspect of information technology projects.72,107 Many evaluation frameworks focus on the evaluation of only one part of the IT life cycle such as human factors engineering.22,23 Even evaluation frameworks that focus on full-cycle approaches do not integrate all the different steps, especially the last few steps of clinical implementation.5,26-29 Yet, effective implementation strategies are likely to be as important for overall innovation success as design quality or clinical value. Thus, although existing innovation evaluation frameworks have commonalities with ELICIT framework such as a focus on accuracy, usability, safety and effectiveness,5,26-29 the ELICIT framework adds to the existing research by explicitly focusing on clinical implementation strategies and outcomes.
Researchers in the field of implementation science also speak to the need to engage stakeholders, organize resources, and address user needs.7,31,32,53,65,66 However, they tend to underemphasize the IT aspects, EHR workflow, communication channels, and user mental models that are required for gathering functionality requirements, designing system components, and designing effective implementation strategies.34 Thus, ELICIT framework adds to existing implementation science research as it focuses on software engineering, user requirements, human factors engineering and value assessment.
Implications
Subject to further empirical testing, ELICIT framework could be applied to evaluation facilitation, coordination, and budgeting. First and foremost, we believe that ELICIT framework can facilitate effective, holistic, scalable evaluations of EHR-integrated innovations, thereby increasing their collective capacity to improve patient care, the provider experience and healthcare value. Second, ELICIT framework can facilitate planning and coordination for the evaluation of EHR-integrated innovations by identifying the evaluation needs across the IT life cycle. Third, we believe ELICIT framework can help justify the investment in evaluation efforts by explicitly mapping out the required evaluation steps and by supporting successful dissemination. Therefore, we hope that this framework will help innovative healthcare organizations and health IT companies to establish evaluation programs, select and pursue research questions and methods, and report the results of their development and evaluation efforts.
We believe that this framework could potentially be generalized to all health IT innovations including EHRs, consumer-facing mobile apps, and artificial intelligence, and we encourage others to adapt and test our framework in different settings. We are planning to continue evaluating and validating this framework in the future. If the ELICIT framework is successfully validated, it would suggest that our methodology for framework development based on combining literature review, expert consensus, and practical experience might also be generalizable.
Limitations
A limitation of this study is that ELICIT framework was developed predominantly at one institution and has not been tested by external investigators. However, this framework was developed based on our experiences with a wide range of projects including operationally and grant funded projects in a wide range of clinical domains and settings.41-48,50 We also collaborated with investigators from many other organizations and are familiar with their evaluation needs and approaches. As a second limitation, it is possible that some aspects of our evaluation framework may be difficult to implement in settings with less access to relevant experts. Finally, this is a work still in progress. As such, further enhancements may arise based on the experience of ourselves and others.
CONCLUSION
While EHR-integrated innovations show great promise, inadequate evaluations may result in poor usability, adoption, safety, or impact. A 12-step multi-level multi-phase evaluation framework for EHR-integrated innovations was iteratively developed to integrate insights from relevant domains and to help bring impactful innovations to market in an efficient and safe manner. This comprehensive and integrated framework covers the entire IT life cycle across the society, user and IT-measures, whereas previous frameworks focused on specific aspects only. We hope that this work will support evaluations undertaken by other digital health innovation teams while fostering discussion and further refinement of the framework.
Box 1. Definitions.
Adaptation is “a process of thoughtful and deliberate alteration to the design or delivery of an intervention, with the goal of improving its fit or effectiveness in a given context.”30
Context is the circumstances that form the setting for the innovation and enable it to be fully understood and assessed.31 Innovation context includes individual factors (e.g., individuals’ knowledge, beliefs, values, goals), organizational factors (e.g., organizational culture, climate and priorities), external factors (e.g., vendors, government regulations, competition, external funding, patient advocacy organizations, public opinion), and technical factors (e.g., system constraints and capabilities, implementation of interoperability standards in the organization).
Implementation is the rollout of innovations into clinical practice. Of note, this definition is different than how the term is used in software engineering, where the term often also encompasses the software coding process.
Information technology (IT) is a technology used to acquire, store, deliver and/or analyze data.
End user is the intended individual who uses the software. For provider-facing innovations, end users may include physicians, pharmacists, nurses, advanced care providers, and medical assistants. For patient-facing innovations, end users are patients and caregivers.
Evidence-based practice (EBP) is an evidence-based intervention, treatment, innovation, recommendation or guideline.32
Formative evaluation is the evaluation of an evolving innovation to give direction for the design, development, and implementation of innovation components.33
Implementation strategies are the actions taken to enhance the adoption, clinical rollout, and/or sustainability of innovations.34
Stakeholders are the individuals and groups affected by the innovation, including software developers, evaluators, end users, and organizational leaders.
Summative evaluation is describing the properties and/or impacts of the innovation.33
HIGHLIGHTS.
It is critical for EHR-integrated innovations to be systematically evaluated
Effective evaluation of EHR-integrated innovations requires a shared understanding and collaboration across disciplines throughout the full information technology (IT) life cycle
A novel Evaluation in Life Cycle of Information Technology (ELICIT) framework focuses on all phases of the IT life cycle
The ELICIT framework consists of 12 evaluation steps across the IT life cycle phases including society-, user-, and IT-level measures
ACKNOWLEDGMENTS
We would like to thank the ReImagine EHR software development team, and especially Dr. Salvador Rodriguez-Loya, for their contribution to the development of EHR-integrated innovations. We would also like to thank the UU Print and Mail Services team for their contribution to the evaluation framework visualization.
FUNDING STATEMENT
The work reported in this paper was supported in part by the University of Utah, the Agency for Healthcare Research and Quality under Award Number R18HS026198. TJR and KLM were supported by the U.S. National Library of Medicine of the National Institutes of Health through grant T15LM007124.
The funding organizations had no role in the conceptualization, design, data collection, analysis, decision to publish, or paper preparation for this case study. The content is solely the responsibility of the authors and does not necessarily represent the official views of the organizations involved.
Footnotes
Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
COMPETING INTERESTS STATEMENT
KK reports honoraria, consulting, sponsored research, licensing, or co-development with McKesson InterQual, Hitachi, Pfizer, the Korean Society of Medical Informatics, NORC at the University of Chicago, Premier, Klesis Healthcare, RTI International, Mayo Clinic, Vanderbilt University, the University of Washington, Indiana University, the University of California at San Francisco, MD Aware, and the U.S. Office of the National Coordinator for Health IT (via ESAC, JBS International, A+ Government Solutions, Hausam Consulting, and Security Risk Solutions) in the area of health information technology. KK was also an unpaid board member of the non-profit Health Level Seven International health IT standard development organization, and he is an unpaid member of the U.S. Health Information Technology Advisory Committee. CJS reports sponsored research with Hitachi, Ltd. Other co-authors have no conflicts to report.
Contributor Information
Polina V. Kukhareva, Department of Biomedical Informatics, University of Utah, Salt Lake City, UT, USA.
Charlene Weir, Department of Biomedical Informatics, University of Utah, Salt Lake City, UT, USA.
Guilherme Del Fiol, Department of Biomedical Informatics, University of Utah, Salt Lake City, UT, USA.
Gregory A. Aarons, Department of Psychiatry, UC San Diego ACTRI Dissemination and Implementation Science Center, UC San Diego, La Jolla, CA, USA.
Teresa Y. Taft, Department of Biomedical Informatics, University of Utah, Salt Lake City, UT, USA.
Chelsey R. Schlechter, Department of Population Health Sciences, Center for Health Outcomes and Population Equity, Huntsman Cancer Institute, University of Utah, Salt Lake City, UT, USA.
Thomas J. Reese, Department of Biomedical Informatics, Vanderbilt University, Nashville, TN, USA.
Rebecca L. Curran, Department of Biomedical Informatics, University of Utah, Salt Lake City, UT, USA.
Claude Nanjo, Department of Biomedical Informatics, University of Utah, Salt Lake City, UT, USA.
Damian Borbolla, Department of Biomedical Informatics, University of Utah, Salt Lake City, UT, USA.
Catherine J Staes, College of Nursing, University of Utah, Salt Lake City, UT, USA.
Keaton L. Morgan, Department of Biomedical Informatics, University of Utah, Salt Lake City, UT, USA.
Heidi S. Kramer, Department of Biomedical Informatics, University of Utah, Salt Lake City, UT, USA.
Carole H. Stipelman, Department of Pediatrics, University of Utah, Salt Lake City, UT, USA.
Julie H. Shakib, Department of Pediatrics, University of Utah, Salt Lake City, UT, USA.
Michael C. Flynn, Department of Family & Preventive Medicine, University of Utah, Salt Lake City, UT, USA.
Kensaku Kawamoto, Department of Biomedical Informatics, University of Utah, Salt Lake City, UT, USA.
REFERENCES
- 1.Mandl KD, Mandel JC, Kohane IS. Driving innovation in health systems through an apps-based information economy. Cell Syst. 2015;1(1):8–13. doi: 10.1016/j.cels.2015.05.001 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Schulte F, Fry E. Death by a thousand clicks: where electronic health records went wrong. Kaiser Health News. https://khn.org/news/death-by-a-thousand-clicks/. Published 2019. Accessed September 23, 2020. [Google Scholar]
- 3.Wong A, Otles E, Donnelly JP, et al. External Validation of a Widely Implemented Proprietary Sepsis Prediction Model in Hospitalized Patients. JAMA Intern Med. Published online June 21, 2021. doi: 10.1001/JAMAINTERNMED.2021.2626 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Koppel R, Metlay JP, Cohen A, et al. Role of computerized physician order entry systems in facilitating medication errors. J Am Med Assoc. 2005;293(10):1197–1203. doi: 10.1001/jama.293.10.1197 [DOI] [PubMed] [Google Scholar]
- 5.Stead WW, Haynes RB, Fuller S, et al. Designing medical informatics research and library-resource projects to increase what is learned. J Am Med Informatics Assoc. 1994;1(1):28–33. doi: 10.1136/jamia.1994.95236134 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Ammenwerth E, Brender J, Nykänen P, Prokosch HU, Rigby M, Talmon J. Visions and strategies to improve evaluation of health information systems: Reflections and lessons based on the HIS-EVAL workshop in Innsbruck. Int J Med Inform. 2004;73(6):479–491. doi: 10.1016/J.IJMEDINF.2004.04.004 [DOI] [PubMed] [Google Scholar]
- 7.Aarons GA, Hurlburt M, Horwitz SMC. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Heal Ment Heal Serv Res. 2011;38(1):4–23. doi: 10.1007/s10488-010-0327-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Kawamoto K, McDonald CJ. Designing, Conducting, and Reporting Clinical Decision Support Studies: Recommendations and Call to Action. Ann Intern Med. 2020;172(11):101–109. doi: 10.7326/M19-0875 [DOI] [PubMed] [Google Scholar]
- 9.Murray E, Hekler EB, Andersson G, et al. Evaluating Digital Health Interventions: Key Questions and Approaches. Am J Prev Med. 2016;51(5):843–851. doi: 10.1016/j.amepre.2016.06.008 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Sloman S, Fernbach P. The Knowledge Illusion: Why We Never Think Alone. Riverhead Books; 2017. [Google Scholar]
- 11.Yusof MM, Kuljis J, Papazafeiropoulou A, Stergioulas LK. An evaluation framework for Health Information Systems: human, organization and technology-fit factors (HOT-fit). Int J Med Inform. 2008;77(6):386–398. doi: 10.1016/j.ijmedinf.2007.08.011 [DOI] [PubMed] [Google Scholar]
- 12.Currie LM. Evaluation frameworks for nursing informatics. Int J Med Inform. 2005;74(11-12):908–916. doi: 10.1016/j.ijmedinf.2005.07.007 [DOI] [PubMed] [Google Scholar]
- 13.Godinho MA, Ansari S, Guo GN, Liaw S-T. Toolkits for implementing and evaluating digital health: A systematic review of rigor and reporting. J Am Med Informatics Assoc. Published online February 23, 2021. doi: 10.1093/jamia/ocab010 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Vis C, Bührmann L, Riper H, Ossebaard HC. Health technology assessment frameworks for eHealth: A systematic review. Int J Technol Assess Health Care. 2020;36(3):204–216. doi: 10.1017/S026646232000015X [DOI] [PubMed] [Google Scholar]
- 15.Eslami Andargoli A, Scheepers H, Rajendran D, Sohal A. Health information systems evaluation frameworks: A systematic review. Int J Med Inform. 2017;97:195–209. doi: 10.1016/j.ijmedinf.2016.10.008 [DOI] [PubMed] [Google Scholar]
- 16.Yusof MM, Papazafeiropoulou A, Paul RJ, Stergioulas LK. Investigating evaluation frameworks for health information systems. Int J Med Inform. 2008;77(6):377–385. doi: 10.1016/J.IJMEDINF.2007.08.004 [DOI] [PubMed] [Google Scholar]
- 17.Neame MT, Sefton G, Roberts M, Harkness D, Sinha IP, Hawcutt DB. Evaluating health information technologies: A systematic review of framework recommendations. Int J Med Inform. 2020;142:104247. [DOI] [PubMed] [Google Scholar]
- 18.Sockolow PS, Bowles KH, Rogers M. Health Information Technology Evaluation Framework (HITREF) Comprehensiveness as Assessed in Electronic Point-of-Care Documentation Systems Evaluations. In: Studies in Health Technology and Informatics. Vol 216. IOS Press; 2015:406–409. doi: 10.3233/978-1-61499-564-7-406 [DOI] [PubMed] [Google Scholar]
- 19.Sockolow PS, Crawford PR, Lehmann HP. Health services research evaluation principles: Broadening a general framework for evaluating health information technology. Methods Inf Med. 2012;51(2):122–130. doi: 10.3414/ME10-01-0066 [DOI] [PubMed] [Google Scholar]
- 20.Kidholm K, Ekeland AG, Jensen LK, et al. A model for assessment of telemedicine applications: Mast. Int J Technol Assess Health Care. 2012;28(1):44–51. doi: 10.1017/S0266462311000638 [DOI] [PubMed] [Google Scholar]
- 21.Nykänen P, Brender J, Talmon J, et al. Guideline for good evaluation practice in health informatics (GEP-HI). Int J Med Inform. 2011;80(12):815–827. doi: 10.1016/j.ijmedinf.2011.08.004 [DOI] [PubMed] [Google Scholar]
- 22.Zhang J, Walji MF. TURF: Toward a unified framework of EHR usability. J Biomed Inform. 2011;44(6):1056–1067. doi: 10.1016/j.jbi.2011.08.005 [DOI] [PubMed] [Google Scholar]
- 23.Johnson CM, Johnson TR, Zhang J. A user-centered framework for redesigning health care interfaces. J Biomed Inform. 2005;38(1):75–87. doi: 10.1016/j.jbi.2004.11.005 [DOI] [PubMed] [Google Scholar]
- 24.Wright A, Sittig DF. A framework and model for evaluating clinical decision support architectures. J Biomed Inform. 2008;41(6):982–990. doi: 10.1016/j.jbi.2008.03.009 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Price M, Lau F. The clinical adoption meta-model: A temporal meta-model describing the clinical adoption of health information systems. BMC Med Inform Decis Mak. 2014;14(1):1–10. doi: 10.1186/1472-6947-14-43 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Kowatsch T, Otto L, Harperink S, Cotti A, Schlieter H. A design and evaluation framework for digital health interventions. IT - Inf Technol. 2019;61(5-6):253–263. doi: 10.1515/itit-2019-0019 [DOI] [Google Scholar]
- 27.Larson DB, Harvey H, Rubin DL, Irani N, Tse JR, Langlotz CP. Regulatory Frameworks for Development and Evaluation of Artificial Intelligence–Based Diagnostic Imaging Algorithms: Summary and Recommendations. J Am Coll Radiol. 2021;18(3):413–424. doi: 10.1016/j.jacr.2020.09.060 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Park Y, Jackson GP, Foreman MA, Gruen D, Hu J, Das AK. Evaluating Artificial Intelligence in Medicine: Phases of Clinical Research. Vol 3. Oxford University Press; 2020:326–331. Accessed April 20, 2021. /pmc/articles/PMC7660958/ [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Mathews SC, McShea MJ, Hanley CL, Ravitz A, Labrique AB, Cohen AB. Digital health: a path to validation. npj Digit Med. 2019;2(1):1–9. doi: 10.1038/s41746-019-0111-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Stirman SW, Baumann AA, Miller CJ. The FRAME: An expanded framework for reporting adaptations and modifications to evidence-based interventions. Implement Sci. 2019;14(1):58. doi: 10.1186/s13012-019-0898-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4(1):50. doi: 10.1186/1748-5908-4-50 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Moullin JC, Dickson KS, Stadnick NA, Rabin B, Aarons GA. Systematic review of the Exploration, Preparation, Implementation, Sustainment (EPIS) framework. Implement Sci. 2019;14(1). doi: 10.1186/s13012-018-0842-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Brender J. Handbook of Evaluation Methods for Health Informatics. Elsevier; 2006. [Google Scholar]
- 34.Proctor EK, Powell BJ, McMillen JC. Implementation strategies: Recommendations for specifying and reporting. Implement Sci. 2013;8(1):139. doi: 10.1186/1748-5908-8-139 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 35.ONC’s Cures Act Final Rule. Published 2020. Accessed September 23, 2020. https://www.healthit.gov/curesrule/ [Google Scholar]
- 36.Mandl KD, Gottlieb D, Ellis A. Beyond One-Off Integrations: A Commercial, Substitutable, Reusable, Standards-Based, Electronic Health Record–Connected App. J Med Internet Res. 2019;21(2):e12902. doi: 10.2196/12902 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Fast Healthcare Interoperability Resources (FHIR) standard. Health Level Seven International (HL7). Accessed September 23, 2019. https://www.hl7.org/fhir/
- 38.Mandel JC, Kreda DA, Mandl KD, Kohane IS, Ramoni RB. SMART on FHIR: A standards-based, interoperable apps platform for electronic health records. J Am Med Informatics Assoc. 2016;23(5):1–10. doi: 10.1093/jamia/ocv189 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39.ReImagine EHR Initiative Products. University of Utah. Published 2020. Accessed March 17, 2020. reimagineehr.utah.edu [Google Scholar]
- 40.Kawamoto K, Kukhareva PV., Weir CR, et al. Establishing a multidisciplinary initiative for interoperable electronic health record innovations at an academic medical center. JAMIA Open. 2021;4(3):1–15. doi: 10.1093/JAMIAOPEN/OOAB041 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 41.Edholm K, Kukhareva PV, Ciarkowski C, et al. Decrease in inpatient telemetry utilization through a system-wide electronic health record change and a multifaceted hospitalist intervention. J Hosp Med. 2018;13(8):531–536. doi: 10.12788/jhm.2933 [DOI] [PubMed] [Google Scholar]
- 42.Horton DJ, Graves KK, Kukhareva PV, et al. Modified early warning score-based clinical decision support: cost impact and clinical outcomes in sepsis. JAMIA Open. Published online 2020. doi: 10.1093/jamiaopen/ooaa014 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.Yarbrough PM, Kukhareva PV, Spivak ES, et al. Evidence-based care pathway for cellulitis improves process, clinical, and cost outcomes. J Hosp Med. 2015;10(12):780–786. doi: 10.1002/jhm.2433 [DOI] [PubMed] [Google Scholar]
- 44.Yarbrough PM, Kukhareva PV, Horton D, Edholm K, Kawamoto K. Multifaceted intervention including education, rounding checklist implementation, cost feedback, and financial incentives reduces inpatient laboratory costs. J Hosp Med. 2016;11(5):348–354. doi: 10.1002/jhm.2552 [DOI] [PubMed] [Google Scholar]
- 45.Edholm K, Lappé K, Kukhareva PV, et al. Reducing Diabetic Ketoacidosis Intensive Care Unit Admissions Through an Electronic Health Record-Driven, Standardized Care Pathway. J Healthc Qual. Published online January 9, 2020:1. doi: 10.1097/jhq.0000000000000247 [DOI] [PubMed] [Google Scholar]
- 46.Ciarkowski CE, Timbrook TT, Kukhareva PV, et al. A Pathway for Community-Acquired Pneumonia with Rapid Conversion to Oral Therapy Improves Healthcare Value. Open Forum Infect Dis. Published online October 19, 2020. doi: 10.1093/ofid/ofaa497 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 47.Kawamoto K, Kukhareva PV, Shakib JH, et al. Association of an electronic health record add-on app for neonatal bilirubin management with physician efficiency and care quality. JAMA Netw Open. 2019;2(11):e1915343. doi: 10.1001/jamanetworkopen.2019.15343 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 48.Kukhareva PV, Warner P, Rodriguez S, et al. Balancing Functionality versus Portability for SMART on FHIR Applications: Case Study for a Neonatal Bilirubin Management Application. AMIA Annu Symp Proc. Published online January 1, 2019:562–571. [PMC free article] [PubMed] [Google Scholar]
- 49.Tarumi S, Takeuchi W, Chalkidis G, et al. Leveraging Artificial Intelligence to Improve Chronic Disease Care: Methods and Application to Pharmacotherapy Decision Support for Type-2 Diabetes Mellitus. Methods Inf Med. Published online May 11, 2021. doi: 10.1055/s-0041-1728757 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 50.Curran RL, Kukhareva PV, Taft T, et al. Integrated displays to improve chronic disease management in ambulatory care: A SMART on FHIR application informed by mixed-methods user testing. J Am Med Informatics Assoc. 2020;27(8):1225–1234. doi: 10.1093/jamia/ocaa099 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 51.Reese T, Schlechter C, Kramer H, et al. Implementing lung cancer screening in primary care: needs assessment and implementation strategy design. Transl Behav Med. Published online August 23, 2021. doi: 10.1093/TBM/IBAB115 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 52.Abedin Z, Hoerner R, Habboushe J, et al. Implementation of a Fast Healthcare Interoperability Resources-Based Clinical Decision Support Tool for Calculating CHA2DS2-VASc Scores. Circ Cardiovasc Qual Outcomes. 2020;13(2):e006286. doi: 10.1161/CIRCOUTCOMES.119.006286 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 53.EPIS Framework. Accessed March 19, 2021. https://episframework.com/
- 54.Christopoulou S, Kotsilieris T, Anagnostopoulos I. Assessment of Health Information Technology Interventions in Evidence-Based Medicine: A Systematic Review by Adopting a Methodological Evaluation Framework. Healthcare. 2018;6(3):109. doi: 10.3390/healthcare6030109 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 55.Luce BR, Drummond M, Jönsson B, et al. EBM, HTA, and CER: Clearing the confusion. Milbank Q. 2010;88(2):256–276. doi: 10.1111/j.1468-0009.2010.00598.x [DOI] [PMC free article] [PubMed] [Google Scholar]
- 56.Drummond MF, Sculpher MJ, Claxton K, Stoddart GL, Torrance GW. Methods for the Economic Evaluation of Health Care Programmes. 4th ed. Oxford University Press; 2015. [Google Scholar]
- 57.Kushniruk A, Beuscart-Zéphir MC, Grzes A, Borycki E, Watbled L, Kannry J. Increasing the safety of healthcare information systems through improved procurement: toward a framework for selection of safe healthcare systems. Healthc Q. 2010;13 Spec No:53–58. doi: 10.12927/hcq.2010.21967 [DOI] [PubMed] [Google Scholar]
- 58.Coiera E, JWestbrook JI, Wyatt JC. The safety and quality of decision support systems. Yearb Med Inform. Published online 2006:20–25. [PubMed] [Google Scholar]
- 59.Magrabi F, Ammenwerth E, Hyppönen H, et al. Improving Evaluation to Address the Unintended Consequences of Health Information Technology: a Position Paper from the Working Group on Technology Assessment & Quality Development. Yearb Med Inform. 2016;(1):61–69. doi: 10.15265/iy-2016-013 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 60.Begg C, Cho M, Eastwood S, et al. Improving the quality of reporting of randomized controlled trials: The CONSORT statement. J Am Med Assoc. 1996;276(8):637–639. doi: 10.1001/jama.276.8.637 [DOI] [PubMed] [Google Scholar]
- 61.Ogrinc G, Davies L, Goodman D, Batalden P, Davidoff F, Stevens D. Standards for quality improvement reporting excellence (SQUIRE 2.0) publication guidelines. BMJ Qual Saf. 2016;25:986–992. doi: 10.1136/bmjqs-2015-004411 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 62.Talmon J, Ammenwerth E, Brender J, de Keizer N, Nykänen P, Rigby M. STARE-HI-Statement on reporting of evaluation studies in Health Informatics. Int J Med Inform. 2009;78(1):1–9. doi: 10.1016/j.ijmedinf.2008.09.002 [DOI] [PubMed] [Google Scholar]
- 63.Husereau D, Drummond M, Petrou S, et al. Consolidated health economic evaluation reporting standards (CHEERS)-explanation and elaboration: A report of the ISPOR health economic evaluation publication guidelines good reporting practices task force. Value Heal. 2013;16(2):231–250. doi: 10.1016/j.jval.2013.02.002 [DOI] [PubMed] [Google Scholar]
- 64.Bauer MS, Damschroder L, Hagedorn H, Smith J, Kilbourne AM. An introduction to implementation science for the non-specialist. BMC Psychol. 2015;3(1):32. doi: 10.1186/S40359-015-0089-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 65.Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health. 1999;89(9):1322–1327. doi: 10.2105/ajph.89.9.1322 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 66.Proctor E, Silmere H, Raghavan R, et al. Outcomes for implementation research: Conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Heal Ment Heal Serv Res. 2011;38(2):65–76. doi: 10.1007/s10488-010-0319-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 67.Greenhalgh T, Wherton J, Papoutsi C, et al. Beyond adoption: A new framework for theorizing and evaluating nonadoption, abandonment, and challenges to the scale-up, spread, and sustainability of health and care technologies. J Med Internet Res. 2017;19(11). doi: 10.2196/jmir.8775 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 68.Pinnock H, Barwick M, Carpenter CR, et al. Standards for Reporting Implementation Studies (StaRI) Statement. BMJ. 2017;356. doi: 10.1136/bmj.i6795 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 69.Schloemer T, Schröder-Bäck P. Criteria for evaluating transferability of health interventions: A systematic review and thematic synthesis. Implement Sci. 2018;13(1). doi: 10.1186/s13012-018-0751-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 70.Camacho J, Zanoletti-Mannello M, Landis-Lewis Z, et al. A Conceptual Framework to Study the Implementation of Clinical Decision Support Systems (BEAR): Literature Review and Concept Mapping. Vol 22. JMIR Publications; 2020:e18388. doi: 10.2196/18388 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 71.Trinkley KE, Kahn MG, Bennett TD, et al. Integrating the Practical Robust Implementation and Sustainability Model With Best Practices in Clinical Decision Support Design: Implementation Science Approach. J Med Internet Res. 2020;22(10):e19676. doi: 10.2196/19676 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 72.Haynes RB, del Fiol G, Michelson M, Iorio A. Context and Approach in Reporting Evaluations of Electronic Health Record-Based Implementation Projects. Ann Intern Med. 2020;172(11_Supplement):S73–S78. doi: 10.7326/M19-0874 [DOI] [PubMed] [Google Scholar]
- 73.Tougas ME, Hayden JA, McGrath PJ, Huguet A, Rozario S. A systematic review exploring the social cognitive theory of self-regulation as a framework for chronic health condition interventions. PLoS One. 2015;10(8). doi: 10.1371/journal.pone.0134977 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 74.Endsley MR, Garland DJ, eds. Situation Awareness Analysis and Measurement. Lawrence Erlbaum Associates, Inc; 2000. [Google Scholar]
- 75.Venkatesh V, Morris MG, Davis GB, Davis FD. User Acceptance of Information Technology: Toward a Unified View. MIS Q. 2003;27(3):425–478. https://www.jstor.org/stable/30036540 [Google Scholar]
- 76.Patel VL, Arocha JF, Diermeier M, Greenes RA, Shortliffe EH. Methods of cognitive analysis to support the design and evaluation of biomedical systems: The case of clinical practice guidelines. J Biomed Inform. 2001;34(1):52–66. doi: 10.1006/jbin.2001.1002 [DOI] [PubMed] [Google Scholar]
- 77.Kaplan B. Evaluating informatics applications - Some alternative approaches: Theory, social interactionism, and call for methodological pluralism. Int J Med Inform. 2001;64(1):39–56. doi: 10.1016/S1386-5056(01)00184-8 [DOI] [PubMed] [Google Scholar]
- 78.Unertl KM, Abraham J, Bakken S. Building on Diana Forsythe’s legacy: the value of human experience and context in biomedical and health informatics. J Am Med Informatics Assoc. 2021;28(2):197–208. doi: 10.1093/jamia/ocaa337 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 79.Westbrook JI, Braithwaite J, Iedema R, Coiera EW. Evaluating the impact of information communication technologies on complex organizational systems: a multi-disciplinary, multi-method framework. Medinfo. 2004;11(Pt 2):1323–1327. doi: 10.3233/978-1-60750-949-3-1323 [DOI] [PubMed] [Google Scholar]
- 80.Holtzblatt K, Wendell J, Wood S. Rapid Contextual Design - 1st Edition. Morgan Kaufmann; 2004. [Google Scholar]
- 81.Crandall B, Klein GA, Hoffman RR. Working Minds. A Practitioner’s Guide to Cognitive Task Analysis. 1st ed. MIT Press; 2006. [Google Scholar]
- 82.Kushniruk AW, Patel VL. Cognitive and Usability Engineering Methods for the Evaluation of Clinical Information Systems. Vol 37. J Biomed Inform; 2004:56–76. doi: 10.1016/j.jbi.2004.01.003 [DOI] [PubMed] [Google Scholar]
- 83.Zhang J, Norman DA. Representations in distributed cognitive tasks. Cogn Sci. 1994;18(1):87–122. doi: 10.1016/0364-0213(94)90021-3 [DOI] [Google Scholar]
- 84.Nielsen J. 10 Heuristics for User Interface Design. Nielsen Norman Group. Published 1994. Accessed February 12, 2020. https://www.nngroup.com/articles/ten-usability-heuristics/ [Google Scholar]
- 85.Brooke J. A “quick and dirty” usability scale. In: Jordan PW, Thomas B, Weerdmeester BA, McClelland IL, eds. Usability Evaluation in Industry. 1st ed. Taylor & Francis; 1996:189–195. https://cui.unige.ch/isi/icle-wiki/_media/ipm:test-suschapt.pdf [Google Scholar]
- 86.McCoy AB, Thomas EJ, Krousel-Wood M, Sittig DF. Clinical decision support alert appropriateness: a review and proposal for improvement. Ochsner J. 2014;14(2):195–202. [PMC free article] [PubMed] [Google Scholar]
- 87.Fox J, Thomson R. Clinical decision support systems: a discussion of quality, safety and legal liability issues. Proc AMIA Symp. Published online 2002:265–269. Accessed October 8, 2020. https://pubmed.ncbi.nlm.nih.gov/12463828/ [PMC free article] [PubMed] [Google Scholar]
- 88.Ranade-Kharkar P, Narus SP, Anderson GL, Conway T, Del Fiol G. Data standards for interoperability of care team information to support care coordination of complex pediatric patients. J Biomed Inform. 2018;85:1–9. doi: 10.1016/j.jbi.2018.07.009 [DOI] [PubMed] [Google Scholar]
- 89.Soares A, Schilling LM. Is my SMART on FHIR app ready for prime time? A review guideline for building and evaluating apps from proof of concept to production. In: AMIA Informatics Summit. ; 2020:846–847. [Google Scholar]
- 90.Ruparelia NB. Software development lifecycle models. ACM SIGSOFT Softw Eng Notes. 2010;35(3):8–13. doi: 10.1145/1764810.1764814 [DOI] [Google Scholar]
- 91.DeLone WH, McLean ER. The DeLone and McLean model of information systems success: A ten-year update. In: Journal of Management Information Systems. Vol 19. M.E. Sharpe Inc.; 2003:9–30. doi: 10.1080/07421222.2003.11045748 [DOI] [Google Scholar]
- 92.American Academy of Pediatrics Subcommittee on Hyperbilirubinemia. Management of hyperbilirubinemia in the newborn infant 35 or more weeks of gestation. Pediatrics. 2004;114(4):1138. doi: 10.1542/PEDS.114.1.297 [DOI] [PubMed] [Google Scholar]
- 93.Ammenwerth E, Gräber S, Herrmann G, Bürkle T, König J. Evaluation of health information systems - Problems and challenges. In: International Journal of Medical Informatics. Vol 71. Elsevier Ireland Ltd; 2003:125–135. doi: 10.1016/S1386-5056(03)00131-X [DOI] [PubMed] [Google Scholar]
- 94.Lengnick-Hall R, Stadnick NA, Dickson KS, Moullin JC, Aarons GA. Forms and functions of bridging factors: specifying the dynamic links between outer and inner contexts during implementation and sustainment. Implement Sci. 2021;16(1):34. doi: 10.1186/s13012-021-01099-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- 95.Bates DW, Kuperman GJ, Wang S, et al. Ten Commandments for Effective Clinical Decision Support: Making the Practice of Evidence-based Medicine a Reality. J Am Med Informatics Assoc. 2003;10(6):523–530. doi: 10.1197/jamia.m1370 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 96.Jaspers MWM, Steen T, Bos C Van Den, Geenen M. The think aloud method: A guide to user interface design. Int J Med Inform. 2004;73(11-12):781–795. doi: 10.1016/j.ijmedinf.2004.08.003 [DOI] [PubMed] [Google Scholar]
- 97.Taft T, Staes C, Slager S, Weir C. Adapting Nielsen’s Design Heuristics to Dual Processing for Clinical Decision Support. AMIA . Annu Symp proceedings AMIA Symp. 2016;2016:1179–1188. Accessed October 7, 2020. /pmc/articles/PMC5333283/?report=abstract [PMC free article] [PubMed] [Google Scholar]
- 98.Hart SG, Staveland LE. Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research. Adv Psychol. 1988;52(C):139–183. doi: 10.1016/S0166-4115(08)62386-9 [DOI] [Google Scholar]
- 99.Von Thiele Schwarz U, Aarons GA, Hasson H. The Value Equation: Three complementary propositions for reconciling fidelity and adaptation in evidence-based practice implementation. doi: 10.1186/s12913-019-4668-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- 100.Saldana L, Chamberlain P, Bradford WD, Campbell M, Landsverk J. The cost of implementing new strategies (COINS): A method for mapping implementation resources using the stages of implementation completion. Child Youth Serv Rev. 2014;39:177–182. doi: 10.1016/j.childyouth.2013.10.006 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 101.Melnick ER, Sinsky CA, Krumholz HM. Implementing Measurement Science for Electronic Health Record Use. JAMA - J Am Med Assoc. Published online 2021. doi: 10.1001/jama.2021.5487 [DOI] [PubMed] [Google Scholar]
- 102.Wright A, Ai A, Ash J, et al. Clinical decision support alert malfunctions: analysis and empirically derived taxonomy. J Am Med Informatics Assoc. 2018;25(5):496–506. doi: 10.1093/jamia/ocx106 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 103.CMS ∣ Physician Fee Schedule. Accessed February 26, 2021. https://www.cms.gov/medicare/physician-fee-schedule/search?
- 104.Kawamoto K, Martin CJ, Williams K, et al. Value Driven Outcomes (VDO): a pragmatic, modular, and extensible software framework for understanding and improving health care costs and outcomes. J Am Med Informatics Assoc. 2014;22(1):223–235. doi: 10.1136/amiajnl-2013-002511 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 105.SMART App Gallery: Intermountain Healthcare Bilirubin Chart. Accessed April 22, 2021. https://apps.smarthealthit.org/app/bilirubin-chart [Google Scholar]
- 106.Chang PW, Kuzniewicz MW, McCulloch CE, Newman TB. A clinical prediction rule for rebound hyperbilirubinemia following inpatient phototherapy. Pediatrics. 2017;139(3):e20162896. doi: 10.1542/peds.2016-2896 [DOI] [PubMed] [Google Scholar]
- 107.Jones SS, Rudin RS, Perry T, Shekelle PG. Health information technology: An updated systematic review with a focus on meaningful use. Ann Intern Med. 2014;160(1):48–54. doi: 10.7326/m13-1531 [DOI] [PubMed] [Google Scholar]






