Abstract
Objectives
To develop indices of US hospital interoperability to capture the current state and assess progress over time.
Materials and Methods
A Technical Expert Panel (TEP) informed selection of items from the American Hospital Association Health IT Supplement survey, which were aggregated into interoperability concepts (components) and then further combined into indices. Indices were refined through psychometric analysis and additional TEP input. Final indices included a “Core Index” measuring adoption of foundational interoperability capabilities, a “Pathfinder Index” representing adoption of advanced interoperability technologies and auxiliary exchange activities, and a “Friction Index” quantifying barriers. The first 2 indices were scored from 0 (no interoperability) to 100 (full interoperability); the Friction Index was scored 0 (no friction) to 100 (maximum friction). We calculated indices annually from 2021 to 2023, stratifying by hospital characteristics.
Results
Items within components created reliable and meaningful measures, and associations between components within indices followed the TEP’s expectations. Weighted mean scores for the Core (2023), Pathfinder (2022), and Friction (2023) Indices were 61, 57, and 30, respectively. Hospitals with 500+ beds (large), not designated as critical access, in metropolitan areas, and using market leading electronic health records had statistically significant higher mean scores on all indices. Index values also improved modestly over time.
Discussion
Hospitals performed best on the Core Index. Given recent policy and programmatic initiatives, we anticipate continued improvement across all indices.
Conclusion
Ongoing index tracking can inform policy impact evaluations and highlight persistent interoperability disparities across hospitals.
Keywords: hospital interoperability, data exchange, public health, application programming interface, health information exchange
Background and significance
For over a decade, US health information technology (health IT) policy has been focused on various ways to support interoperability, defined in the 21st Century Cures Act of 2016 as technology that “enables the secure exchange of electronic health information” without special effort, “allows for complete access, exchange, and use of all electronically accessible health information for authorized use,” and “does not constitute information blocking.”1 The Cures Act pushed interoperability forward by prohibiting practices likely to interfere with the access, exchange, and use of electronic health information, creating requirements for developers and providers to support easy patient access to their health information, and authorizing the creation of a Trusted Exchange Framework and Common Agreement (TEFCA) to enable nationwide exchange of health information. In tandem, the Centers for Medicare & Medicaid Services (CMS) adjusted the Medicare and Medicaid EHR Incentive Programs under the 2015 Medicare Access and CHIP Reauthorization Act (MACRA) to further prioritize interoperability, and, in 2018, renamed its initiatives the Medicare and Medicaid Promoting Interoperability (PI) Programs.2,3 The Office of the Assistant Secretary for Technology Policy/Office of the National Coordinator for Health Information Technology (ASTP)’s 2020 Cures Act Final Rule subsequently implemented many of the provisions of the Cures Act by establishing new interoperability requirements and updated electronic health record (EHR) certification criteria for health IT developers to support data exchange.4
Despite this focus on interoperability, there are few widely used comprehensive measures available to assess interoperability progress by healthcare delivery organizations. Measuring the state of hospital interoperability is necessary to evaluate the effectiveness of existing policies and identify needs for new or revised policy. Hospitals’ progress in enabling interoperability has been examined by the Assistant Secretary for Technology Policy/Office of the National Coordinator for Health Information Technology (hereafter, ASTP) and in the literature based on engagement with 4 major exchange activities: finding, sending, receiving, and integrating health information.5–7 However, this assessment of interoperability capabilities does not capture the full breadth of high-value interoperability capabilities, does not address all components of the Cures Act definition of interoperability, and may not reflect areas targeted by recent policy efforts. For example, exchange of data representing social determinants of health (SDOHs) and support for electronic public health reporting were only recently accelerated during the COVID-19 pandemic.8–10 Furthermore, existing measures do not offer representations of the continued concern that, while hospital capabilities may exist to enable exchange, beneficial exchange without “special effort” remains uncommon.11,12
Comprehensive indices, like those widely used to track basic and comprehensive EHR adoption over the past decade and also those used in other sectors (eg, the Consumer Confidence Index, the DOW Jones Industrial Average), serve an important role in high-level assessments of current state and trajectory of progress.13–15 In the context of hospital interoperability, which is complex and includes many components, headline index numbers provide a straightforward way to describe the state of interoperability, and tracking indices over time similarly allows for a more granular understanding of progress (or lack thereof). Tracking indices for hospitals of varying sizes, resources, and affiliations further supports examination of whether progress is occurring equitably across organizations.
Objective
We therefore sought to develop a set of indices, analogous to the consumer price index, stock market indices, and other measures of value over time, that can accurately and simply communicate a holistic sense of hospital interoperability. We focused on hospitals because they provide critical services, they are the largest facilities in health care, they often serving as the anchor for large health systems, and data on hospital interoperability are readily available from existing national surveys. Our goal was to develop a single index or small number of indices that would meet the following criteria: encompass a breadth of dimensions of interoperability in a logical and hierarchical structure; capture incremental progress in nationwide hospital interoperability through a continuous scale; be easily updated to reflect new technologies, interoperability needs, and policy priorities; and be used to assess the effects of policies including disparities among hospitals. This work also serves as a model for the development of similar interoperability indices among other types of healthcare delivery organizations (eg, physician offices, behavioral health providers) as well as other relevant entities in the health IT ecosystem (eg, digital health companies).
Materials and methods
Selection of data source, index framework development, and technical expert feedback
Given our objective to develop and implement indices using existing data, the study team assessed sources that captured a national sample of hospitals and broad dimensions of interoperability as well as those that are collected at least annually. We selected the American Hospital Association (AHA) Health IT Supplement survey, which is distributed annually to the CEOs of all US hospitals, regardless of AHA membership, and contains questions developed with input from ASTP and intended to measure the adoption and use of health IT in US hospitals. The respondent is asked to complete the survey or designate completion to the most knowledgeable person in the organization, online or via mail. The survey regularly receives a high response rate, including 54% in 2022 and 50% in 2023. Study team members reviewed the 2014-2023 AHA Health IT Supplement surveys and identified candidate survey items relevant to interoperability. The study team (primarily J.A.M. and A.H.) then developed an initial structure to organize these items into broader groupings.
Next, we convened a Technical Expert Panel (TEP) consisting of 6 experts on interoperability and measurement. Experts were identified by the study team, by consultation with knowledgeable parties, and through a snowball-style approach. The final list of experts represented a cross-section of knowledgeable parties with the expertise to speak to the measurement process. Individuals were recruited from health information organizations, health systems, trade groups, and technology companies to achieve multiple vantage points on the index content and minimize bias from any one party; participants were offered an honorarium (see Acknowledgments for list of TEP members). Over a series of 5 videoconference meetings between May and August 2023, the TEP reviewed existing survey instruments to develop and confirm a set of related indices. Two study team members (J.A.M. and A.H.) facilitated each TEP meeting by presenting the survey content, facilitating discussion, and seeking consensus.
In the first meeting, the TEP discussed and agreed upon the conceptual design of 3 mutually exclusive indices: (1) foundational interoperability technologies and practices (the Core Index); (2) the adoption of novel interoperability technologies, including practices relevant to engaging patients, the use of application programming interfaces (APIs), and information exchange with public health (the Pathfinder Index); and (3) challenges experienced with interoperability (the Friction Index) (Figure 1). In the 3 subsequent meetings, TEP members reviewed survey items from each index and decided: (1) whether the item should be included (ie, it is relevant to interoperability); (2) if so, in which index it fit best; and (3) how response options for the item should be scored. For example, 1 survey item, originally slated for the Pathfinder Index, related to hospitals’ use of automated, manual, or a mix of both reporting types for public health data reporting. The TEP felt that these survey items reflected aspects of both the Pathfinder Index and Friction Index and that they provided only modest insight beyond items relating to the technology used to submit public health data; these items were ultimately omitted from the index. After each TEP meeting, the study team revised the index design based on the feedback received. Final approval of the index design was received during the fifth TEP meeting. Additional detail regarding TEP recruitment and the content of TEP sessions is available in Appendix S1.
Figure 1.
Hospital interoperability indices conceptual model. API, application programming interfaces; EHR, electronic health record; FHIR, Fast Healthcare Interoperability Resources; HIE, health information exchange; SDOH, social determinants of health.
The TEP also identified important concepts missing from the indices (for which the AHA IT Supplement did not have an associated item). Of note, the TEP identified information security, data accuracy, data standardization, data quality, workforce challenges, and the state of exchange with social service organizations as important concepts that were not reflected in the indices due to limitations in availability of questions in the survey instruments. These topics will inform future measurement efforts.
While the TEP discussed the possibility of applying weights to different items within an index (ie, to give certain items more importance), TEP participants ultimately decided that there was no systematic basis on which to justify weights. Therefore, we applied equal weights to each item, and created components such that hospitals with minimum performance on each item received a score of zero, and hospitals with the maximum performance on all items received a score of 100. Each component was then also assigned equal weight and aggregated to produce the final score for each index, which also ranged from 0 to 100. A score of 100 on the Core or Pathfinder Index represents perfect interoperability according to the respective measures, while the same score indicates the worst possible experience with friction (the greatest number of major challenges to interoperability) on the Friction Index.
We then used data from the 2022 IT Supplement survey to create the Pathfinder Index and 2023 survey data to construct the Core and Friction Indices. Different years of data were used due to differences in the availability of survey items used to inform each index. Two members of the study team (J.E. and C.S.) then evaluated the psychometric reliability and validity of the initial indices, using the respective years of data from which each index was created. To establish internal consistency reliability, we calculated item-rest correlations and Cronbach’s alpha, using the “psych” package in R (4.2.2), to determine whether items within each component were related to one another such that they appear to empirically measure a shared concept.16,17 To assess construct validity, we calculated Spearman correlation coefficients between components within and across indices. These correlations were used to evaluate whether the components of each index are (1) closely related such that quantitative results indicate that they represent a shared higher-level concept, supporting convergent validity, or (2) are not closely related quantitatively, indicating that they represent distinct aspects of the overarching index, supporting discriminant validity.18
Each index was finalized through an iterative process. The study team identified items with unexpected psychometrics, eg, very low correlation of components within the same index and presented issues to the TEP. The TEP and study team then discussed the acceptability of these results in light of conceptual expectations for the relationship between the items and components and identified strategies to address each item. We then generated a final set of psychometrics that are reported in the results below.
Final index construction
The final set of indices included a Core Index measuring levels of adoption of foundational interoperability capabilities, a Pathfinder Index representing the extent to which hospitals have adopted more advanced technologies for interoperability and engage with auxiliary exchange, and a Friction Index that quantifies the extent to which hospitals face barriers to interoperability (Figure 1). Each index is comprised of components that represent more specific interoperability dimensions, and these components are each comprised of several survey items. For example, the Core Index accounts for the frequency with which hospitals use a variety of methods to send summary of care records, an item contributing to the “Clinical Interoperable Exchange” component. As part of its “Social Determinants of Health (SDOH)” component, the Pathfinder Index includes a survey item representing the types of organizations from which hospitals receive data on patients’ social needs. Finally, an example of items included in the Friction Index is a survey question related to which issues hospitals experience when sending, receiving or querying information to/from other hospitals (eg, difficulty matching patients, data formatting concerns), which is aggregated into the “Barriers to Exchange” component. Appendix S2 provides a more detailed breakdown of the specific AHA IT Supplement questions informing individual items and index components, including descriptive statistics of survey responses to each of the items, as well as further details about our approach to construct the index and update it over time.
Analysis
We focused our analysis on non-federal acute care hospitals because those hospitals were eligible for Federal EHR incentives, because a broader population might suppress measures of reliability and validity by introducing variation from different types of hospitals, and because these hospitals are often the focus of analysis of hospital interoperability. We then created non-response weights in each year of AHA IT Supplement data (2021-2023) using a logistic regression to predict the likelihood that a hospital in the full AHA Annual Survey responded to the survey based on the hospital’s size, ownership, teaching status, system membership, availability of a cardiac intensive care unit, urban status, and region. Hospital weights were the inverse of these response probabilities. These weights were then integrated into all analyses described below to generate nationally representative results.
Using 2022 IT Supplement data for the Pathfinder Index and 2023 data for the Core and Friction Indices, we constructed histograms and calculated means, medians, 25th percentile values, and 75th percentile values for the 3 indices and their respective components to depict the spread of hospitals’ scores. To determine whether indices varied across hospitals of different types, we constructed histograms and calculated the mean value for each index across stratifications of hospital size (small, medium, large), critical access hospital (CAH) status (yes vs no), core-based statistical area (CBSA) type (metropolitan, micropolitan, rural), and primary EHR used (market-leading—Epic, Cerner, Meditech vs non-market leading).5 We selected these variables for stratification because they offer a representation of hospitals’ resource availability.
Lastly, to capture longitudinal progress in performance, we calculated mean scores for each index and its respective components from 2021 through 2023, as data availability allowed. However, differences in survey questions over time made it infeasible to assess trends in the Pathfinder and Friction Indices (see survey items available by year in Table S1). When items were not included in an earlier year, we imputed the value, setting it equal to the first year that the data were observed. We also developed an approach to smooth indices that included new items over time, as detailed in Part 5 of Appendix S2.
Results
Three indices
Core Index
The Core Index measures the level of adoption of foundational interoperability capabilities. It is comprised of 3 components representing (1) hospitals’ interoperable exchange of patient information with other healthcare organizations (measured through 4 survey items), (2) the availability and use of exchanged information to inform patient care (2 items), and (3) breadth of exchange partners, including long-term care and behavioral health providers (8 items).
Pathfinder Index
The Pathfinder Index quantifies hospitals’ implementation of more advanced technologies and adoption of auxiliary interoperability activities, aligning with newer policy relating to the standardization of advanced exchange technologies (ie, APIs) and a greater emphasis on public health following the onset of the COVID-19 pandemic. The index includes 4 components: (1) hospital support APIs supporting applications used by clinicians and the health system (measured through 3 survey items); (2) patient engagement, including support for APIs to enable use of data by patients and enabling submission of patient-generated health data (6 items); (3) the exchange and use of SDOH information (4 items), and (4) submission of public health data for 7 activities (eg, syndromic surveillance) using the EHR or a health information exchange (HIE) (7 items).
Friction Index
The Friction Index serves as a numeric representation of the extent and severity of challenges faced by hospitals engaging in health data exchange. The index includes 3 components: (1) barriers a hospital experiences to exchange health information (measured through 3 survey items), (2) a hospital’s need to use numerous methods (eg, HIE, national network, and point-to-point interfaces) to exchange health information (3 items), and (3) a hospital’s experience of information blocking from various actors (3 items).
Psychometric properties of index construction framework
In the final indices, Cronbach’s alpha within components varied from 0.39 for the experience of information blocking component of the Friction Index to 0.92 for the breadth of exchange partners component of the Core Index (Table S1). Item-rest correlations generally fell within the range of 0.40-0.80, indicating moderate to high correlations between individual items and other items in a component. There were 3 exceptions: the item-rest correlation for the hospital capacity reporting item in the public health component of the Pathfinder Index was 0.39. The item-rest correlations for hospitals’ experiences with information blocking by HIEs and by healthcare providers in the information blocking component of the Friction Index were 0.37 and 0.27, respectively.
Spearman correlations (ρ) for components within indices varied (Figure 2). Core Index components were positively correlated (ρ ranged from 0.44 to 0.59). Components of the Pathfinder Index were less correlated with one another, with ρ ranging from 0.23 to 0.35. These correlations indicate that both the Core and Pathfinder Index were moderately to well correlated but not duplicative. The Friction Index components were not closely correlated with one another, with ρ ranging from −0.08 to 0.14, indicating little relationship between components of the Friction Index.
Figure 2.
Spearman correlations between hospital interoperability indices components. Notes: Consistent with the color coding in Figure 1, components in green text correspond to the Core Index (2023), components in blue text correspond to the Pathfinder Index (2022), and components in purple text correspond to the Friction Index (2023). API, application programming interfaces; SDOH, social determinants of health.
Several components were correlated with components in different indices. The correlations across components of the Pathfinder Index and components of the Core Index ranged from 0.21 to 0.44. The Core Index components correlated with the methods of exchange component in the Friction Index (ρ ranged from 0.50 to 0.73).
Index scores
The mean scores for the Core, Pathfinder, and Friction Indices were 61 in 2023, 57 in 2022, and 30 in 2023, respectively (Figure 3). However, because the Core Index exhibited a great deal of left skewedness, its median (71) was substantially higher than its mean.
Figure 3.
Hospital performance on the hospital interoperability indices and components. Notes: Calculated means and percentiles reflect survey weights. API, application programming interfaces; EHR, electronic health record; HIE, health information exchange.
Values on individual components of each index varied widely. Among the Core Index components, hospitals performed best with respect to the clinical interoperable exchange component (mean = 77; 95% CI: 76-78), followed by the clinical information availability and use component (mean = 65; 95% CI: 64-67), and then had much lower scores on the breadth of exchange partners component (mean = 42; 95% CI: 41-43).
On the Pathfinder Index, hospitals performed similarly on the clinical/health system APIs (mean = 66; 95% CI: 65-68), patient engagement (mean = 64; 95% CI: 63-65), and public health data submission (mean = 57; 95% CI: 56-58) components; but the mean score was lower for the SDOH component (mean = 42; 95% CI: 41-44).
Scores on the Friction Index components were similar for the barriers to exchange and methods of exchange components (mean = 42 and 95% CI: 41-43, and mean = 37 and 95% CI: 36-38, respectively), but the mean score was much lower for the experience of information blocking component (mean = 11; 95% CI: 10-12).
Index scores by hospital type and year
Index scores varied significantly between hospitals with differing characteristics (Figure 4). Hospitals that are larger in size, are not CAHs, are in metropolitan areas, and use a market leading EHR had statistically significantly higher mean scores on the Core and Pathfinder Indices compared to their counterparts. These differences were driven by a substantially larger proportion of hospitals with fewer resources scoring near the bottom of the distribution of each index, rather than by a shift in the modal response. Hospitals that are larger in size, are not CAHs, are in metropolitan areas, and use a market leading EHR also had higher mean scores on the Friction Index, indicating that they experienced greater friction.
Figure 4.
Hospital performance on the hospital interoperability indices by hospital characteristic. Notes: All weighted means (represented by red vertical lines) were significantly different from that of their respective reference groups. Market-leading EHRs include Cerner, Epic, and MEDITECH’s EHRs. CAH, critical access hospital; CBSA, core-based statistical area; EHR, electronic health record.
Hospitals’ performance on these measures also varied across time (Table 1).
Table 1.
Trends in weighted means of hospital interoperability indices, 2021-2023.
| 2021 (n = 2364) | 2022 (n = 2541) | 2023 (n = 2539) | Expected 2024 | Expected 2025 | |
|---|---|---|---|---|---|
| Core (smoothed) a | 56 | 60 | 61 | New data available | |
| Clinical interop functions | 69 | 75 | 77 | ||
| Data availability and use—all items included in 2021 | 58 | 62 | 65 | ||
| Breadth of exchange partners | No items included | No items included | 42 | ||
| Pathfinder | 57 | New data available | |||
| Clinician/health system APIs | No items included | 66 | No items included | Update | |
| Patient engagement (smoothed)b | 58 | 64 | 67 | Update | |
| Social determinants of health | No items included | 42c | 42 | Update | |
| Public health | 50c | 57 | 61 | Update | |
| Friction | 30 | New data available | |||
| Barriers to exchange | 45c | 50c | 42 | ||
| Methods of exchange | 33 | 36 | 37 | ||
| Information blocking | 18 | 12 | 11 | ||
The “Breadth of Exchange Partners” component was not included in 2021 or 2022. To calculate the Core Index in those years, we assumed that the “Breadth of Exchange Partners” had the same value in 2021 and 2022 as in 2023.
Five of 6 items included on the 2022 and 2023 AHA IT Supplement were also included in 2021. To calculate the patient engagement component in 2021, we assumed that the sixth item, which captured whether the hospital supported patients’ ability to “Submit patient-generated data (eg, blood glucose, weight) through apps configured to meet Fast Healthcare Interoperability Resources (FHIR) specifications,” had the same value in 2021 as in 2022.
Substantially different response options or question phrasing from 2023, see Appendix S2 for information on the difference.
The mean score on the Core Index increased from 56 to 61 between 2021 and 2023. Where we could track change in components of the Pathfinder and Friction Indices over time, they exhibited improvement. For example, the information blocking component of the Friction Index declined from 18 in 2021 to 11 in 2023 (Table 1).
Discussion
The 3 indices developed in this study—Core, Pathfinder, and Friction—represent holistic and meaningful measures of hospital interoperability. These indices are intended to capture and simply convey progress in interoperability, with grounding in expert deliberation to identify and group interoperability concepts into a logical hierarchical structure and psychometric analyses to characterize their reliability and validity.
Advantages of index construction
This approach builds on existing measures of hospital interoperability, including commonly used measures focused on engagement in 4 domains of interoperability.5 In contrast to those measures, the indices we developed capture diverse concepts that represent the breadth of interoperability (including the use of health data, the number and type of partners with which data are exchanged); the use of APIs (including standards-based APIs); and patient engagement with data, public health reporting, and challenges to exchange. Furthermore, these indices are continuous measures, whereas existing metrics are binary. Continuous measures are better able to reflect incremental progress, such as increasingly frequent use of interoperability, which previous measures may not have captured.5 Another important advantage of these indices is the use of psychometric analysis to closely understand the relationships between included items. Components within the Core Index, which primarily capture interoperability between healthcare delivery organizations, were highly correlated, aligning with expectations. In contrast, correlations between the components of the Pathfinder Index were lower, which was expected because the Pathfinder Index represents newer and more diverse focuses of policy and technology development. Finally, components of the Friction Index were not well correlated and likely represent distinct facets of Friction. Given that these components are not well correlated in cross-sectional data, it will be important to monitor whether they improve in parallel or at varied rates in the future.
Hospital performance
Hospitals’ performance on the Pathfinder Index in 2022 was modestly lower than on the Core Index in 2023. Lower scores on the Pathfinder Index relative to the Core Index likely reflect the legacy of the HITECH Act, by far the largest public financial investment in health IT in the United States, which incentivized adoption specifically for acute care treatment purposes. However, public policy and recent events have galvanized progress on technologies captured by the Pathfinder Index. Monitoring the relative rate of improvement on these 2 indices could inform where further policy interventions are needed or whether progress continues organically under the existing policy regime.1
The Friction Index represents a useful complement to the Core and Pathfinder Indices by capturing the extent of challenges experienced when engaging in interoperability; for this measure, a larger score represents a greater extent of challenges. In the cross-sectional data, hospitals with higher scores on the Core and Pathfinder Indices also had higher (ie, worse) scores on the Friction Index. If this continued over time, we might expect increases in all indices: as interoperability became more common, so did Friction. In contrast, a positive outcome over time would be to observe increases in the Core and Pathfinder Indices accompanied by decreases in the Friction Index. Furthermore, because the components of the Friction Index exhibit low convergent validity (ie, are not well correlated), it will be important to monitor both changes in the overall Friction Index and its specific components over time. This is particularly important because specific policy interventions are likely to affect components differently. For instance, while mean scores on the information blocking component have declined (an improvement) since the effective date of information blocking regulations in 2021, increases in scores on the methods of exchange component indicate progressively greater use of multiple methods to exchange information, and the experience of barriers remains substantially common. By facilitating connectivity across networks, the TEFCA may reduce the need to use these multiple methods, lowering this component of friction in the future.19 Our team intends to track scores on the methods of exchange component to evaluate how they change over time given the establishment of this new infrastructure for nationwide HIE.
Disparities in index scores
These data indicate that hospitals with fewer resources—as captured by hospital size, critical access status, location, and use of a market-leading EHR—had statistically significantly lower scores on the Core and Pathfinder Indices (representing worse performance). These findings parallel recent work, reiterate important disparities between hospitals, and indicate the validity of these aggregate indices relative to prior work.5 These disparities reinforce a need for targeted policy to ensure that hospitals with limited resources and that disproportionately care for groups that have been marginalized can apply interoperable health IT to quantify and target upstream and preventable causes of health crises.20–22
Limitations
Index construction may not capture all relevant aspects of, and challenges related to, interoperability. First, component themes were limited to those already represented among the survey items included in the Health IT Supplement. Because of ASTP’s involvement in developing the Health IT Supplement, additional survey items reflecting new interoperability themes can be added in the future, but there will be a delay in when these new themes are reflected in the indices due to the time demands of survey development (eg, the need to cognitively test new questions) and fielding of these new items.
Additionally, the themes reflected in the indices reflect the subject matter expertise of the research team and TEP members. The concepts included in the indices may be biased based on the experiences of those involved in index design, which may have affected the level of importance placed on each interoperability topic area or affected whether a topic was reflected in the indices at all. We note that the intention of including TEP members in the index design process was to minimize the impact of the study team’s bias on item selection by including a wider variety of perspectives, although we acknowledge that this approach may not have eliminated bias completely.
Future work
The indices are designed such that additional concepts can be added over time as new interoperability technologies proliferate. In the coming years, we intend to update the items informing these indices to reflect hospitals’ adoption of novel technologies and implementation of new processes, as well as to incorporate important topics identified by the TEP. In consultation with the survey developers, the research team will work to develop, test, and field questions on these concepts in future iterations of the AHA Health IT Supplement survey and will reconvene a TEP to inform question development and inclusion in the index. Each of the 3 index scores will be re-calculated biannually (Core and Friction on odd years and Pathfinder on even years) for individual hospitals and to attain mean scores representing overall nationwide performance. Evaluation of hospitals’ index scores can also highlight performance disparities as they persist or shift over time.
This work focused on hospitals given the longstanding survey efforts of these organizations. The process may serve as a model for the development of additional indices, using other data and measuring interoperability for other types of healthcare delivery organizations (eg, provider organizations). However, we anticipate that other indices may vary in structure.
Conclusion
Through TEP guidance and psychometric analysis, we developed a set of comprehensive national indices to represent the state of US hospital interoperability. The final indices capture progress on foundational interoperability capabilities (in the Core Index), newer and more diverse interoperability capabilities (in the Pathfinder Index), and difficulty encountered in engaging in interoperable exchange (in the Friction Index). We found that, on average, hospitals performed better on the Core Index compared to the Pathfinder Index. Better-resourced hospitals tended to score higher on all indices compared to their counterparts—better on the Core and Pathfinder Indices and worse on the Friction Index. Between 2021 and 2023, hospitals’ performance on the Core Index, as well as the components of the Pathfinder and Friction Indices, generally improved. Continued tracking of these index scores over time and across hospital characteristics will offer opportunities to highlight progress in the widespread use of interoperable technologies, to track the impact of policies as they are implemented, and to target new policies.
Supplementary Material
Acknowledgments
We would like to acknowledge the following Technical Expert Panel (TEP) participants, who contributed subject matter expertise in the development of 3 indices representing US hospital interoperability: (1) Jeff Chin—Director, Data Collaboratives & Governance, Michigan Medicine; (2) Mari Savickis—Vice President, Public Policy, CHIME; (3) Craig Behm—President & CEO, CRISP; (4) Ries Robinson—CEO Rodin Scientific, LLC (formerly CEO Graphite Health); (5) Chantal Worzala—Principal, Alazro Consulting; and (6) Lorren Pettit—Vice President, Digital Health Analytics, CHIME.
Contributor Information
Catherine E Strawley, Office of the Assistant Secretary for Technology Policy/Office of the National Coordinator for Health Information Technology, Washington, DC 20201, United States.
Julia Adler-Milstein, Division of Clinical Informatics & Digital Transformation, Department of Medicine, University of California San Francisco, San Francisco, CA 94143, United States.
A Jay Holmgren, Division of Clinical Informatics & Digital Transformation, Department of Medicine, University of California San Francisco, San Francisco, CA 94143, United States.
Jordan Everson, Office of the Assistant Secretary for Technology Policy/Office of the National Coordinator for Health Information Technology, Washington, DC 20201, United States.
Author contributions
Catherine E. Strawley contributed to the conception of the manuscript, analysis, visualization, and interpretation of the data, and drafting and critical revision of the manuscript. Julia Adler-Milstein contributed to project administration, conception and design of the indices, interpretation of the data, and critical revision of the manuscript. A Jay Holmgren contributed to project administration, conception and design of the indices, interpretation of the data, and critical revision of the manuscript. Jordan Everson contributed to project supervision, conception and design of the indices and manuscript, analysis and interpretation of the data, and drafting and critical revision of the manuscript. All authors provided final approval of the final manuscript and agree to be accountable for all aspects of the work.
Supplementary material
Supplementary material is available at Journal of the American Medical Informatics Association online.
Funding
This work was supported through a contract funded by the Office of the Assistant Secretary for Technology Policy/Office of the National Coordinator for Health Information Technology (ASTP).
Conflicts of interest
The authors have no conflicts of interest.
Data availability
The AHA data used in this study are available for purchase from the AHA at https://www.ahadata.com/aha-data-resources.
References
- 1. United States of America. H.R.34–21st Century Cures Act. Congress.gov; 2016. Accessed December 27, 2023. https://www.congress.gov/bill/114th-congress/house-bill/34
- 2. Centers for Medicare and Medicaid Services. The Medicare Promoting Interoperability Program vs MIPS Promoting Interoperability Performance Category. Centers for Medicare and Medicaid Services. Accessed December 27, 2023. https://www.cms.gov/files/document/infographic-pi-program-vs-mips-pi-perf-category.pdf
- 3. United States of America. H.R.2—Medicare Access and CHIP Reauthorization Act of 2015. Congress.gov; 2015. Accessed December 27, 2023. https://www.congress.gov/bill/114th-congress/house-bill/2/text
- 4. United States of America. 21st Century Cures Act: Interoperability, Information Blocking, and the ONC Health IT Certification Program. Congress.gov; 2020. Accessed December 27, 2023. https://www.federalregister.gov/documents/2020/05/01/2020-07419/21st-century-cures-act-interoperability-information-blocking-and-the-onc-health-it-certification
- 5. Pylypchuk Y, Everson J. Interoperability and Methods of Exchange among Hospitals in 2021. Data Brief No. 64. Office of the Assistant Secretary for Technology Policy/Office of the National Coordinator for Health Information Technology; 2023. Accessed December 27, 2023. https://www.healthit.gov/data/data-briefs/interoperability-and-methods-exchange-among-hospitals-2021 [PubMed]
- 6. Holmgren AJ, Patel V, Adler-Milstein J.. Progress in interoperability: measuring US hospitals’ engagement in sharing patient data. Health Aff (Millwood). 2017;36:1820-1827. [DOI] [PubMed] [Google Scholar]
- 7. Holmgren AJ, Everson J, Adler-Milstein J.. Association of hospital interoperable data sharing with alternative payment model participation. JAMA Health Forum. 2022;3:e215199. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8. Budd J, Miller BS, Manning EM, et al. Digital technologies in the public-health response to COVID-19. Nat Med. 2020;26:1183-1192. [DOI] [PubMed] [Google Scholar]
- 9. Dixon BE, Grannis SJ, McAndrews C, et al. Leveraging data visualization and a statewide health information exchange to support COVID-19 surveillance and response: application of public health informatics. J Am Med Inform Assoc. 2021;28:1363-1373. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10. Greene DN, McClintock DS, Durant TJS.. Interoperability: COVID-19 as an impetus for change. Clin Chem. 2021;67:592-595. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11. Everson J, Hendrix N, Phillips RL, et al. Primary care physicians’ satisfaction with interoperable health information technology. JAMA Netw Open. 2024;7:e243793. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12. Gabriel MH, Richwine C, Strawley C, Barker W, Everson J. Interoperable Exchange of Patient Health Information among U.S. Hospitals: 2023. Data Brief No. 71. Office of the Assistant Secretary for Technology Policy/Office of the National Coordinator for Health Information Technology; 2024. Accessed May 31, 2024. https://www.healthit.gov/data/data-briefs/interoperable-exchange-patient-health-information-among-us-hospitals-2023 [PubMed]
- 13. Everson J, Rubin JC, Friedman CP.. Reconsidering hospital EHR adoption at the dawn of HITECH: implications of the reported 9% adoption of a “basic” EHR. J Am Med Inform Assoc. 2020;27:1198-1205. 10.1093/jamia/ocab213 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14. Jha AK, DesRoches CM, Campbell EG, et al. Use of electronic health records in U.S. hospitals. N Engl J Med. 2009;360:1628-1638. 10.1056/NEJMsa0900592 [DOI] [PubMed] [Google Scholar]
- 15. DesRoches CM, Campbell EG, Rao SR, et al. Electronic health records in ambulatory care—a national survey of physicians. N Engl J Med. 2008;359:50-60. 10.1056/NEJMsa0802005 [DOI] [PubMed] [Google Scholar]
- 16. Allen MJ, Yen WM.. Introduction to Measurement Theory. Waveland Press; 2001. [Google Scholar]
- 17. Revelle W. psych: Procedures for Personality and Psychological Research. Northwestern University; 2023. Accessed December 27, 2023. https://cran.r-project.org/web/packages/psych/psych.pdf
- 18. O’Leary-Kelly SW, Vokurka RJ.. The empirical assessment of construct validity. J Oper Manage. 1998;16:387-405. [Google Scholar]
- 19. Office of the National Coordinator for Health Information Technology. The Trusted Exchange Framework and Common Agreement: Highlights for Health Information Networks, Exchanges, and Health IT. Office of the National Coordinator for Health Information Technology; 2019. Accessed December 27, 2023. https://www.healthit.gov/sites/default/files/page/2019-04/ONC-TEFCA_FINAL_InfoSheets_Patients.pdf
- 20. Apathy NC, Holmgren AJ, Adler-Milstein J.. A decade post-HITECH: critical access hospitals have electronic health records but struggle to keep up with other advanced functions. J Am Med Inform Assoc. 2021;28:1947-1954. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21. Everson J, Patel V, Bazemore AW, et al. Interoperability among hospitals treating populations that have been marginalized. Health Serv Res. 2023;58:853-864. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22. Office of the National Coordinator for Health Information Technology. Advancing Health Equity by Design and Health Information Technology: Proposed Approach, Invitation for Public Input, and Call to Action. Washington, DC: Office of the National Coordinator for Health Information Technology; 2024. Accessed May 31, 2024. https://www.healthit.gov/sites/default/files/2024-04/ONC-HEBD-Concept-Paper_508.pdf
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Data Availability Statement
The AHA data used in this study are available for purchase from the AHA at https://www.ahadata.com/aha-data-resources.




