Abstract
Objectives
We describe the development of a nursing home information technology (IT) maturity model designed to capture stages of IT maturity.
Materials and Methods
This study had 2 phases. The purpose of phase I was to develop a preliminary nursing home IT maturity model. Phase II involved 3 rounds of questionnaires administered to a Delphi panel of expert nursing home administrators to evaluate the validity of the nursing home IT maturity model proposed in phase I.
Results
All participants (n = 31) completed Delphi rounds 1-3. Over the 3 Delphi rounds, the nursing home IT maturity staging model evolved from a preliminary, 5-stage model (stages 1-5) to a 7-stage model (stages 0-6).
Discussion
Using innovative IT to improve patient outcomes has become a broad goal across healthcare settings, including nursing homes. Understanding the relationship between IT sophistication and quality performance in nursing homes relies on recognizing the spectrum of nursing home IT maturity that exists and how IT matures over time. Currently, no universally accepted nursing home IT maturity model exists to trend IT adoption and determine the impact of increasing IT maturity on quality.
Conclusions
A 7-stage nursing home IT maturity staging model was successfully developed with input from a nationally representative sample of U.S. based nursing home experts. The model incorporates 7-stages of IT maturity ranging from stage 0 (nonexistent IT solutions or electronic medical record) to stage 6 (use of data by resident or resident representative to generate clinical data and drive self-management).
Keywords: nursing homes, information technology, informatics, technology assessment, healthcare surveys
INTRODUCTION
Technology’s growth is pervasive across all industries. Growth in the healthcare sector has been spurred on in part by the realization that innovative information technology (IT) can help facilitate the effective management of a person’s health.1,2 This potential to improve patient outcomes with technology has captured the attention of cross-cutting stakeholder groups involved in improving quality of care, including federal agencies, nursing homes (NHs), and hospitals.3 For instance, one crucial cross-cutting IT resource gaining national attention is health information exchange used to share secure encrypted clinical patient data between stakeholder groups.4 Health information exchange enables timeliness of data transmission, which can lead to more fully informed care providers and eventually better patient outcomes. However, understanding the full impact of IT resources, such as health information exchange, on quality requires that we identify how quality changes as all types of IT matures, and which technologies make a difference in our health system.
As the result of the Health Information Technology for Economic and Clinical Health (HITECH) Act, the United States has paid more than $35 billion in incentive payments to eligible hospitals and physicians for adoption and “meaningful use” of electronic health records (EHRs).5 The EHR incentive program has resulted in widespread technology adoption in hospitals and ambulatory settings; however, adoption has been much slower in settings (eg, NHs) ineligible to participate in the incentive program.6
Despite these funding exclusions, cross-cutting partnerships are emerging in healthcare marketplaces that encourage use of technology to enable sharing of patient data, improvements in timeliness of clinical decisions, and patient safety and quality. Monitoring impacts of emerging technologies on quality of care and safety of residents is critical for the large sector of healthcare including NHs, which have 1.4 million residents nationally in nearly 16 000 facilities.7 For example, in a recent study assessing national trends in NH IT sophistication and quality performance over a 3-year period, an inverse relationship was found to exist between IT adoption and select quality measures for instance, percentages of long-stay residents with a urinary tract infection and residents receiving antipsychotic medications.8 The impact of increasing IT sophistication on clinical outcomes like these could be due to enhanced communication, improving clinical workflow, or better administrative oversight of NH residents susceptible to these conditions. IT sophistication is a valid and reliable measure on a scale (minimum = 0, maximum = 100) in 9 dimensions and domains related to IT capabilities, extent of IT use, and degree of IT integration within resident care, clinical support (pharmacy, radiology, and laboratory), and administrative activities.9
The first step toward understanding the relationship between IT sophistication and quality performance in NHs is recognizing the spectrum of NH IT maturity that exists and how IT matures over time. Currently, no universally accepted NH IT maturity model exists to trend IT adoption and to determine the impact of increasing IT maturity on NH quality. Situating NH IT adoption within a maturity model will inform NHs stakeholders about technological capabilities and encourage progression to a higher stage of IT maturity. This assumes that higher stages of IT maturity help facilitate the realization of improved quality performance outcomes. If true, then determining staging criteria and trends in IT maturity level is an important pathway to understand what technologies make a difference in NH quality performance. This type of evidence will help NH leaders make IT adoption decisions based on the potential to improve resident outcomes. Also, understanding IT adoption may be helpful when negotiating contracts and establishing relationships for value-based care.
IT maturity models offer a pathway by which provider organizations can transform their IT capabilities over time. One aspect of organizational transformation is based on the premise that IT systems implemented in organizations, such as NHs, evolve through a process of growth and development through a number of distinct stages.10 Maturity models assume there are predicted patterns, conceptualized in terms of stages that are sequential and cumulative in nature, and involve a wide range of structures and activities.11 Nolan12 is credited as the principle architect of an early model of IT maturity: the stages of growth model. Nolan proposed 6 stages of growth, beginning with the initiation of technological components into the organization and cumulating with technological maturity and system stability. According to Nolan, after the initial introduction of technology (stage 1), rapid proliferation of systems occurs, often resulting in disparate systems lacking appropriate controls (stage 2). In response to the chaos brought about in stage 2, formalized controls are put in place (stage 3). As users become more adept at using technology and begin to perceive its value, the organization is able to improve integration (stage 4) by consolidating systems and optimizing data management. Data administration (stage 5) addresses issues of data ownership and protection. Finally, steady growth produces maturity (stage 6), whereas systems are developed to their optimum state and are aligned with the organization’s strategic plan.
Since publication of Nolan’s model in 1973, researchers have worked to develop new models and apply them to various types of organizations. A recent literature review concluded that 14 health IT maturity models have been developed using a variety of methods, including surveys, interviews, focus groups, and observation.13 The models were developed to assess the general healthcare environment, mobile health, electronic medical records (EMRs), interoperability, telemedicine, and usability. No models in this latest review addressed NH IT maturity. Furthermore, most models analyzed in the review did not disclose the design process or research options for development and validation, a critical component necessary for using the model to inform future interventions. The design process is critical because through a rigorous development process appropriate criteria for model content and specific uses of the model can be validated for future use and increasing generalizability, in this case a national model for NH IT maturity. In this article, we describe the development of a NH IT maturity model designed to capture stages of IT adoption needed to evaluate the link between IT maturity and quality measures, a link that has not been established or tested in prior IT maturity models. Intended users of this model could be anyone seeking nursing home placements for patients including patients, families, care navigators, etc. Other users might be state surveyors who are conducting annual audits of NHs, who need to know before an audit what level of IT maturity exists in a facility. Furthermore, hospitals or vendors might be interested in knowing more about what NHs in their catchment regions have advanced IT systems including EHRs or health information exchange capabilities.
MATERIALS AND METHODS
Study design
We designed this study in 2 phases. Phase I of the study (October 2016 to December 2016) involved an in-depth review of the literature to identify preliminary stages, descriptions, and definitions used in previously published IT maturity staging models in acute care or other settings. The purpose of phase I was to develop a preliminary NH IT maturity model identified from other maturity model resources found in the literature.13 Maturity models found in the literature were reviewed by research team and preliminary content for phase I model was agreed on by the team. Phase II (December 2016 to January 2017) involved 3 rounds of questionnaires administered to a panel of expert NH administrators using a Delphi technique14 to evaluate the validity of the NH IT maturity model proposed in phase I. During Delphi rounds, experts were asked to comment on and recommend revisions to the stages, descriptions, and definitions included in the proposed IT maturity model emerging from the phase I literature review. All procedures were approved through the University of Missouri institutional review board.
Sample
A panel of 31 U.S. based expert NH administrators were recruited to participate in the Delphi panel who had core knowledge of NH care processes and health IT. These administrators act as managing officers in planning, organizing, directing, and controlling day-to-day functions in their facilities, including responsibilities for IT systems. We recruited NH administrators who met the following inclusion criteria: (1) at least 120 hours of experience, (2) a bachelor’s degree, and 3) completed preceptor training as a NH administrator. Because our goal was to develop a NH IT maturity staging model, we sought participants who had experience in NHs that had implemented health IT. We used data from a previous study9 to identify NH administrators who met inclusion criteria and who worked in NHs with high total IT sophistication scores above 75th percentile among a nationally representative sample. The preliminary national study had 198 NHs who met these criteria. After NHs were identified an internal expert advisory panel was used to nominate a diverse sample of 31 NH administrators for the Delphi panel based on location, bed size, and ownership from the homes in the national study.
Data collection and analysis
Data from phase II were collected by conducting a 3 round Delphi study (see Figure 1). The Delphi method is a way of gathering opinions and making judgments through an iterative process.14 This approach allows researchers to elicit views from people who have expert knowledge about a particular topic as well as a means for prioritization and consensus building.15,16 To ensure fidelity between the members of the Delphi panel, we invited them to participate in instructional webinars before beginning round 1. During the webinars, background information was given about the state of NH technology adoption and relationship to quality measures found in a preliminary national survey study conducted by GLA,17 there was an overview of Delphi method and activities during anticipated rounds, there was a description of the initial IT maturity staging model developed from the literature, and a brief overview of the NH IT survey developed by GLA to assess adoption nationally was given. Additionally, each Delphi panel member was provided a consent form approved by the university’s Institutional Review Board (Project # 2009109) for participation in the research study that discussed the research purpose, methods, benefits, participant rights, etc.
Figure 1.
Summary of activities in each round of the Delphi study.
In Delphi round 1, participants were given a preliminary staging model and were asked to review the stage numbers, stage descriptions, and stage definitions individually. After their review Delphi panel members were asked to choose if they agreed or disagreed about each staging criteria and were offered the opportunity to edit the stage number, description or definition. If they disagreed, they were provided a text box to make suggested revisions or comments. After each round, members of the expert internal advisory panel reviewed the suggestions and incorporated them into the staging model until consensus was reached. The Delphi panel consensus goal was 70% agreement among members; however, if panel members did not reach 70% agreement, a separate expert internal advisory panel with 9 members composed of gerontology and informatics experts discussed Delphi panel member comments, rated the items and recommended changes until 88% agreement was reached among advisory members.
In round 2, expert Delphi panel members evaluated a revised staging model. At the end of round 2, the internal advisory panel again evaluated comments and suggestions made by Delphi panel members and revised the staging model using the same consensus approach.
Last, Delphi round 3 was conducted with an additional caveat. During round 3 Delphi panel members were asked to use the NH IT maturity staging model to stage a specified set of NH IT items taken from a validated IT sophistication survey used in a national study conducted by the primary author (see Table 1).8 Following this staging activity, Delphi panel members were asked to comment on any recommended revisions to the IT maturity model stages, descriptions, or definitions after applying the staging model to NH IT items within the survey.
Table 1.
Examples of NH IT items on the IT sophistication survey used in round 3 IT: information technology; NH: nursing home.
Domain of Care | NH IT Item |
---|---|
Resident care |
|
Clinical support (pharmacy, laboratory, radiology) |
|
Administrative activities |
|
We used the electronic survey software Qualtrics to administer the questionnaires during the Delphi rounds and to collect feedback from Delphi panel members. Frequencies of agreement or disagreement for each stage number, description, and definition were calculated for each round. Comments from Delphi panel members were reviewed after each round of questionnaires and grouped according to stage. For example, all comments related to stage 1 were grouped together and then categorized according to whether the comment was related to the stage description or definition.
If needed, Delphi panel members were asked to make general comments pertaining to the staging model apart from the stage number, definitions, and descriptions. The general comments were organized, analyzed, and interpreted using conventional content analysis. This included qualitative descriptive research for looking at trends and how often problems were discussed. Therefore, conventional content analysis was selected.18 Using this analytical method, meaning is inductively derived from the data by allowing categories and names for categories to flow directly from the data. The expected outcome of this analysis is a descriptive summary of information contained in the data.19 Definitions for categories were agreed upon by the expert internal advisory panel through consensus (>88% agreement). Coding was initially completed by GLA and KP independently. Other members of the internal advisory panel were asked to review the comments and categories independently. The internal advisory panel members met frequently and compared the data analysis results through an iterative process of reviewing emerging codes and categories. We used a variety of methods to achieve trustworthiness of findings. Peer debriefing was used to establish credibility as codes and categories were formed. Confirmability was established by maintaining an audit trail as data were coded, and organized into emerging themes. The internal advisory panel used Trello, an online collaboration tool, to organize and analyze the qualitative data.
RESULTS
All 31 participants completed Delphi rounds 1-3 (100% response rate). The NH administrators participating in the study reported total IT sophistication scores in their facilities ranging from 391 to 711 (of 800). The majority of panel members were from for-profit (n = 16), medium-sized facilities (n = 22) in metropolitan areas (n = 18). Delphi panel members represented 24 U.S. states from varied parts of the country: Northeast (n = 7), South (n = 5), Upper Midwest (n = 4), Midwest (n = 10), and West/Southwest (n = 5).
At the completion of all rounds, Delphi panel members reached >70% consensus on all stages, descriptions, and definitions. A threshold of consensus was reached among the 31 Delphi panel members following round 2: stage 1 = 90%, stage 2 = 87%, stage 3 = 97%, stage 4 = 87%, stage 5 = 97%, and stage 6 = 77%. During round 3, stage 1 was re-evaluated and a recommendation was made to exclude facilities with nonexistent IT solutions or EMRs from stage 1 and add a stage 0 for NH leadership to report no IT or EMR capability in their facilities; 100% agreement was reached with this change. Table 2 illustrates the initial phase I model developed after the literature review. Over the 3 iterations of the Delphi, the NH IT maturity staging model evolved from a preliminary, 5-stage model (stage 1-5) to a 7-stage model (stage 0-6) (see Table 2).
Table 2.
Progression of NH IT maturity model stages, descriptions and definitions across 3 Delphi rounds IT: information technology; NH: nursing home.
Phase I: Initial IT Maturity Model |
After Round 1 Delphi |
After Round 2 Delphi |
Post Delphi Round 3: Final IT Maturity Model |
|||||
---|---|---|---|---|---|---|---|---|
Stage | Description | Definition | Description | Definition | Description | Definition | Description | Definition |
0 | N/A | N/A | N/A | N/A | NA | NA | Nonexistent IT solutions or EMR | EMR not used. No overarching IT governance. |
1 | Disparate fragmented solutions | Different incongruous IT systems that have distinct functionality, isolated systems, use of standardized terminology (eg, clinical diagnosis, nursing interventions) | Disparate fragmented solutions | Different incongruous IT systems that have distinct functionality, with limited integration, isolated systems, may use some standardized terminology (eg, clinical diagnosis, nursing interventions). Systems do not share data without manual interface. EMR not used. | Nonexistent incomplete or disparate fragmented IT solutions | EMR not used. No overarching IT governance. Systems do not share data. Different incongruous IT systems that have distinct functionality, with no integration, isolated systems, may use some standardized terminology (eg, clinical diagnosis, nursing interventions). | Incomplete or disparate fragmented IT solutions | Different incongruous IT systems that have distinct functionality, with no integration, isolated systems, may use some standardized terminology in documentation (eg, clinical diagnosis, nursing interventions). |
2 | Established IT governance structures, procedures, and processes | IT leadership with specific duties and functions; implementing IT management and data stewardship processes (eg, data analytics); master data sources for identifying residents and data content areas are available | Established IT leadership that governs, coordinates structures, procedures, processes, and policies | IT leadership with specific duties and functions; implementing IT management and data stewardship processes (eg, data analytics), master data sources for identifying residents and data content areas are available. | Established IT leadership that governs and coordinates structures, procedures, processes, and policies | IT leadership with specific duties and functions; incorporates superusers (eg, staff knowledgeable about IT use) to assist in building, troubleshooting, implementing, and supporting frontline staff with IT tasks. Implementing IT governance and data stewardship processes (eg, ensuring data quality, capturing appropriate information for each data element). Some techniques are available to join data across disparate systems and are used for data analytics and reporting. | Established IT leadership that governs and coordinates structures, procedures, processes, and policies | IT leadership with specific duties and functions; incorporates superusers (eg, staff knowledgeable about IT use) to assist in building, troubleshooting, implementing, and supporting frontline staff with IT tasks. Implementing IT governance and data stewardship processes (eg, ensuring data quality, capturing appropriate information for each data element). Some techniques are available to join data across disparate systems and are used for data analytics and reporting. |
3 | Automated internal connectivity and reporting | More consistent and efficient data use and reporting internally | Automated internal connectivity and reporting | Utilizes common interfaces that permit secure sharing of data across multiple internal applications. Implementation of new applications requires adherence to standardized protocols for internal reporting. | Automated internal connectivity and reporting | Utilizes common interfaces that permit secure sharing of data across multiple internal applications. Uses master data sources and classifications to establish data relationships between systems. Implementation of new applications requires adherence to standards for connectivity and internal reporting. | Automated internal connectivity and reporting | Utilizes common interfaces that permit secure sharing of data across multiple internal applications. Uses master data sources and classifications to establish data relationships between systems. Implementation of new applications requires adherence to standards for connectivity and internal reporting. |
4 | Automated external connectivity and reporting | More consistent and efficient data use and reporting externally | Automated external connectivity and mandated reporting. Includes care management and nursing documentation for claims and billing purposes | Utilizes common interfaces that permit secure sharing of data across multiple external applications (eg, interface pharmacy, labs, x-ray). Interface with IT vendors. Incorporates HIE. Implementation of new applications requires adherence to standardized protocols for external reporting. | Automated external connectivity and reporting. | Utilizes standard interfaces that permit secure data sharing across external applications (eg, interface with pharmacy, labs, radiology) for treatment related purposes. Interface with third parties for revenue cycle or quality management. Incorporates HIE. Implementation of new systems requires adherence to standards for external connectivity. Includes nursing and ancillary services documentation used in care management, also for claims and billing purposes. | Automated external connectivity and reporting | Utilizes standard interfaces that permit secure data sharing across external applications (eg, interface pharmacy, labs, radiology) for treatment related purposes. Interface with third parties for revenue cycle or quality management. Incorporates HIE. Implementation of new systems requires adherence to standards for external connectivity. Includes nursing and ancillary services documentation used in care management, also for claims and billing purposes. |
5 | Clinical risk intervention and predictive analytics | Established analytic environment for understanding clinical outcomes | Clinical risk intervention and predictive analytics | Established analytical environment for understanding and deciphering clinical outcomes for given clinical approaches, interventions, and risks. Analytics guide timely intervention to improve outcomes. Enables association of external and internal data to predict outcomes and provide benchmarks. Interface should allow access to all data for an all-inclusive comprehensive clinical report. | Clinical risk intervention and predictive analytics | System-driven tools that influence the development of treatments and care plans while minimizing risk. Includes clinical decision support. Analytics guide timely intervention to improve clinical outcomes. Interfaces allow delivery of all-inclusive clinical reporting using virtually all relevant data from internal and external systems. Enables association of external and internal data to predict outcomes and provide benchmarks. | Clinical risk intervention and predictive analytics | System driven tools that influence the development of treatments and care plans, while minimizing risk. Includes clinical decision support. Analytics guide timely intervention to improve clinical outcomes. Interfaces allow delivery of all-inclusive clinical reporting using virtually all relevant data from internal and external systems. Enables association of external and internal data to predict outcomes and provide benchmarks. |
6 | N/A | N/A | Accessibility of data by customer and/or personal representative | Establish environment for customer and/or personal representative to have access to data. Increases transparency of their analytical data in a format that makes sense to these types of end users. | Accessibility of data by resident or resident representative to drive self-management | A secure and protected means for resident and/or resident representative to generate and access clinical data. Increases transparency of their clinical data in a format that is easily understood by end users. Resident data are accessible electronically. | Use of data by resident and/or resident representative to generate clinical data and drive self-management | A secure and protected means for resident and/or resident representative to generate and access clinical data. Increases transparency of their clinical data in a format that is easily understood end users. Resident data are accessible electronically. |
EMR: electronic medical record; HIE: health information exchange; IT: information technology; NH: nursing home.
Delphi round 1
Delphi panel members began round 1 by reviewing the preliminary staging model and providing feedback regarding the number of stages, stage descriptions, and stage definitions. Changes to stage description or definition were made during this round (see online Supplementary Appendix). Changes to stages 1 and 2 included a change to the definition of stage 1 and to the description of stage 2. The stage 1 definition was changed to include limited integration and no automatic sharing of data based on a Delphi panel member’s recommendation. In addition, a new stage, stage 6, was added to the model.
The definitions for stages 3 and 4 were changed significantly during round 1. Four Delphi panel members (13%) commented that consistent and efficient data use did not adequately describe automated internal or automated external connectivity and reporting. In both cases (stages 3 and 4), panel members suggested providing examples in the definition. For example, in stage 3 (automated internal connectivity and reporting), one Delphi member recommended adding “utilizes common interfaces that permit sharing of data across multiple internal applications. Implementation of new applications requires adherence to standardized protocols.” Similarly, other Delphi members recommended edits to the stage 4 definition to create consistency with the stage 3 definition and to clarify “consistent and efficient data use.” Examples of external applications and IT consultants were also added to the stage 4 definition.
representatives. Definition: Establish environment for resident and/or personal representative full access and ultimately transparency of their analytical data in a format that makes sense to more casual end users.
Delphi round 2
Comments regarding the stage 1 description and definitions led to several changes to stage 1 in round 2. Three Delphi members (10%) made recommendations regarding the inability of systems to share data without doing so manually. For example, one Delphi member commented: “Systems may not contain all data, or do not share data without manual interface.” Two additional Delphi participants commented regarding the need to add incomplete to the definition of stage 1. There were no recommended changes to the stage 2 description in round 2; however, there were several recommended changes to the stage 2 definition. One Delphi participant recommended including superusers in stage 2. The participant commented, “In addition to IT leadership, utilize front line staff (Super-users) to assist in building, troubleshooting, implementing and supporting staff with IT tasks.” Recommendations were made to address the ability to join data across disparate systems that was thought to take place in the second stage of IT maturity. Furthermore, one Delphi participant recommended moving master data sources to stage 3, “Because it is core to automated reporting.”
Changes were made to the definition of stage 4, automated external connectivity and reporting. Delphi panel members suggested including additional examples of external applications and including the ability to interface with third parties to engage in functions such as quality management and health information exchange. While only 1 member of the expert Delphi panel recommended changes to the stage 5 definition, this comment resulted in a great deal of wordsmithing by the internal expert advisory panel to try and achieve a clear and concise definition. Clinical decision support was added as an example of the type of clinical risk intervention and predictive analytics present in the fifth stage of IT maturity.
Round 2 was the first stage Delphi participants had the opportunity to comment on the newly added sixth stage of IT maturity: accessibility of data by customer or personal representative. Consensus on definitions and descriptions was above 70% in this round. However, 3 Delphi participants recommended minor changes, for instance, the word customer in the stage 6 definition. For example, one participant commented:
In a nursing home setting, the customer is typically referred to as resident or (if receiving short-term in-patient services with the intent to return to their home in the community) patient. The new term for personal representative in nursing homes is resident representative. We consider many people our customers including family members of our residents who aren’t necessarily a resident representative.
This resulted in a change to both the definition and description of stage 6. In addition, Delphi panel members recommended including an example (ie, patient portal) and addressing secure access to data by the resident or resident representative.
Delphi round 3
The application of the staging model to a validated set of IT capabilities enabled the Delphi panel members to practice using the staging model as they staged specific IT capabilities. As a result, recommendations were made to create a new stage 0, to exclude the facilities that have nonexistent IT solutions or EMRs from stage 1, and to have stage 1 only include incomplete or disparate fragmented IT or EMR solutions. The internal advisory team agreed (100%) this change enabled greater mutual exclusivity between the 2 stages (stages 0 and 1). Finally, some comments demonstrated concern for requirements of a specific stage, for instance, one Delphi panel member made the following statement about stage 6 use of data by resident or resident representative to generate clinical data and drive self-management, “I’m not sure if the 6th stage will work. It requires a lot of training, background education/knowledge, and could pose a risk to the integrity of the healthcare information.”
General comments
During each round, Delphi panel members had the opportunity to make general comments regarding the staging model. Seven categories emerged from the content analysis of the general comments: IT expertise, need for examples, comprehension of stage descriptions and definitions, comprehensiveness of the model, logical sequencing, ease of use, and potential benefits. The categories are thought to build on each other sequentially. For example, the first 3 categories are thought to be antecedents to implementing the model and include: IT expertise, need for examples, and comprehension of stage descriptions and definitions. The next 3 categories were comprehensiveness of the model, logical sequencing, and ease of use contribute to the functionality of the model. The final category, potential benefits, summarizes consequences of implementing the model.
IT expertise
The first category, IT expertise, refers to comments (n = 8) pertaining to the NH administrators’ self-perceived lack of IT expertise. For example, one Delphi member wrote, “I will admit I am not an IT expert when it comes to terminology, but I am an expert at using the technology in my center.” Similarly, another Delphi member commented, “I don’t consider myself an IT expert so the simpler the definitions the better. I want to be sure I understand all the terminology.” Despite this self-perceived lack of IT expertise, some Delphi panel members recognized the importance of the “real-world” experience they brought to developing the NH IT maturity model.
Need for examples
This category refers to comments (n = 10) about the perceived value of including examples in the staging model. For example, one Delphi participant made a suggestion to add “advanced clinical decision support and structured messaging” to the stage 6 definition. Ten Delphi participants (36%) made comments as to the value of including examples in the model. They made comments such as, “I would like to see more examples listed in the definition because it helps me apply the description more effectively” and “please continue to simplify and give examples that relate, practically speaking, to our work.”
Comprehension of stage descriptions and definitions
Incorporating easy to understand stage descriptions and definitions made up the next category of comments. This category includes comments (n = 12) pertaining to how easy, or in some cases, how difficult the definitions and descriptions used in the model were to understand. Nine Delphi participants (29%) found the stage descriptions and definitions easy to understand: “I liked the very easy to understand definitions and descriptions to describe processes and inter connectivity.” In addition, Delphi participants said the definitions “better explain the stages” and “allows the participant to have a thorough understanding of the process…” On the contrary, 3 participants (10%) found the stage descriptions “somewhat difficult to understand” and “a little complicated.” One participant wrote, “My only concern is the definitions seem overly complex. Not inaccurate, just a bit difficult to process.”
Comprehensiveness of the model
Four Delphi panel members (13%) stated the staging model was comprehensive in addressing the IT maturity spectrum in NHs. Both the language included in the model and the sequencing of stages were found to be comprehensive, “encompassing the whole realm of need in healthcare from an IT system.”
Logical sequencing
Because the IT maturity model is cumulative in nature, many comments (n = 19) were made with regard to the logical sequencing of stages. Participants found the stages to be “in order and properly defined.” One participant described the logical sequencing using specific examples from the model:
The process, from my review during this survey, appears to be not only appropriate in regards to the actual process of each stage but also the order of the stages. I completely agree that through stage 1, stage 2 becomes the most obvious next step as establishing governance is necessary for the completion of the following stages. I also agree that the consistency of data being reported internally will be necessary before addressing the external reporting of data.
Ease of use
This category was established based on comments (n = 7) related to how easy to use Delphi members found the IT maturity staging model to be. Words such as simple, straightforward, and easy were used to describe the process of creating the staging model and using it to describe maturity of NH IT systems.
Potential benefits
Comments (n = 5) regarding potential benefits can be split into 2 subcategories: (1) benefits of using this (Delphi) process to develop the model and (2) potential benefits in future applications of the model. Delphi panel members commented as to the potential benefits of using the Delphi process to develop the IT maturity staging model. Panel members made comments like, “the overall process was excellent…” and “very straightforward.” In addition, Delphi members commented that seeing the suggestions of others in the panel gave them ideas and exposure to technologies not yet adopted in their individual facility.
Delphi participants described potential benefits of understanding how IT systems may be integrated to coordinate care. The NH administrators who made up this Delphi panel identified potential financial, regulatory, and quality improvement benefits from exploring system integration using the model. One participant shared an example of the potential benefit of integrative systems:
In thinking with all of the many programs (that don’t often interface) that I work in daily, it would be incredible to have a parent system that coordinated all of these specific functions as well as the costs that could be saved in staffing from having to manually integrate these systems into a healthcare campus.
Another Delphi participant went beyond the potential financial benefits and described how the model might be used to inform and improve quality of care: “I am wondering how this [model] may impact the overall spend. If we can improve data analytics and have a better sense of what is medically necessary at any given time during a patient's journey…” Comments were also made about the potential of the model to guide administrative decision making by creating a “roadmap to implementation of NH IT.”
DISCUSSION
In this study, an expert Delphi panel composed of NH IT administrators, a unique part of the NH workforce, designed a NH IT maturity model. Experts recommended 7 stages, which range from facilities with nonexistent NH IT solutions or EMR systems (stage 0) to NHs that provide access for residents or resident representatives to IT systems, enabling them to generate and access clinical data (stage 6). Each stage was iteratively modified during 3 Delphi rounds in which experts reached consensus (>70%) on number of stages, stage descriptions and definitions in the IT maturity staging model. Similar models exist to measure EHR adoption including the LeadingAge CAST 7-Stage EHR Adoption model, which is described as a “generalized model that provides a framework to assess the level of adoption and sophistication of use, as opposed to just the overall rate of adoption, of EHRs in long-term and post-acute care.”20 The CAST model has 7 stages, ranging from stage 1 (basic information systems) to stage 7 (interoperability and health information exchange). Some similarities exist between the NH IT maturity model in this study and the CAST model, including 2 distinct stages, relating to IT use (1) internal to the organization and (2) external to the organization. One difference between the 2 models is that the CAST model is primarily to be used to help long-term care organizations choose an EHR whereas the NH IT maturity staging model in this study is primarily designed to be used in the trending and reporting of many types of IT resources, specifically in NH. This is an important distinction in use of the 2 models because the IT maturity staging model developed in this study was designed to inform a diverse set of stakeholders who are intended users, such as state NH surveyors, policymakers, NH administrators, and potentially others (eg, patients/family members) as they try to make informed decisions about IT capabilities, extent of use, and integration in NH settings.
An IT maturity staging model is needed to create an opportunity to better understand how IT is maturing in nearly 16 000 NHs across the United States. Currently, IT maturity assessments in the NH sector are not conducted on a routine basis. Developing an IT maturity model with specified stages may provide a mechanism for interested stakeholders to explore critical information about technology trends in the NH sector. Uncovering these trends provides benefits to healthcare stakeholders, such as hospital providers and staff interested in working with NH staff to improve IT connectivity to smooth communication and workflow during transitions of acute care patients to or from the NH. For example, a hospital administrator responsible for care transitions might want to use the staging model information to identify NHs in their catchment area that have EHRs capable of supporting health information exchange about resident care. By identifying NHs that can support health information exchange, hospital leaders are better positioned to extend their IT services to NHs within a designated catchment area. This will allow for testing and implementation of health information exchange using EHR platforms, thus allowing methods for the automatic exchange of continuity of care documents during transitions of care (see IT maturity stage 4 in Table 2). Currently, no routine assessment of NH IT maturity staging is completed or reported nationally, and there is no mechanism to report IT maturity to stakeholders in disparate systems (eg, hospitals and NH), thus there is no way to assess information about who has advanced IT systems. Gaps in these critical linkages need to be filled if we are to achieve effective health information exchange systems across all healthcare organizations.
Another benefit for describing how NHs progress from nonexistent IT systems (stage 0) to mature IT systems (stage 6) is that variability in IT maturity stage across organizations may be used to demonstrate how specific IT capabilities, extent of IT use, or IT integration (dimensions of IT sophistication measured in IT maturity21) influences quality, safety, and efficiencies of care. Many completed studies assess quality, safety, and efficiencies realized through use of IT systems. However, most of these studies are not in NHs. In the case of NHs, national samples have not been recruited and tested routinely or evaluations have included only specific type of technology such as health information exchange,22 which has limited generalizability to overall IT system impact. Another opportunity for improvement, relates to current, national quality reporting models (eg, Centers for Medicaid and Medicare Nursing Home Compare), which contain no clear link between technology trends and quality outcomes to be able to illustrate how IT systems impact care delivery. Without this information, stakeholders participating in IT implementation are making blind decisions without clear evidence that IT systems make a difference in care, which can lead to costly decisions about which IT systems to implement and wasted time if IT does not work as expected. With staging information in hand and initial data that seem to suggest known relationships to quality and safety, stakeholders considering IT implementation can create targeted strategies that align specific IT capabilities use, and integration with known benefits to maximize quality, safety, and efficiency.
The next step in this model development process is to conduct a rigorous psychometric evaluation of NH IT maturity staging model using an IT sophistication survey also with NH IT experts who have sophisticated NH IT systems implemented in their facilities.
LIMITATION
Currently, there is no gold standard for reaching consensus using Delphi methods. Some have viewed this as a limitation of using this method to reach decisions in model development activities. A range of consensus values have been used across many Delphi studies. We elected to have 70% agreement among our Delphi panel members. When this was not reached, we were stricter in our agreement (88%) between expert internal advisory panel members. Another limitation is that the Delphi panel reviewers were U.S. based; therefore, the model reflects U.S. understanding of NH IT. Research is currently underway to determine how the model applies to countries outside the United States, specifically Australia.
CONCLUSION
A 7-stage NH IT maturity staging model was successfully developed with input from a nationally representative sample of U.S. based NH experts using a Delphi method. The model incorporates 7 stages of IT maturity ranging from stage 0 (nonexistent IT solutions or EMR) to stage 6 (use of data by resident or resident representative to generate clinical data and drive self-management). The model was successfully applied to a validated set of NH IT capabilities by experts who provided valuable feedback about the model which subsequently used to refine and define stages. The final NH IT maturity model can be used to assess stages of NH IT maturity nationally and will potentially impact NH quality and safety. Reporting IT maturity stages will provide important information to many types of stakeholders who are interested in extending IT capabilities, use, and integration among nursing homes.
FUNDING
This project was supported by Agency for Healthcare Research and Quality grant number R01HS022497. The content is solely the responsibility of the authors and does not necessarily represent the official views of the Agency for Healthcare Research and Quality.
AUTHOR CONTRIBUTORS
GLA was the Principal Investigator and led this study team and was responsible for all aspects of research design, methods, analysis, and dissemination. KP substantially contributed to the research methods, acquisition of data, analysis, and interpretation, and also participated in all aspects of writing the manuscript. CBD contributed to the research design, methods, analysis, and interpretation of data, and also participated in all aspects of writing the manuscript. LPo, ASMM, and RK contributed to research design, methods, analysis, and interpretation of findings; these co-investigators participated in all aspects of writing the manuscript. LPe and MD were consultants on this study; each of them contributed to research design, methods, and interpretation of findings, and participated in all aspects of writing the manuscript.
Supplementary Material
ACKNOWLEDGMENTS
The authors would like to thank Keely Wise, Project Coordinator, who help keep us all organized, moving forward through this project, and on target to accomplish goals. Further, this study would not be possible without the contributions of many nursing home leaders across the country who were consistent partners in our research.
Conflict of interest statement. None declared.
REFERENCES
- 1. DeSalvo K. Health Information Technology: Where We Stand and Where We Need to Go. HealthITBuzz; 2015. https://www.healthit.gov/buzz-blog/2015/04. Accessed March 4, 2019.
- 2. Guillemette MG, Pare G.. Toward a new theory of the contribution of the IT function in organizations. MIS Quarterly 2012; 36 (2): 529–51. [Google Scholar]
- 3. Alvarado CS, Zook K, Henry J.. Electronic Health Record Adoption and Interoperability among U.S. Skilled Nursing Facilities in 2016. ONC Data Brief No. 39. Washington, DC: Office of the National Coordinator for Health Information Technology; 2017. [Google Scholar]
- 4.Office of the National Coordinator for Health Information Technology. HIT and Health Information Exchange Basics. https://www.healthit.gov/topic/health-it-and-health-information-exchange-basics/health-it-and-health-information-exchange. Accessed March 4, 2019.
- 5. Thune J, Alexander L, Roberts P, Burr R, Enzi M. Where is HITECH’s $35 billion dollar investment going? Health Affairs https://www.healthaffairs.org/do/10.1377/hblog20150304.045199/full/. Accessed March 4, 2019.
- 6. Walker D, Mora A, Demosthenidy MM, Menachemi N, Diana ML.. Meaningful use of EHRs among hospitals ineligible for incentives lags behind that of other hospitals, 2009-13. Health Aff (Millwood) 2016; 35 (3): 495–501.. [DOI] [PubMed] [Google Scholar]
- 7.National Center for Health Statistics. Centers for Disease Control and Prevention FastStats Homepage. https://www.cdc.gov/nchs/fastats/nursing-home-care.htm. Accessed April 30, 2018.
- 8. Alexander GL, Madsen D.. A national report of nursing home quality and information technology: two-year trends. J Nurs Care Qual 2018; 33 (3): 200–7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9. Alexander GL, Madsen RW, Miller EL, et al. A national report of nursing home information technology: year 1 results. J Am Med Inform Assoc 2017; 24 (1): 67–73. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10. Nolan R. Managing the computer resource: a stage hypothesis. Commun ACM 1973; 16 (7): 399–405. [Google Scholar]
- 11. Rocha Á. Evolution of information systems and technologies maturity in healthcare. Int J Healthc Inf Syst Inform 2011; 6 (2): 28–36. [Google Scholar]
- 12. Nolan RL. Managing the crises in data processing. Harv Bus Rev 1979; 57: 115–26. [Google Scholar]
- 13. Carvalho J, Rocha Á, Abreu A.. Maturity models of healthcare information systems and technologies: a literature review. J Med Syst 2016; 40 (6): 131. [DOI] [PubMed] [Google Scholar]
- 14. McKenna H. The Delphi technique: a worthwhile approach for nursing? J Adv Nurs 1994; 19 (6): 1221–5. [DOI] [PubMed] [Google Scholar]
- 15. Keeney S, Hasson F, McKenna H.. A critical review of the Delphi technique as a research methodology for nursing. Int J Nurs Stud 2001; 38 (2): 195–200. [DOI] [PubMed] [Google Scholar]
- 16. Hasson F, Keeney S.. Enhancing rigour in the Delphi technique research. Technol Forecast Soc Change 2011; 78 (9): 1695–704. [Google Scholar]
- 17. Alexander G, Madsen R, Miller E, Wise K.. A national report of nursing home information technology adoption and quality measures. J Nurs Care Qual 2016; 31 (3): 201–6.. [DOI] [PubMed] [Google Scholar]
- 18. Sandelowski M. What’s in a name? Qualitative description revisited. Res Nurs Health 2010; 33 (1): 77–84. [DOI] [PubMed] [Google Scholar]
- 19. Hsieh H-F, Shannon SE.. Three approaches to qualitative content analysis. Qual Health Res 2005; 15 (9): 1277–88. [DOI] [PubMed] [Google Scholar]
- 20.Electronic health records (EHRs) for long-term and post-acute care a primer on planning and vendor selection. Section 3.3.1.3 CAST 7-stage EHR adoption model for long-term and post-acute care (LPTAC). http://www.leadingage.org/white-papers/electronic-health-records-ehrs-long-term-and-post-acute-care-primer-planning-and-vendor#1. Accessed October 24, 2018.
- 21. Alexander GL, Wakefield DS.. IT sophistication in nursing homes. J Am Med Dir Assoc 2009; 10 (6): 398–407. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22. Kruse CS, Marquez G, Nelson D, Palomares O.. The use of health information exchange to augment patient handoff in long-term care: a systematic review. Appl Clin Inform 2018; 9 (4): 752–71. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.