Skip to main content
BMJ Health & Care Informatics logoLink to BMJ Health & Care Informatics
. 2025 Sep 17;32(1):e101427. doi: 10.1136/bmjhci-2024-101427

Feasibility of semiautomated surveillance of healthcare-associated Staphylococcus aureus bloodstream infections using hospital electronic health records in Victoria, Australia

Lyn-li Lim 1,2,, Stephanie K Tanamas 2, Ann Bull 2, Daniel Capurro 3,4, Kylie Snook 2, Vivian K Y Leung 1,5, N Deborah Friedman 1,2, Caroline Marshall 4,6, Roland Laguitan 2, Judy Brett 2, Leon J Worth 2
PMCID: PMC12458672  PMID: 40967670

Abstract

Objective

Many hospitals struggle to transform electronic health record (EHR) data to support performance, continuous improvement and patient safety. Our study aimed to explore the feasibility of semiautomated surveillance for healthcare-associated infections (HAIs) in Australian hospitals, focussing on Staphylococcus aureus bloodstream infection (SABSI) surveillance.

Method

National surveillance case definitions were reviewed with an inventory list of data elements created to identify high-probability healthcare-associated SABSI events. An interview schedule was developed to assess the availability, characteristics and quality of EHR data for data elements. Interviews were conducted with hospital infection prevention and control (IPC) staff.

Results

12 IPC staff representing 12 hospitals and 11 healthcare organisations were interviewed. EHRs were in place at nine (75%) sites, supplied by six different vendors. Heterogeneity was observed in EHR functionalities, data capture methods for routine care and local approaches to use electronic systems to reduce HAI surveillance workload. None reported using automated surveillance systems. Most core data elements for the SABSI algorithm were present in EHRs, suggesting only minor modification to the SABSI definitions may be needed for automation, but issues with data quality were also described.

Discussion

We propose that modification of the national SABSI definitions is needed for automation. While many Victorian hospitals have adopted EHRs, data quality and interoperability issues limit the leveraging of EHR data for secondary purposes.

Conclusions

We have taken the initial steps of evaluating the feasibility of semiautomated HAI surveillance in Victorian hospitals. With further development, this offers the promise of enhanced efficiency and reduced human resources required for HAI surveillance.

Keywords: Electronic Health Records, Delivery of Health Care, Hospitals


WHAT IS ALREADY KNOWN ON THIS TOPIC

  • Australian hospitals are increasingly adopting electronic health records (EHRs), and there is interest in its use to support automation of healthcare-associated infection surveillance.

WHAT THIS STUDY ADDS

  • Semiautomated surveillance for Staphylococcus aureus bloodstream infection (SABSI) using EHR data is feasible in Australian hospitals. Minor modifications to national SABSI surveillance and improvements in hospital data quality would be required.

HOW THIS STUDY MIGHT AFFECT RESEARCH, PRACTICE OR POLICY

  • Developing a standardised, externally validated algorithm for semi-automated SABSI surveillance can reduce implementation effort required by hospitals.

Background

In Australia, around 9.9% of hospitalised adult patients are affected by a healthcare-associated infection (HAI).1 Performing HAI surveillance is crucial to ensure patient safety and has been shown to reduce HAI rates.2,4 Surveillance that accurately detects events in as real-time as possible using standardised case definitions is important for effective infection prevention and control programmes.3

HAI surveillance in Australian healthcare organisations remains predominantly a manual process for hospital infection prevention and control (IPC) teams.5 6 With the growing adoption of electronic health records (EHRs) in hospitals, the key emerging challenge is the meaningful use of EHR data to support monitoring of organisational performance, continuous improvement and patient safety.7 Ultimately, EHR-based monitoring systems could potentially provide an automated solution to currently employed manual processes.

Healthcare-associated Staphylococcus aureus bloodstream infections (SABSIs) are associated with significant morbidity and are potentially preventable,8 and monitoring of SABSI using standardised national surveillance definitions is a mandatory requirement in all Australian hospitals.9 SABSI monitoring is a plausible early target for development and piloting of automated surveillance (AS) in Australia.

HAI AS is the process of obtaining useful information from infection control data through the systematic application of medical informatics and computer science technologies. Automation of HAI surveillance involves electronic extraction of routinely available clinical data and then applying algorithms to identify whether an HAI has occurred. In fully AS, algorithms search through multiple types of EHR data according to specified rules (eg, diagnostic codes, microbiology, medication data) to identify events. With semi-AS, the algorithm identifies patients with a high probability of HAI, which then prompts manual confirmation of the event. Patients identified as low probability are classified as ‘no HAI’ and do not require manual case review. It has been widely reported that using algorithms to analyse routinely collected EHR data can yield similar results with less effort than manual surveillance.10 11

In Australia, public hospitals have increasingly implemented EHRs, with each jurisdiction adopting separate approaches to implementation.12 Few Australian hospitals have achieved the highest stage of maturity validation as measured by the Healthcare Information and Management Systems Society (HIMSS) Electronic Medical Record Adoption Model (ie, EMRAM Level 7).13 14

In developing a framework for electronic infection detection rules, the initial question to address is whether the necessary data sets and fields are fully available and accessible in hospital EHRs.15 This information is required to inform the feasibility of using the developed rules. Our study aimed to explore the feasibility of semi-AS for HAI in Australian hospitals, focussing on SABSI surveillance aligned with national case definitions16 by assessing the availability, characteristics and quality of EHR data related to the required data elements through interviews with IPC staff in Victorian healthcare facilities.

Methods

Developing a core dataset of EHR source data required for SABSI surveillance

In order to evaluate the availability and quality of relevant source data in EHRs, a working group of IPC experts (L-lL, LJW and JB) familiar with the SABSI implementation guide8 and HAI surveillance collaborated to interpret the guide and case definitions using hypothetical scenarios. Consensus on source data required to identify a healthcare-associated SABSI event was achieved through discussion (online supplemental file 1). The list of data elements became the ‘core dataset’, an agreed-on set of variables required to identify SABSI events that met the case definition.

Interviews

To understand the availability of source data available for SABSI surveillance in hospital EHRs, we undertook one-on-one online interviews with Victorian IPC nurses involved with SABSI surveillance in April and May 2024. At recruitment, to optimise representativeness of hospital IPC programmes and EHR resources, participant sampling was stratified by:

  • Geographical location: metropolitan versus regional.

  • Size of health service: large (>2 hospitals) versus small (≤2 hospitals).

Consolidated criteria for Reporting Qualitative research guidelines17 were considered in study design, conduct and reporting (online supplemental file 2). Semistructured interviews were conducted via Microsoft Teams using an interview schedule (online supplemental file 3), with one interviewer (L-lL, LJW, KS or AB) and an observer (SKT). Interviewers were VICNISS IPC experts familiar with the task of HAI surveillance. IPC staff responsible for SABSI surveillance at >1 hospital were requested to focus their responses to a single site. Participants were asked to describe their electronic data systems used to support SABSI surveillance, pathology and radiology, medication management, clinical notes, observation and device monitoring charts. Where EHR was available, participants were asked what locations (eg, ED, theatre, wards) it was in and their workflow in identifying and investigating a SABSI event. Questions were asked to assess the availability, characteristics (eg, found in one or multiple locations, structured or unstructured) and quality of information in EHRs related to the proposed core dataset. Follow-up questions were asked to clarify and provide detail using clinical anecdotes. The interviewer and observer undertook a debrief after each session to ensure completeness of field notes. The schedule was reviewed after each interview to ensure that it was performing as expected and changed iteratively as required.

Data analysis

Interviews were audio recorded and transcribed verbatim using Microsoft Teams speech to text transcription, then manually checked for accuracy and corrected against the recording. Data were collated and coded in Microsoft Excel. All interviews were open coded by two researchers (L-lL and SKT). Inductive and deductive approaches were applied to the analysis. Codebooks were compared, discussed to achieve consensus agreement and merged into one codebook. Exemplar codes from transcripts representing themes were identified by L-lL.

Participant responses were analysed to understand which data in EHRs could be used as algorithm elements. Assessments of data quality were based on frequently used data quality dimensions,18 where applicable, noting that not all individual dimensions are relevant for all data types.

The dimensions are:

Availability—the extent to which the data are available and the ease of user access.

Accuracy—the degree of agreement between data and the real-life object, reliability and validity are synonyms.

Completeness—the extent to which all expected data are present.

Consistency—the degree to which the data are free from contradiction with itself, follow an established rule or are provided in the same format.

Currency—the extent to which the data are reasonably up to date for the intended task.

Relevancy—the degree that the data meets the expectations and requirements of the user.

Timeliness—the extent to which the age of the data is suitable for intended use.

We characterised the format of how the data were recorded in the EHR by whether they were structured or unstructured.19 Structured data are in a clearly defined format, with content organised into small, searchable units, and are computable. Each element is assigned a unique data type (eg, admission data), enabling algorithms to easily and accurately collect, interpret and validate the information. Unstructured data are recognised as not having a predefined format or organisation, commonly found as free text (eg, clinician notes). Analysis usually requires data transformation into a structured format.

Results

Interviews were undertaken with 12 IPC staff representing 11.5% of Victorian healthcare organisations. Participants self-identified as being the primary person responsible for SABSI surveillance in their organisation or for specific sites within their organisation. Of 12 hospitals, 9 (75%) had implemented EHR systems, using 6 different EHR software systems (table 1). All public hospital sites (n=9) had EHRs in place while none of the private hospitals (n=3) had EHRs. Variation in scope of implementation of EHRs was reported. One site had not implemented electronic medication management. Three sites with electronic medication management systems still used paper-based charts in theatre. One site did not have electronic clinical observation tools, instead relying on paper charts, while others had electronic observations rolled out in higher-risk areas such as Intensive Care Units but not in general wards and theatre. Eight of nine sites had electronic clinical monitoring of peripheral intravenous cannula (PIVC) devices.

Table 1. General and electronic health record (EHR) system characteristics of healthcare organisations employing IPC staff who participated in the interviews (n=12).

Geographical location Ownership Size of health organisation EHR system in place Electronic systems used to support SAB surveillance EHR system used for SABSI reporting (eg, dashboards)
Site 1 Metropolitan Public >2 hospitals Yes EHR No
Site 2 Metropolitan Public >2 hospitals Yes EHR, infection monitoring software from commercial provider, Pathology provider reporting (daily, monthly reports), in-house database (data from pathology portal uploaded to database), patient administrative records. Note: EHR organism reports available but not used. No
Site 3 Regional Public ≤2 hospitals Yes EHR, bespoke software purchased from software provider No
Site 4 Regional Public ≤2 hospitals Yes EHR, pathology provider reporting, patient administrative records Yes—partial (organism reports available)
Site 5 Metropolitan Public >2 hospitals Yes EHR, infection monitoring software purchased from commercial provider Yes—dashboard reports with direct feed from EMR
Site 6 Metropolitan Public >2 hospitals Yes EHR No (in progress)
Site 7 Metropolitan Public >2 hospitals Yes EHR, in-house database, REDCap. No
Site 8 Regional Public ≤2 hospitals Yes EHR, pathology provider report (weekly) No
Site 9 Regional Public ≤2 hospitals Yes Pathology provider reporting (weekly), patient administrative records No
Site 10 Regional Private >2 hospitals No Scanned medical records, bespoke software purchased from software provider No EHR
Site 11 Metro Private >2 hospitals No Pathology provider reporting (weekly) No EHR
Site 12 Regional Private >2 hospitals No Scanned medical records, bespoke software purchased from software provider No EHR

EMR, electronic medical record; IPC, infection prevention and control; SABSI, Staphylococcus aureus bloodstream infection.

Sources of information for SABSI surveillance

A minority of hospitals with EHRs described relying solely on this for SABSI surveillance. Most also used bespoke infection monitoring commercial software (n=3), external pathology provider microbiology reports (n=4) and in-house databases (n=2). Hospitals without EHRs described using patient administrative records, external pathology provider reports and bespoke infection monitoring commercial software.

Organisational SABSI reporting was not commonly described as supported by EHR functionalities. One described having access to an infection pathology view on their EHR, available to support IPC surveillance, but this was not used for reporting. Another participant described that their organisation had recently purchased an add-on infection control enhancement from their EHR software vendor allowing for EHR enhancements to support local infection surveillance and reporting but was uncertain if or when this would be implemented.

Existing SABSI surveillance processes

Participants described processes as commencing with a laboratory report of patients with positive S. aureus blood cultures during a specified period. Commonly, this report was made available to the IPC staff on the pathology provider’s portal. The report was then uploaded onto purchased software or in-house databases and presented alongside patient demographics (eg, admission date, location in hospital). IPC staff then used the EHR as a data source to access patient demographics, admission and discharge dates, clinical information, pathology and radiology reports and medication charts. IPC staff would then investigate whether this blood culture result was related to a new or pre-existing event, if the infection was healthcare-associated, and whether it was preventable (eg, attributable to a device).

The majority described receiving bespoke reports generated from pathology portal systems separate to the EHR. Only one participant described using EHR-generated blood culture reports in their workflows; at another site, this was available but not preferred by the IPC staff. No participant described the use of algorithms applied to electronic data to identify probable HAI events. None described performing, documenting and reporting the case investigations within the EHR.

Quality of information for SABSI surveillance from EHR

The perceived quality of EHR source data in relation to core dataset elements is presented in table 2. The majority of data elements were available in EHRs, although data quality issues and issues with unstructured data were described.

Table 2. Core dataset of EHR source data required for Staphylococcus aureus bloodstream infection surveillance and IPC staff (n=9) perception of data quality on their organisational EHR systems.

Data element/s Quality of data element/s and potential data quality issues
Found in all EHRs
Admission/discharge dates Quality: Structured, rules for restriction to data entered applied for conformity. Data accurate, consistent, timely. Issues: Organisations apply varying definitions of discharge, for example, statistical discharge for admissions >30 days with change in unit
Admission diagnosis Quality: Data found in multiple EHR locations including emergency department and medical admission clinical notes (as a field in a customised template). Data are usually accurate, usually consistent, not objective and timely. In some hospitals, also captured as structured data with allocated data field for episode of care, the data field is free text. Data described as accurate, not objective, timely. Issues: Data are found in multiple locations, often unstructured. ICD-10 AM codes are not visible.
Discharge status (alive, deceased) Quality: Data accurate, consistent and usually timely.
Discharge location Quality: Data are structured and retrievable from customised templates (medical discharge summaries), structured data field (episode of care data), unstructured free text (clinical notes). Users uncertain of accuracy, the field may be incomplete in the EHR as it is completed in the patient administrative system. Issues: Data may not be easily extractable or may be missing.
Location prior to admission Quality: Most commonly available as unstructured data (admission notes) and may be missing. Where structured, it is part of demographic data collected by ward clerk or included question in clinician admission template. Issues: Data may not be easily extractable or missing.
Pathology results—internal Quality: Pathology type (specimen type /date/time of specimen collection) data were accurate, consistent and usually timely.
Blood culture collection date, time and site—internal Quality: Data on sample collection date and time are structured and accurate, consistent and timely. Data on where the sample was collected from (eg, from a line) has an allocated data field in the pathology report. These data are presented in a structured way with free-text additions allowed. Where available, they are described as accurate, usually consistent. Sometimes, these data were unavailable (ie, if they had not been documented by the pathology collector). Issues: Data may lag in being available on EHR compared with pathology portals (for organisations where different vendors are used).
Radiology results—internal Quality: Procedure type (name of procedure/date/time) for internal procedures is accurate, consistent and usually timely.
Device insertion and removal date and time (eg, peripheral intravenous cannula (PIVC), central lines, drain tubes). Quality: Data captured on electronic care charts as structured data with restrictions in the data field and minimal free text allowed. Also captured as unstructured data in clinical progress notes. Issues: Data can be missing, inaccurate and data in electronic charts may not be consistent with clinical notes. This occurs where the device was inserted or removed but the clinician did not document it in the electronic chart. Missing, inaccurate data are more common with device removal than insertion.
Device monitoring Quality: Data on devices are documented in electronic device charts (structured data), on paper forms scanned into EHR postdischarge and in clinical notes. There may be multiple sources for this data (eg, clinical notes vs electronic or paper chart). The electronic chart can be inaccurate, inconsistent with clinical notes and incomplete. Issues: Some hospitals have hybrid systems of documentation. Some hospitals have electronic charts for higher-risk clinical areas (eg, Intensive Care Unit) and paper charts on general wards and theatre.
Invasive procedure outside theatre Quality: Where surgical procedure module is available on EMR, sites used surgical module to also capture data from cardiac catheterisation and endoscopy procedures. Where the procedure is performed in radiology, this is captured under radiology results. Bedside procedures performed in ICU are captured in ICU procedure documentation. Data for these locations were collected using a customised template. Bedside procedures on the ward are captured in clinical notes. Data described as accurate, consistent and timely. Issues: Data found in multiple EHR locations.
Found in most EHRs
Clinical observations Quality: In EHR at 8 of 9 sites. Data on observations (eg,temperature, heart rate, blood pressure, O2 saturation, respiratory rate) are structured with restrictions on data entry and minimal free text for conformity. Data described as accurate, consistent and timely. Issues: Some sites still employ paper charts scanned into EHR records postdischarge.
PIVC site monitoring and evidence of infection Quality: In EHR at 8 of 9 sites. Structured data are entered into a tool with restriction in data field and minimal free text for conformity. At one site, data were entered into a paper-based tool and scanned into EHR.
Surgical procedure type and date Quality: In EHR at 6 of 9 sites. Most commonly, procedure type is found in the surgeon’s operative report (free-text entry in customised template), or specific data field for theatre admission (structured). The procedure date is found as structured data in the theatre admission tab. Data on surgical procedure date and time were accurate, consistent and timely. Of the sites that do not use EMR for theatre episodes of care, one used paper-based documentation and two used the patient administration system.
Inpatient medications Quality: In EHR at 8 of 9 sites. Prescribing and administration data are structured with restriction in data field and minimal free text for conformity.
Inconsistently found in EHR
Admission location Quality: Data structured or unstructured or missing. Data described as usually accurate, not consistent, may not be timely. Issues: Admission or patient encounter location is not as accurate as date/time on EHR and patient administrative databases are considered more reliable. There may be issues with discharge dates of interhospital transfers.
Pathology results—external Quality: Pathology type (specimen type /date/time of specimen collection) data often or as scanned documents and accessible after patient discharge
Radiology results—external Procedure type (name of procedure/date/time) for external procedures usually unavailable.
Medications prior to admission Data can be found in admission notes in an unstructured format or pharmacist medication reconciliation in a template performed on admission. The data were described as accurate when entered by the pharmacist, otherwise inaccurate, inconsistent or missing.

EHR, electronic health record; EMR, electronic medical record; ICD-10 AM, International Statistical Classification of Diseases and Related Health Problems, 10th edition, Australian Modification; ICU, Intensive Care Unit; IPC, infection prevention and control.

Availability

Data most commonly described as unavailable were pathology and radiology reports from external providers. Data elements most commonly described as found in multiple EHR locations included admission diagnosis and medications prior to admission.

Accuracy and reliability

Data most commonly described as inaccurate were patient encounter, which present information on the patient’s visit to the hospital including the encounter type (eg, admission, appointment), status of encounter (eg, discharged, completed), encounter date and time and location (eg, in hospital, hospital in the home, discharged). The field that identifies patient’s current location and discharge status was described as unreliable while other electronic sources, primarily administrative databases, were considered more reliable. Participants noted that this can impact discharge dates on systems, with interhospital transfers sometimes not being updated on the system on the same day of transfer. Several participants mentioned that they had experienced a lag in microbiology data being available on their EHRs compared with other electronic sources such as directly from the laboratory portal, as well as more limited access to the range of organism susceptibilities.

Data on device insertion and removal were described as unreliable: it could be missing, inaccurate and inconsistent between electronic charts and clinical notes. Users stated that missing and inaccurate data are more common for device removal than insertion. Several participants mentioned local quality improvement had been undertaken to improve clinician documentation of PIVC insertion, removal and monitoring in the EHR.

While blood cultures flagging positive for S. aureus were reliably found in EHRs, documentation of the site of sample collection (eg, from a peripheral stab, peripheral intravenous line or central line lumen) was sometimes missing. This was likely dependent on whether the information was entered on the pathology collection slip by the person collecting the sample.

Data structure

Data elements described as missing or not easily extractable because of unstructured data entry included discharge location and location prior to admission.

IPC staff experiences with EHR systems for surveillance

Three main themes emerged from analysis of the qualitative data: (1) transitioning to EHR, (2) EHR source data quality and (3) IPC staff performing infection surveillance selectively prefer to use other sources of electronic data rather than EHR. A description of the findings for each theme, identified subthemes and illustrative quotes is presented in table 3.

Table 3. IPC staff experiences with electronic health records (EHRs) systems for surveillance, themes and subthemes and illustrative quotes.

Theme and subthemes Quote
Theme 1: Transitioning to EHR/EMR*
Incorporating EMR into clinical work “We find the nurses are, nine times out of 10, documenting their VIPs (Visual Infusion Phlebitis score) in progress notes (rather than on the electronic tool). They rarely use the (electronic tool tab) because it’s not intuitive. (The organisation) are (developing) a nurses toolbox where basic things of care should be completed that aren't being completed, they go to that toolbox and make sure they've ticked off those things for the day.“(IPC participant A)
Duplication of workflows “I think that we see the value in actually just having the one system. (At the moment) you’ve got a case (in isolation/ transmission-based precautions) in the EMR and then you've got it duplicated in (IPC team bespoke infection surveillance system).” (IPC participant B)
Using EHR reports for IPC patient safety “(The IPC team) are able to run a daily…report that tells us if we’ve got cannulas that are overdue or due for removal. Often times we will be out on our (IPC) ward round, or we’ll call the ward and say the cannula for bed whoever is due for removal, and (the ward) will say, well, that came out yesterday. And you’re like, OK, well, it’s not documented.” (IPC participant C)
“I’m really lucky that one of my team is IT focused and is working with (EMR team) to develop better tools for us.” (IPC participant D)
Theme 2: EHR/EMR source data quality
Missing clinical information “Some of (IPC team) will actually go and speak with the Infections Diseases doctors or the treating team… we primarily use all of that as part of our source of truth for investigations.” (IPC participant D)
Electronic compared with paper documentation “The thing that I found really interesting with our recent audits is that about a third of all devices…were not documented at all in the (EHR) interactive view, and this is the one place where they're supposed to. It’s about the same number based on our experience with paper records. “(IPC participant E)
Information on patient movements “I’ve actually just applied for access back for our (administrative database) which gives us a lot more accurate data on exactly where someone’s been in the time frames they've been in, whereas in this new EMR’s difficult.”(IPC participant A)
Documentation regarding blood culture sample “Unless it’s been taken from a device, they might actually commit to CVC or something like that. But quite often it’s not and not labeled very clearly. I don't believe it’s a mandatory field (whether sample collected from peripheral stab or line) when the nurses are having to actually document it in the in the blood culture ordering section.” (IPC participant E)
Information on invasive device insertion documented in multiple locations “(Lines inserted in theatre are documented) in the (EMR) anaesthetic theatre module… but it generally it doesn’t talk with adult lines and devices for lines inserted outside of theatre.” (IPC participant C)
“(PIVC lines) is currently in two places… So you can document it in assessments and cares, but you can also use clinical pathway, but that has IV cannula and stockings, your TED stockings also.” (IPC participant F)
Theme 3: IPC staff performing infection surveillance selectively prefer to use other sources of electronic data rather than EHR/EMR
Reports to support surveillance not set up in EMR “Our pathology system has reports that are designed by pathology IT (Information Technology) for infection prevention. So they’re custom reports that pull results from the pathology system into a platform that we access.” (IPC participant G)
Familiarity with other systems “Blood culture reports on EMR are not as clear as a result on the (pathology portal) where I can look at it and know exactly what I’m looking at, whereas it takes me just that little bit longer in EMR.”(IPC participant H)
Searching for information in EHR is not intuitive “The problem is that the (EHR interactive interface) needs to address different clinical concerns so what I need and what a ward nurse needs are quite different and everything gets lumped together. It’s a very clunky, awful interface that is not user friendly or adaptable… It’s not intuitive at all. It’s very difficult doing the (SABSI) investigation. All lines are meant to be documented in adult lines and devices… it presets to the last 72 hours. So I go in there and amend these fields for the date range I'm looking for… then scroll back quite far looking for the information, every assessment, every entry… The other issue is that quite often our staff are not documenting in (EHR) interactive view, so they might be doing a free text note… You also have search all the clinical notes… I would set it up to the right time frame, and then I might type in words like PIVC, cannula, catheter.” (IPC participant E)
*

EMR is the common term used for EHR systems in Australian healthcare organisations.

EMR, electronic medical record; IPC, infection prevention and control.

Discussion

Our study provides a current snapshot of the extent to which hospitals in Victoria, Australia are engaged in the use of EHRs and the extent to which EHR data are used to support HAI surveillance. Vast heterogeneity was described in EHR functionalities, how routine care data were captured, and local approaches to reduce the workload of HAI surveillance. None of the sites described using AS of HAI. While hospital IPC staff involved in local SABSI surveillance identified the EHR as an important source of data for clinical care and diagnostic results, the use of EHR reports to identify events was not described. Participants described hybrid solutions to access clinical information during staged roll-outs of EHR within healthcare organisations, resulting in multiple clinical documentation processes and definitive sources of information, issues with data quality related to EHR system design and disrupted clinical workflows, and lack of intuitiveness when trying to find data on EHR. Potential solutions to address poor data quality related to missing, incomplete or inaccurate documentation are to improve user interfaces for clinicians and reduce excessive data entry requirements.14

Few Australian hospitals have achieved the highest stage of digital maturity as measured by the HIMSS EMRAM.13 14 In Victoria, healthcare organisations have the flexibility to choose their EHR vendor.20 As seen in our study, EHR systems, level of adoption and maturity of systems vary between hospitals. Implementation of EHR is performed within healthcare organisations, and while organisations can implement the same EHR functionalities, each may use this resource differently, which creates value that can be described as ‘non-imitable’ and ‘non-mobile’.21 The heterogeneity between local sites in use, reporting formats and structures, and non-standardisation of data elements reported in fixed fields complicates data extraction and limits interoperability between EHR systems.3 22 This limits the ability to share surveillance methods across networks of hospitals.11

None of our study participants described algorithms applied to their EHR data to support reduction in manual workload of HAI surveillance. The main disadvantage of a manual approach to chart review and case ascertainment is that review is required for a large number of SABSIs to identify a relatively small number of HAIs, making it inefficient and resource and time intensive.22 23 Use of AS offers the promise of improvements in accuracy of measurement and standardisation of processes. Gains in efficiency have the potential to support IPC staff and alleviate current workload, expand the breadth of surveillance, and focus efforts on quality improvement and IPC initiatives.3 As Australian healthcare organisations look at implementing AS for HAIs, a concern is that the process of locally translating surveillance definitions into electronic rules may lead to algorithms that vary between hospitals and consequently lead to unexpected discrepancies in data reported by different hospitals. Currently, there are no available solutions identified that escape the need for local integration and validation.

We propose that offering Australian hospitals an externally validated and standardised surveillance approach with consensus in case definitions and standardised EHR data elements can partially reduce the work effort required for local AS of HAI implementation. This approach has been successful overseas.24 This will allow hospitals to focus on the integration of the data elements to their local systems and use test data sets to pilot the performance of these rules.

Our evaluation confirmed that key core data elements of our proposed SABSI model algorithm are available in EHRs, suggesting that SABSI definitions may require modest modifications to make AS feasible. Further steps would be to incorporate these data elements into an algorithm and perform validation. Since algorithm-based HAI surveillance was first described in 2004,25 the majority of studies have applied a semiautomated AS approach for the identification of surgical site infections, central-line bloodstream infections and other HAIs, generally reporting high sensitivity (>0.8) but variable specificity (0.4–1.0).26 However, there are increasingly promising reports on the feasibility and performance of AS of bloodstream infections, such as using the hospital-onset bacteraemia detection algorithm developed by the PRAISE (Providing a Roadmap for Automated Infection Surveillance in Europe) network.22 24 27 Semi-AS, allowing for case review prior to finalisation, is reported as associated with higher acceptance from clinicians unfamiliar with HAI AS than fully AS.11 Given the heterogeneity in EHRs used in Victorian hospitals, we feel that application of a semiautomated approach is also more likely to be feasible for local EHR teams to implement as there is potentially room for adaptation.

Our findings highlighted that some required data may be missing or not easily extractable from Victorian hospital EHRs. These are recognised barriers to AS implementation in healthcare organisations.28 We also acknowledge that the focus of our research was on using EHR to support case identification; however, the utility of EHR also extends to providing data on risk factors. To ensure the availability of such high-quality data in EHR, hospitals would need to review how data are currently captured on their EHRs and invest in engaging clinicians to improve clinical documentation in EHRs despite the challenges of working in a time-poor clinical environment.14

Encouraging Australian healthcare organisations to transition to HAI AS requires a commitment to a national surveillance approach which currently does not exist in Australia.29 In addition, a roadmap to large-scale implementation of AS at a national level is needed. This roadmap can be used to guide future steps towards implementation, including designing solutions for AS and practical guidance checklists.22 Surveillance definitions will need to be revised to align with available structured EHR data elements.

Limitations

Our perspectives on availability and quality of EHR data for SABSI surveillance were informed by Victorian IPC staff and hence may not be generalisable to all Australian settings. As clinician end-users, we acknowledge that these stakeholders may have limited expertise regarding EHR data storage and extractability for the purpose of use in an algorithm. Looking ahead, these insights may be gained through engagement of EHR vendors or platform experts.

Conclusions

The increasing adoption of EHRs in Australian hospitals is promising, but the challenge remains in using meaningful EHR data to support continuous improvement processes for patient safety, quality of care, efficiency and reduction of health disparities. Our feasibility study has identified that despite the wide adoption of EHR by many Victorian hospitals, there is variation in EHR digital maturity and interoperability, and a lag in investment for using EHR data for secondary purposes. Current solutions, including commercial products, to reduce IPC staff workload in HAI surveillance are heterogeneous; they require local validation, integration and maintenance and are site-specific solutions. Given these findings, we propose that the next steps needed to understand readiness of Australian hospitals in being able to use AS for HAI are to test the application of this approach in a hospital setting and to validate this approach followed by targeted pilot implementation in a healthcare setting with advanced digital maturity. We plan to further explore our findings with healthcare staff familiar with the availability and extraction of EHR data for reporting to provide additional insights on the feasibility of EHR data to support HAI surveillance, and further insights into the development of the model SABSI algorithm.

Supplementary material

online supplemental file 1
bmjhci-32-1-s001.docx (24KB, docx)
DOI: 10.1136/bmjhci-2024-101427
online supplemental file 2
bmjhci-32-1-s002.docx (22.8KB, docx)
DOI: 10.1136/bmjhci-2024-101427
online supplemental file 3
bmjhci-32-1-s003.docx (395.5KB, docx)
DOI: 10.1136/bmjhci-2024-101427

Acknowledgements

We wish to thank the interview participants from Victorian health service Infection Prevention and Control teams.

Footnotes

Funding: The University of Melbourne and Royal Melbourne Hospital provided funding through the Innovation Acceleration Program (no award number).

Provenance and peer review: Not commissioned; externally peer reviewed.

Patient consent for publication: Not applicable.

Ethics approval: This study involves human participants and was approved by the Royal Melbourne Hospital Human Research Ethics Committee (100762-MH2023). Participants gave informed consent to participate in the study before taking part.

Data availability statement

All data relevant to the study are included in the article or uploaded as supplementary information.

References

  • 1.Russo PL, Stewardson AJ, Cheng AC, et al. The prevalence of healthcare associated infections among adult inpatients at nineteen large Australian acute-care public hospitals: a point prevalence survey. Antimicrob Resist Infect Control. 2019;8:114. doi: 10.1186/s13756-019-0570-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Abbas M, de Kraker MEA, Aghayev E, et al. Impact of participation in a surgical site infection surveillance network: results from a large international cohort study. J Hosp Infect. 2019;102:267–76. doi: 10.1016/j.jhin.2018.12.003. [DOI] [PubMed] [Google Scholar]
  • 3.Shenoy ES, Branch-Elliman W. Automating surveillance for healthcare-associated infections: Rationale and current realities (Part I/III) Antimicrob Steward Healthc Epidemiol. 2023;3:e25. doi: 10.1017/ash.2022.312. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Worth LJ, Bull AL, Spelman T, et al. Diminishing surgical site infections in Australia: time trends in infection rates, pathogens and antimicrobial resistance using a comprehensive Victorian surveillance program, 2002-2013. Infect Control Hosp Epidemiol. 2015;36:409–16. doi: 10.1017/ice.2014.70. [DOI] [PubMed] [Google Scholar]
  • 5.Schults J, Henderson B, Hall L, et al. Designing for transparency and trust: Next steps for healthcare associated infection surveillance in Queensland. Infect Dis Health. 2024;29:243–5. doi: 10.1016/j.idh.2024.05.002. [DOI] [PubMed] [Google Scholar]
  • 6.Sommerville M, Lim L, Bradford J, et al. An evaluation of current data handling systems for the surveillance of healthcare-associated infection in victoria [abstract]. 9th International Australasian College of Infection Prevention and Control conference 8-10 November 2021. Adelaide, South Australia: Inf Dis Health; 2023. [Google Scholar]
  • 7.Adler-Milstein J, DesRoches CM, Kralovec P, et al. Electronic Health Record Adoption In US Hospitals: Progress Continues, But Challenges Persist. Health Aff (Millwood) 2015;34:2174–80. doi: 10.1377/hlthaff.2015.0992. [DOI] [PubMed] [Google Scholar]
  • 8.Staphylococcus aureus bloodstream infection (SABSI) prevention resources. [3-Sep-2024]. https://www.safetyandquality.gov.au/our-work/infection-prevention-and-control/staphylococcus-aureus-bloodstream-infection-sabsi-prevention-resources Available. Accessed.
  • 9.Australian Institute of Health and Welfare National staphylococcus aureus bacteraemia data collection. [18-Jul-2023]. https://www.aihw.gov.au/about-our-data/our-data-collections/national-staphylococcus-aureus-bacteraemia Available. Accessed.
  • 10.Klompas M, Yokoe DS. Automated surveillance of health care-associated infections. Clin Infect Dis. 2009;48:1268–75. doi: 10.1086/597591. [DOI] [PubMed] [Google Scholar]
  • 11.van Mourik MSM, Perencevich EN, Gastmeier P, et al. Designing Surveillance of Healthcare-Associated Infections in the Era of Automation and Reporting Mandates. Clin Infect Dis. 2018;66:970–6. doi: 10.1093/cid/cix835. [DOI] [PubMed] [Google Scholar]
  • 12.Digital health institute summit: 2020 state of the EMR nation. 2020. https://www.pulseitmagazine.com.au/australian-ehealth/5820-digital-health-institutesummit-2020-state-of-the-emr-nation Available.
  • 13.Electronic medical record adoption model. [03-Sep-2024]. https://www.himss.org/what-we-do-solutions/maturity-models-emram Available. Accessed.
  • 14.Lloyd S, Long K, Probst Y, et al. Medical and nursing clinician perspectives on the usability of the hospital electronic medical record: A qualitative analysis. Health Inf Manag. 2024;53:189–97. doi: 10.1177/18333583231154624. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Hota B, Lin M, Doherty JA, et al. Formulation of a model for automating infection surveillance: algorithmic detection of central-line associated bloodstream infection. J Am Med Inform Assoc. 2010;17:42–8. doi: 10.1197/jamia.M3196. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.HAI surveillance. https://www.safetyandquality.gov.au/our-work/infection-prevention-and-control/hai-surveillance n.d. Available.
  • 17.Tong A, Sainsbury P, Craig J. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. Int J Qual Health Care. 2007;19:349–57. doi: 10.1093/intqhc/mzm042. [DOI] [PubMed] [Google Scholar]
  • 18.Hassenstein MJ, Vanella P. Data Quality—Concepts and Problems. Encyclopedia . 2022;2:498–510. doi: 10.3390/encyclopedia2010032. [DOI] [Google Scholar]
  • 19.Behnke M, Valik JK, Gubbels S, et al. Information technology aspects of large-scale implementation of automated surveillance of healthcare-associated infections. Clin Microbiol Infect. 2021;27 Suppl 1:S29–39. doi: 10.1016/j.cmi.2021.02.027. [DOI] [PubMed] [Google Scholar]
  • 20.Digital health. [03-Sep-2024]. https://www.health.vic.gov.au/quality-safety-service/digital-health Available. Accessed.
  • 21.Upadhyay S, Opoku-Agyeman W. Implementation Levels of Electronic Health Records and their Influence on Quality and Safety. Online J Nurs Inform. 2023;26 [Google Scholar]
  • 22.van Mourik MSM, van Rooden SM, Abbas M, et al. PRAISE: providing a roadmap for automated infection surveillance in Europe. Clin Microbiol Infect. 2021;27 Suppl 1:S3–19. doi: 10.1016/j.cmi.2021.02.028. [DOI] [PubMed] [Google Scholar]
  • 23.Mitchell BG, Hall L, Halton K, et al. Time spent by infection control professionals undertaking healthcare associated infection surveillance: A multi-centred cross sectional study. Infect Dis Health. 2016;21:36–40. doi: 10.1016/j.idh.2016.03.003. [DOI] [Google Scholar]
  • 24.Brekelmans M, Hopmans T, van Mourik M, et al. Evaluation of a multifaceted implementation strategy for semi-automated surveillance of surgical site infections after total hip or knee arthroplasty: a multicentre pilot study in the Netherlands. Antimicrob Resist Infect Control. 2024;13:63. doi: 10.1186/s13756-024-01418-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Trick WE, Zagorski BM, Tokars JI, et al. Computer algorithms to detect bloodstream infections. Emerg Infect Dis . 2004;10:1612–20. doi: 10.3201/eid1009.030978. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Streefkerk HRA, Verkooijen RP, Bramer WM, et al. Electronically assisted surveillance systems of healthcare-associated infections: a systematic review. Euro Surveill. 2020;25:1900321. doi: 10.2807/1560-7917.ES.2020.25.2.1900321. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Johannes Sam Aghdassi S, Darius Rüther F, Geffers C, et al. Advancing hospital infection surveillance: Automated detection of hospital-onset bacteremia in a large university hospital. ASHE. 2024;4:s146. doi: 10.1017/ash.2024.321. [DOI] [Google Scholar]
  • 28.Verberk JDM, Aghdassi SJS, Abbas M, et al. Automated surveillance systems for healthcare-associated infections: results from a European survey and experiences from real-life utilization. J Hosp Infect. 2022;122:35–43. doi: 10.1016/j.jhin.2021.12.021. [DOI] [PubMed] [Google Scholar]
  • 29.Russo PL, Cheng AC, Richards M, et al. Healthcare-associated infections in Australia: time for national surveillance. Aust Health Rev. 2015;39:37–43. doi: 10.1071/AH14037. [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

online supplemental file 1
bmjhci-32-1-s001.docx (24KB, docx)
DOI: 10.1136/bmjhci-2024-101427
online supplemental file 2
bmjhci-32-1-s002.docx (22.8KB, docx)
DOI: 10.1136/bmjhci-2024-101427
online supplemental file 3
bmjhci-32-1-s003.docx (395.5KB, docx)
DOI: 10.1136/bmjhci-2024-101427

Data Availability Statement

All data relevant to the study are included in the article or uploaded as supplementary information.


Articles from BMJ Health & Care Informatics are provided here courtesy of BMJ Publishing Group

RESOURCES