Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2016 Jun 1.
Published in final edited form as: Am J Infect Control. 2015 Jun 1;43(6):600–605. doi: 10.1016/j.ajic.2015.02.006

Data Elements and Validation Methods Used for Electronic Surveillance of Healthcare-Associated Infections: A Systematic Review

Kenrick D Cato 1, Bevin Cohen 1,2, Elaine Larson 1,2
PMCID: PMC4456686  NIHMSID: NIHMS686003  PMID: 26042848

Abstract

Objective

This study describes the primary data sources, data elements, and validation methods currently used in electronic surveillance systems (ESS) for identification and surveillance of healthcare-associated infections (HAIs), and compares these data elements and validation methods with recommended standards.

Methods

Using PRISMA guidelines, a PubMed and manual search was conducted to identify research articles describing ESS for identification and surveillance of HAIs published from January 1, 2009 through August 31, 2014. Selected articles were evaluated to determine what data elements and validation methods were included.

Results

Among the 509 articles identified in the original literature search, 30 met the inclusion criteria. While the majority of studies (83%) used recommended data sources and validated the numerator (80%), only 10 percent of studies performed external and internal validation. In addition, there was variation in ESS data formats used.

Conclusions

The findings of this review suggest that the majority of ESS for HAI surveillance are using standard definitions, but the lack of widespread internal data, denominator, and external validation in these systems reduces the reliability of their findings. Additionally, advanced programming skills are required to create, implement and maintain these systems and to reduce the variability in data formats.

Keywords: Electronic Surveillance System, Healthcare Associated Infection, Automated Surveillance

INTRODUCTION

For the past three decades, surveillance has been recognized as the cornerstone of effective infection prevention and control programs1, but traditional manual surveillance methods are labor intensive and limited by inter-observer variability2. To address these issues, the infection prevention community has pursued the development of automated electronic surveillance systems (ESS). While ESS using electronically available patient data have been found to be accurate and potentially time saving 35, their performance is not consistent across settings 6. The performance of ESS often depends on implementation issues related to data sources and data capture 7. This review utilizes an adapted framework 8 to 1) describe primary data sources, data elements, and validation methods currently used in ESS for identification and surveillance of healthcare-associated infections (HAIs), and 2) compare these data elements and validation methods with recommended standards.

METHODS

Search strategies and information sources

Using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses Statement (http://www.prisma-statement.org/) as a guide, we conducted a systematic search of published literature that evaluated ESS for HAIs. The PubMed system was used to search for publications indexed by Medline from January 1, 2009 through August 31, 2014. Manual searches were also performed by scanning the bibliographies of eligible original research papers and systematic reviews.

Eligibility criteria and study selection

Selected articles had to describe an automated system that performed electronic HAI surveillance, relevant data sources used in the system, and any system validation performed. Studies that used the electronic health record as solely a means for conducting chart review were excluded, as well as those that investigated predictive risk modeling for HAI. We used the filters ‘human’, ‘abstract’, and ‘English language’ on all searches. Table 1 summarizes the PubMed search query.

Table 1.

Search term used for PubMed query

Search Term
((“cross infection”[All Fields] OR “Algorithms”[All Fields] OR “automated surveillance”[All Fields] OR “Automatic Data Processing”[All Fields] OR “Bacteremia/classification”[All Fields] OR “Infection Control”[All Fields] OR “Sentinel Surveillance”[All Fields] OR “Electronic Surveillance System”[All Fields] OR “Surgical Wound Infection”[All Fields] OR “Surgical site infection”[All Fields] OR “Population Surveillance/methods”[All Fields] OR “Hospital Information Systems”[All Fields] OR “Diagnosis, Computer-Assisted”[All Fields] OR “Data Collection/methods”[All Fields] OR “hospital acquired infection”[All Fields] OR “hospital associated infection”[All Fields] OR “healthcare associated infection”[All Fields] OR Fields[All Fields] OR “patient to patient infection”[All Fields] OR “nosocomial infection”[All Fields] OR “catheter related infection”[All Fields] OR “CLABSI”[All Fields] OR “BSI”[All Fields] OR “Urinary Tract Infections”[All Fields] OR “central line associated bloodstream infection”[All Fields]) AND (“electronic health record”[All Fields] OR “EHR”[All Fields] OR “EMR”[All Fields] OR “electronic medical record”[All Fields] OR “computerized medical record”[All Fields])AND “2009/01/01”[PDAT]: “2014/09/01”[PDAT]))

Assessment of studies

To ensure articles matched the eligibility criteria, titles and abstracts were evaluated independently by each author and discrepant cases were settled by consensus. Full text of the remaining articles were then reviewed independently by each author to verify that they met the inclusion criteria. After articles that met the inclusion criteria were identified, they were assessed utilizing a modified framework originally developed by Woeltje and colleagues 8. The first three articles were independently assessed and then discussed by all three authors, and any discrepancies were resolved by consensus. The remaining articles were abstracted by one of the three authors using an assessment framework.

The assessment framework developed by Woeltje, et al. 8 had two main components. First, for each of five types of infection, recommended data elements for ESS were outlined based on National Healthcare Safety Network definitions for HAI surveillance (http://www.cdc.gov/nhsn/pdfs/pscmanual/17pscnosinfdef_current.pdf). These included central line-associated bloodstream infection (CLABSI), catheter-associated urinary tract infection (CAUTI), surgical site infection (SSI), ventilator-associated event (VAC, IVAC), MDRO module, and Clostridium difficile module. We added bloodstream infection (BSI), urinary tract infection (UTI), ventilator-associated pneumonia (VAP), and pneumonia (PNU) to the list of HAI because these were also investigated in the articles we reviewed. Second, four key concepts for describing data validation were recommended: internal and external validation and validation of numerator and denominator 8. Based on this framework, we evaluated each article to determine whether all recommended data elements were included and whether recommended validations were performed. The Woeltje, et al. framework was modified only for surgical site infections, for which it was decided that an ESS would not require both procedure and diagnostic codes because there is considerable overlap between the two; thus this data element was considered present if either type of code was used.

Results

As Figure 1 illustrates, 509 articles were initially identified (Table 1 lists the full search text). After removing duplicate citations and limiting articles to those with available abstracts, 383 abstracts were screened. An additional 77 were excluded during title and abstract review, primarily because they did not pertain to automated ESS. Full text assessment of 35 articles resulted in 30 final studies that met inclusion criteria.

Figure 1.

Figure 1

PubMed search query for automated HAI surveillance systems

Table 2 provides a summary of each study reviewed, which included an array of HAIs: BSI=10 and CLABSI=5; UTI =7 and CA-UTI=7; SSI=5; MDRO=3; any ventilator associated events=1 and PNU=2; and C. difficile=3. The majority of studies, 83%, used the recommended HAI-specific data sources in their ESS.

Table 2.

Data elements and validation used by ESS studies

Group Article Study setting and size HAI types Data sources used Used recommended data sources8 Validation performed
Benoit et al 201122 Thirty-four hospitals, 4,585 patients, 12 states C. difficile Microbiology cultures, ADT Yes None
Branch-Elliman 201420 Five acute care VA hospitals MDRO Microbiology cultures No None
Choudhuri et al 201424 Teaching hospital 136 patients CAUTI Microbiology cultures, ADT, vital signs, urinalysis Yes Internal, Numerator,
De Bus 201426 University hospital 14 bed MICU, 22 bed SICU, 876 admissions UTI, BSI Microbiology cultures, ADT, vital signs, urinalysis Yes Internal, Denominator Numerator
Dubberke et al 20123 Four hospitals C. difficile Microbiology cultures, ADT Yes Numerator
FitzHenry et al 201321 Six VA hospitals, 33,565 patients SSI Microbiology cultures, ADT Yes Numerator
Harron et al 201327 109,654 pediatric inpatient records BSI Microbiology cultures, ADT Yes Internal, Numerator
Herasevich et al 201038 Academic medical center, 204 bed ICU BSI Microbiology cultures, ADT Yes Internal, Numerator
Inacio et all 201115 Large HMO SSI Microbiology cultures, ADT, hospital procedure codes, hospital diagnosis codes Yes Numerator, Denominator
Klein et al 201416 Tertiary referral centers 2,080 patients VAC, VAE Microbiology cultures, ADT, vital signs, presence of endotracheal intubation device, laboratory (white blood cell count), ventilator settings (PEEP, FiO2 antimicrobial use) Yes Numerator
Knepper et al 20139 Academic safety-net hospital, 2,449 surgical procedures SSI Microbiology cultures, ADT, antibiotic administration, hospital procedure codes, hospital diagnosis codes Yes Numerator
Leal et al 20104 Regional hospital, 306 patients BSI Microbiology cultures, ADT Yes Numerator
Leth et al 201018 1504 C-Section cases UTI Microbiology cultures, ADT, vital signs, urinalysis Yes Numerator
Lo et al 201328,39 Tertiary care teaching hospital, 730-bed CAUTI Microbiology cultures, ADT, vital signs, urinary catheter present, urinalysis Yes Numerator
Peterson et al 201228 Health system, over 300,000 patients MDRO Microbiology cultures, ADT Yes Numerator
Schmiedeskamp et al 2009 1923,920 inpatient discharges C. difficile Microbiology cultures, ADT Yes Numerator
Shepard et al 201412 A 583-bed adult tertiary care center, 6,379 urine cultures CAUTI Microbiology cultures, ADT, vital signs, urinary catheter present, urinalysis Yes Numerator
Venable et al 200940 A 420-bed teaching hospital CLABSI, CAUTI Microbiology cultures, ADT, vital signs, urinary catheter present, urinalysis, central venous catheter presence Yes Numerator
Wald et al 201429 A 425-bed university hospital, 1,695 patient visits CAUTI Microbiology cultures, ADT, vital signs, urinary catheter present, Yes Internal, Numerator
Woeltje et al 201130 A 1,250-bed tertiary care teaching hospital CLABSI Microbiology cultures, ADT No Internal, Numerator
National Taiwan U. Tseng et al 201214 A 2200-bed teaching hospital MDRO Microbiology cultures, ADT Yes Numerator
National Taiwan U. Tseng et al 201313 A 2200-bed teaching hospital BSI Microbiology cultures, ADT Yes Internal, Numerator
National Taiwan U. Wu et al 200941 A 2200-bed teaching hospital MDRO Microbiology cultures, ADT Yes Internal, Numerator
Columbia Apte, Landers et. al 201110 Academic medical center 28,956 hospital admissions SSI Microbiology cultures, ADT, antibiotic administration, hospital procedure codes No Internal, Numerator, Denominator
Columbia Apte, Neidell et al 201111 Academic medical center 320,000 inpatient discharges BSI, PNU, SSI, UTI Microbiology cultures, ADT, urinary catheter present, urinalysis, central venous catheter presence, AD, antibiotic administration, No None
Columbia Landers et al 201017 Academic medical center 33,834 hospital admissions UTI Microbiology cultures, ADT, urinalysis No None
MONI Adlassnig et al 200923 Teaching hospital 96 ICU beds UTI, CAUTI, BSI, CLABSI Microbiology cultures, ADT, vital signs, urinary catheter present, urinalysis, central venous catheter presence Yes None
MONI Adlassnig et al 20145 Teaching hospital, 87 adult ICU beds, 46 NICU beds UTI, CAUTI, BSI, CLABSI Microbiology cultures, ADT, vital signs, urinary catheter present, urinalysis, central venous catheter presence Yes None
MONI de Bruin et al 201325 Teaching hospital 75 ICU beds BSI Microbiology cultures, ADT, vital signs, urinalysis, biochemistry laboratory Yes Internal
MONI Blacky 201142 Teaching hospital 96 ICU patients in 12 ICUs BSI, CLABSI, PNU, UTI Microbiology cultures, ADT, vital signs, urinary catheter present, urinalysis, central venous catheter presence, antibiotic administration Yes Numerator

ICU, intensive care unit; MICU, medical intensive care unit; SICU, surgical intensive care unit; ADT, admission discharge transfer information; BSI, bloodstream infection; UTI, urinary tract infection; CAUTI, catheter-associated urinary tract infection; MDRO, multi-drug resistant organisms; SSI, surgical site infection; VAE, ventilator-associated events; VAP, ventilator-associated pneumonia; PNU, pneumonia

The articles reviewed did not always report how clinical facts (e.g. laboratory results, diagnosis, medications administer) were annotated and the corresponding vocabularies used to format the related data. However, there was variation in data formats for the studies that did provide a detailed description of data used by their ESS. These formats varied from unstructured, non-coded, and institution-specific coded data to internationally and nationally adopted formats like ICD-9. To determine antibiotics administered, textual medication names911, and institution-specific codes1214 formats were used. ICD-94,911,1519, SNOMED-CT2022 and free text from notes were used to determine hospital billing diagnosis and procedures10,11,17,21. Microbiology results were formatted in institution-specific codes10,11,13,14,17, textual results35,9,12,16,18,2230, and LOINC22 codes.

Validation performed

Validation of the numerator was performed most often (80%, 24/30 studies). Checking of the actual data with an independent data source, also referred to as internal validation, was done in 33% (10/30) of the studies. External validation, e.g., having an external organization validate the ESS findings, was not used in any of the studies in our sample. Ten percent of the studies (3/30) reported having validated the denominator.

Discussion

The ideal ESS would be fully automated and accurately identify infections without human input. The goal of our literature review was to assess the state of science with regard to electronic surveillance of HAI (e.g., how close we are to full automation). A number of themes emerged from the review relating to data availability, lack of standardized sources of data, the complexity of the ESS and the lack of validation of the surveillance systems.

The increased number of data sources used in ESS reflects the fact that more electronic clinical data continues to become available6. However, the fact that 17% of the studies in this review did not use the recommended data suggests that availability of relevant data is still one gap that must be filled before fully automated surveillance of HAI is possible.

The fact that a majority of studies utilized the recommended data elements to define an HAI is encouraging. Still, it is important to note that, for example, even though all of the ESS utilized microbiology laboratory results, there was great variability in the structure of the actual results. This lack of standardization of data is an impediment to having ESS that can be implemented uniformly across settings. Furthermore, lack of uniformity in data input increases the complexity of the systems and the required resources to implement and maintain these systems.

In our review most of the studies were conducted in academic medical centers, Veteran’s Administration hospitals, and one large health maintenance organization (HMO). This finding reflects that fact that only institutions with considerable financial resources can afford to implement these systems. Moreover, ESS are complex, range in sophistication, but in general they require high level programing and technologic support. Clearly, the specialized work force that is required to support these systems is sorely lacking31,32. The creation, implementation and management of ESS systems require individuals who understand nuances of clinical data and its analytical techniques, and have ability to extract and transform standard and non-standard data sources.

None of the studies in this review met the criteria of having internal, external, denominator and numerator validation. It is important to note that denominator validation is often not applicable and feasible to calculate for the HAIs that are applicable to an entire inpatient population. While the logistics of accomplishing the task of validating the data are daunting, validation is vital and must be performed on these systems 33. Research has indicated that ESS studies which have performed the requisite numerator, denominator and/or external validation found high variability in sensitivity/specificity 7,34. In addition, studies have also highlighted the lack of completeness35 and bias36,37 of electronic patient data.

This literature review was limited by the inclusion of only articles written in English with abstracts, and use of a single database with a limited number of search terms. Therefore, some articles pertaining to the topic could have been missed. Nonetheless, our findings present the state of the science in ESS research and point to important future directions for continued investigation. In summary, we recommend that future ESS HAI surveillance focus on obtaining high-quality data, employing dedicated programmers with advanced skills, and performing more thorough validation.

Acknowledgments

Financial support. This study was funded by a grant from The National Institute of Nursing Research, 3R01NR010822.

Footnotes

Potential conflicts of interest. All authors report no conflicts of interest relevant to this article.

Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

References

  • 1.Haley RW, Culver DH, White JW, et al. The efficacy of infection surveillance and control programs in preventing nosocomial infections in US hospitals. American journal of epidemiology. 1985;121:182–205. doi: 10.1093/oxfordjournals.aje.a113990. [DOI] [PubMed] [Google Scholar]
  • 2.Lin MY, Hota B, Khan YM, et al. Quality of traditional surveillance for public reporting of nosocomial bloodstream infection rates. JAMA: the journal of the American Medical Association. 2010;304:2035–41. doi: 10.1001/jama.2010.1637. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Dubberke ER, Nyazee HA, Yokoe DS, et al. Implementing automated surveillance for tracking Clostridium difficile infection at multiple healthcare facilities. Infection control and hospital epidemiology: the official journal of the Society of Hospital Epidemiologists of America. 2012;33:305–8. doi: 10.1086/664052. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Leal J, Gregson DB, Ross T, Flemons WW, Church DL, Laupland KB. Development of a novel electronic surveillance system for monitoring of bloodstream infections. Infection control and hospital epidemiology: the official journal of the Society of Hospital Epidemiologists of America. 2010;31:740–7. doi: 10.1086/653207. [DOI] [PubMed] [Google Scholar]
  • 5.Adlassnig KP, Berger A, Koller W, et al. Healthcare-associated infection surveillance and bedside alerts. Studies in health technology and informatics. 2014;198:71–8. [PubMed] [Google Scholar]
  • 6.Kashiouris M, O’Horo JC, Pickering BW, Herasevich V. Diagnostic performance of electronic syndromic surveillance systems in acute care: a systematic review. Applied clinical informatics. 2013;4:212–24. doi: 10.4338/ACI-2012-12-RA-0053. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.de Bruin JS, Seeling W, Schuh C. Data use and effectiveness in electronic surveillance of healthcare associated infections in the 21st century: a systematic review. Journal of the American Medical Informatics Association. 2014;21:942–51. doi: 10.1136/amiajnl-2013-002089. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Woeltje KF, Lin MY, Klompas M, Wright MO, Zuccotti G, Trick WE. Data requirements for electronic surveillance of healthcare-associated infections. Infection control and hospital epidemiology: the official journal of the Society of Hospital Epidemiologists of America. 2014;35:1083–91. doi: 10.1086/677623. [DOI] [PubMed] [Google Scholar]
  • 9.Knepper BC, Young H, Jenkins TC, Price CS. Time-saving impact of an algorithm to identify potential surgical site infections. Infection Control & Hospital Epidemiology. 2013;34:1094–8. doi: 10.1086/673154. [DOI] [PubMed] [Google Scholar]
  • 10.Apte M, Landers T, Furuya Y, Hyman S, Larson E. Comparison of two computer algorithms to identify surgical site infections. Surgical infections. 2011;12:459–64. doi: 10.1089/sur.2010.109. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Apte M, Neidell M, Furuya EY, Caplan D, Glied S, Larson E. Using electronically available inpatient hospital data for research. Clinical and translational science. 2011;4:338–45. doi: 10.1111/j.1752-8062.2011.00353.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Shepard J, Hadhazy E, Frederick J, et al. Using electronic medical records to increase the efficiency of catheter-associated urinary tract infection surveillance for National Health and Safety Network reporting. American journal of infection control. 2014;42:e33–6. doi: 10.1016/j.ajic.2013.12.005. [DOI] [PubMed] [Google Scholar]
  • 13.Tseng YJ, Wu JH, Lin HC, et al. Rule-based healthcare-associated bloodstream infection classification and surveillance system. Studies in Health Technology & Informatics. 2013;186:145–9. [PubMed] [Google Scholar]
  • 14.Tseng YJ, Wu JH, Ping XO, et al. A Web-based multidrug-resistant organisms surveillance and outbreak detection system with rule-based classification and clustering. Journal of Medical Internet Research. 2012;14:e131. doi: 10.2196/jmir.2056. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Inacio MC, Paxton EW, Chen Y, et al. Leveraging electronic medical records for surveillance of surgical site infection in a total joint replacement population. Infection Control & Hospital Epidemiology. 2011;32:351–9. doi: 10.1086/658942. [DOI] [PubMed] [Google Scholar]
  • 16.Klein Klouwenberg PM, van Mourik MS, Ong DS, et al. Electronic implementation of a novel surveillance paradigm for ventilator-associated events. Feasibility and validation American Journal of Respiratory & Critical Care Medicine. 2014;189:947–55. doi: 10.1164/rccm.201307-1376OC. [DOI] [PubMed] [Google Scholar]
  • 17.Landers T, Apte M, Hyman S, Furuya Y, Glied S, Larson E. A comparison of methods to detect urinary tract infections using electronic data. Joint Commission journal on quality and patient safety/Joint Commission Resources. 2010;36:411–7. doi: 10.1016/s1553-7250(10)36060-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Leth RA, Norgaard M, Uldbjerg N, Thomsen RW, Moller JK. Surveillance of selected post-caesarean infections based on electronic registries: validation study including post-discharge infections. The Journal of hospital infection. 2010;75:200–4. doi: 10.1016/j.jhin.2009.11.018. [DOI] [PubMed] [Google Scholar]
  • 19.Schmiedeskamp M, Harpe S, Polk R, Oinonen M, Pakyz A. Use of International Classification of Diseases, Ninth Revision, Clinical Modification codes and medication use data to identify nosocomial Clostridium difficile infection. Infection control and hospital epidemiology: the official journal of the Society of Hospital Epidemiologists of America. 2009;30:1070–6. doi: 10.1086/606164. [DOI] [PubMed] [Google Scholar]
  • 20.Branch-Elliman W, Strymish J, Gupta K. Development and validation of a simple and easy-to-employ electronic algorithm for identifying clinical methicillin-resistant Staphylococcus aureus infection. Infection Control & Hospital Epidemiology. 2014;35:692–8. doi: 10.1086/676437. [DOI] [PubMed] [Google Scholar]
  • 21.FitzHenry F, Murff HJ, Matheny ME, et al. Exploring the frontier of electronic health record surveillance: the case of postoperative complications. Medical care. 2013;51:509–16. doi: 10.1097/MLR.0b013e31828d1210. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Benoit SR, McDonald LC, English R, Tokars JI. Automated surveillance of Clostridium difficile infections using BioSense. Infection Control & Hospital Epidemiology. 2011;32:26–33. doi: 10.1086/657633. [DOI] [PubMed] [Google Scholar]
  • 23.Adlassnig KP, Blacky A, Koller W. Artificial-intelligence-based hospital-acquired infection control. Studies in health technology and informatics. 2009;149:103–10. [PubMed] [Google Scholar]
  • 24.Choudhuri JA, Pergamit RF, Chan JD, et al. An electronic catheter-associated urinary tract infection surveillance tool. Infection control and hospital epidemiology: the official journal of the Society of Hospital Epidemiologists of America. 2011;32:757–62. doi: 10.1086/661103. [DOI] [PubMed] [Google Scholar]
  • 25.de Bruin JS, Blacky A, Koller W, Adlassnig KP. Validation of fuzzy sets in an automated detection system for intensive-care-unit-acquired central-venous-catheter-related infections. Studies in Health Technology & Informatics. 2013;192:215–8. [PubMed] [Google Scholar]
  • 26.De Bus L, Diet G, Gadeyne B, et al. Validity analysis of a unique infection surveillance system in the intensive care unit by analysis of a data warehouse built through a workflow-integrated software application. Journal of Hospital Infection. 2014;87:159–64. doi: 10.1016/j.jhin.2014.03.010. [DOI] [PubMed] [Google Scholar]
  • 27.Harron K, Goldstein H, Wade A, Muller-Pebody B, Parslow R, Gilbert R. Linkage, evaluation and analysis of national electronic healthcare data: application to providing enhanced blood-stream infection surveillance in paediatric intensive care. PLoS ONE [Electronic Resource] 2013;8:e85278. doi: 10.1371/journal.pone.0085278. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Peterson KE, Hacek DM, Robicsek A, Thomson RB, Jr, Peterson LR. Electronic surveillance for infectious disease trend analysis following a quality improvement intervention. Infection control and hospital epidemiology: the official journal of the Society of Hospital Epidemiologists of America. 2012;33:790–5. doi: 10.1086/666625. [DOI] [PubMed] [Google Scholar]
  • 29.Wald HL, Bandle B, Richard A, Min S. Accuracy of electronic surveillance of catheter-associated urinary tract infection at an academic medical center. Infection control and hospital epidemiology: the official journal of the Society of Hospital Epidemiologists of America. 2014;35:685–91. doi: 10.1086/676429. [DOI] [PubMed] [Google Scholar]
  • 30.Woeltje KF, McMullen KM, Butler AM, Goris AJ, Doherty JA. Electronic surveillance for healthcare-associated central line-associated bloodstream infections outside the intensive care unit. Infection control and hospital epidemiology: the official journal of the Society of Hospital Epidemiologists of America. 2011;32:1086–90. doi: 10.1086/662181. [DOI] [PubMed] [Google Scholar]
  • 31.Hersh W, Wright A. What Workforce is Needed to Implement the Health Information Technology Agenda?. Analysis from the HIMSS Analytics™ Database; AMIA Annual Symposium Proceedings; 2008; American Medical Informatics Association; p. 303. [PMC free article] [PubMed] [Google Scholar]
  • 32.Hersh WR, Weiner MG, Embi PJ, et al. Caveats for the use of operational electronic health record data in comparative effectiveness research. Medical care. 2013;51:S30–S7. doi: 10.1097/MLR.0b013e31829b1dbd. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Arnold KE, Thompson ND. Building Data Quality and Confidence in Data Reported to the National Healthcare Safety Network. Infection Control and Hospital Epidemiology. 2012;33:446–8. doi: 10.1086/665311. [DOI] [PubMed] [Google Scholar]
  • 34.Haut ER, Pronovost PJ. Surveillance bias in outcomes reporting. JAMA: the journal of the American Medical Association. 2011;305:2462–3. doi: 10.1001/jama.2011.822. [DOI] [PubMed] [Google Scholar]
  • 35.Weiskopf NG, Hripcsak G, Swaminathan S, Weng C. Defining and measuring completeness of electronic health records for secondary use. Journal of biomedical informatics. 2013;46:830–6. doi: 10.1016/j.jbi.2013.06.010. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Hripcsak G, Knirsch C, Zhou L, Wilcox A, Melton G. Bias associated with mining electronic health records. Journal of biomedical discovery and collaboration. 2011;6:48–52. doi: 10.5210/disco.v6i0.3581. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.van Mourik MSM, Troelstra A, van Solinge WW, Moons KGM, Bonten MJM. Automated Surveillance for Healthcare-Associated Infections: Opportunities for Improvement. Clinical Infectious Diseases. 2013;57:85–93. doi: 10.1093/cid/cit185. [DOI] [PubMed] [Google Scholar]
  • 38.Herasevich V, Pickering BW, Dong Y, Peters SG, Gajic O. Informatics Infrastructure for Syndrome Surveillance, Decision Support, Reporting, and Modeling of Critical Illness. Mayo Clinic Proceedings. 2010;85:247–54. doi: 10.4065/mcp.2009.0479. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39.Lo YS, Lee WS, Liu CT. Utilization of electronic medical records to build a detection model for surveillance of healthcare-associated urinary tract infections. Journal of Medical Systems. 2013;37:9923. doi: 10.1007/s10916-012-9923-2. [DOI] [PubMed] [Google Scholar]
  • 40.Venable A, Dissanaike S. Is automated electronic surveillance for healthcare-associated infections accurate in the burn unit? Journal of Burn Care & Research. 2013;34:591–7. doi: 10.1097/BCR.0b013e3182a2aa0f. [DOI] [PubMed] [Google Scholar]
  • 41.Wu J-H, Chen Y-C, Hsieh S-h, et al. Real-time Automated MDRO Surveillance System. BIOCOMP. 2009:764–9. [Google Scholar]
  • 42.Blacky A, Mandl H, Adlassnig KP, Koller W. Fully Automated Surveillance of Healthcare-Associated Infections with MONI-ICU. A Breakthrough in Clinical Infection Surveillance. Applied Clinical Informatics. 2011;2:365–72. doi: 10.4338/ACI-2011-03-RA-0022. [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES