Abstract
Background
The use of electronic surveillance systems (ESSs) is gradually increasing in infection prevention and control programs. Little is known about the characteristics of hospitals that have a ESS, user satisfaction with ESSs, and organizational support for implementation of ESSs.
Methods
A total of 350 acute care hospitals in California were invited to participate in a Web-based survey; 207 hospitals (59%) agreed to participate. The survey included a description of infection prevention and control department staff, where and how they spent their time, a measure of organizational support for infection prevention and control, and reported experience with ESSs.
Results
Only 23% (44/192) of responding infection prevention and control departments had an ESS. No statistically significant difference was seen in how and where infection preventionists (IPs) who used an ESS and those who did not spend their time. The 2 significant predictors of whether an ESS was present were score on the Organizational Support Scale (odds ratio [OR], 1.10; 95% confidence interval [CI], 1.02-1.18) and hospital bed size (OR, 1.004; 95% CI, 1.00-1.007). Organizational support also was positively correlated with IP satisfaction with the ESS, as measured on the Computer Usability Scale (P = .02).
Conclusion
Despite evidence that such systems may improve efficiency of data collection and potentially improve patient outcomes, ESSs remain relatively uncommon in infection prevention and control programs. Based on our findings, organizational support appears to be a major predictor of the presence, use, and satisfaction with ESSs in infection prevention and control programs.
Keywords: Surveillance, infection prevention programs, organizational support
Electronic surveillance systems (ESSs), designed to support decision making in infection prevention and control programs by improving surveillance capabilities, have been evolving rapidly over the past 15 years.1 As the role of the infection preventionist (IP) continues to expands in scope and responsibility, surveillance continues to be one of the IP’s most time-consuming activities, with an average of 45% of the total time in infection prevention spent on surveillance and analysis.2 With the increasing availability and use of electronic medical records, information technology tools have created opportunities for automation of data collection3 and the potential to decrease the time spent on conducting manual surveillance. In a systematic literature review, Leal and Laupland4 reported that the use of an ESS decreased the time spent conducting surveillance by up to 61%. Infection prevention and control experts have predicted that the use of an ESS would allow them more time to provide leadership to clinicians to implement evidence-based processes aimed at reducing infections.5
Information is sparse on the effectiveness of ESSs in infection prevention and control programs, with studies limited to assessments of whether ESSs increase the sensitivity and specificity of surveillance over traditional methods.6-8 The purpose of the present study was to examine the utilization of ESSs in acute care hospitals in California. This study was one component of a larger parent project, the Changing Role of Infection Preventionists, funded by the Blue Shield of California Foundation to evaluate the California Healthcare-Associated Infection Prevention Initiative (CHAIPI). The goal of CHAIPI is to generate knowledge to inform evidence-based decision making for health policy makers, hospital administrators, epidemiologists, and IPs. As part of their participation in CHAIPI, some hospitals received funding to purchase an ESS. The analyses reported herein were designed to meet 3 aims: (1) to compare the differences in where and how IPs spend their time in hospitals with and without an ESS, (2) to describe characteristics of hospitals with and without an ESS, and (3) to assess IP satisfaction with ESSs.
METHODS
Setting and sample
All nonspecialty acute care hospitals in California were eligible to participate. Psychiatric, drug/alcohol rehabilitation centers, nursing homes, outpatient facilities, and children’s hospitals were excluded. Of the 350 eligible hospitals, 207 volunteered to participate (59% recruitment rate).
Survey instrument
The data collection instrument for this study was modified from a Web-based survey previously used in the Prevention of Nosocomial Infections and Cost Effectiveness study (National Institutes of Health, National Institute of Nursing Research, R01NR010107). Test-retest reliability was assessed with a mean κ = 5 0.88 for each item (standard deviation [SD] ± 0.024).2 The modified instrument comprised sections addressing demographics, infection prevention and control department staff, IP time use, and hospital and unit-based policies for infection prevention and control. For this survey, 2 additional sections were added to measure organizational support and the use of an ESS.
For this analysis, 3 sections of the survey were used: description of infection prevention and control department staffing, organizational support for infection prevention and control, and ESS use and usability. The section on organizational support included three 5-point Likert scales: Institutional Organization and Support, Senior Management Engagement, and Leadership on Patient Safety. These scales were adapted from the Patient Safety Climate in Healthcare Organizations instrument9 and were combined to create a 16-item composite Organizational Support Scale, with scores ranging from 16 to 80 and a Cronbach’s a of 0.92. Higher scores represented a higher level of management involvement in infection prevention.
Satisfaction with ESS was evaluated using a Computer Usability Scale comprising 23 items in 5 sections, with scores ranging from 23 to 115. The scale was created by modifying and merging items from 3 different instruments designed to measure computer usability.10-12 Sections included (1) advantages and disadvantages, (2) workflow, (3) user–system interface, (4) accuracy, and (5) outcome perceptions. Some questions were reverse-coded, so that a higher number always represented the positive view of ESS. The scale’s internal consistency, using Cronbach’s a, was 0.83.
In addition to the survey content on organizational support and satisfaction with the ESS, a section was included to describe IP activities, that is, how and where IPs spend their time. The categorization of activities was based on the practice analysis published by the Certification Board of Infection Control and Epidemiology.13 Respondents were asked to estimate the proportion of their work time spent in each of 8 categories of activities: (1) surveillance, defined as collecting, analyzing, and interpreting data on the occurrence of infections; (2) teaching infection prevention and control policies and procedures; (3) activities related to out-breaks; (4) daily isolation issues; (5) policy development and meetings; (6) other (eg, product evaluation, workman’s compensation, research); (7) employee/occupational health; and (8) emergency preparedness. Respondents also were asked to estimate the proportion of their time spent in the IP office, inpatient areas, outpatient areas, and long-term care areas.
Procedure
The Columbia University Medical Center’s Institutional Research Review Board approved this study. The survey was conducted between October and December 2008. One staff member from each infection prevention department was asked to complete the survey. A modified Dillman technique14 was used, including an initial invitation letter, 5 reminders, and a last-chance letter. Hospitals were contacted by mail, e-mail, and/or fax once each week. Weekly lotteries for membership in the Association for Professionals in Infection Prevention and Epidemiology (APIC) or book prizes were held to encourage participation. In addition, respondents with incomplete surveys were contacted by phone and encouraged to complete the survey.
Statistical analysis
Data were imported into SPSS version 15 (SPSS Inc, Chicago, IL) for analysis. Distributions and descriptive statistics (ie, proportions, means, and SDs) were computed on all variables. Single missing values per case on the Computer Usability Scale were imputed; the average score of the responses on individual scale items per case was calculated and inserted as the missing value to maximize the denominator. For other variables, the denominator varied because of missing values.
Aim 1: To compare the differences in where and how IPs spend their time in hospitals with and without an ESS
Student t tests or Mann-Whitney tests were used to examine mean differences in the proportions of how and where the IP spent their time. Factors that differed significantly between hospitals with and without an ESS in these bivariate analyses were entered into a logistic regression model that included other potential predictors, including hospital bed size, organizational support, presence of a hospital epidemiologist, and medical school affiliation.
Aim 2: To describe characteristics of hospitals with and without an ESS
Bivariate analyses were conducted initially using appropriate nonparametric or parametric statistics (χ2 test, t test, Mann-Whitney test, or analysis of variance [ANOVA]) to examine relationships between demographic and institutional factors and whether or not a hospital had an ESS. Variables examined included bed size, medical school affiliation, whether or not the hospital was a member of CHAIPI, to whom the IP reported (medical director, nurse executive, quality management, patient safety, or other), number of professional and support staff in the infection prevention and control department (ie, full-time equivalent positions) per 100 hospital beds, the proportion of IPs certified in infection control (ClC), the proportion of infection control directors who were members of professional organizations (Society for Healthcare Epidemiology of America [SHEA] and/or APIC), and the presence of a full or part-time hospital epidemiologist. Those variables with a P value <.1 in bivariate analyses were entered into forward-entry logistic regression models to examine potential predictors of hospitals with an ESS.
Aim 3: To assess IP satisfaction with the ESS
Among hospitals with as ESS, the correlations between satisfaction scores and other factors were assessed using Pearson’s product-moment correlation for parametric measures and Spearman’s rank-order correlation for nonparametric measures. Factors examined included daily hours that the ESS was used, years since the ESS was implemented, Organizational Support Scale score, and total years of infection prevention director experience. ANOVA was used to compare mean satisfaction scores by infection prevention director experience (<10 years or ≥10 years).
RESULTS
The average bed size of the 207 participating facilities was 237 ± 169, significantly larger than the average bed size of 187 in nonparticipating facilities in California (P = .006). The participating hospitals included 44 CHAIPI hospitals (21%) and 163 non-CHAIPI hospitals (79%). There were no significant differences in any factor analyzed, including use of an ESS, between the participating CHAIPI and non-CHAIPI hospitals.
Utilization of ESS
Among the 192 facilities that provided data on use of an ESS, 44 (23%) had an ESS. Of those, 52% reported using either a customized system or did not specify the system used; the other 48% used one of 4 commercial products (AICE, 32%; Medmined, 20%; Safety Suveillor, 9%; Theradoc, 2%). Respondents reported that the ESS was used in their hospitals primarily to create reports and data summaries from built-in templates (77%), to provide automatic alerts (57%), to integrate infection data with Centers for Disease Control and Prevention definitions and/or reporting requirements (43%), for data mining (36%), and for sharing reports with committees and administration (61%).
Staffing
About half of the facilities (53%) had at least one hospital epidemiologist, who worked an average of 13.4 hours per week in infection prevention. Eightytwo per cent of facilities had an infection control department director, most of whom were nurses (69%). Other directors were physicians (9%), masters in epidemiology (4%), medical technologists (2%), or other (14%). Half of the infection prevention and control department directors were certified in infection prevention, and 85% were members of either APIC or SHEA. Fifty-six percent of the department directors and 34% of the IPs had >10 years experience in infection prevention and control.
Aim 1: To compare the differences in where and how IPs spend time in hospitals with and without an ESS
IPs spent the greatest proportion of their time on surveillance, policy development and meetings, and consultation regardless of whether or not their hospital had an ESS. Respondents reported spending the largest proportion of their time in their offices. None of these proportions were significantly different between hospitals with or without an ESS. Although IPs in hospitals without an ESS reported spending significantly more time in occupational health and in long-term care units in bivariate analyses, regression analysis revealed no significant differences in how and where they spent their time when controlling for bed size (Table 1).
Table 1.
Proportion of time (± SD) |
|||
---|---|---|---|
ESS (n 5 40)* |
No ESS (n 5 147)* |
P value | |
Type of activity | |||
Surveillance: collecting, analyzing, and interpreting data |
40.6 (16.6) | 35.9 (15.9) | .13 |
Policy development and meetings |
13.1 (11.6) | 13.9 (9.7) | .43 |
Consultation and unit rounds | 12.2 (7.4) | 12.2 (8.1) | .8 |
Teaching: infection control ypolicies and procedures |
9.9 (6.6) | 10.9 (6.4) | .66 |
Product evaluation, workman’s compensation, research |
7.9 (6.7) | 8.7 (13) | .94 |
Daily isolation issues | 5.7 (9.6) | 7.6 (6.7) | .68 |
Activities related to outbreaks | 4.3 (5.0) | 4.3 (3.3) | .12 |
Employee/occupational health | 3.7 (3.8) | 7.1 (8.3) | .03† |
Emergency preparedness | 3.3 (2.5) | 4.3 (3.3) | .11 |
Location of activity | |||
Activities in infection prevention office |
51 (20) | 47 (24) | .32 |
Inpatient | 30.9 (18.2) | 31.9 (22.7) | .76 |
Outpatient | 7.1 (8.7) | 5.9 (9.7) | .47 |
Long-term care | 2.5 (6.4) | 6.5 (12) | .05† |
Other | 8.5 (12.7) | 8.7 (12.5) | .94 |
Not all respondents completed this section of the survey.
No significant difference in logistic regression model when controlled for bed size.
Aim 2: To describe characteristics of hospitals with and without an ESS
The bivariate analyses found no significant differences in hospitals with and without an ESS related to the location of the facility (urban, suburban, or rural), hospital affiliation with a medical school, or the number of infection prevention and control staff per 100 beds (all P ≥ .10). The proportion of IPs with CIC, the person to whom the infection prevention and control department reported, having an independent budget, infection control director membership in SHEA and/or APIC, and the number of support staff per 100 beds each differed significantly between hospitals with an ESS and those without an ESS (P < .10). Hospitals with an ESS reported 0.20 support staff per 100 beds, compared with 0.11 support staff per 100 beds in those without an ESS (P ≤ .001). Thus, these variables were entered into the logistic regression model. In the regression analysis, however, only 2 factors remained significant predictors of having an ESS: a higher score on the Organizational Support Scale (odds ratio [OR], 1.10; 95% confidence interval [CI], 1.02-1.18) and more hospital beds (OR, 1.004; 95% CI, 1.00-1.007) (Table 2).
Table 2.
Infection prevention program characteristic | ESS | No ESS |
---|---|---|
Presence of at least 1 hospital epidemiologist (part-time or full-time), % (n) | 57% (25/44) | 41% (61/148) |
Medical school affiliation, % (n) | 27% (12/44) | 20% (29/148) |
Location of facility, % (n) | ||
Urban | 45% (20/44) | 43% (64/148) |
Suburban | 39% (17/44) | 29% (42/148) |
Rural | 16% (7/44) | 28% (42/148) |
Person to whom infection prevention programs report, % (n) | ||
MD/chief medical officer only | 23% (10/44) | 38% (31/81) |
Nurse executive only | 0% (0/44) | 0% (0/81) |
Other (quality management, patient safety, other) | 21% (9/44) | 58% (47/81) |
More than one supervisor | 57% (25/44) | 4% (3/81) |
Infection control director is a member of SHEA/APIC, % (n) | 61% (27/44) | 62% (92/148 |
Infection prevention program has an independent budget, % (n) | 60% (26/43) | 44% (63/143) |
Proportion of IPs with infection prevention certification, mean ± SD | 56% ± 0.41% | 44% ± 0.42% |
Number of support staff in infection prevention program per 100 beds, mean ± SD | 0.20 ± 0.22 | 0.11 ± 0.22 |
Number of infection prevention staff per 100 beds, mean ± SD | 0.89 ± 0.56 | 1 ± 0.96 |
Organizational Support Scale score (possible range, 16-80), mean ± SD | 66.5 ± 10.5 | 64.8 ± 10.1† |
Number of hospital beds, mean ± SD | 302 ± 188.5 | 216.7 ± 152.8† |
MD, medical doctor.
Denominators vary because of missing values.
Statistically significant in final logistic regression final.
Aim 3: To assess IP satisfaction with ESSs
The mean IP satisfaction score in hospitals with an ESS was 64.2 ± 9 (range, 23-115). The only factor associated with increased IP satisfaction with the ESS was a higher score on the Organizational Support Scale (r = 0.48; P = .02). Years of experience of the infection control director, total number of hours that the ESS was used daily, and total number of years since the ESS was implemented were not significantly associated with level of satisfaction (Table 3).
Table 3.
Variable | Value | P value |
---|---|---|
Total number of hours ESS used daily, mean ± SD |
3.81 ± 3.7 | .16* |
Total number of years since ESS was implemented, mean ± SD |
4 ± 4 | .15* |
Mean score on Organizational Support Scale, mean ± SD (possible range, 16-80) |
64.2 ± 10.8 | .02† |
Years experience of infection prevention director, % (n) |
.09‡ | |
Less than 1 year | 4.5 (2/44) | |
1-3 years | 2.3 (1/44) | |
4-9 years | 6.8 (3/44) | |
More than 9 years | 47.7 (21/44) | |
Missing values | 38.6 (17/44) |
Spearman correlation.
Pearson correlation.
ANOVA.
DISCUSSION
This study provides a comprehensive profile of infection prevention and control programs in California acute care hospitals with and without an ESS. It was hypothesized that facilities with an ESS would spend less time in surveillance as automated surveillance replaced manual surveillance. In fact, a reported benefit of automation is improved resource utilization, saving about 10 weeks of infection control time annually and requiring only 1/6 to 1/3 of the time required for standard manual surveillance.5,15 However, we did not find that IPs in hospitals with an ESS devoted less time to surveillance activities or spent more time on education or with patients. Although ESS technology is increasingly used by IPs for data mining (ie, reporting trends in organisms and/or infections), such systems have been used only minimally by infection prevention staff to help make decisions about necessary interventions. The IP using an ESS should be able to spend less time on data mining and more time on data analysis and patient care.16
Professional staffing did not differ between hospitals with or without an ESS. In this analysis, professional staffing included staff IPs and department directors. With all professional staff considered, these ratios met those recommended in a Delphi project conducted in 1999.17 Indeed, these ratios exceeded those reported in other more recent studies, including a mean of 0.69 per 100 beds in a group of National Health and Safety Network hospitals and a mean of 0.56 per 100 beds in Massachusetts hospitals.2,18 However, in both of those studies, the ratio was based on IP staff and did not include department directors. The appropriate staffing, given the changing role of the IP, increased responsibilities related to mandatory reporting and the availability of technology such as ESSs is not known.
Although the ORs were only slightly greater than 1, clearly the larger facilities with better organizational support were more likely to have an ESS. This is consistent with a recent report from Kadlec Medical Center, a 172-bed regional hospital in Washington. The Vice President of Finance in this facility promoted the purchase of an ESS, and this hospital reported a 13% reduction in the rate of health care–associated infections and a savings of $576,000 after implementation of the ESS.7
Several reports have acknowledged that ESSs may not necessarily save time.16,19,20 Rather, data mining can detect new and unexpected patterns and may require additional human resources to analyze and develop interventions. Implementing an ESS involves a large investment in both dollars and human resources and requires the support of an organization that understands and values patient safety and potential cost savings though prevention of infections.19 An estimated 4000 hours is required for the development of an internal clinical data warehouse for infection control, and costs do not end with development and implementation.21 Although ESSs have been shown to be efficient and effective in screening for potential outbreaks and identifying endemic health care–associated infections, the data generated electronically must be interpreted and translated into knowledge, drawing on the critical thinking skills of the IP. This paradigm shift to automated surveillance should allow the IP to minimize the time required to identify infections and maximize time spent in prevention, focusing on education and interventions aimed at reducing health care expenditures and adverse outcomes.5 The IP could then focus on leading intervention teams to reduce infections through implementation of evidence-based infection prevention practices, such as central line–associated bloodstream infection and ventilator-acquired pneumonia prevention bundles. Cost effectiveness and improvements in clinical outcomes are the highest priorities in this paradigm shift.19,22 Hospitals that have strong organizational support are a step ahead in the commitment to an ESS and its benefits for patient care.
Organizational support also was highly correlated with IP satisfaction with an ESS (r = 0.48; P = .02). The Organizational Support Scale measured such items as satisfaction with surveillance features, access to technical support, ease of use, accuracy, and perceptions of improved patient care. Although we hypothesized that the number of years since implementation of an ESS, the number of hours an ESS is used daily, or the years of experience of the infection prevention director might be associated with satisfaction with an ESS, such was not the case. This may imply that IP satisfaction in general is higher when the organization is supportive of the IP’s mission and goals.
This is one of a few studies to examine the relationship between organizational support and the presence and use of an ESS. We also found that the proportion of time spent in surveillance did not vary significantly between those with and those without an ESS, suggesting that saving time on surveillance as a marketing pointmay be misleading to administration. Rather, ESSs provide the opportunity to spend less time on data collection and more time on decision making and implementing preventive interventions. These data suggest that in hospitals with strong leadership and engagement with patient safety, IPs in general may feel more supported in implementing ESSs and overcoming barriers, leading to greater satisfaction. This study clearly suggests that once an ESS is purchased, the need for organizational support has not ended; rather, both human and informatics resources must be committed to implementing and maintaining the program.
Although this study provides data to inform infection prevention and control programs regarding characteristics of hospitals with an ESS, the findings may not be generalized to hospitals in states other than California. In addition, the number of missing values might have affected the interpretation of the findings. Finally, we may not have had sufficient power to detect some differences between hospitals, given that there were only 44 hospitals with an ESS.
ESSs are early in their infancy, and more research is needed to determine the impact ESSs on the efficiency and effectiveness of infection prevention programs. A recent APIC position paper describes the benefits and essential components of automated electronic surveillance and notes that ESSs may ease the burden of data management in infection prevention and control programs.23 As the responsibilities of infection prevention and control department programs continue to expand into such areas as mandatory reporting, behavioral interventions, emergency preparedness, and construction, increasingly sophisticated data mining and informatics have the potential to support the improvement of resource utilization and cost-effectiveness while reducing infections through prevention activities. This study suggests that organizational support is important to the success of an ESS.
Acknowledgments
The authors thank all of the participating California hospitals. This study was funded by the Blue Shield of California Foundation (Grant BSCAFND 2490932) and conducted in collaboration with APIC.
Footnotes
Conflicts of interest: None to report.
References
- 1.Wright M. Automated surveillance and infection control: toward a better tomorrow. Am J Infect Control. 2008;36:S1–6. [Google Scholar]
- 2.Stone PW, Dick A, Pogorzelska M, Horan TC, Furuya Y, Larson E. Staffing and structure of infection prevention and control programs. Am J Infect Control. 2009;37:351–7. doi: 10.1016/j.ajic.2008.11.001. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Hota B, Jones RC, Schwartz DN. Informatics and infectious diseases: what is the connection and efficacy of information technology tools for therapy and health care epidemiology? Am J Infect Control. 2008;36:S47–56. [Google Scholar]
- 4.Leal J, Laupland KB. Validity of electronic surveillance systems: a systematic review. J Hosp Infection. 2008;69(3):220–9. doi: 10.1016/j.jhin.2008.04.030. [DOI] [PubMed] [Google Scholar]
- 5.Atreja A, Gordon SM, Pollock DA, Olmsted RN, Brennan PJ, HICPAC Opportunities and challenges in utilizing electronic hospital records for infection surveillance, prevention, and control. Am J Infect Control. 2008;36(3 Suppl):S37–46. doi: 10.1016/j.ajic.2008.01.002. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Furuno JP, Schweizer ML, McGregor JC, Perencevich EN. Economics of infection control surveillance. Am J Infect Control. 2008;36:S12–7. doi: 10.1016/j.ajic.2007.06.002. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Meek J, Tinney SM. Computerize your infection surveillance for improved patient care—and savings. Health Financ Manage. 2006;60:108–12. [PubMed] [Google Scholar]
- 8.Wright MO, Perenvich EN, Novak C, Hebden JN, Standiford HC, Harris AD. Preliminary of an automated surveillance system for infection control. Infect Control Hosp Epidemiol. 2004;25:325–32. doi: 10.1086/502400. [DOI] [PubMed] [Google Scholar]
- 9.Singer S, Meterko M, Baker L, Gaba D, Falwell A, Rosen A. Workforce perceptions of hospital safety culture: development and validation of the patient safety climate in healthcare organizations survey. Health Serv Res. 2007;42:1999–2022. doi: 10.1111/j.1475-6773.2007.00706.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Dykes PC, Hurley A, Cashen M, Bakken S, Duffy ME. Development and psychometric evaluation of the Impact of Health Information Technology (I-HIT) scale. J Am Med Inform Assoc. 2007;14:507–14. doi: 10.1197/jamia.M2367. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Lee F, Teich JM, Sputt CD, Bats DW. Implementation of physician order entry: user satisfaction and self-reported usage patterns. J Am Med Inform Assoc. 1996;3:42–55. doi: 10.1136/jamia.1996.96342648. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Doll WJ, Xia WD, Torkzadeh G. A confirmatory factor analysis of the end-user computing satisfaction instrument. MIS Quart. 1997;18:453–61. [Google Scholar]
- 13.Curchoe R, Fabrey L, LeBlanc M. The changing role of infection prevention practice as documented by the Certification Board of Infection Control and Epidemiology practice analysis survey. Am J Infect Control. 2008;36:241–9. doi: 10.1016/j.ajic.2007.10.010. [DOI] [PubMed] [Google Scholar]
- 14.Dillman D. Mail and telephone surveys: the total design method. Vol. 614. Wiley; New York: 1978. [Google Scholar]
- 15.Hebden JN, Wright M, Fuss EP, Standiford HC. Leveraging surveillance technology to benefit the practice and profession of infection control. Am J Infect Control. 2008;34:S7–11. [Google Scholar]
- 16.Boyd JP. Surveillance, reporting, automation, and interventional epidemiology. Infect Control Hosp Epidemiol. 2003;24:10–2. doi: 10.1086/502108. [DOI] [PubMed] [Google Scholar]
- 17.O’Boyle C, Jackson M, Henly SJ. Staffing requirements for infection control programs in US health care facilities: Delphi Project. Am J Infect Control. 2002;30:321–33. doi: 10.1067/mic.2002.127930. [DOI] [PubMed] [Google Scholar]
- 18.Betsy Lehman Center for Patient Safety and Medical Error Reduction. JSI Research and Training Institute Inc. Massachusetts Department of Public Health [Accessed August 21, 2008];Prevention and control of health care–associated infections in Massachusetts. Part 2: findings from complementary research activities. Available at http://www.mass.gov/Eeohhs2/docs/dph/patient_safety/haipcp_final_report.pdf.
- 19.Furuno JP, Schweizer ML, McGregor JC, Perencevich EN. Economics of infection control surveillance technology: cost-effective or just cost? Am J Infect Control. 2008;36(3 Suppl):468–71. doi: 10.1016/j.ajic.2007.06.002. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Murphy DM. From expert data collectors to interventionists: changing the focus for infection control professionals. Am J Infect Control. 2002;30:120–32. doi: 10.1067/mic.2002.120526. [DOI] [PubMed] [Google Scholar]
- 21.Winsnieski MF, Kieszkowski P, Zagorski BM, Trick WE, Sommer M, Weinstein RA. Development of a clinical data warehouse for hospital infection control. J Am Med Inform Assoc. 2003;10:454–62. doi: 10.1197/jamia.M1299. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Aboelela SW, Stone PW, Larson EL. Effectiveness of bundled behavioral interventions to control healthcare-associated infections: a systematic review of the literature. J Hosp Infect. 2007;66:101–8. doi: 10.1016/j.jhin.2006.10.019. [DOI] [PubMed] [Google Scholar]
- 23.Greene LR, Cain TA, Khoury R, Krystofiak SP, Patrick M, Streed S. APIC position paper: the importance of surveillance technologies in the prevention of health care–associated infections. Am J Infect Control. 2009;37:510–3. [Google Scholar]