Skip to main content
Lippincott Open Access logoLink to Lippincott Open Access
. 2019 Dec 11;25(1):E11–E16. doi: 10.1097/PHH.0000000000000768

Electronic Health Record Implementation Findings at a Large, Suburban Health and Human Services Department

Kenyon Crowley 1,, Anubhuti Mishra 1, Raul Cruz-Cano 1, Robert Gold 1, Dushanka Kleinman 1, Ritu Agarwal 1
PMCID: PMC7329137  PMID: 29324567

Objective:

Evaluate an electronic health record (EHR) implementation across a large public health department to better understand and improve implementation effectiveness of EHRs in public health departments.

Design:

A survey based on Consolidated Framework for Implementation Research constructs was administered to staff before and after implementation of an EHR.

Setting:

Large suburban county department of health and human services that provides clinical, behavioral, social, and oral health services.

Participants:

Staff across 4 program areas completed the survey prior to EHR implementation (n = 331, June 2014) and 3 months post-EHR final implementation (n = 229, December 2015).

Intervention:

Electronic health record

Main Outcome Measures:

Constructs were validated using confirmatory factor analysis and included information strengths and information gaps in the current environment; EHR impacts; ease of use; future use intentions; usefulness; knowledge of system; and training. Paired t tests and Wilcoxon signed rank tests of a matched sample were performed to compare the pre-/postrespondent scores.

Results:

A majority of user perceptions and expectations showed a significant (P < .05) decline 3 months postimplementation as compared with the baseline with variation by service area and construct. Staff perceived the EHR to be less useful and more complex, provide fewer benefits, and reduce information access shortly after implementation.

Conclusions:

Electronic health records can benefit public health practices in many ways; however, public health departments will face significant challenges incorporating EHRs, which are typically designed for non–public health settings, into the public health workflow. Electronic health record implementation recommendations for health departments are provided. When implementing an EHR in a public health setting, health departments should provide extensive preimplementation training opportunities, including EHR training tailored to job roles, competencies, and tasks; assess usability and specific capabilities at a more granular level as part of procurement processes and consider using contracting language to facilitate usability, patient safety, and related evaluations to enhance effectiveness and efficiencies and make results public; apply standard terminologies, processes, and data structures across different health department service areas using common public health terminologies; and craft workforce communication campaigns that balance potential expected benefits with realistic expectations.

Keywords: electronic health records, implementation research, public health information technology


Astrong public health system is crucially dependent on the availability of high-quality information.1,2 However, many health departments across the United States struggle in their efforts to collect, use, and share essential information.3 Public health delivery systems face multiple and distinct information management challenges.3 The potential for overcoming information management challenges through the application of information technologies has been widely acknowledged among stakeholders in the health care ecosystem in general4 and public health in particular.57

Public health information technology encompasses the array of information systems to store, share, and analyze information supporting the public health mission.8 Electronic health records (EHRs), in particular, have been identified as foundational information technology infrastructure for advancing health care quality and reducing delivery costs.9,10 Electronic health record adoption within and across health departments is needed as widely as they have across other health system channels to ensure a strong public health system that promotes individual and population health.10 Yet, EHR adoption in health departments remains strikingly low with recent estimates approximating adoption rates of 37% to 42%.11,12

To better understand and improve EHR implementation at health departments, we provide a detailed field evaluation of an EHR system implementation in a large (>500 000 population served) suburban public health and human services department located near Washington, District of Columbia.

Methods

Study design

A two-phase cross-sectional survey was administered to health department staff that would have access to the EHR. Health department supervisors informed their staff about the study; staff were then sent e-mail invitations from the research team to participate in the survey. The first survey was conducted prior to EHR implementation with employees (clinical, case management, and administrative staff) across 4 service areas (behavioral health and crises services, public health services, children youth and family services, and office of the director) situated in 7 facilities. The second survey was administered at the same facilities approximately 3 to 4 months after EHR implementation. The University of Maryland and Maryland Department of Health Institutional Review Boards approved the protocol and informed consent was obtained for all participants.

Measures

Drawing on the Consolidated Framework for Implementation (CFIR)13 as a theoretical framework, we designed the survey to measure constructs related to “individuals involved” and the “inner setting.” For the individuals involved construct, we measured several aspects of the “knowledge and beliefs about the intervention,” including information strengths and information gaps in the current environment; EHR impacts; ease of use; future use intentions; usefulness; and knowledge of EHR system. For the inner setting, we included the key construct of training,14 a critical factor in complex change processes. Items were scored on a 7-point Likert scale15 from 1 (Strongly Disagree) to 7 (Strongly Agree). Gender, age, education, experience with computers, experience with EHR systems, organizational role, and service area were captured and were used as controls in the analysis.

Construct conceptual meanings follow. Information strengths measured the characteristics of the information currently available in the system in terms of its comprehensiveness, quality, and accessibility; and information gaps measured the challenges perceived by users related to the process of acquiring and using information with the current system(s). EHR impacts measured the potential influence and benefits that EHR usage would deliver. Usefulness measured the perceptions that system use would aid in accomplishing tasks in an efficient and effective way and contribute to users' productivity. Ease of use assessed the degree to which a person believed that using a particular system would be easy to learn and he or she could perform tasks with the system with little effort. Future use intentions measured the willingness of a person to adopt, increase use of, and explore the system. Knowledge of system measured the extent to which the users perceive they know how to use, why to use, and receive adequate system support. Training is measured by the respondent's satisfaction with the training program.

Measurement validation

Confirmatory factor analysis techniques were used to validate the pre- and postimplementation survey measures. As expected, 8 factors were obtained from both data sets. Reliability analysis revealed high Cronbach α for all the measures: information strengths (α = 0.91); information gaps (α = 0.90); EHR impact (α = 0.94); perceived usefulness (α = 0.97); perceived ease of use (α = 0.94); future use intentions (α = 0.74); and, knowledge of the system (α = 0.81).

Analysis strategy

Users were matched across the 2 phases on the basis of demographic characteristics shown to be associated with technology use,16 namely, age and gender, and their service area, using the greedy algorithm (using SAS v9.4).17 Data do not allow us to link an individual's pre- and postimplementation responses. The greedy algorithm measures the Euclidian distance between observations based on the standardized values for the matching variables. We used the most conservative approach and set this distance equal to zero, hence requiring a perfect variables match. Only the 159 individuals for whom the algorithm found a match were included in the statistical analyses.*

Paired t tests across pre- and postresponses were performed to understand differences in respondent perceptions based on the 8 CFIR constructs; P values of less than .05 were considered as statistically significant. A Shapiro-Wilk test was used to determine whether the assumption data normality was reasonable. Paired t tests were performed to compare the pre-/postscores of the matched individuals when normality reasonable; Wilcoxon signed rank tests were used otherwise.

We also conducted regression analyses after pooling together data from both surveys. A total of 461 of 560 respondents had complete clean records for all relevant variables and thus included in the regression model. We used each research variable as the dependent variable and included independent variables of Wave (0 for preimplementation, 1 for postimplementation) and all the demographic variables as controls (age, gender, computer experience, EHR experience, role, and service area).

Results

Descriptive statistics

We obtained a total of 331 responses in the preimplementation survey (response rate = 71%) and 229 responses in the postimplementation survey (response rate = 42%), with an overall response rate of 55%. Survey respondent characteristics are summarized in Table 1.

TABLE 1. Pre– and Post–Electronic Health Record Implementation Survey Respondent Attributes (Pre: June 2014, Post: December 2015).

Respondents (%)
Pre–implementation Post–implementation
Age, y
 21-30 2.4 4.4
 31-40 8.6 13.2
 41-50 22.3 20.6
 51-60 41.6 44.1
 61-70 22.3 16.7
 71-80 2.1 1.0
 >80 0.7 0.0
Sex
 Women 76.1 78.5
 Men 23.9 21.5
Education
 Less than high school/GED 0.3 0.0
 High school/GED 2.8 2.5
 Associate degree 5.6 5.9
 Some college 8.7 6.9
 Undergraduate/Bachelor's degree 23.3 21.2
 Master's degree 50.3 56.7
 MD/PhD/JD 9.0 6.9
Computer experience
 Extensive 22.1 31.9
 Good 44.8 46.1
 Average 31.0 20.9
 Use them very infrequently 2.1 1.0
Electronic health record experience
 Extensive 7.2 12.4
 Good 19.9 33.0
 Average 14.0 24.7
 Know about them but limited use 24.0 17.0
 Never used them 34.9 12.9
Service area
 Behavioral health and crisis services 47.6 40.2
 Public health services 37.8 46.4
 Other 14.6 13.4

More than 80% of respondents are between 41 and 70 years of age, with most between 51 and 60 years of age. More than 80% of respondents attained a bachelor's degree and more than 50% attained a master's or terminal degree. Respondents generally had prior experience with computers, but most had limited experience with or had never used EHR systems. The majority of survey respondents were from behavioral health and crisis services (pre = 47.46%, post = 40.2%) or public health services (pre = 37.8%, post = 46.4%).

Matched pairs pre-/postanalysis

Mean CFIR construct scores of survey respondents by service area based on the matched (age group, gender, and service area) sample (N = 159) responses at pre- and postimplementation time frames are plotted later (Figure). P values based on the paired t tests or signed rank test results of matched samples CFIR constructs are provided in Table 2. Matching on additional demographic variables yielded a similar pattern of results. The results indicate a statistically significant (P < .05) decline in users' perceptions about the EHR system from their preimplementation expectations to the postimplementation experience. Information gaps increased from pre- to postimplementation. Statistically significant differences in pre- and postimplementation respondent mean scores for every CFIR construct variable were found, except for knowledge of system (P > .05).

TABLE 2. Matched Sample of Pre– and Post–Electronic Health Record Implementation Survey Respondents, P Values for Survey Constructs.

Matching Variables: Age Group, Gender, and Service Area All (N = 159) Service Area = Behavioral Health and Crisis Services (N = 73) Service Area = Public Health Services (N = 72)
Future use intentions <.001 <.001 .005
Information gaps <.001 .009 .008
Information strengths <.001a .002a .007a
Electronic health record impacts <.001 <.001 <.001
Perceived usefulness <.001 <.001a <.001
Perceived ease of use <.001 <.001 <.001
Future use intentions <.001 <.001 .005
Knowledge of system .016 .181 .086
Training <.001 <.001 <.001

aAll values in the tables are P values. P value is based on signed rank test instead of the paired t-test.

FIGURE.

FIGURE

Pre– and Post–EHR Implementation Survey Responses by Construct and Service Area (Pre: June 2014, Post: December 2015)

Abbreviation: EHR, electronic health record.

Multivariate regression analysis

Certain respondent attributes were significantly (P < .05) correlated with respondent CFIR construct score variance (ie, individual differences influenced their perceptions of the EHR), including age and information gaps (+), ease of use (−); gender (female) and information strengths (+); EHR experience and EHR impacts (−), usefulness (−), ease of use (−), future use intentions (−), information strengths (−), and knowledge of system (−). These results indicate that users with previous EHR experience had a less favorable assessment of the new EHR than users without prior EHR experience. Computer experience, role, and service area attributes were not significantly correlated with respondent's CFIR construct score variance.

Discussion

Results indicate an overall discontent with the new technology at approximately 3 months postimplementation relative to preimplementation perceptions. While these findings are generally consistent with prior work that has described the challenges health services organization staff have described at early EHR implementation such as functionality gaps,18 training deficits,19 and usability challenges,20 this study highlights specific nuances public health department EHR implementers should consider.

The behavioral health and crises services area reported greater dissatisfaction with the EHR than the public health services area. The added complexity in using and sharing behavioral data may be a causal factor. Discussions with health department staff suggested that increases in documentation and interface navigation challenges, respectively, yielded frustration. Variability in processes and terminology across service areas and groups within the health department created issues in learning. Staff remarked that extracting useful information for reporting in the new EHR was especially challenging.

Results suggest that this public health EHR implementation needed to be further contextualized to the complex health department environment. Users believed that the system would be useful and easy to use preimplementation but reported negative beliefs postimplementation. One reason for this decline may be that the specialized data processing, policy and information management needs of a health department, which may include medical, behavioral, dental, and specialized programmatic data, are not readily available in the majority of commercial off-the-shelf software. Electronic health records designed for inpatient or ambulatory settings, in general, will need a great deal of customization to operate effectively in a health department.

The usability and usefulness issues survey respondents report suggest that procurement processes should put more attention on the usability of products and include rigorous requirements for specific information management (eg, data reporting, interoperability, etc) needs. Stakeholders have highlighted the effects of poor EHR usability on patient safety21 and efficient use,22 while health care organizations and researchers23 have decried the difficulty in rigorously assessing EHR usability due to restrictive vendor license agreements. The Office of National Coordinator for Health IT has recently released a contracts guide24 that provides recommended language for EHR purchasers, including for usability testing. We recommend health departments review this language, including negotiating a “carve out”(p13) to permit certain types of information sharing that may support usability studies. Ideally, usability assessments should be completed before selecting a vendor.

The EHR system was anticipated to be far more impactful than postimplementation respondents found it was in practice. When an information system does not meet preimplementation expectations, the perceived benefit is negatively impacted and downstream effects on integration processes can occur. These results underscore the need to carefully craft communication campaigns that balance potential expected benefits with realistic expectations of the challenges that will be faced when implementing public health EHRs.

These study results further reinforce the importance of carefully crafted training. The most effective training is that which is tailored to the distinct roles of users, closely simulating actual work tasks, without being too constrained to limit an understanding of the full system function.25 The trainers in this study, which had not previously implemented a public health–focused EHR, did not customize training for all roles. Health departments may benefit from EHR implementation resources tailored to public health's unique needs.

Limitations

The results represent a singular health department's EHR implementation. Health departments, as well as EHR vendors, can vary along many dimensions. Even so, there are similarities between information management tasks of a health department and most EHRs that provide generalizability. The data do not allow identifying a specific individual's pre- and postimplementation responses, so a matched sample approach was used. Also, the response rate in the postimplementation phase dropped precipitously, perhaps due to EHR fatigue, and potentially introducing response bias. Finally, these results are based on approximately 3 months post-EHR implementation, whereas it typically takes years for the benefits of an EHR system to be fully realized.17,26 Health department staff involved in the current study have indicated that utility and satisfaction have significantly improved with time, even facilitating new initiatives requiring data coordination and management. Evaluation of public health EHRs longer into the implementation cycle (>3 months) is needed.

Implications for Policy & Practice

Our analysis demonstrated the pre– and post–early implementation factors a large, suburban health department experienced when implementing an EHR. The findings suggest that health departments should conduct the following activities when implementing EHR systems in a public health setting.

  • Provide extensive preimplementation training opportunities, including EHR training tailored to job roles, competencies, and tasks.

  • Assess usability and specific capabilities at a more granular level as part of procurement processes and consider using contracting language to facilitate usability, patient safety, and related evaluations to enhance effectiveness and efficiencies and make results public.

  • Apply standard terminologies, processes, and data structures across different health department service areas using common public health terminologies.

  • Craft workforce communication campaigns that balance potential expected benefits with realistic expectations.

*

We repeated the matching using additional demographic variables: EHR Experience and Computer Experience. Although these additional variables reduced the size of the sample (N = 152 when only the first variable was added, and N = 126 when both were added), results were similar to the main analysis.

The authors declare no conflicts of interest.

References

  • 1.Honoré PA, Wright D, Berwick DM, et al. Creating a framework for getting quality into the public health system. Health Aff(Millwood). 2011;30(4):737–745. [DOI] [PubMed] [Google Scholar]
  • 2.Lumpkin JR, Magnuson JA. History and significance of information systems and public health. In: Magnuson JA, Lumpkin JR. eds. Public Health Informatics and Information Systems. London, England: Springer; 2013:19–36. [Google Scholar]
  • 3.McCullough JM, Goodin K. Clinical data systems to support public health practice: a national survey of software and storage systems among local health departments. J Public Health Manag Pract. 2016;22(suppl 6):S18–S26. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Detmer D. Building the national health information infrastructure for personal health, health care services, public health, and research [published online ahead of print January 6, 2003]. BMC Med Inform Decis Mak. 2003;3:1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Calman N, Hauser D, Lurio J, Wu WY, Pichardo M. Strengthening public health and primary care collaboration through electronic health records. Am J Public Health. 2012;102(11):e13–e18. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Birkhead GS, Klompas M, Shah NR. Public health surveillance using electronic health records: rising potential to advance public health. Front Public Health Serv Sys Res. 2015;4(5):25–32. [Google Scholar]
  • 7.Massoudi BL, Goodman KW, Gotham IJ, et al. An informatics agenda for public health: summarized recommendations from the 2011 AMIA PHI Conference. J Am Med Inform Assoc. 2012;19(5):688–695. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Crowley K, Gold RS, Bandi S, Agarwal R. The Public health information technology maturity index: an approach to evaluating the adoption and use of public health information technology. Front Public Health Serv Systems Res. 2016;5(2):26–33. [Google Scholar]
  • 9.Adler-Milstein J, Everson J, Lee S. EHR adoption and hospital performance: time-related effects. Health Serv Res. 2015;50(6):1751–1771. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Friedman DJ, Parrish RG, Ross DA. Electronic health records and US public health: current realities and future promise. Am J Public Health. 2013;103(9):1560–1567. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Williams KS, Shah GH. Electronic health records and meaningful use in local health departments: updates from the 2015 NACCHO informatics assessment survey. J Public Health Manag Pract. 2016;22(suppl 6):S27–S33. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.NACCHO. 2016 national profile of local health departments. http://nacchoprofilestudy.org/. Accessed February 20, 2017.
  • 13.Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4(1):886. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Wisdom JP, Chor KH, Hoagwood KE, Horwitz SM. Innovation adoption: a review of theories and constructs. Adm Policy Ment Health. 2013;41(4):480–502. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Likert R. The Human Organization: Its Management and Values. New York, NY: McGraw-Hill; 1967. [Google Scholar]
  • 16.Talukder M. Factors affecting the adoption of technological innovation by individual employees: an Australian study. Procedia-Social Behavioral Sci. 2014;40(2):52–57. [Google Scholar]
  • 17.Black PE. Greedy Algorithm. Dictionary of Algorithms and Data Structures, US National Institute of Standards and Technology website. https://xlinux.nist.gov/dads/HTML/greedyalgo.html. 2010. Accessed December 10, 2016.
  • 18.Masselink L, Erikson C. Perceptions of Electronic Health Records Effects on Staffing, Workflow, & Productivity in Community Health Centers. GW Health Workforce Research Center Report produced under Cooperative Agreement with the U.S. Bureau of Health Professions, National Center for Health Workforce Analysis. Washington, DC: GW Health Workforce Research Center; 2016. [Google Scholar]
  • 19.Rathert C, Porter TH, Mittler JN, Fleig-Palmer M. Seven years after meaningful use: physicians' and nurses' experiences with electronic health records [published online ahead of print June 13, 2017]. Health Care Manage Rev. 2017. 10.1097/HMR.0000000000000168. [DOI] [PubMed] [Google Scholar]
  • 20.Friedberg MW, Chen PG, Van Busum KR, et al. Factors affecting physician professional satisfaction and their implications for patient care, health systems, and health policy. Rand Health Q. 2014;3(4):1. [PMC free article] [PubMed] [Google Scholar]
  • 21.Sittig DF, Singh H. Electronic health records and national patient-safety goals. N Eng J Med. 2012;367(19):1854–1860. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Blijleven V, Koelemeijer K, Wetzels M, Jaspers M. Workarounds emerging from electronic health record system usage: consequences for patient safety, effectiveness of care, and efficiency of care. JMIR Hum Factors. 2017;4(4):e27. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Shneiderman B. Tragic errors. Interactions. 2011;18(6):60. [Google Scholar]
  • 24.The Office of the National Coordinator for Health Information Technology. EHR Contracts Untangled: Selecting Wisely, Negotiating Terms, and Understanding the Fine Print. Washington, DC: The Office of the National Coordinator for Health Information Technology; 2016. [Google Scholar]
  • 25.Cresswell KM, Bates DW, Sheikh A. Ten key considerations for the successful implementation and adoption of large-scale health information technology. J Am Med Inform Assoc. 2013;20(e1):e9–e13. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.King J, Patel V, Jamoom EW, Furukawa MF. Clinical benefits of electronic health record use: national findings. Health Serv Res. 2014;49(1, pt 2):392–404. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Journal of Public Health Management and Practice are provided here courtesy of Wolters Kluwer Health

RESOURCES