Skip to main content
Health Services Research and Managerial Epidemiology logoLink to Health Services Research and Managerial Epidemiology
. 2021 Apr 30;8:23333928211012226. doi: 10.1177/23333928211012226

Public Health Informatics, Human Factors and the End-Users

Sarah D Matthews 1,, Michael D Proctor 1
PMCID: PMC8107934  PMID: 33997119

There is an unspoken assumption in public health informatics that, “If you build it, they will come.” In this commentary, we argue that building it, is not enough. Without end-user focus on human factors issues that are identified and resolved prior to implementation, “they may come, but they won’t stay!” We argue that to engage public health professionals with new innovative technology, 1 must continually ask during the development process, “who are we building this product for and do we have the right information to back up our theories on implementation and use?” With the myriad of public health informatics introduced amid the COVID-19 pandemic, there are many choices. For those languishing, we note that this question may not have been sufficiently pursued resulting in situations where “they may come, but they won’t stay!”

Public Health Informatics Evolution and COVID-19

Over 2 decades ago, Yasnoff et al.1 gave definition to the discipline of public health informatics (PHI) as the “systematic application of information and computer science and technology to public health practice, research and learning.”1 The prevailing impression at that time was that all stakeholders must be engaged in coordinated activities related to PHI, but the public health workforce were deemed to not have the training and experience to make decisions about IT.2 If public health professionals were left out of the PHI decision making, is it any surprise that this practice resulted in some high risk of technology failures and in others slow adoption of the technology.2 Besides bringing public health professionals into the decision making circle, what can be done to help PHI flourish?

In the last 2 decades we have seen a myriad of innovative technology applied to public health practice in areas such as disease surveillance, immunizations registries, electronic health record integration, vital statistics etc. Public health organizations such as the Association of State and Territorial Health Officials (ASTHO), Public Health Informatics Institute (PHII) and Centers for Disease Control and Prevention (CDC) recognized that the decentralized United States public health system is enormous in scale and immense in diversity making it difficult to implement innovative technology successfully.3-5 Complicating successful implementations are informatic factors and organizational culture barriers that do not allow for evidence-based public health to be implemented in practice.2,6

To help public health leaders to understand these digital technologies and make informed decisions on technology integration, ASTHO, PHII and CDC have created several reports and frameworks.7-9 These documents are filled with standards, discussion about security, confidentiality and privacy, system architectures and infrastructure, training, and workforce development. They are detailed about the technology and their implementations with recommendations of such things as overhauling of computer systems, changes in operability and upgrades to hardware and software. These guides and frameworks primarily define the workforce in the domain of technical skills. One example is CDC’s roadmap for public health informatics and data modernization which identifies the need for a future workforce with stronger skills in data science, analytics, modeling, and informatics.

Clearly improved technical skills are a necessary condition, but we argue insufficient to the challenge at hand. Kaplan and Harris-Salamone10 report that across industries (including healthcare) there is at least 40% or greater failure rate for generic IT projects. These failures are largely attributed to overbudget, timeline overruns, under delivery of value, and termination of the project before completion. They also cite the 3 major reasons for project success are user involvement, executive management support (proactive) and a clear requirements statement.10 Reviewing the CDC’s roadmap reveals an insufficient emphasis on, if not miss, these important human factors concepts such as the public health professional workforce perceptions, attitude, and motivation for accepting and using new technology.7

With a transdisciplinary approach, we studied public health professionals across the United States to understand their technology use behavior and their health behavior toward the use of an agent-based online personalized intelligent tutoring system.11 Thus, we believe our findings can be extrapolated and should be applied with prior evidence-based interventions to increase innovative technology project success and retention for public health practice applications.

Human Factors Research in Public Health Practice

In our study, we re-affirmed findings of prior studies that the biggest barriers to the user were time and technology barriers such as firewalls not allowing cloud-based applications, slow loading, system compatibility, specific state requirements and interoperability across devices.10,11 But by combining the theoretical frameworks of the Public Health Services Health Belief Model (HBM)12 and Davis’ Technology Acceptance Model (TAM)13 we also discovered less emphasized insights.

HBM hypothesized that health related-action depends upon 3 factors occurring simultaneously: 1. The existence of sufficient motivation to make the health issue relevant, 2. The belief that 1 is susceptible to a serious health problem or the sequelae of that illness or condition (i.e. perceived threat) and 3. That belief that following the health recommendation/regime would be beneficial in reducing the perceived threat.14,15 HBM is composed of 4 constructs: perceived susceptibility, perceived severity/seriousness, perceived benefits to taking action and perceived barriers to taking action. These constructs are applied to the individual’s cues to action.14,15 Our results revealed that public health professionals were sufficiently motivated by the health-related state posed, they believe that their community was susceptible to a serious health problem, sequelae from that condition, and that using the technology would be beneficial in reducing the threat of illness to the community. But the most influential construct in the HBM was cues to action. Thus, when developing new technology, public health professionals must believe that use of the new technology will improve their confidence in the work they do. Technology influencers were others from the public health domain including colleagues. Finally, technology must be taught in a self-paced environment to achieve success.

TAM is another important technique widely used in industries outside health care and accounts for 30%-40% of IT acceptance.16,17 TAM is composed of 4 concepts: attitude, perceived ease of use, perceived usefulness, and intention for use. Of these concepts, attitude and perceived ease of use were found to be the most influential for actual use. End users’ attitude is measured in thinking that it is a good idea to use the technology, liking the idea of using it, and finding using the technology a pleasant experience. The most influential toward use of new technology in our study was perceived ease of use. Perceived ease of use is measured by having the technology being easy to operate, the technology must do what is expected, it must be clear and understandable, flexible to interact with and must be easy to become skillful at using.

What does this mean and how do we apply it? Consider 2 COVID-19 case management tool examples: one using the MITRE Sara Alert product and the other Microsoft ARIAS/Dynamics. MITRE boasts of its development in partnership with key public health partner organizations with much focus on the technical and functionality aspects of the tool.18 Currently, 8 States have implemented the system to help with contact tracing efforts for COVID-19 reducing the staffing and resources needed to conduct active monitoring.9 But there is the burden of considerable workload with enrolling contacts, direct monitoring of non-participatory contacts and follow-up on non-responders as well as having to do duplicative data entry into existing State data systems.19 Customization is limited which creates operational issues across States. Cases are purged 2 weeks after isolation and quarantine orders are closed thus having States to develop a process to export data to retain for historical metrics.18

Microsoft’s ARIAS/Dynamics has been implemented by 9 States.9 Oregon Health acknowledges in their contact tracing training that the software requires technical skills and access to equipment. Additionally, because of the limitation of English language only option they are cocreating a system that serves other demographics in their State.20 The system also requires browsers in Firefox and Chrome as it is not fully supported by Explorer or Safari, the 2 browsers most frequently used in governmental public health.20 In the limited documentation on these systems, there is no mention of technology acceptance, usability or ease of use. There is no published literature on how suitable the end-users feel the technology is for their jobs. This lack of human factors research leaves one to believe that the implementation of these novel technologies is reactionary and after the COVID-19 response the investment in these automated systems will be left to waste like so many other IT projects.

IT projects in the public health domain cannot continue to slight human factors but should be proactive with a focus on not only technology aspects of the project but consider using the fore mention techniques to focus one’s approach to human factor implementation. Public health informatics leaders cannot continue to only account for public health professionals in the workforce development sections of their implementation agenda. These end-users must be included in the structural research prior to implementation. Human factors research theories and concepts must be included in the frameworks and guides, otherwise these innovative approaches will likely continue the abysmal high percentage of technology failures.

Although we critique the current process of public health informatics implementation, we do believe that the myriad of projects introduced amid the COVID-19 response can be sustained and accepted after the response. We recommend: 1. The collaboration of the developers, public health informatics leaders and scientists including human factors researchers and the public health end-user, 2. Collecting data with theoretically informed and empirically validated tools on the end-user’s perceptions, attitude, and motivation for using the new technology and their acceptance to use,11 3. Documentation of enhancements, fixes, barriers and best practices to use of the technology during implementation, and 4. Review and analysis of data to help create clear technology requirements statements for future projects.

Author Biographies

Sarah D. Matthews, PhD, is a graduate of the UCF Interdisciplinary Modeling and Simulation Program and a seasoned applied epidemiologist in governmental public health. Dr. Matthews leads her consulting agency, Health Communications Consultants, Inc. which focuses on finding innovative solutions to complex health service problems that improve lives, influence change and advocate for equity. Email: sarah.matthews@healthcommunicationsconsultants.com

Michael D. Proctor, LTC (Retired), PhD. IE, CMSP, is an associate professor with the University of Central Florida College of Engineering and Computer Science Industrial Engineering and Management Systems Department. Dr. Proctor is also a co-founding faculty member UCF Interdisciplinary Modeling and Simulation program as well as Director of the Lockheed-Martin Synthetic Environments Learning Laboratory. His research interests include education and decision support through ITS, gaming, and live, virtual, and constructive models and simulations. Email: Michael.Proctor@ucf.edu

Footnotes

Declaration of Conflicting Interests: The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article:The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Funding: Article processing charges were provided in part by the UCF College of Graduate Studies Open Access Publishing Fund.

ORCID iD: Sarah D. Matthews Inline graphic https://orcid.org/0000-0001-8372-1185

References

  • 1. Yasnoff WA, O Carroll PW, Koo D, Linkins RW, Kilbourne EM. Public health informatics: improving and transforming public health in the information age. J Public Health Manag Pract. 2000;6(6):67–75. [DOI] [PubMed] [Google Scholar]
  • 2. Yasnoff WA, Overhage JM, Humphreys BL, LaVenture M. A national agenda for public health informatics: summarized recommendations from the 2001 AMIA spring congress. J Am Med Inform Assoc. 2001;8(6):535–545. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3. Beck AJ, Boulton ML, Coronado F. Enumeration of the governmental public health workforce, 2014. Am J Prev Med. 2014;47(5 Suppl 3):S306–S313. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4. Hilliard TM, Boulton ML. Public health workforce research in review: a 25-year retrospective. Am J Prev Med. 2012;42(5):S17–S28. [DOI] [PubMed] [Google Scholar]
  • 5. Tao D, Evashwick CJ, Grivna M, Harrison R. Educating the public health workforce: a scoping review. Front Public Health. 2018;6:27. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6. Novick LF. JPHMP and The Guide to Community Preventive Services. LWW; 2020. [DOI] [PubMed] [Google Scholar]
  • 7. Data modernization initiative. CDC.gov. https://www.cdc.gov/surveillance/surveillance-data-strategies/data-ITtransformation.Html. Published 2020. Updated April 9, 2021. Accessed August 28, 2020.
  • 8. ASTHO. Issue Guide: Covid-19 Case Investigation and Contact Tracing Considerations for Using Digital Technologies. Association of State and Territorial Health Officials B.Next; 2020. [Google Scholar]
  • 9. PHII. Digital Tools to Support Contact Tracing: Tool Assessment Report. Public Health Informatics Institute; 2020. [Google Scholar]
  • 10. Kaplan B, Harris-Salamone KD. Health IT success and failure: recommendations from literature and an AMIA workshop. J Am Med Inform Assoc. 2009;16(3):291–299. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11. Matthews SD, Proctor MD. Can public health workforce competency and capacity be built through an agent-based online, personalized intelligent tutoring system? Educ Techno Soc. 2021;24(1):29–43. [Google Scholar]
  • 12. Rosenstock IM. Historical origins of the health belief model. Health Educ Behav. 1974;2(4):328–335. [Google Scholar]
  • 13. Davis FD. Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly. 1989;13(3):319–340. [Google Scholar]
  • 14. Rosenstock I, Strecher V, Becker M. Social learning theory and the health belief model. Health Educ Q. 1988;15(2):175–183. [DOI] [PubMed] [Google Scholar]
  • 15. Communication Studies Theories the Health Belief Model. University of Twente. https://www.utwente.nl/en/bms/communication-theories/. Published 2017. Updated January 5, 2019. Accessed September 19, 2018. [Google Scholar]
  • 16. Holden R, Karsh B-T. The technology acceptance model: its past and its future in health care. J Biomed Inform. 2008;43(1):159–172. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17. Legris P, Ingham J, Collerette P. Why do people use information technology? A critical review of the technology acceptance model. Inf Manag. 2003;40(3):191–204. [Google Scholar]
  • 18. Secure monitoring and reporting for public health-saraalert. MITRE. https://saraalert.org/. Published 2020. Updated 2021. Accessed September 11, 2020.
  • 19. Krueger A, Gunn JK, Watson J, et al. Characteristics and outcomes of contacts of Covid-19 patients monitored using an automated symptom monitoring tool—Maine, May-June 2020. MMWR Morb Mortal Wkly Rep. 2020;69(31):1026. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20. OregonHealth. ARIAS system training. https://www.oregon.gov/oha/covid19/ContactTracing/ARIAS-Training-TrainingInstructions.pdf. Published August 24, 2020. Accessed September 11, 2020.

Articles from Health Services Research and Managerial Epidemiology are provided here courtesy of SAGE Publications

RESOURCES