SYNOPSIS
Objectives
Epidemiologists play critical roles in public health. However, until recently, no formal standards existed for epidemiology practice. In 2005, the Centers for Disease Control and Prevention and Council of State and Territorial Epidemiologists drafted Competencies for Applied Epidemiologists in Governmental Public Health Agencies (AECs) that provide a foundation for expectations and training programs for three tiers of practice. We characterized the Virginia Department of Health (VDH) epidemiology workforce and assessed its baseline applied epidemiology competency by using these competencies.
Methods
Epidemiologists representing multiple divisions developed an Internet survey based on the AECs. Staff who met the definition of an epidemiologist were requested to complete the survey. Within eight skill domains, specific competencies were listed. For each competency, frequency and confidence in performing and need for training were measured by using Likert scales. Differences among tier levels were assessed using analysis of variance.
Results
Eighty-eight people from 10 program areas responded and were included in the analysis. Median epidemiology experience was four years, with 52% having completed formal training. Respondents self-identified as Tier 1/entry-level (38%), Tier 2/mid-level (47%), or Tier 3/senior-level (15%) epidemiologists. Compared with lower tiers, Tier 3 epidemiologists more frequently performed financial or operational planning and management (p=0.023) and communication activities (p=0.018) and had higher confidence in assessment and analysis (p<0.001). Overall, training needs were highest for assessment/analysis and basic public health sciences skills.
Conclusions
VDH has a robust epidemiology workforce with varying levels of experience. Frequency and confidence in performing competencies varied by tier of practice. VDH plans to use these results and the AECs to target staff training activities.
Epidemiology, considered the core science of public health, has been practiced in public health since its inception. Key responsibilities of epidemiologists include public health surveillance, investigation, research, planning, and evaluation with respect to the distribution of disease and health-related events and conditions among populations. The assumption is often made that those who conduct activities associated with epidemiology possess the skills of an epidemiologist. Until recently, no standards existed to measure the skills of those working as field epidemiologists. Epidemiology competencies should be established and used to assess the competency of those who work in government public health agencies. Training aimed at maximizing epidemiology competency of the applied public health workforce can then be delivered.
The topic of competence among the public health workforce has been addressed by multiple authors and organizations during the past decade, with the majority of studies focusing on the workforce as a whole.1,2 Job titles are frequently an inaccurate reflection of the educational preparation or the work performed by an epidemiologist.3,4 The Epidemiology Capacity Assessment Survey conducted by the Council of State and Territorial Epidemiologists (CSTE) revealed that as of 2006, a total of 46% of epidemiologists working in state and territorial health departments were not academically trained in epidemiology.3 Although competencies are critical for the whole workforce, development of function-specific competencies and their use in assessments will help ensure that the people performing those functions possess the necessary skills and not just the title.
In 2006, the Centers for Disease Control and Prevention (CDC) and CSTE released the Competencies for Applied Epidemiologists in Governmental Public Health Agencies (AECs), a report partly designed to define the expected skills of applied epidemiology practitioners.4 The CDC/CSTE working group established competency and subcompetency statements centered on eight skill domains first identified by the Council on Linkages Between Academia and Public Health Practice as core competencies for the effective practice of epidemiology in a government public health setting.5 The eight skill domains are: (1) assessment and analysis, (2) basic public health sciences, (3) communication, (4) community dimensions of practice, (5) cultural competency, (6) financial and operational planning and management, (7) leadership and systems thinking skills, and (8) policy development skills. For each skill domain or core competency, multiple competency statements were subsequently developed to clarify specific skills and knowledge.
The AECs provide public health entities (e.g., state, local, and tribal health departments) with a foundation for assessing the competencies of their epidemiology workforce and for identifying where gaps exist. The results of these assessments should be viewed as an opportunity to close gaps, not merely highlight them. Other authors have reinforced the value of competency through their definition, “a cluster of related knowledge, skills, and attitudes that affects a major part of one's job (a role or responsibility), that correlates with performance on the job, that can be measured against well-accepted standards, and that can be improved via training and development.”6,7 As such, the use of these epidemiology-specific competencies should increase the practitioners' skills when combined with data-driven training.
The Virginia Department of Health (VDH) embraced the release of the AECs as an opportunity to conduct a baseline assessment of its epidemiology workforce. The primary objectives of the assessment were to (1) characterize the VDH epidemiology workforce in terms of capacity, experience, education, frequency of performing, and confidence in core competencies and (2) use the results to plan education and training opportunities based on the self-identified needs of the workforce relative to this standardized skill set.
METHODS
Survey development
In April 2006, eight epidemiologists met to develop objectives and a survey tool for an assessment of the AECs within the VDH workforce. The epidemiologists represented multiple VDH program areas (Division of Surveillance and Investigation, Division of Disease Prevention, Division of Family Health Services, and Emergency Preparedness and Response) and positions (division or program director, regional epidemiologist, district epidemiologist, and CDC fellow).
The survey instrument was closely based on skill domains and competency statements detailed in the CDC/CSTE AECs draft report.4 Each competency was reviewed and those most relevant to work activities at VDH were selected for inclusion. For every competency within each of the eight skill domains, four questions were asked. Participants were first asked whether their position involved the activity stated in the competency. If yes, the respondent was asked to answer the remaining three questions, beginning with the frequency with which they conducted the activity on a Likert scale of 1 to 5, with five being “very frequent.” Next, respondents were asked, “What is your confidence level in performing this activity?” They self-assessed confidence on a similar scale of 1 (low confidence) to 5 (high confidence). Lastly, respondents were asked what their interest in additional training for the activity was on a scale of 1 (low interest) to 5 (high interest).
Additional survey questions asked for the respondents' highest level of education, clinical degrees and licensure, highest level of epidemiology training, years of experience in epidemiology, program area, and tier. Four tiers of epidemiologists were identified by CDC/CSTE (Figure 1). Participants selected the tier that was most appropriate to their level of practice after being provided with examples of functional responsibility and educational and experiential criteria.
Figure 1.
Centers for Disease Control and Prevention and Council of State and Territorial Epidemiologists applied epidemiology tier levelsa
Centers for Disease Control and Prevention (US) and Council of State and Territorial Epidemiologists. Competencies for applied epidemiologists in governmental public health agencies (AECs) [cited 2007 Jan 31]. Available from: URL: http://www.cdc.gov/od/owcd/cdd/aec or http://www.cste.org/competencies.asp
The survey was pilot-tested in June 2006 with a sample (n=10) of CDC Epidemic Intelligence Service Officers and CSTE fellows. Results from the pilot were used to refine survey questions.
Survey administration
Support for the assessment was garnered through announcements at Virginia's Emergency Preparedness and Response (EP&R) annual meeting, a monthly statewide epidemiology conference call, and at a meeting of district health directors. Additionally, the state epidemiologist sent an e-mail describing the survey to four distribution lists that included central office division directors, nurse managers, EP&R epidemiologists, and health directors for Virginia's 35 health districts. In August 2006, the survey was posted on an Internet survey site, and participants were provided with an online address. Division directors, health directors, and nurse managers were asked to forward information about the survey to colleagues under their supervision who act as epidemiologists. An epidemiologist was defined as “an investigator who studies the occurrence of disease or other health-related conditions or events in defined populations.”8 The survey was available to respondents from August 14–25, 2006. Personal identifiers such as name and position were not included in the survey.
Analysis
Data from the online survey were downloaded into a Microsoft® Excel® spreadsheet. All analyses were conducted using SPSS9 and SAS.10 Descriptive analyses (e.g., frequencies and median values) were performed. For each of the eight skill domains, competency statements were listed that allowed for an assessment of specific activities related to the overarching skill domain. Respondents who performed ≥75% of all competencies within a single skill domain were classified as performing that skill.
Because distributions were normal, a mean score was calculated for each of the skill domains by taking the arithmetic mean of the five-point Likert scales for respondents who performed that skill. Mean scores were calculated for the frequency of performing, confidence in, and training need for, the total sample and for each tier of epidemiology practice within each skill domain. Differences between mean scores were assessed using analysis of variance. Differences between the median number of years of experience by tier group were assessed using the Brown-Mood test.11
RESULTS
Respondent characteristics
A total of 89 people responded to the survey. One person was excluded from analysis because of a discrepancy between self-reported experience and self-selected Tier 3b. Eighty-eight people were included in the final analysis.
Characteristics of survey respondents are presented in Table 1. Approximately 69% of respondents held a master's degree or higher. However, only 44% held at least a master's degree in epidemiology. Although the majority of respondents did not have any clinical licensures (65%), 21% were registered nurses; 7% held a medical degree; and 1% held a veterinary degree. Approximately 58% of respondents had <5 years of work experience in the field of epidemiology, with the median for the cohort equivalent to four years. Self-reported Tier 1 or entry-level epidemiologists comprised 38% of the cohort; Tier 2 or mid-level epidemiologists comprised 47% of the sample; and Tier 3a or senior-level supervisor/manager epidemiologists comprised 15% of the sample. No Tier 3b or senior scientist epidemiologists were included in the sample. Respondents represented 10 different VDH program areas, with the majority of respondents from local or district health departments (38%), Office of Epidemiology Division of Surveillance and Investigation (23%), Office of Epidemiology Division of Disease Prevention (19%), and Family Health Services (10%).
Table 1.
Self-identified characteristics of Virginia Department of Health epidemiology workforce, August 2006 (n=88 survey participants)
Includes Divisions of Immunization, Zoonotic and Environmental Epidemiology, and Public Health Toxicology
GED = General Educational Development
The level of epidemiology training by self-reported tier of practice is presented in Figure 2. The majority of Tier 2 (59%) and Tier 3a (62%) respondents had completed at least a master's degree in epidemiology or public health. Self-reported Tier 1 epidemiologists more frequently had received epidemiology training on the job (39%) or through coursework in epidemiology (27%). Nurses who performed epidemiologic activities most frequently had received their training on the job (67%).
Figure 2.
Epidemiology training by tier, Virginia Department of Health epidemiology workforce, August 2006
Figure 3 presents self-reported years of epidemiology experience by tier of practice. Approximately 39% of Tier 1 or entry-level epidemiologists reported,1 year of work experience in epidemiology. Forty-six percent of these epidemiologists were registered nurses. Six respondents who classified themselves as Tier 1 epidemiologists reported ≥5 years of epidemiology experience. The majority of respondents (76%) who self-classified as Tier 2 epidemiologists had one to nine years of epidemiology experience, which is consistent with the Tier 2 definition. Tier 2 epidemiologists primarily worked in local health departments (45%). Eighty-five percent of Tier 3a, or senior-level epidemiologists, worked within a division of the state office. The median number of years of experience by tier level was 1.0, 4.5, and 9.0 for Tiers 1–3a, respectively, which was statistically different (p<0.05).
Figure 3.
Self-reported years of epidemiology experience by tier, Virginia Department of Health epidemiology workforce, August 2006
Competency assessment
For each skill domain, the number of people who performed >75% of all associated subcompetencies in their current position varied. These numbers are presented in the second column of Tables 2–4. The skill domains that were most frequently a part of an epidemiologist's position included assessment and analysis, basic public health sciences, and communication. Approximately 50% of Tier 3a scientists performed all eight skill domains. However, the only skill domain that was performed by >50% of Tier 1 epidemiologists was assessment and analysis.
Table 2.
Mean self-rated frequency (range = 1 to 5) of performing skill domain, by tier level, Virginia Department of Health epidemiology workforce, August 2006 (n=88)
Number responding that they performed ≥75% of the competencies in the skill domain
Significant
Table 3.
Mean self-rated confidence (range = 1 to 5) of performing skill domain, by tier level, Virginia Department of Health epidemiology workforce, August 2006 (n=88)
Number responding that they performed ≥75% of the competencies in the skill domain
Significant
Table 4.
Mean self-rated need for additional training (range = 1 to 5) in skill domain, by tier level, Virginia Department of Health epidemiology workforce, August 2006 (n=88)
Number responding that they performed ≥75% of the competencies in the skill domain
For those respondents who reported performing a skill as part of their position, the frequency of performing it is described in Table 2 by mean score and tier. Tier 3a epidemiologists reported more frequently performing activities associated with financial and operational planning and management (p=0.023) and communication (p=0.018) as compared with people in lower tiers. Self-reported confidence in performing skills varied by tier level as well. As listed in Table 3, self-reported senior-level epidemiologists had higher mean confidence in assessment and analysis (p=0.001) than Tier 1 and 2 epidemiologists.
For all tiers, training needs were highest for assessment and analysis and basic public health sciences. Specific activities within assessment and analysis where >50% of respondents performing the activity requested more training included using data to identify public health problems pertinent to the population, conducting surveillance, investigating conditions among the population, analyzing data from epidemiologic investigations by using appropriate analytical techniques, using computer software to create visual displays of data, and using statistical software packages to perform analytical functions and improve data quality. Training needs within specific activities of basic public health sciences included using knowledge of causes of disease in practicing epidemiology, using knowledge of environmental or behavioral sciences in practicing epidemiology, and interpreting laboratory results of screening and diagnostic tests.
DISCUSSION
The AECs have provided a framework for public health agencies to assess the skills possessed by their epidemiology workforce in a standardized manner. Using these competencies, we carried out a baseline assessment of the VDH epidemiology workforce.
The VDH workforce was determined to be robust in terms of number, educational and professional background, and years of experience. However, gaps were identified regarding the frequency and confidence in performing core skills within the workforce as a whole and within each tier of practice. The findings of this assessment will be discussed in terms of the differences in competency level by workforce characteristics, identified training needs, and utility of the AECs in providing a standard for epidemiology practice within a centralized state health department.
Epidemiology workforce characteristics
In implementing this assessment, one of our first challenges was identifying Virginia's epidemiology workforce. A single epidemiology job series has not been established in Virginia. Epidemiologists at the central office perform a range of epidemiologic activities and are classified in multiple job series, ranging from statistical analyst to human services program manager. In 2002, federal biologic terrorism and EP&R funds were used to establish epidemiologist positions primarily within each of VDH's 35 health districts and five regional offices. Although epidemiologists funded by EP&R funds are more easily identified, they are not alone in performing epidemiologic activities at the local level. Public health nurses play critical roles in epidemiologic investigations and communicable disease surveillance at the local level. This was especially evident before establishment of district epidemiologist positions.
As expected, Virginia's epidemiology workforce, as determined by this survey cohort, was diverse in terms of professional background and level of epidemiology training and on-the-job experience. This is typical of the public health workforce as a whole.2 Approximately half of the workforce, including at least 84% of registered nurses, did not have formal epidemiology training through an academic or established training program. This is consistent with findings from CSTE's 2006 Epidemiology Capacity Assessment (CSTE ECA).3 The median number of years of experience for the workforce is consistent with the influx of EP&R funding, demonstrating that the majority of the current epidemiology workforce was hired to fill EP&R positions. VDH, like many other state health departments, continues to rely heavily on federal funds to support epidemiologic activities.3
On the basis of their level of epidemiology training and years of experience, respondents self-selected their tier of practice. Our analysis indicated that respondents with similar training backgrounds and years of experience were not consistent in their tier selection. Thus, differences among tiers must be interpreted with caution, and the utility of establishing training programs specific to tier groups is questionable.
Competency assessment and training needs
Competencies were performed with varying frequency, with the majority of epidemiologists performing activities associated with assessment and analysis and basic public health sciences skill domains—critical skills for people serving in even a slight epidemiology capacity. Skill domains commonly associated with supervisory or managerial job functions (e.g., financial and operational planning and management), leadership and systems thinking, and policy development were performed more frequently by Tier 3a epidemiologists. Our results revealed that Tier 2 epidemiologists had higher mean confidence in performing competencies within five of the eight skill domains as compared with Tiers 1 and 3a epidemiologists. This can be interpreted in different ways. Tier 2 epidemiologists might have more experience in a skill domain that then translates into self-perceived higher confidence, or Tier 3a epidemiologists might have fewer opportunities to apply their basic epidemiology skills as they devote more time to policy, administration, and supervision. The results might also be an inherent bias of self-reporting.
Despite differences in how competency was assessed via the CSTE ECA and our VDH-specific survey, comparisons can be made. With respect to applying privacy laws to protect confidentiality and using knowledge of environmental and behavioral sciences in epidemiology practice, the proportion of respondents indicating that they were competent in these areas was nearly equivalent for both surveys. Compared with the perspective of state epidemiologists/senior health officials, the Virginia workforce reported that it did not feel as comfortable performing capacities such as creating and managing databases, applying and understanding causes of disease, and risk communication. Additionally, at times the workforce perceived a greater need for training when senior epidemiologists or health officers reported less need, and vice versa.
Training needs varied by practice tier, professional background, years of experience, and level of formal epidemiology training. Overall, the need for additional training was highest for activities comprising the assessment and analysis and basic public health sciences skill domains, the two skill domains in which epidemiologists most frequently performed activities and had the highest confidence. Initial supplemental training to meet this need might focus on specific activities (e.g., basic epidemiologic analyses), use of different statistical software packages, and an overview of common communicable diseases frequently investigated by public health. Development of additional targeted training will probably come after a thorough assessment of job activities and prior epidemiology training.
CONCLUSIONS AND LIMITATIONS
The results of this assessment demonstrate that, despite performing certain epidemiologic activities with high frequency and confidence, the desire of the workforce for sustained training is high. Continuing to provide a variety of training programs on a frequent basis will help close gaps in the applied epidemiology competency of the workforce as well as likely disparities between those with an academic vs. applied epidemiology training.12
The AECs were useful in providing a foundation of expected skills for epidemiologists working in a state health department. Virginia is one of the first states to use these competencies to establish a baseline assessment of skills and training needs within its epidemiology workforce. Ideally, all epidemiologists would be encouraged to work toward meeting this set of competencies specific to their profession. However, from our assessment it was evident that epidemiologists within a state health department performed multiple roles that in certain instances did not incorporate all competencies. Because epidemiology positions are so broadly defined, the competencies can serve a more practical purpose in providing guidance on more focused job functions. Assessments implemented across time can aid in determining a person's comfort level with different job duties and associated training needs. Because this was a baseline assessment, we plan to use lessons learned from this experience to develop follow-up assessments. This will allow for an evaluation of the efficacy of training programs and incorporation of AECs among the VDH workforce.
This study was prone to certain limitations. In an effort to be inclusive rather than exclusive, we sought to survey all people who identified with the definition of epidemiologist. The most straightforward way to do this was to identify the survey cohort by self-selection; hence, the true denominator was unknown. Certain epidemiologists might not have responded to the survey, leading to a response bias. Self-report was also a limitation, by introducing potential biases associated with misclassification and response. Although including job title might have aided in tier classification and the development of subsequent training programs, we did not include this on our survey because it had the potential to breach anonymity. Lastly, our assessment was focused on specific activities performed on the job and did not assess the workforce's abilities and experience in other epidemiology competencies not relevant to or required for their current position.
Through use of the AECs and our own assessment tool, we were able to verify that Virginia has a diverse workforce. The AECs will help guide planning for training among those who practice epidemiology in different positions across the state. VDH will continue to use the competencies in an effort to close existing gaps in its workforce and set a standard for professional practice.
Acknowledgments
The authors acknowledge additional members of the Virginia Applied Epidemiology Competency working group: John Ambrose, MPH, Derek Chapman, PhD, MS, Candace Hamm, MPH, Jeff Stover, MPH, and Betty Rouse.
Footnotes
The findings and conclusions in this article are those of the author(s) and do not necessarily represent the views of the Centers for Disease Control and Prevention.
REFERENCES
- 1.Public Health Foundation. Washington: Public Health Foundation; 2006. [cited 2007 Jan 23]. Core competencies for public health professionals. Also available from: URL: http://www.phf.org/competencies.htm. [Google Scholar]
- 2.Gebbie K, Merrill J, Tilson HH. The public health workforce. Health Aff (Millwood) 2002;21:57–67. doi: 10.1377/hlthaff.21.6.57. [DOI] [PubMed] [Google Scholar]
- 3.Council of State and Territorial Epidemiologists. Atlanta: CSTE; 2006. [cited 2007 Feb 8]. 2006 national assessment of epidemiologic capacity: findings and recommendations. Also available from: URL: http://www.cste.org/pdffiles/2007/2006CSTEECAFINALFullDocument.pdf. [Google Scholar]
- 4.Centers for Disease Control and Prevention (US) and Council of State and Territorial Epidemiologists. Competencies for applied epidemiologists in governmental public health agencies (AECs) [cited 2007 Feb 8]; Available from: URL: http://www.cdc.gov/od/owcd/cdd/aec or http://www.cste.org/competencies.asp.
- 5.Council on Linkages Between Academia and Public Health Practice. [cited 2007 Feb 8];Core competencies for public health professionals. Available from: URL: http://www.phf.org/Link.htm.
- 6.Lucia AD, Lepsinger R. San Francisco: Jossey-Bass; 1999. The art and science of competency models: pinpointing critical success factors in organizations. [Google Scholar]
- 7.Miner KR, Childers WK, Alperin M, Cioffi J, Hunt N. The MACH model: from competencies to instruction and performance of the public health workforce. Public Health Rep. 2005;120(Suppl 1):9–15. doi: 10.1177/00333549051200S104. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Last JM. A dictionary of epidemiology. 2nd ed. New York: Oxford University Press; 1988. [Google Scholar]
- 9.SPSS, Inc. SPSS: Version 12.0. Chicago: SPSS, Inc.; 2003. [Google Scholar]
- 10.SAS Institute, Inc. SAS: Version 9.1. Cary (NC): SAS Institute, Inc.; 2003. [Google Scholar]
- 11.Brown GW, Mood AM. On median tests for linear hypotheses. Proceedings of the Second Berkeley Symposium on Mathematics and Statistics Problems; Berkeley (CA): University of California Press; 1951. pp. 159–66. [Google Scholar]
- 12.Thacker SB, Buffington J. Applied epidemiology for the 21st century. Int J Epidemiol. 2001;30:320–5. doi: 10.1093/ije/30.2.320. [DOI] [PubMed] [Google Scholar]







