Abstract
Frequent outbreaks and the COVID-19 pandemic underscore the need for delivering hands-on outbreak investigation content to learners. This work aimed at assessing the effectiveness of using a combined experiential, competency, and team-based learning activity for teaching outbreak investigations to first-year medical students (M1). Two prospective cohorts of 84 M1 students each in 2019 and 2020 underwent an interactive endeavor. This project evaluated the competencies gained as portrayed in a team presentation, students’ perception of those competencies, and activity’s utility. Students gained most competencies, particularly those linked to their role as clinicians. There is still room for improvement in detecting an outbreak, labeling the epidemic curve type, and designing a study suitable for answering the hypothesis. Based on 55 and 43 (65% and 51%) responders, most of the groups agreed that the learning activity was useful in providing the necessary skills to conduct an outbreak investigation. Facilitating experiential learning opportunities in which students can practice their recently acquired medical skills (i.e., recognize symptoms, elaborate differential diagnosis) engaged them in the non-clinical components. Such opportunities can also gauge in lieu of a formal evaluation the level of mastery achieved and deficiencies not only in specific but also in related competencies.
Supplementary Information
The online version contains supplementary material available at 10.1007/s40670-023-01756-5.
Introduction
The COVID-19 pandemic and other recent outbreaks of vaccine-preventable diseases underscore the relevance of outbreak investigations in medical education. However, there is limited guidance on how to deliver such content for medical students in a hands-on format. Hands-on activities are part of the experiential learning theory (ELT) developed by David Kolb in 1984 [1]. The ELT consists of four steps:
Concrete learning (learner gets a new experience or interpret past experience in a new way)
Reflective observation (learner reflects about own experience)
Abstract conceptualization (learner forms new ideas or adjust thinking based on experience and reflection about it)
Active experimentation (learner applies the new ideas)
These four steps actually cover different modes of acquiring and transforming experience into knowledge. The process can start at any of the four steps and can contain any number of them. However, as a learning activity can cover more steps and thus different ways of working with the learning material, that will optimize learning [2].
Medical educators consider experiential learning as one of the most effective tools for adult education particularly of medical students [3] and themselves during faculty development activities [4]. A recent meta-analysis surveying 89 studies spanning 43 years of work and a range of disciplines found that experiential learning vs. traditional learning produces a positive moderate size effect (i.e., standardized mean difference) of 0.57 when the learning activity dealt with discussing complex social issues, and a large effect size of 0.89 for activities focused on enhancing cognitive development (e.g., development of tasks or skills, academic achievement) [5].
Public health schools and government programs organize related learning activities for their own personnel, which are not necessarily tailored to medical students. Some of these programs are as follows:
A simulation exercise within the Moroccan Field Epidemiology Training program [6]. The instructor assigns this exercise after the classroom course before the students’ internship. Instructor and actors allow students to conduct an outbreak investigation during four days.
The Ghana Field Epidemiology and Laboratory Program organized for frontline health workers in Ghana [7].
A web-based tool developed by the Rocky Mountain Public Health Training Center [8].
Conducting outbreak investigations require synthesizing many epidemiologic concepts. Thus, participating in this activity would allow students to solidify concepts provided within the semester-long section of the longitudinal thread Professions of Medicine – Principles. It would also offer instructors the opportunity to evaluate competencies gained by students specific to an outbreak investigation, epidemiology of infectious diseases, and study designs, all evaluated in the United States Medical Licensing Examination. The competency-based framework in medical education has its roots in the 1970s, but it was not until the start of the twenty-first century that it moved ahead. It was conceived as a way to enhance the provision of quality care and reduce practice variation by focusing on skills, outcomes, the learner, and de-emphasizing time-based training [9].
The objective of this program evaluation was to assess the effectiveness of using a combined experiential, competency, and team-based learning for teaching outbreak investigations to first-year medical students (M1) and their perceptions about the usefulness of such approach. A secondary objective was to assess the agreement between the competencies gained as judged by the instructor, and as perceived by the students.
Methods
A prospective evaluation measured the effectiveness of this innovative interactive learning activity, and the students’ perception about it. Class size of 84 M1 students in fall 2019 fit nicely into 17 teams; one case study per team. In the following year (Fall 2020), students self-assembled again into 17 teams. Students also self-selected the condition by choosing a number between 1 and 17. Each number had an associated case study unknown to students. These many cases allowed to portray a wide variety of conditions with different transmission modes, settings, burden, epidemic curves, and populations at risk (Online Resource 1). Data about outbreaks from the literature or health departments’ websites facilitated more realistic scenarios and reflected situations experienced mostly in the USA. The idea was to create awareness about the presence of all these conditions in the USA.
The learning activity developed incorporates notions of (ELT) [1]. Students would conduct themselves many of the steps of an outbreak investigation given a particular case study. This classifies the activity as a low-fidelity simulator, since students need performing most but not all the steps of an outbreak investigation to complete the assignment. Effective learning requires feedback during the experience, and the Research Electronic Data Capture (REDCap) [10] tool was designed to provide some feedback by not revealing the data until students include the condition in question within the differential diagnosis.
This learning activity incorporates three out of four steps of Kolb’s ELT:
Concrete experience: exemplified by examination of two initial clinical cases, these help students develop a differential diagnosis, as well as identify a potential outbreak.
Abstract conceptualization: students identify gaps in their knowledge, and look for to fill those gaps while building the differential diagnosis.
Reflection: students assess how they see this learning experience through the feedback survey.
REDCap is a user-friendly, secure, and HIPAA compliant, web-based database application that allows data collection for studies. The REDCap survey made interactive the experience about the identification of the possible outbreak, the condition involved, and the symptoms (Online Resource 2). This survey provided the first two cases of the potential outbreak and a recent baseline case number for the condition within the locality. This is in order to detect an outbreak if applicable. The cases would allow the team to formulate a definition in terms of place, time and affected population, signs and symptoms, and finally a differential diagnosis. The teams could indicate up to three conditions in the differential as possible candidates for the outbreak. If the “unknown” condition assigned to them was among those three alternatives, data about the individuals in the outbreak with onset time, relevant symptoms, and social history or other features as needed appeared in the screen for students to retrieve.
A second step of the activity consisted of a 10-min team presentation to report their findings. Students presented their project in person for year 2019 and through videoconference in 2020. Likewise, preparatory classes in 2019 were in person while in 2020 were remote. Competencies sought in the presentation, with competencies one through six outlined in the corresponding program of North West Center for Public Health Practice of the University of Washington [11], were:
To identify an outbreak
To identify the condition to generate a case definition
To perform an epidemiologic description (i.e., orientation to person, place, and time) including building an epidemic curve
To formulate a hypothesis about the possible causes of the outbreak and related risk factors
To design a study that could test the formulated hypothesis
To propose a potential remedial measure to contain the outbreak
To indicate a novel epidemiological tool used in outbreak investigations, or an issue brought about by the condition examined (e.g., emergence, resurgence, antimicrobial resistance)
To identify a recent outbreak similar to the one they examined preferably within the USA
Teams in the 2019 cohort had about 3 months to complete the work at their own pace. Given the shorter time students used in the first iteration, teams in the 2020 cohort had 1 month to complete the work.
A second REDCap survey to gather students’ perception anonymously was administered after they completed the presentations (Online Resource 3). Questions framed in a 5-point Likert scale measured the degree of agreement (from strongly disagree to strongly agree) with the presented statements. An additional question inquired whether the students perceived that the activity provided further information with respect to what they received in few slides about the steps of an outbreak investigation and different epidemic curves during a 1-h lecture about Epidemiology of Infectious Diseases. The lecture preceded the activity. Finally, one question inquired about aspects they considered could be improved among: organization, required time for preparation, depth, information provided, delivery format of the information, and information requested.
Corrective measures to deficiencies noticed in the first cohort with respect to the competencies sought would be implemented in the second cohort in order to try to solve them as part of a quality improvement process.
Statistical Analysis
Generalized estimating equations [12] estimated the results from the perception survey as implemented in SAS v9.4 for Windows (SAS Institute Inc., Cary, NC) to account for the (exchangeable or similar across clusters) correlation among students’ responses from the same team. Not accounting for this correlation could lead to biased and/or inefficient estimates. For each question phrased in a positive mode, responses with agree or strongly agree were lumped together with the other three levels as the reference. For the question regarding whether students considered the activity necessary in general, the labeling was reversed since it was phrased negatively, i.e., lumped together disagree and strongly disagree and joined the other three levels into the reference level.
Results
Table 1 summarizes the information gathered from the interactive survey. Almost half of the teams in the 2019 cohort correctly identified an outbreak, cluster, or required additional information to make the determination. This recognition improved in the 2020 cohort to 88% of teams after careful examination of how background prevalence levels information was presented. All teams oriented the condition toward place, time, and people correctly. Above 80% of the teams recognized completely or almost completely most frequent related symptoms and the condition in the first option in both cohorts.
Table 1.
Descriptive of performance detecting the disease in question
| Feature | First cohort, Fall 2019 (n = 17) | Second cohort, Fall 2020 (n = 17) |
|---|---|---|
| Outbreak detection |
Eight (47%) teams correctly identified the type of phenomenon during the work on the REDCap platform - Five teams indicated that there was an outbreak (4 correct) - Six teams indicated that a cluster but not an outbreak was happening (2 correct) - Six teams indicated that they required more information to make a determination (2 correct) |
Fifteen (88%) teams correctly identified an outbreak. (Background information on the disease in question was updated to resemble an outbreak in all 17 conditions for this second cohort) |
|
Condition in differential (Condition in question) |
Percent of conditions in 1st, 2nd or 3rd option First iteration: 13, 1, 1 Subsequent iteration: 2, 0, 0 |
Percent of conditions in 1st, 2nd or 3rd option First iteration: 14, 2, 0 Subsequent iteration: 0, 1, 0 |
| Condition in differential (other conditions) |
Percent of related conditions correct – distribution in 1st, 2nd and 3rd options 2/2, 10/12, 6/10 Four groups reported only one disease in the differential and it was the correct one. Two groups reported two conditions and the first one was the correct one |
Percent of related conditions correct – distribution in 1st, 2nd and 3rd options 2/3, 5/11, 7/10 Three groups reported only one disease in the differential and it was the correct one. One group reported two conditions and the first one was the correct one |
Table 2 summarizes the proportion of agree and strongly agree for each question as gathered in the students’ feedback survey and the competencies achieved as recognized from the presentations by the instructor. Fifty-five (65% in 2019) and 43 (51%, in 2020) eligible students completed the perceptions’ survey. At least two individuals from each group provided feedback through the survey in 2019 and one individual from all but one team in 2020. It took the teams 4 h (minimum 1, maximum 11) on average to work on the activity. Twenty-six responders from twelve teams (year 2019) and 24 responders from 11 teams (year 2020) agreed or strongly agreed with most of the items stated. Twenty-eight people from 14 groups in 2019, and 17 responders from 12 teams in 2020 were neutral in the overall as to the value of each individual item. Finally, one individual in 2019 and two in 2020 disagreed with the statements provided for each of the items.
Table 2.
Competencies gained as assessed by instructor from group presentation (n = 17) and as perceived by students (n = 55 responders in 2019, 43 responders in 2020)
| Year 2019 | Year 2020 | |||
|---|---|---|---|---|
| Competency |
Per presentationa (n = 17 teams) |
Per surveyb (n = 55 students) |
Per presentationa (n = 17 teams) |
Per surveyb (n = 43 students) |
| Case definition | 17 partially complete (0.00) |
30; 0.56 (0.40, 0.71) |
11 complete, 6 partially complete (0.65) |
33; 0.77 (0.63, 0.87)c |
| Epi description | 17 (1.00) |
41; 0.74 (0.65, 0.81)c |
14 (0.82) |
38; 0.88 (0.77, 0.94) |
| Build EPI curve | 17 (1.00) |
35; 0.64 (0.50, 0.76) |
17 (1.00) |
33; 0.78 (0.65, 0.87)c |
| ID type of EPI curve | 12 (0.71) | 13 (0.76) | ||
| Indicate incubation period | 13 (0.77) | 11 (0.65) | ||
| Generate hypothesis | 15 (0.88) |
41; 0.75 (0.58, 0.87)c |
17 (1.00) |
39; 0.90 (0.80, 0.95)c |
| Design proper study | 9 (0.53) |
37; 0.67 (0.53, 0.79) |
9 (0.53) |
31; 0.72 (0.55, 0.84) |
| Propose remedial measures | 15 (0.88) |
40; 0.74 (0.58, 0.85) |
16 (0.94) |
35; 0.82 (0.67, 0.90) |
| Propose issue and/or tools | 16 (0.94) | 17 (1.00) | ||
| ID similar outbreak in the USA | 15 (0.88) | 16 (0.94) | ||
| Report results thoroughly | 4 (0.24) |
39; 0.71 (0.58, 0.81) |
14 (0.82) |
39; 0.91 (0.80, 0.96)c |
aNumbers in column represent the number of teams presenting the competency and the corresponding proportion out of 17 (within parentheses)
bFirst figure in column corresponds to number of students who considered achieving the competency, figures after semicolon represent the corresponding proportion out of 55 students (43 students for 2020), and 95% confidence interval (within parentheses), all obtained from the generalized estimating equation model to account for clustering of students’ responses within teams
cSame number of responses may lead to slightly different estimate and/or confidence intervals depending on the patterns of responses within teams, in addition to the potential effect of rounding to two decimal places
All teams captured the condition, symptomatology, and transmission mode. Twelve of the 17 groups (70.6%) in 2019 added other features in their presentations. Basically, the students reported the results of a self-motivated literature search that they performed about the basic reproduction number R0. Nine of the 17 teams (52.9%) in 2020 reported on R0. Seven teams in each cohort took care of using a time interval in the epidemic curve that agreed with the incubation time. Twelve teams also categorized the spread (i.e., point source, continuous, propagated), eight (75%) did it correctly in 2019, and six of 13 teams (46%) in 2020. Team presentations revealed deficiencies in designing an adequate study to test the proposed hypothesis about the most likely source and potential risk factors.
Some level of disagreement exists between students’ perception about the competencies they gained and what they portrayed in their team presentations (Table 2). Students seem to underestimate the acquisition of skills to provide an epidemiologic description, building an epidemic curve, and to a lesser extent to generate a hypothesis and propose remedial measures. They overestimated the acquisition of skills to design a proper study that would answer the hypothesis and a thorough reporting of results. The latter was notorious for the 2019 cohort. Thus providing clear expectations about the report content constituted a quality improvement step for the 2020 cohort. This resulted in a substantial increase of teams providing a complete report in the second cohort. Differences in performance between the 2019 and 2020 cohorts as evaluated by the instructor range between 0 and 5% for most competency items, while it ranged between 15 and 20% when evaluated by the students.
Figure 1a displays students’ perception about the utility of the learning activity.
Fig. 1.
a Students’ perception about utility of learning activity from two cohorts. b Students’ suggested aspects to improve in learning activity from two cohorts
Over two-thirds of the responders considered the activity stimulating, that it gave them hands-on experience (the main motivation of this learning innovation) and that the group work facilitated learning (Fig. 1a). For the last two issues, the proportion perceiving them was greater for the 2020 cohort. Over 80% of the responders in both cohorts found that the activity allowed to cover a wider range of conditions and that the two initial cases were clear enough to allow further work. Clarity of the two cases was considered greater by the 2019 cohort. The students found that the time for preparation and the information requested was sufficient (Fig. 1b). One quarter of respondents found no need for changes to the activity (closer to 40% in 2020), while a quarter suggested improvements in the organization, the information provided, or its delivery format.
Discussion
The platform developed and the team presentations showed the activity to be effective in gaining six of the eight sought competencies, in particular:
Developing a differential diagnosis and selecting the most likely condition source of the outbreak when provided with either clear signs or more subtle symptoms and thus able to generate a case definition
Building an epidemic curve and identifying potential sources of the disease outbreak
Choosing appropriate remedial and/or control measures
Recognizing modern tools that assist in epidemiology research such as genotyping, surveillance systems, and social media
Regarding the clarity of the two cases, key signs described nine conditions (Online Resource 1). This decreased the likelihood of confusion about other potential conditions in the differential. For all these cases except for six students (three in each cohort), responders agreed or strongly agreed that the two cases were informative enough. More vague signs described the other eight conditions. Still, most students indicated that there was enough information to identify it correctly.
Close scrutiny of the first cohort’s unattained competencies (i.e., outbreak identification and design study to test hypothesis) signaled areas of improvement and that needed emphasis during the Epidemiology of Infectious Disease lecture, as well as learning events dealing with study designs. These areas were as follows:
How to interpret the background prevalence in order to identify a potential outbreak and distinguish it from clusters of disease
How to select an appropriate time scale in the epidemic curve in agreement with the incubation period of the condition, which also facilitates identifying the type of epidemic
How to design the most appropriate study for the hypothesis posed, considering the data available and the time constraints of conducting an epidemiologic study within an outbreak
I incorporated corrective measures to these issues into the 2020 cohort (i.e., a quality improvement task) and evaluated them in a similar fashion to determine their effectiveness in overcoming the limitations of the initial iteration. The changes achieved improvement in recognizing an outbreak, elaborating a refined case definition, and reporting results thoroughly. However limitations regarding how to design the most appropriate study remained. Improving the outbreak recognition is important because this item is explicitly outlined in the US-MLE. Thus the enhanced recognition will improve students’ performance on the test. On the other hand, difficulties in recognizing the proper study design are concerning because this is also an explicit item in the US-MLE in particular and reflects the overall understanding of many epidemiology concepts in general. This indicates that delivery of study design content during the preparatory sessions still needs improvement, and for that, further evaluation of the root cause of the difficulty needs examination in order that the changes in the delivery of this content can be effective.
No other studies with similar features for medical students were encountered in the literature. A PubMed search identified seven additional studies examining experiential learning activities for public health practitioners like the ones described in the introduction [13–17] to build local capacity to deal with outbreaks, or for veterinary students [18, 19]. A common feature in all these studies is the use of a single case study rather than a compendium of cases that allows learners to be exposed to broad range of outbreaks with their particular characteristics. Some studies indicated that the learning activity should take place in the classroom and that it would take 2 or 3 h to complete. The latter agrees with the average time that took students to perform the outbreak investigation in the current study. We would need to add the time for the presentations which was about two sessions of 90 min each.
The two studies developed within the veterinary curriculum [18, 19] ran contemporarily with the current work. The first study [18] highlighted the use of a gamification exercise to motivate students into learning outbreak investigations. Students could perform the activity individually or in teams. Authors evaluated objectively learning among the subset of 58 students who conducted the work alone. These students exhibited a 27% point increase of correct responses between the pre- and post-test evaluation (to a 76% post-test) after undergoing the activity. The second study focused in assessing the acceptance of an activity which combines a case-based approach with blended learning (i.e., combination of in-class and online content delivery). Eighty percent of participants encountered the activity acceptable, and 76% reported positive outcome in their self-assessed learning. The current study shares the highlighted features from each of the veterinary studies.
It is important to highlight that in the current activity, students’ initiative to aboard the topic of basic reproduction number underscores the advantages of learner-centeredness of the experiential learning approach. By design, the activity did not emphasize R0 given its more quantitative requirements within primarily non-quantitative participants. However, still more than half of the teams in both cohorts touched upon the issue. We could speculate that during and after the reflection stage, students considered that the outbreak investigation was incomplete without a proper discussion about the concept and thus made the extra effort into acquiring and sharing the necessary information.
In terms of instructor’s effort, conceptualization, investigating about the conditions, data retrieval, and conditioning including outbreak and baseline prevalence, building two introductory cases for each condition, and designing and developing the REDCap tools, required about 132 h of work including the 17 cases developed. In general, once the process is streamlined, it takes about 7–8 h to identify a case study, generate the pertinent data for the epidemic curve, build the narrative of the first two cases, and upload the information into REDCap.
Current times stress that teaching about outbreaks in Medical School, and likely for other practitioners, needs robustness. The interactive REDCap tool developed allowed teaching outbreaks differently and effectively, it engages students, and it is instructor-friendly. Future endeavor will consider to include more components of the outbreak investigation within the REDCap tool to simulate more realistically the whole process.
Although I examined the activity in medical students and thus emphasized aspects that required skills they have developed during their first year as a motivating piece, we can adapt the motivating aspects of the activity accordingly to the skills and strengths of the learners’ background (e.g., public health, epidemiology, and biostatistics). Finally, it is important to emphasize that mode of delivery, i.e., in-person or remote, seems to not alter the results as evidenced by lack of changes of components not subjected to quality improvement under the instructor’s perspective. These aspects highlight the versatility of this activity, although the study designed did not assessed these aspects specifically.
Supplementary Information
Below is the link to the electronic supplementary material.
Acknowledgements
The author wishes to thank Anita F. Bell, data technician, for her assistance in fine-tuning the REDCap application that facilitated the interaction with students; the POM-PR administration which facilitated developing the learning event and performing its corresponding evaluation; Lisa Graves, M.D., for her insightful suggestions; and the anonymous reviewers for their thorough and encouraging comments that helped to shape the final product. Results from the first cohort were presented at the 39th Annual Kalamazoo Community Medical and Health Sciences Research Day (2021).
Declarations
Ethics Approval
The institutional IRB granted exempt status to this project under 45 CFR 45.101(b) Category 1. (Ref WMed-2019–0531).
Consent to Participate
Consent to participate in the voluntary survey was obtained from participants who provided the survey.
Conflict of Interest
The author declares no competing interests.
Footnotes
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- 1.Kolb DA, Boyatzis RE, Mainemelis C. Experiential learning theory: previous research and new directions. In Strenberg RJ, Zhang LF, editors. Perspectives on Cognitive, Learning, and Thinking Styles. NJ: Lawrence Erlbaum; 2000.
- 2.Dong H, Lio J, Shere R, Jiang I. Some learning theories for medical educators. Med Sci Educ. 2021;31(3):1157–1172. doi: 10.1007/s40670-021-01270-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Reed S, Shell R, Kassis K, Tartaglia K, Wallihan R, Smith K, et al. Applying adult learning practices in medical education. Curr Probl Pediatr Adolesc Health Care. 2014;44(6):170–181. doi: 10.1016/j.cppeds.2014.01.008. [DOI] [PubMed] [Google Scholar]
- 4.Steinert Y, Mann K, Centeno A, Dolmans D, Spencer J, Gelula M, et al. A systematic review of faculty development initiatives designed to improve teaching effectiveness in medical education: BEME Guide No. 8. Med Teach. 2006;28(6):497–526. 10.1080/01421590600902976. [DOI] [PubMed]
- 5.Burch GF, Giambatista R, Batchelor JH, Burch JJ, Hoover JD, Heller NA. A meta-analysis of the relationship between experiential learning and learning outcomes. Decis Sci J Innov Educ. 2019;17(3):239–273. doi: 10.1111/dsji.12188. [DOI] [Google Scholar]
- 6.Jroundi I, Belarbi A. Foodborne outbreak simulation to teach field epidemiology: the Moroccan Field Epidemiology Training Program. Tunis Med. 2016;94(11):658. [PubMed] [Google Scholar]
- 7.Ameme DK, Nyarko KM, Afari EA, Antara S, Sackey SO, Wurapa F. Training Ghanaian frontline healthcare workers in public health surveillance and disease outbreak investigation and response. Pan Afr Med J. 2016;25(Suppl 1):2. doi: 10.11604/pamj.supp.2016.25.1.6179. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Rocky Mountain Public Health Training Center. Outbreak playbook. 2018. https://rmphtc.org/COE/OutbreakPlaybook/index.html. Accessed 17 Sept 2019.
- 9.Cooney R, Chan TM, Gottlieb M, Abraham M, Alden S, Mongelluzzo J, et al. Academic primer series: key papers about competency-based medical education. West J Emerg Med. 2017;18(4):713–720. doi: 10.5811/westjem.2017.3.33409. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Harris PA, Taylor R, Thielke R, Payne J, González N, Conde JG. Research electronic data capture (REDCap) – a metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform. 2009;42(2):377–381. doi: 10.1016/j.jbi.2008.08.010. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Baseman JG, Marsden-Haug N, Holt VL, Stergachis A, Goldoft M, Gale JL. Epidemiology competency development and application to training for local and regional public health practitioners. Public Health Rep. 2008;123(Suppl 1):44–52. doi: 10.1177/00333549081230S111. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Zeger SL, Liang KY, Albert PS. Models for longitudinal data: a generalized estimating equation approach. Biometrics. 1988;44(4):1049–1060. doi: 10.2307/2531734. [DOI] [PubMed] [Google Scholar]
- 13.Abubakar AA, Gobir AA, Nda II, Kusfa IU, Obafemi B, Alabi O, et al. Outbreak investigation of measles in Kaduna State, Northwestern Nigeria, 2015. Pan Afr Med J. 2018;30(Suppl 1):3. doi: 10.11604/pamj.supp.2018.30.1.15445. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Asiedu EK, Nyarko KM, Kenu E, Afari EA, Frimpong JA, Park MM, et al. Meningitis outbreak investigation in Nkoranza South Municipality in Brong Ahafo Region, Ghana, February 2016. Pan Afr Med J. 2018;30(Suppl 1):4. doi: 10.11604/pamj.supp.2018.30.1.15261. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Syed MA, Shumuye NA, Anyorikeya M, Usenbaev N, Mertens E, Bellali H, et al. Outbreak investigation of an unknown gastrointestinal illness in District Victoria, Country Mala, 2016. Pan Afr Med J. 2021;40(Suppl 2):1. doi: 10.11604/pamj.supp.2021.40.2.30992. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Oberle MW, Foy HM, Alexander R, Kobayashi J, Helgerson SD. Enhancing student practicum opportunities: the outbreak investigation option. J Public Health Manag Pract. 1995;1(2):69–73. doi: 10.1097/00124784-199500120-00012. [DOI] [PubMed] [Google Scholar]
- 17.Stehr-Green J, Gathany N. Training in outbreak investigation through use of an online discussion group. J Environ Health. 2005;68(4):9–13. [PubMed] [Google Scholar]
- 18.Shearer AEH, Kniel KE. Foodborne illness outbreak investigation for one health postsecondary education. J Microbiol Biol Educ. 2021;22(2):e00129–e221. doi: 10.1128/jmbe.00129-21. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Duckwitz V, Vogt L, Hautzinger C, Bartel A, Reinhardt J, Haase S, et al. Teaching outbreak investigations with an interactive blended learning approach. J Vet Med Educ. 2022;49(3):312–322. doi: 10.3138/jvme-2020-0077. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.

