Skip to main content
Public Health Reports logoLink to Public Health Reports
. 2023 Sep 7;138(6):878–884. doi: 10.1177/00333549231190050

Design and Implementation of an Innovative, Rapid Data-Monitoring Strategy for Public Health Emergencies: Pilot of the United States School COVID-19 Mitigation Strategies Project

Marci F Hertz 1,, Rhodri Dierst-Davies 2, Kimberley Freire 3, Jorge M Vallery Verlenden 1, Laini Whitton 4, John Zimmerman 4, Sally Honeycutt 1, Richard Puddy 1, Grant T Baldwin 1
PMCID: PMC10576489  PMID: 37675484

Abstract

During the COVID-19 pandemic, an urgent need existed for near–real-time data collection to better understand how individual beliefs and behaviors, state and local policies, and organizational practices influenced health outcomes. We describe the processes, methods, and lessons learned during the development and pilot testing of an innovative rapid data collection process we developed to inform decision-making during the COVID-19 public health emergency. We used a fully integrated mixed-methods approach to develop a structured process for triangulating quantitative and qualitative data from traditional (cross-sectional surveys, focus groups) and nontraditional (social media listening) sources. Respondents included students, parents, teachers, and key school personnel (eg, nurses, administrators, mental health providers). During the pilot phase (February–June 2021), data from 12 cross-sectional and sector-based surveys (n = 20 302 participants), 28 crowdsourced surveys (n = 26 820 participants), 10 focus groups (n = 64 participants), and 11 social media platforms (n = 432 754 503 responses) were triangulated with other data to support COVID-19 mitigation in schools. We disseminated findings through internal dashboards, triangulation reports, and policy briefs. This pilot demonstrated that triangulating traditional and nontraditional data sources can provide rapid data about barriers and facilitators to mitigation implementation during an evolving public health emergency. Such a rapid feedback and continuous improvement model can be tailored to strengthen response efforts. This approach emphasizes the value of nimble data modernization efforts to respond in real time to public health emergencies.

Keywords: COVID-19, case study, data systems, public health emergencies


Since the COVID-19 pandemic began in the United States, federal, state, territorial, local, and tribal officials implemented, to varying degrees, measures to slow the spread of COVID-19 and its health, economic, and social impacts.1,2 However, prior to vaccine availability, health officials reported varying levels of uptake of prevention guidelines, including stay-at-home orders, face mask mandates, and physical distancing recommendations.3-5 In addition, compliance varied, resulting in a patchwork of implementation and enforcement. 6 Given the inconsistencies in implementation of prevention guidelines, the Centers for Disease Control and Prevention (CDC) identified a need for real-time information on adoption of COVID-19 prevention strategies nationwide to inform public health action and tailor response efforts. While rapid data collection is common following disasters, the scope, scale, complexity, and fast-changing geographic and setting-specific challenges of the COVID-19 pandemic strained the capacity of surveillance systems to meet time-sensitive needs. Prior to the pandemic, public health officials recognized the need to modernize public health data systems to include faster, more cutting-edge data collection methods, but COVID-19 underscored the urgency to accelerate efforts.7,8 Given the unique challenges of the COVID-19 pandemic and the need to explore faster, more innovative data collection methods, CDC leadership charged a subgroup with developing and piloting a novel rapid data collection process. The goal was to better understand the contextual factors that influence implementation and effectiveness of CDC-recommended mitigation strategies across populations and settings.

The influence of contextual factors on COVID-19 mitigation in the United States was particularly evident in kindergarten through 12th grade (K-12) schools. After the emergency declaration on January 31, 2020, extended school closures occurred through the end of the 2019-2020 academic year and beyond. 9 Approximately 93% of students engaged in some form of distance learning beyond fall 2020. 10 As virtual learning periods extended, concern about the effects of distance learning on student mental health and family economics increased, as closures hindered parents’ regular workplace routines.11,12 In addition, emerging evidence suggested that continued disruptions could exacerbate preexisting sociostructural inequalities 12 and academic disparities 13 and reduce the ability of schools to provide social and nutritional services. 14 Based on these concerns, school reopening was a key component of the Opening Up America Again plan. 15

Purpose

To reopen schools more broadly, a need existed to understand the real-time impact of COVID-19 and the extent to which schools implemented mitigation strategies effectively. To address this gap, CDC convened a team with expertise in emergency response, surveillance, epidemiology, informatics, program implementation, health policy, school health, and evaluation. The team’s primary goal was to assess the feasibility of conducting near–real-time monitoring and evaluation of implementation of community mitigation strategies and their impact on disease incidence, with an initial focus on schools as a proxy for other settings such as workplaces or public transportation. The project sought to layer data from multiple sources with a short lag from data collection and analysis to dissemination. The objective of this case study was to describe the processes and methods used, define lessons learned, and discuss future directions to modernize data collection during public health emergencies.

Methods

CDC identified 4 key actions: (1) create a structured process for integrating multiple data strategies in rapid cycles; (2) conduct formative research to identify data sources that could be triangulated to assess mitigation implementation, test the feasibility of methods, and modify the model; (3) test the refined model; and (4) develop rapid dissemination reports to inform decision-making. CDC conducted the first 2 actions during August–December 2020 with assistance from Deloitte with CDC Foundation funding. Actions 3 and 4 took place January–June 2021 and were implemented by Deloitte with funding and support from the CDC Foundation and technical assistance from CDC. Collection took place across 2 phases: formative (action 2) during September–November 2020 and pilot (action 3) during February–June 2021. We fielded many crowdsourcing surveys simultaneously because they involved unique populations (Figure 1). We launched several continuous social media queries, while others were active during specific times to coincide with a survey or topic.

Figure 1.

Figure 1.

Timeline and tasks to develop and pilot test a near–real-time data collection process. Abbreviations: A, action; IRB, institutional review board.

Action 1: Develop a Framework (August 2020)

CDC’s approach was grounded in the CDC Framework for Program Evaluation (eFigure 1 in Supplemental Material). 16 The framework consists of 6 interconnected steps and 4 categories of standards. The steps include the following: engage stakeholders, describe the program, focus the evaluation design, gather credible evidence, justify conclusions, and ensure use and share lessons learned. The categories of standards are utility, feasibility, propriety, and accuracy.

The purpose of the CDC Framework for Program Evaluation is to guide the development and implementation of a program evaluation strategy that encompasses all steps and standards in the framework and reflects program context. Engaging partners and focusing the evaluation design are described in actions 2 and 3. For the evaluation design for the formative research portion of this project, we selected a fully integrated mixed-methods approach incorporating both qualitative data collection through focus groups and social media listening and quantitative data obtained through surveys. Social media listening refers to the analysis of conversation trends through queries of keywords, phrases, or hashtags. The mixed-methods approach enabled data triangulation to improve validity, avoid confirmation bias, and increase representativeness.17,18 We used rapid feedback evaluation to inform data collection and integration strategies and to guide analyses, interpretation, and application of findings. Effective rapid evaluation balances relevance and rigor, often including mixed-methods approaches, to prioritize data for decision-making when circumstances change quickly. These methods yielded an initial set of questions to inform data source selection (actions 2 and 3) and potential partners to support response efforts (action 4). Uniquely, this framework facilitated the collection of near–real-time data, applied findings rapidly, and modified approaches to address evolving concerns about the pandemic. 19

Action 2: Formative Research (September–December 2020)

Formative research focused on the identification of data sources, collection methods, and validation of the data collection approach(es). CDC defined criteria to organize methods and establish priorities, including whether data could (1) address specific evaluation questions, (2) be collected frequently and reported rapidly, (3) include sampling from priority populations across jurisdictions, and (4) be feasibly used by federal agencies and/or the general public. During planning, CDC reviewed internal and external data collection initiatives and various rapid data collection methods. Online surveys, crowdsourcing platforms, virtual focus groups, and social media monitoring were most relevant for this pilot.

Next, we developed indicators to evaluate the impact of school-based prevention strategies on COVID-19 transmission. We selected indicators based on their alignment with governmental guidance, monitoring feasibility, ability to reflect implementation rather than policy or practice content, and likelihood of impact on COVID-19 incidence. We iteratively field-tested data collection instruments and modalities and used preliminary results (action 2) to refine collection methods, survey constructs, and populations to inform pilot rollout. Initial social media listening activities helped refine search terms and collection channels. Data collection and analysis were conducted consistent with federal law and CDC policy. An independent accredited institutional review board (IRB), the Center for IRB Intelligence, reviewed all activities. Because data were not for research purposes, an exemption was granted.

Formative Outcomes

The team created a schematic diagram of the development and implementation process to illustrate its application to the mitigation of COVID-19 in schools (Figure 2). In total, we conducted 26 surveys, 6 focus groups, and social media listening activities. We used results to refine intended populations, survey constructs and questions, social media queries, and the dashboard. Key lessons learned that were applicable to action 3 included the need to (1) increase the reach of data collection to underrepresented populations through crowdsourced surveys, oversampling, and focus groups; (2) engage partners to identify constructs and obtain population buy-in; (3) coordinate constructs across qualitative and quantitative modalities to avoid incongruency during triangulation; and (4) synthesize and tailor dissemination procedures to ensure greater reach and usefulness.

Figure 2.

Figure 2.

Application of a rapid feedback model as applied to COVID-19 mitigation strategies in school settings.

Action 3: Pilot the Model (February–June 2021)

Based on formative findings and discussions with national partner organizations (eg, National Association of School Nurses, National Association of Secondary School Principals), the project team refined evaluation questions across instruments. Questions and constructs focused on which strategies were implemented and to what degree, demographic and contextual factors that influence implementation, and the mental health of respondents. Questions and constructs reflected topics of immediate relevance to decision makers and the public, including reopening, special populations, health equity, COVID-19 testing as a mitigation strategy, and vaccine confidence. Revised indicators guiding instrument development fell into 2 broad categories: individuals’ attitudes, beliefs, motivations, and behaviors and mitigation strategies (individual, structural, institutional). Example topics included adoption, adherence, and enforcement of mitigation strategies; vaccine sentiments; and mental, educational, and physical impacts.

Pilot Participants

Inclusion and exclusion criteria varied based on investigatory need. Eligible participants included enrolled K-12 students aged 13-20 years, parents with school-aged children in the household, and K-12 teachers, staff, and administrators. All participants provided informed consent. The team obtained parental consent prior to student assent when appropriate. The Center for IRB Intelligence reviewed and approved this pilot phase.

Surveys

Qualtrics XM collected cross-sectional web-panel and sector-based surveys through its online platform. Recruitment of potential respondents occurred through various partner double-opt-in market research panels. Panel respondents provide basic sociodemographic information and can volunteer to participate in surveys on various topics of interest. Eligible respondents then receive a recruitment email with a link to a survey and complete additional screening information to ensure eligibility. To ensure representativeness, sampling targets for cross-sectional surveys were parents (n = 4000), students (n = 2000), and teachers (n = 1800), which, for these surveys, were minimum adequate sample size calculations that ensured 80% power. Targets for sector-based surveys ranged from 500 to 1500 and were not based on sample size estimates because no stable denominators existed. We based parent and student estimates on the 2020 US Census and adjusted across categories of interest within US regions. We used data from the National Center for Educational Statistics to generate estimates for teachers. 20 Pollfish, an online marketing and research firm, fielded crowdsourced surveys through its network of 140 000 application partners. We used similar double-opt-in recruitment techniques described previously. We provided no compensation, other than benefits received by respondents for participating in their panels.

Virtual Focus Groups

Partners of the CDC Foundation provided guidance on sampling and recruitment for adult focus groups. Qualtrics supported recruitment for students. All focus groups relied on convenience sampling of 6 to 8 participants, lasted 90 minutes, were audio-recorded for analysis purposes, and were conducted by using the Zoom platform; participants received a $75 to $150 gift card for participating.

Social Media Listening

Using Sprinklr, a customer experience platform that synthesizes data from multiple online platforms, the Deloitte team had access to >350 million publicly available data sources. We developed 30 queries aligned to strategies or topics, half of which focused on general conversations about schools and mitigation strategies and half of which captured posts from key populations. Results were updated daily, and weekly reports that outlined activity (eg, number of hits, spikes in activity) were used to develop a dashboard.

Data Analyses and Dissemination

Because data type and structure varied across collection modes, management varied. For cross-sectional and some sector-based survey data, we used additional postcollection techniques (imputation, weighting) to reduce bias and increase generalizability. For focus groups, we used deductive (moderator-guided topics) and inductive (emergent themes) coding methods to facilitate analyses and develop a codebook. Data triangulation began by investigating a research question and potential responses across all collection mediums. Generally, the team compared qualitative findings across all survey types and then explored qualitative and social media findings to provide additional context. We synthesized findings into briefs or reports and presented them to CDC officials.

Action 4: Dissemination (March–June 2021)

Aligned with the fully integrated mixed-methods approach, we triangulated data across platforms to facilitate near–real-time insights into whether, how, and why mitigation strategies were being implemented and to address gaps. This process included developing an integrated platform using the CDC Data Hub’s Amazon Web Service to populate a Tableau dashboard. The team triangulated qualitative findings by comparing themes and emerging trends during marked increases in discussion threads and hashtags. We identified quantitative findings according to their alignment with, or divergence from, key outcomes and to provided context. The team translated results into reports accessible to CDC COVID-19 emergency response officials, vetted key findings, and made them publicly available.

Outcomes

Pilot Results

We conducted 40 surveys and 10 focus groups (eFigure 2 in Supplemental Material). Average fielding was longer for sector-based and cross-sectional surveys (26.5 and 21.3 days, respectively) than for crowdsourcing surveys (4.5 days), allowing for the collection of more (n = 28) surveys covering diverse topics than sector-based or cross-sectional surveys (n = 6 each). In addition, more respondents from crowdsourced surveys (n = 26 820) than from sector-based (n = 4241) or cross-sectional (n = 16 061) surveys participated.

The data triangulation process can be applied to other public health emergencies; for example, we explored the effect of vaccines among teachers and students (eFigure 3 in Supplemental Material). Survey results demonstrated that although wearing face masks was important (teachers, 66%), adherence to recommendations was challenging, particularly for special education staff (44%). Differences by data source were noted, but anticipated, because of varied sampling techniques and sample sizes. For social media listening, mentions of face masks, face masking, or school face masking policy were tracked from March through May 2021. Overall, 893 669 unique mentions were identified. To further align triangulated efforts, the team reviewed focus group themes and compared response patterns.

In total, 19 products were created, including triangulation reports, executive summaries, infographics, policy briefs, and dashboards. Results from this project were shared across CDC COVID-19 response staff and were broadly integrated with other data sources to inform the development of school mitigation resources.

Lessons Learned

This pilot demonstrated that combining traditional and nontraditional data collection methods can convey a more complete picture of epidemiologic and behavioral trends during rapidly evolving emergencies than any individual data source. Research on the implementation of evidence-based practices in schools illustrated the need for prevention strategies to reflect school and community culture and context to support uptake and effectiveness. 21 Keeping purposefulness and utility central helped ensure that data were examined, considered, and used to guide improvements in mitigation practice. This approach aligns with the tenants of public health surveillance22-24 while incorporating new methodologies and technologies consistent with data modernization initiatives. 25

This study had several limitations. First, information overload can make it difficult to identify the most critical and salient findings. In balancing methods that were timely and rigorous, prioritization of one or the other had to be stated early and explicitly. 26 Second, although this process was expedited, building and implementing an entirely new model was time-consuming. Survey modes were known to have varying levels of representativeness and accuracy in sampling, leading to concerns about selection and misclassification bias, issues the research team took into account during the triangulation process.

Several lessons learned are particularly applicable for future emergencies. First, partnerships were vital at all stages to increase sustainability for future emergency responses. Future applications could incorporate additional collection modes to improve capacity and accurately forecast community needs, such as insights from the CDC Center for Forecasting and Outbreak Analytics. 27 Integration of rapid data collection platforms with surveillance efforts implemented through CDC’s Data Modernization Initiative 25 could increase effectiveness, accuracy, and sustainability of public health data collection systems at all levels (federal, state, and local). Moving forward, improvements can be made to ensure that the dissemination of information collected through surveillance efforts is timely, understandable, and accessible. Social media may be beneficial in disseminating health messages because it can shape norms and behaviors and serve as a barometer of sentiments, for example, when we observed spikes reported during March (~120 000 instances), when the American Rescue Plan Act 28 was signed in the United States, and when CDC issued revised face mask guidance in May 2021. 29 Social media, in conjunction with technology (eg, artificial intelligence, machine learning), can be used to identify emerging trends faster than traditional methods.26,30 Finally, enhancing real-time monitoring and visualization through expanded use of data dashboards would be beneficial.

This work can serve as a prototype for the use of near–real-time data collection methods during an emergency response wherein investigatory approaches need to balance scientific relevance with rigorous analytics. Before the COVID-19 pandemic, public health experts asserted a need to apply novel dynamic research and surveillance methods to epidemic investigations and data collection practices. 7 The approaches outlined here provide a roadmap to quickly collect, synthesize, and use data to assess implementation and communication about guidance during a public health emergency. This project demonstrates the benefits of this novel process and for integration of data in near-real time.

Supplemental Material

sj-pptx-1-phr-10.1177_00333549231190050 – Supplemental material for Design and Implementation of an Innovative, Rapid Data-Monitoring Strategy for Public Health Emergencies: Pilot of the United States School COVID-19 Mitigation Strategies Project

Supplemental material, sj-pptx-1-phr-10.1177_00333549231190050 for Design and Implementation of an Innovative, Rapid Data-Monitoring Strategy for Public Health Emergencies: Pilot of the United States School COVID-19 Mitigation Strategies Project by Marci F. Hertz, Rhodri Dierst-Davies, Kimberley Freire, Jorge M. Vallery Verlenden, Laini Whitton, John Zimmerman, Sally Honeycutt, Richard Puddy and Grant T. Baldwin in Public Health Reports

Acknowledgments

The authors thank the CDC response management for its support and the members of the CDC Falcon team, who were instrumental in the development, refinement, and implementation of the model. Additionally, we thank the organizations that supported recruitment for the focus groups (Family Voices, School Nutrition Association, National Rural Educators Association, and The School Superintendents Association) and for sector-specific surveys (The School Superintendents Association, Alliance for a Healthier Generation, National Association of School Nurses, National Association of Secondary School Principals, American School Counselor Association, and National School Boards Association).

Footnotes

Disclaimers: The findings and conclusions in this article are those of the authors and do not necessarily represent the official position of the Centers for Disease Control and Prevention (CDC) or the CDC Foundation. This publication contains general information only and Deloitte is not, by means of this publication, rendering accounting, business, financial, investment, legal, tax, or other professional advice or services. This publication is not a substitute for such professional advice or services, nor should it be used as a basis for any decision or action that may affect your business. Before making any decision or taking any action that may affect your business, you should consult a qualified professional advisor. Deloitte shall not be responsible for any loss sustained by any person who relies on this publication.

The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Funding: The authors disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This project was funded by the CDC Foundation, using funds from its COVID-19 Emergency Response Fund.

ORCID iDs: Marci F. Hertz, MS Inline graphic https://orcid.org/0000-0002-3027-4278

Rhodri Dierst-Davies, PhD, MPH Inline graphic https://orcid.org/0000-0001-8020-3198

Kimberley Freire, PhD, MPH Inline graphic https://orcid.org/0000-0001-8754-4606

Supplemental Material: Supplemental material for this article is available online. The authors have provided these supplemental materials to give readers additional information about their work. These materials have not been edited or formatted by Public Health Reports’s scientific editors and, thus, may not conform to the guidelines of the AMA Manual of Style, 11th Edition.

References

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

sj-pptx-1-phr-10.1177_00333549231190050 – Supplemental material for Design and Implementation of an Innovative, Rapid Data-Monitoring Strategy for Public Health Emergencies: Pilot of the United States School COVID-19 Mitigation Strategies Project

Supplemental material, sj-pptx-1-phr-10.1177_00333549231190050 for Design and Implementation of an Innovative, Rapid Data-Monitoring Strategy for Public Health Emergencies: Pilot of the United States School COVID-19 Mitigation Strategies Project by Marci F. Hertz, Rhodri Dierst-Davies, Kimberley Freire, Jorge M. Vallery Verlenden, Laini Whitton, John Zimmerman, Sally Honeycutt, Richard Puddy and Grant T. Baldwin in Public Health Reports


Articles from Public Health Reports are provided here courtesy of SAGE Publications

RESOURCES