Skip to main content
PLOS ONE logoLink to PLOS ONE
. 2021 Sep 30;16(9):e0257806. doi: 10.1371/journal.pone.0257806

The cost-effectiveness of common strategies for the prevention of transmission of SARS-CoV-2 in universities

Zafar Zafari 1,*, Lee Goldman 2, Katia Kovrizhkin 3, Peter Alexander Muennig 3
Editor: Kevin Schwartzman4
PMCID: PMC8483333  PMID: 34591874

Abstract

Background

Most universities that re-open in the United States (US) for in-person instruction have implemented the Centers for Disease Prevention and Control (CDC) guidelines. The value of additional interventions to prevent the transmission of SARS-CoV-2 is unclear. We calculated the cost-effectiveness and cases averted of each intervention in combination with implementing the CDC guidelines.

Methods

We built a decision-analytic model to examine the cost-effectiveness of interventions to re-open universities. The interventions included implementing the CDC guidelines alone and in combination with 1) a symptom-checking mobile application, 2) university-provided standardized, high filtration masks, 3) thermal cameras for temperature screening, 4) one-time entry (‘gateway’) polymerase chain reaction (PCR) testing, and 5) weekly PCR testing. We also modeled a package of interventions (‘package intervention’) that combines the CDC guidelines with using the symptom-checking mobile application, standardized masks, gateway PCR testing, and weekly PCR testing. The direct and indirect costs were calculated in 2020 US dollars. We also provided an online interface that allows the user to change model parameters.

Results

All interventions averted cases of COVID-19. When the prevalence of actively infectious cases reached 0.1%, providing standardized, high filtration masks saved money and improved health relative to implementing the CDC guidelines alone and in combination with using the symptom-checking mobile application, thermal cameras, and gateway testing. Compared with standardized masks, weekly PCR testing cost $9.27 million (95% Credible Interval [CrI]: cost-saving-$77.36 million)/QALY gained. Compared with weekly PCR testing, the ‘package’ intervention cost $137,877 (95% CrI: $3,108-$19.11 million)/QALY gained. At both a prevalence of 1% and 2%, the ‘package’ intervention saved money and improved health compared to all the other interventions.

Conclusions

All interventions were effective at averting infection from COVID-19. However, when the prevalence of actively infectious cases in the community was low, only standardized, high filtration masks clearly provided value.

Introduction

In September of 2020, roughly half of United States (US) universities and colleges allowed at least some students back for in-person instruction [14]. Re-opening protocols for universities in the US are based on guidelines set by the Centers for Disease Control and Prevention (CDC) [5]. These include social distancing, masks, an emphasis on handwashing, and enhanced cleaning procedures in all parts of the university [5, 6]. Many universities attempted to supplement the core CDC guidelines with additional preventive interventions.

To address uncertainties surrounding the cost and effectiveness of interventions to prevent the spread of COVID-19, we developed the Columbia Covid-19 Model [7]. This is a user-accessible model that allows different universities to alter input parameters via an online interface based on their unique characteristics. Our aim was to calculate the cost-effectiveness of commonly used interventions for re-opening universities relative to implementing the CDC guidelines alone. We examined the cost-effectiveness of implementing the CDC guidelines in combination with: 1) a symptom-checking mobile phone application, 2) providing standardized, high filtration masks, 3) using thermal cameras for temperature screening at university entrances, 4) gateway polymerase chain reaction (PCR) testing, 5) and weekly PCR testing. We also modeled a ‘package’ intervention that combines CDC guidelines with providing a symptom-checking mobile phone application, standardized high filtration masks, and gateway PCR testing at the beginning of the semester followed by weekly PCR testing thereafter. We also developed an accompanying online tool that can evaluate novel interventions while also allowing local university decisionmakers to change the model parameters so that they more closely align with those of their own university setting [1, 8].

Materials and methods

Overview

The Columbia Covid-19 model is a decision-analytic model that deploys a Monte Carlo simulation. In this model, a cohort of students and a cohort of staff/faculty cycle daily through a 90-day semester [7]. As each day passes, the model calculates the risk of an infection, hospitalization, or death among the students and university affiliates.

For the present analysis, we used Columbia University as a case study because we had information on the socio-demographic characteristics of university affiliates (students, faculty, and staff who returned to campus in the Fall of 2020), novel data, and detailed cost information. The data that we collected include information from the extended contact tracing team, procurement costs, and expert input from the Public Health Committee. In addition, our team administered theory-grounded standard gamble exercises to graduate public health students at Columbia University to obtain data on risk-taking proclivities and willingness-to-pay data for tuition when classes are held online only versus in-person (S1 Table in S1 Appendix). These students were chosen because they had studied the risks associated with contracting COVID-19 among student-aged populations. Our model allows for stepwise cost-effectiveness comparisons across interventions [9].

Interventions

We compared the CDC recommendations alone (“status quo”) with the CDC guidelines coupled together with each of the interventions under study [1]. Our online model allows the user to compare any given intervention against either the CDC guidelines or to no guidelines in place at all [7].

Our interventions were chosen because they were the most commonly used preventive modalities at the start of the Fall of 2020 semester in American universities [4]. This determination was made using a survey of universities [4] and the Columbia University Public Health Committee (a group of leading experts in infectious disease and university administrators).

The interventions fell into two categories: 1) reducing the number of potentially infectious affiliates on campus screening, and 2) reducing transmission on campus (S1 File in S1 Appendix).

Interventions for removing potentially infectious affiliates

Symptom-checking mobile application. We evaluated a requirement that university affiliates self-report COVID-19-associated symptoms using the university-mandated mobile phone application, which is available on iOS and Android systems and is required for entry to campus [10]. The symptom-checking application was designed to increase the proportion of exposed affiliates who self-isolate when they develop symptoms of COVID-19 (Table 1). After users attest to having no symptoms related to COVID-19, the application presents a green screen that can be shown to security guards.

Table 1. Major assumptions used in modeling the cost-effectiveness of strategies to improve infection control for COVID-19 in the university setting.
1. The campus would be closed and classes would be held online for the remainder of the semester if the cumulative number of incident cases among students/staff reached 500.*
2. Upon presence of a super-spreader in the party, 5 or more university affiliates participating in that party (half of the average 10 university affiliates attending the party) would be exposed.†
85% of students would self-isolate when they developed symptoms of COVID-19. We assumed a 10-percentage points improvement in this parameter associated with the use of symptom-checking mobile application.† This assumption was modeled probabilistically and tested in a one-way sensitivity analysis.†
4. The average infected student would have an average of 10 close contacts (<6 feet for more than 10 minutes) on campus and 2 close contacts/day off campus prior to detection. ‡
5. Viral loads did not differ by sex, age, or severity of disease.†
6. All wages were valued at the median hourly wage in the U.S. [9]
7. When an otherwise healthy person was misdiagnosed by a test or thermal screening, the relevant indirect cost was lost time valued at the national average wage during the quarantine time [9].
8. Fevers detected using thermal cameras would be re-checked using a second method, such as a tympanic membrane thermometer.†
9. The efficacy of university-provided masks was equal to the mean efficacy of the “average” mask used by the public and an N95 mask without vents (see Table 2) [13, 34].
10. Students would not commute to or from multi-generational households with older members or have direct contact with people over the age of 60.†
11. We assumed that the duration of illness is 14 days and accounted for the possibility of long-term symptoms.
12. In our model, we assumed over weekends, possible reductions in the number of close contacts between students would be offset by higher chances of spending time in the community and social gatherings. Therefore, we assumed that the total number of contacts would remain the same throughout the week. This assumption was based in part on survey data we collected on student behaviors.†

*Based upon New York State guidelines.

†Based upon expert estimations from the Columbia University Public Health Committee or outside experts.

‡Based upon student survey.

Thermal camera. We also assessed thermal monitoring cameras at facility entry points to prevent entry of people with a fever as they enter the campus. The objective was to reduce the number of affiliates with any febrile illness, including COVID-19. Those who screen positive are subsequently screened with a tympanic membrane thermometer to reduce the number of false positive screens (see S2 Table in S1 Appendix for more information on how we modeled the intervention effect). This intervention also carries the benefit of removing affiliates who may have infectious diseases other than COVID-19.

Gateway and weekly PCR testing. Finally, we assessed one-time entry (“gateway”) testing for SARS-CoV-2 for all affiliates with or without weekly testing for acute infection using PCR tests from a commercial provider (Broad Institute, Cambridge MA). Those who tested positive were required to quarantine in a campus facility for 14 days.

Interventions for reducing transmission

Regular, disposable face mask and frequent hand hygiene. For the status quo arm, we modeled the effects of CDC-recommended baseline measures. These included wearing face masks and frequent hand hygiene. For the effects of these interventions in preventing transmission of SARS-CoV-2, we used evidence from recent published studies, including a systematic review and meta-analysis [11, 12].

Standardized, high filtration mask. We assessed a policy that universities provide standardized, high quality, high filtration masks. This policy was adopted at Columbia University because university decisionmakers felt that highly effective N95 masks would be difficult to wear during class, but that some masks made or purchased by students would be less effective. The masks that we evaluated are snug fitting and dual ply. Although no efficacy data were available, the masks were assumed to fall roughly at the mid-point of surgical masks and N95 masks (Table 2) [11, 13]. Providing such masks would reduce the number of students using homemade thin, loosely-fitting masks which were assumed to perform similarly to surgical masks [11, 13].

Table 2. Total costs and probabilities used as model inputs for estimating the cost-effectiveness of strategies to improve infection control for Covid-19 in a university setting with 16,000 students and 4,500 employees on campus during a 90-day semester.
Parameters Baseline Distribution*
Population
Number of students on campus 16,000 -
Number of staff/faculty on campus 4,500 -
Daily number of close contacts
Between each student and other students on campus 10 Gamma (25, 2.5)
Between each student and staff/faculty on campus 1 Gamma (4, 4)
Between each student and community members 2 Gamma (4, 2)
Between each staff/faculty and students on campus 4 Gamma (16, 4)
Between each staff/faculty on campus 1 Gamma (4, 4)
Between each staff/faculty and community members 2 Gamma (16, 8)
Time values
Incubation time (rinc) [26, 27] 5 days Triangular (3, 14, 5)
Infectiousness to symptoms onset (rs) 2 days Triangular (1, 3, 2)
Exposure to infectiousness 3 days Probability distribution of rinc minus probability distribution of rs
Duration of infectiousness after symptoms onset [28, 29] 10 days Triangular (6, 14, 10)
Probabilities and rates
Transmission rate per close contact [30] 0.066 Normal (0.066, 0.005)
Infection hospitalization rate among students [17, 31] 0.008 Beta (99.192, 12299.81)
Infection hospitalization rate among staff/faculty [17, 31] 0.018 Beta (98.182, 5356.374)
Infection fatality rate among students [17] 0.0002 Beta (99.98, 499799)
Infections mortality rate among staff/faculty [17] 0.0015 Beta (99.85, 66465.82)
Probability of long COVID-19 [32] 0.133 Beta (86.567, 564.313)
Proportion of students’ compliance with stay-home order when they notice their symptoms 0.85 Triangular (0.75, 0.9, 0.85)
Proportion of community members’ compliance with wearing masks outside of campus [33] 0.78 Triangular (0.72, 0.78, 0.78)
Direct costs (U.S. dollars in 2020 USD)
Hospitalization [21, 34] $23,489 -
CDC guidelines [5]
    Adhering to cleaning protocol costs [35] $318,798 -
    Custodial staff [35] $979,503 -
    Personal protective equipment [35] $1,386,898 -
Temperature cameras (see S2 Table in S1 Appendix) $485,000 -
PCR test (per test) $45 -
Indirect costs (U.S. dollars in 2020 USD)
Covid-19 infection without hospitalization for symptomatic employee who either got detected or self-quarantined (losses of productivity over 2 weeks of self-isolation) $2,800 Gamma (100, 0.036)
Covid-19 hospitalization among employees (losses of productivity over 3 weeks) $4,200 Gamma (100, 0.024)
Lost tuition value per day for online vs. in-person classes among students (Calculated from a student survey average tuition for the Fall of 2020 semester at Columbia University. See S1 and S2 Tables in S1 Appendix for more details) $46
Intervention effects
CDC guidelines
    Hand washing/sanitizer (incidence rate ratio of infection) [12] 0.64 -
    Regular mask use (odds ratio of infection) [11] 0.33 -
    Hand washing/sanitizer plus regular mask use (odds ratio of infection) [11] 0.21 Beta (78.669, 293.816)
Symptom checking application (percentage points change in compliance of university affiliates quarantine upon noticing symptoms for Covid-19) 10% Triangular (0.75, 0.9, 0.85)+0.1
Standardized masks (combined effect with frequent handwashing/sanitizing, odds ratio of infection) (see S2 Table in S1 Appendix) [11, 36] 0.128 Beta (87.072, 593.178)
Test for SARS-CoV-2
Sensitivity 0.95 -
Specificity 1 -
Health-related quality of life
Losses of QALYs associated with a COVID-19 symptomatic case [37] 0.008 Beta (99.192, 12299.81)
Losses of QALYs associated with a long COVID-19 infection [37] 0.034 Beta (96.566, 2743.61)
Losses of QALYs associated with a COVID-19 hospitalization [37] 0.020 Beta (97.970, 4776.154)
Losses of QALYs associated with a COVID-19 death among student population (adjusted for average age at death, age-dependent QALYs of the US general population, and discounting future values at 3%) 23.94 Normal (23.94, 2.40)
Losses of QALYs associated with a COVID-19 death among employee population (adjusted for average age at death, age-dependent QALYs of the US general population, and discounting future values at 3%) 18.33 Normal (18.33, 1.83)

Note: A close contact is defined as person-to-person contact < 6 feet for > 10 minutes. See S2 Table in S1 Appendix for further details on the model inputs.

*For triangular distributions, the parameters listed are lower limit, upper limit, and mode; for normal distributions, parameters are mean and standard deviation; for beta distribution, parameters are shape 1 and shape 2; and for gamma distributions, parameters are shape and rate.

†Expert opinion based on video conferences with the Public Health Committee at Columbia University, which is comprised of a range of infectious disease experts and administrators.

‡Costs reflect actual costs paid by Columbia University including personnel.

We also modeled a ‘package’ intervention that combined implementing the CDC guidelines with using symptom-checking mobile application, university-provided standardized, high filtration masks, and one-time entry gateway PCR testing plus weekly PCR testing thereafter.

Outcome measures

We examined: 1) the incremental cost of each intervention after accounting for medical and intangible costs (e.g., productivity losses of quarantine for diagnosed or hospitalized affiliates and perceived monetary instructional value of in-person versus online classes); 2) the incremental quality-adjusted life years (QALYs) gained [9, 14]; and 3) the incremental cost-effectiveness ratio (ICER). The ICER is computed as changes in costs divided by the changes in QALYs. A QALY, which can be conceptualized as a year of life lived in perfect health, is calculated as the product of the remaining years of life and the health-related quality of life (HRQL) score [15].

Model specification

Students and staff/faculty were treated as two separate but interacting populations with different baseline ages, average number of close contacts, exposures, risks of illness, hospitalizations, and deaths due to COVID-19 [16]. We used data from Columbia University on the age of each student, staff, and faculty member. We obtained age-related risks of hospitalization and death from the published literature and from the CDC [17, 18]. We then calculated the weighted average risks separately for students and staff/faculty by multiplying their age distribution by age-related risks of hospitalization and death.

We divided the simulation cohort into five states: susceptible (those who have not developed the disease and are at risk), exposed (those who are exposed but not yet infectious or symptomatic), infected (those who are currently infected and contagious), recovered (those who were infected in the past but are currently recovered), and death. The states of disease pathways are presented graphically in Fig 1. The cycle length of the model was one day, and the time horizon was over the semester (90 days).

Fig 1. Graphical representation of the states of disease pathways.

Fig 1

Our model computed the daily probability of becoming infected among susceptible population based on the average number of close contacts, the transmission rate per close contact, and the estimated prevalence of infectious cases inside and outside of the campus [16]. For each susceptible student and staff/faculty, the probability of becoming infected outside of campus was calculated as follows:

P[infectionoutsideofcampus|simulationuniti]=(1(1pc.r)coi),

where pc represents the prevalence of infectious COVID-19 cases in local community outside of the campus; r is the transmission rate per close contact; and coi represents the average number of daily close contacts that each simulation unit i (students or staff/faculty) makes in the local community outside of the campus.

The prevalence of actively infectious cases is an adjusted estimate of the proportion of people who, on any given day, might plausibly transmit SARS-CoV-2 to a close contact. The prevalence was calculated by dividing the number of actively infectious cases within New York City (NYC) neighborhoods within which the university is situated by the number of residents within same neighborhoods.

To obtain the prevalence of actively infectious cases, we first obtained the daily incident cases reported to the New York State Department of Health for the NYC-defined neighborhoods of interest. This number underestimates the actual incident cases on any given day because: 1) some people are asymptomatic; and 2) many people with symptoms will not be tested for COVID-19 [19, 20]. To determine the actual daily incident cases in the community, we applied a multiplier of 5, which was estimated using a COVID-19 projection model also in use by the CDC [19, 20]. We then added the incident cases of the current day to those from the past days who are still infectious. The surrounding community was defined as the area around the university in which students tend to live, in this case the official boundary for NYC-defined neighborhoods within which the university resides.

We also accounted for the proportion of population wearing face masks outside of campus. We assigned a multiplier factor, 1−Co. RRwearing mask; where Co represents the compliance rate with wearing face masks in the local community outside of the campus; and RRwearing mask represents a risk reduction associated with wearing face masks.

Similarly, the probability of becoming infected inside the campus was calculated as follows:

P[infectioninsidecampus|simulationuniti]=1(1ps(t).r)csi.(1pe(t).r)cei;

where ps(t) and pe(t) represents the prevalence of infectious cases among students, and staff/faculty, respectively, at time t; r is the transmission rate per close contact; and csi and cei represents respectively the average number of daily close contacts that each simulation unit i makes with students and staff/faculty on campus.

We also modeled the probability of a super-spreader event based upon the prevalence of actively infectious cases in the community, the daily probability of students’ participation in a party within the community, and the average number of attendees in each community party (S2 File in S1 Appendix).

Once infected, three consecutive phases of disease progression were possible, denoted as the time between: 1) the primary exposure and infectiousness; 2) infectiousness and onset of symptoms; and 3) symptom onset until the end of infectiousness (Table 2). For asymptomatic infected affiliates, the model excluded the second phase. At the end of the third phase, infected affiliates were classified as ‘recovered.’ In addition, the infected affiliates were exposed to a chance of illness, hospitalization, incurring costs, changes in HRQL, and a probability of death [21].

Lost productivity and leisure time were valued at the average American wage [22]. Intangible costs associated with online versus in-person instruction were valued using a survey administered to students who had experienced learning in each format. Risk tolerance was assessed using a standard gamble exercise (for details refer to the S1 Table in S1 Appendix). We tested the effect of the perceived value of online vs. in-person instruction in the one-way sensitivity analysis in which the value of tuition was varied from 0% to 100%.

The model accounted for interventions that: 1) remove infected affiliates from the university community (screening interventions); or 2) reduce SARS-CoV-2 transmission while on campus (e.g., wearing face masks), computed as the product of the adjusted odds ratio of infection associated with the intervention and the background odds of infection in the absence of the intervention.

The campus would close, and instruction would switch to online-only learning for the remainder of the semester, if the model reached a total of 500 cumulative cases of COVID-19 cases among students/staff/faculty over the semester.

Analysis

We ran a probabilistic analysis using a Monte Carlo simulation with 1,000 iterations. In each iteration, all model parameters were simultaneously sampled from their probabilistic distribution. We assessed 3 scenarios of the prevalence of actively infectious cases of COVID-19: “low prevalence” (roughly 0.1%); “moderate prevalence” (1%); and “high prevalence” (2%) to represent a range of values seen across the US over the Fall semester. We calculated the stepwise cost-effectiveness comparisons, which provide information on the value of incrementally investing in strategies (e.g., investing in the most cost-effective strategy, and then adding the next most cost-effective strategy to that). We also conducted one-way sensitivity analyses to evaluate those variables that produced a large influence on the ICER. We used the common maximum willingness-to-pay threshold of $200,000 per QALY gained as a point of reference in our sensitivity analyses [9, 14, 23]. The willingness-to-pay threshold is a hypothetical reference point against which one can compare the ICER of an intervention to a maximum value society is willing to pay for one QALY gained [9, 14, 23]. In addition, we ran a series of multi-way sensitivity analyses on core model parameters including number of close contacts, transmission rate per close contact, willingness-to-pay value, compliance with wearing masks in the community, and prevalence of actively infectious cases in the community. Our model was built on the R statistical platform (The R Foundation, Inc) [7].

Results

Predicted number of infections

At a 0.1% prevalence of actively infectious cases in the community, 968 out of the 20,500 university affiliates in our model would contract COVID-19 over the 90-day semester if no CDC guidelines were implemented. At a prevalence of 1% and 2%, infections would rise to 4,598 and 7,865 infections, respectively. When the CDC guidelines were implemented alone, infections dropped to roughly 482 (0.1%), 3,982 (1%), and 7,430 (2%), respectively.

Stepwise cost-effectiveness of additional interventions relative to CDC guidelines alone

0.1% prevalence of actively infectious cases

At this prevalence, requiring standardized, high filtration masks in addition to implementing the CDC guidelines both saved money and resulted in a gain in QALYs compared with: 1) implementing CDC guidelines alone; or 2) implementing CDC guidelines in combination with the symptom-checking mobile application, thermal cameras, and gateway testing. Compared with standardized, high filtration masks, weekly PCR testing plus CDC guidelines produced additional costs of $10,235,673 and resulted in 1.1 QALYs gained. This produces an ICER of $9,273,023/QALY gained. Compared with weekly PCR testing, the ‘package’ intervention cost $40,958 and gained 0.3 QALYs for an ICER of $137,877/QALY gained. Fig 2 shows the efficiency frontier curve for the cost-effectiveness of each of the interventions under study at a 0.1% prevalence of actively infectious cases. The stepwise calculations of ICERs from the least to the most effective intervention along with the 95% credible intervals around the outcomes’ point estimates are presented in Table 3. The ICER values in terms of incremental cost per infection averted are presented in the S3 Table in S1 Appendix. In addition, the probabilistic results in terms of the cost-effectiveness planes and cost-effectiveness acceptability curves are presented in the Online S1 and S2 Figs in S1 Appendix.

Fig 2. Efficiency frontier curve for cost-effectiveness of strategies for the prevention of transmission of SARS-CoV-2 in universities.

Fig 2

The efficiency frontier curve presents the incremental cost of each intervention under study in constant 2020 US dollars relative to the change in effectiveness as measured in quality-adjusted life years (QALYs). Each intervention is paired with the Centers for Disease Control and Prevention (CDC) guidelines. Each strategy is represented by a dot in a consistent greyscale, with the CDC guidelines in black and the multi-component “package” intervention in the lightest gray. Note: CDC guidelines = the Centers for Disease Control and Prevention guidelines for preventing the transmission of COVID-19 in a university setting. The “package” intervention combines the CDC guidelines with using the symptom-checking mobile application, standardized masks, gateway PCR testing, and weekly PCR testing.

Table 3. Model outcomes for average number of days that the university will remain open, costs, Quality-Adjusted Life Years (QALYs), and Incremental Cost-Effectiveness Ratio (ICER).
Days university open Number of infections Incremental costs ($) Incremental QALYs ICER ($/QALY) ICER ($/QALY), without dominated
100 Cases/100,000
CDC guidelines 79 (37, 90) 482 (62, 1054) Reference Reference Reference
Symptom-checking mobile application plus CDC guidelines 81 (40, 90) 437 (58, 1026) -$1901841 (-$8325490, $1835) 0.60 (0.06, 1.55) -$3165450
Thermal cameras plus CDC guidelines 81 (41, 90) 430 (56, 1016) $3345729 ($824777, $4622951) 0.08 (-0.21, 0.39) $39500535
Gateway testing plus CDC guidelines 83 (47, 90) 388 (21, 950) -$4043021 (-$11416977, -$1863169) 0.55 (-0.16, 2.34) -$7398283
Standardized masks plus CDC guidelines 89 (72, 90) 236 (31, 696) -$5154184 (-$26890309, -$851424) 1.98 (-0.24, 5.91) -$2601899 -$2601899
Weekly testing plus CDC guidelines 90 (90, 90) 152 (17, 373) $10235673 (-$2162557, $11062938) 1.10 (0.14, 4.89) $9273023 $9273023
‘Package’ intervention 90 (90, 90) 129 (16, 309) $40958 ($2811, $82009) 0.30 (0.00, 0.90) $137877 $137877
1000 Cases/100,000
CDC guidelines 25 (11, 90) 3982 (459, 5929) Reference Reference Reference
Symptom-checking mobile application plus CDC guidelines 26 (11, 90) 3930 (429, 5922) -$782349 (-$6185102, $104257) 0.71 (0.05, 4.66) -$1106337
Thermal cameras plus CDC guidelines 27 (12, 90) 3844 (416, 5810) $916464 (-$386891, $3733634) 1.19 (0.19, 2.32) $772052
Standardized masks plus CDC guidelines 38 (15, 90) 3331 (234, 5624) -$10329971 (-$46263518, -$2416245) 6.85 (0.23, 37.30) -$1508613
Gateway testing plus CDC guidelines 40 (17, 90) 3257 (154, 5498) $130466 (-$6035111, $7891180) 0.95 (-5.26, 6.76) $137319
Weekly testing plus CDC guidelines 52 (20, 90) 2620 (126, 5251) -$2928562 (-$15088412, $10351925) 8.53 (0.27, 21.74) -$343318
‘Package’ intervention 57 (22, 90) 2377 (119, 5091) -$2906565 (-$8356499, $82892) 3.23 (0.03, 9.64) -$900687 -$900687
2000 Cases/100,000
CDC guidelines 13 (7, 29) 7430 (4656, 9841) Reference Reference Reference
Symptom-checking mobile application plus CDC guidelines 13 (7, 32) 7412 (4531, 9838) -$46978 (-$731014, $213804) 0.25 (0.02, 1.24) -$186482
Thermal cameras plus CDC guidelines 14 (7, 34) 7274 (4340, 9687) $567255 (-$152899, $1086572) 1.95 (1.30, 2.94) $291003
Standardized masks plus CDC guidelines 21 (8, 90) 6861 (456, 9660) -$6694723 (-$41433630, -$1522290) 5.69 (-0.82, 49.96) -$1177028
Gateway testing plus CDC guidelines 28 (10, 90) 6312 (299, 9432) -$4288175 (-$30250266, $1167593) 7.45 (0.88, 46.64) -$575598
Weekly testing plus CDC guidelines 34 (11, 90) 5820 (245, 9243) $270282 (-$15340524, $10349996) 6.60 (0.30, 39.83) $40941
‘Package’ intervention 36 (12, 90) 5649 (231, 9148) -$1321270 (-$4488178, $207549) 2.33 (0.06, 7.20) -$567571 -$567571

All the results are the average of the 1,000 simulation runs in a probabilistic Monte Carlo simulation. ICERs were calculated as average incremental costs over average incremental QALYs in the Monte Carlo simulations and were calculated in a stepwise approach (each intervention was compared against the intervention with the next lower costs if the comparator intervention was not dominated or ruled out because of an extended dominance). Negative ICERs in this table represent a cost-saving scenario, indicating the comparator saves money and improves health compared with the baseline intervention. Under each actively infectious case prevalence scenario, the most likely cost-effective intervention at the willingness-to-pay value of $200,000/QALY was presented in bold text.

CDC: Centers for Disease Control and Prevention. CDC guidelines included social distancing, mask use, frequent handwashing, and sanitization of spaces. For the probabilistic results see S1 and S2 Figs in S1 Appendix).

*Costs and ICERs include monetary value of in-person vs. online-only instructions which were derived from a student survey at Columbia University. For details see S1 and S2 Tables in S1 Appendix.

1% prevalence of actively infectious cases

At this prevalence, the ‘package’ intervention produced both monetary savings and QALYs gained compared with either implementing the CDC guidelines alone or implementing the CDC guidelines in combination with the symptom-checking mobile application, thermal cameras, standardized masks, gateway PCR testing, and weekly PCR testing (Table 3).

2% prevalence of actively infectious cases

As with a prevalence of 1%, at a prevalence of 2%, the ‘package’ intervention resulted in both monetary savings and QALYs gained compared with all other interventions, including implementing the CDC guidelines alone and implementing the CDC guidelines in combination with the symptom-checking mobile application, thermal cameras, standardized masks, gateway testing, and weekly testing. (Table 3.)

Sensitivity analyses

Prevalence of actively infectious cases in the community

Up to the point that the prevalence of actively infectious cases in the community reached 0.22%, the use of standardized, high filtration masks in addition to implementing CDC guidelines provided the highest value given the threshold of $200,000/QALY. When the prevalence exceeded 0.22%, weekly PCR testing in addition to implementing CDC guidelines and the package of interventions both provided better value for money compared with standardized masks.

Value of online instruction

Varying the perceived value of online vs. in-person tuition did not substantively change the model outcomes. Even when the perceived value of online-only tuition was equal to that of in-person classes, the use of standardized, high filtration masks in addition to implementing CDC guidelines provided the highest value.

Influence of transmission rate, close contacts, and mask use

Although the prevalence of actively infectious cases was the most important driver of cost-effectiveness, the transmission rate, the number of close contacts, and the general use of masks among members of the surrounding community were important drivers of cost-effectiveness.

For example, the value of the package of interventions was sensitive both to the number of close contacts per student on campus and the transmission rate. At the base case 0.1% prevalence of actively infectious cases, increases in the transmission rate changed the most cost-effective intervention from standardized, high filtration masks to the ‘package’ intervention at the willingness-to-pay of $200,000/QALY. In addition, increasing the daily number of close contacts between students from 10 to 14 made the ‘package’ intervention the most cost-effective approach. However, standardized, high-filtration masks remained the most cost-effective intervention at a 0.1% prevalence of actively infectious cases when the average number of close contacts between students decreased from 10 to 2. Fig 3 depicts the most cost-effective intervention at different values of the transmission rate, daily number of close contacts between students, and willingness-to-pay. Additional multi-way sensitivity analyses at a prevalence of 1% and 2% are available in the Online S3 and S4 Figs in S1 Appendix.

Fig 3. Multi-way sensitivity analysis identifying the most cost-effective intervention at different values of the number of close contacts between students on campus, the transmission rate per close student contact, and willingness-to-pay at a 2% prevalence of actively infectious cases in the community.

Fig 3

Reducing the proportion of people in the community who are compliant with wearing regular face masks to 25% or below changed the most cost-effective intervention to the ‘package’ intervention. A multi-way sensitivity analysis between the proportion of people wearing face masks in the community, the prevalence of actively infectious cases in the community, and willingness-to-pay value is presented in the Online S5 Fig in S1 Appendix.

At Columbia University, including or excluding faculty and staff over 65 years of age or 70 years of age did not have a substantial impact under any scenario because of their relatively small numbers. Finally, reducing the threshold value for the cumulative number of infections to cause campus closure (classes turning into online-only instruction upon campus closure) by 50% (from 500 to 250) made the ‘package’ intervention the most cost-effective approach because the ‘package’ intervention would keep the campus open for more days, therefore providing more monetary instructional value from in-person vs. online only classes compared with the other interventions.

Discussion

Our model showed that the prevalence of actively infectious cases of COVID-19 in the neighborhood surrounding the university was the most important driver of cost-effectiveness when CDC guidelines were in place. At a prevalence of 0.1% (e.g., as in New York in July 2020), the most value would be realized from requiring university affiliates to use the university-provided standardized, high filtration masks in addition to implementing the CDC guidelines. When the prevalence exceeded 0.22%, the ‘package’ intervention provided the most value. However, variables such as the number of contacts between affiliates, the transmission rate per close contact, and face mask use in the community were also important determinants of the cost-effectiveness of the interventions we studied. As shown in Fig 3, reducing the number of close contacts per person and the use of face masks had a substantial influence on the likelihood of the spread of disease and therefore the cost-effectiveness of interventions to reduce the spread. Readers are encouraged to change the model inputs to suit their particular university characteristics using the online version of the model [7].

Our results are in line with a recent study by Paltiel and colleagues that recommended testing for SARS-CoV-2 when the prevalence is 0.2% [24]. We also found that for prevalence estimates of 0.22% or above, the ‘package’ intervention, which requires the one-time entry testing and weekly testing thereafter, would provide the highest cost-effectiveness value at Columbia University. Nevertheless, we adopted different modeling approach and assumptions surrounding: 1) infection fatality rate, 2) risk of transmission on campus, 3) the number of close contacts/student, and 4) our use of cost/QALY gained as an outcome measure rather than cost/case averted. If students have a higher number of close contacts or live in multi-generational households, we expect less cost-effective interventions to increase in value.

Our assessed infection fatality rate (0.02% for students and 0.15% for staff/faculty, Table 2) was smaller than the average rate for the U.S. (0.5%) [17, 24] because the population of both the students and staff/faculty was younger than the general population. Users of our online model should be careful to define risks specific to their university demographics.

Universities should consider standardizing the masks that students wear, such that their fit and filtration are superior to what students would choose to purchase on their own [11, 25]. Such standardized, high quality masks can provide the largest value, especially when the prevalence of actively infectious cases in the community is low. For example, Columbia University provided two $2 2-ply masks to each student [8].

When the prevalence of actively infectious cases in the community is high or when the average student has more close contacts, the chances of early campus closure increase. When the university is closed early, the money spent on any interventions goes to waste, and large indirect costs associated with online-only instruction are incurred. Therefore, any intervention to reduce the possibility of students attending mass events should be prioritized.

The major limitation of our analysis was the considerable uncertainty in parameter estimates. For example, estimates of infection fatality rates can quadruple when hospitals are overwhelmed with cases [17, 20]. However, the model was generally robust to different parameter inputs and assumptions for interventions. The variables that we used as inputs to the model should be adjusted as new information and new strains of SARS-CoV-2 emerge. In addition, there were considerable uncertainties surrounding factors outside of campus, such as the enforcement of more restrictive measures when infections rose in the community. We therefore held the prevalence of actively infectious cases of COVID-19 in the community where a university is situated as a constant throughout the semester. The model will not perform well when the university population is large relative to the surrounding community.

Another limitation was that universities vary considerably with respect to socio-demographic composition and risk-taking among students. The standard gamble exercises we used were administered to students who may be more risk adverse than other students. We accounted for differences in student risk preferences by varying the number of assumed contacts between students, both on and off campus in sensitivity analyses. Finally, our model greatly underestimates risk for universities in which many students commute to and from multi-generational households.

When tailored to the conditions within which the university operates, our model should provide a robust estimate of the cost-effectiveness of interventions to prevent the spread of COVID-19. As COVID-19 becomes a seasonal illness that is complicated by variants of the virus, our model can be used by university decisionmakers to ascertain how much of an investment will be necessary to manage risk.

Supporting information

S1 Appendix

(DOCX)

Acknowledgments

We would like to acknowledge the help and contributions of Wafaa El-Sadr, Melanie Bernitz, Steven Shea, Wan Yang, Jeffery Shamen, and the Public Health Committee for Reopening Columbia University.

Data Availability

All relevant data are within the paper and its Supporting Information files.

Funding Statement

Our study is funded by Columbia University Mailman School of Public Health. The funder had no roles in the design of the study and collection, analysis, interpretation of data, and writing of the manuscript.

References

Decision Letter 0

Kevin Schwartzman

5 May 2021

PONE-D-21-07113

The cost-effectiveness of common strategies for the prevention of transmission of SARS-CoV-2 in universities

PLOS ONE

Dear Dr. Zafari,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Thank you for submitting your manuscript to PLOS ONE. It considers an interesting and timely topic. However, the reviewers have identified a number of areas requiring revision and improvement. Please address these carefully.

Please submit your revised manuscript by Jun 17 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Kevin Schwartzman

Academic Editor

PLOS ONE

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. Thank you for stating the following in the Funding Section of your manuscript:

"This study is funded by Columbia University Mailman School of Public Health. The

funder had no roles in the design of the study and collection, analysis, interpretation of data, and

writing of the manuscript."

We note that you have provided funding information that is not currently declared in your Funding Statement. However, funding information should not appear in the Acknowledgments section or other areas of your manuscript. We will only publish funding information present in the Funding Statement section of the online submission form.

Please remove any funding-related text from the manuscript and let us know how you would like to update your Funding Statement. Currently, your Funding Statement reads as follows:

"The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript."

Please include your amended statements within your cover letter; we will change the online submission form on your behalf.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Partly

Reviewer #2: Yes

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

Reviewer #2: Yes

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: Yes

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: No

Reviewer #2: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: Prior to my comments, I will justify my scores above.

For (1) I selected "partly" because there are some methodological questions I have. The conclusions drawn do represent the findings presented.

For (2) I selected "Yes" but I think this is with caveats. The authors have missed an opportunity to analyze multiple interventions at once and there are some questions with underlying assumptions.

For (3) I selected "Yes" as the model is open source and parameters are listed.

For (4) I selected "No" but really this is "in between" - there are very well written parts and other parts that could use proofreading for flow and grammar. Since no copyediting is provided by PLoS, I think this is the safest score.

OVERVIEW

Zafari and colleagues detail an analysis of various interventions implemented in an American university during the Fall 2020 semester across three scenarios of assumed active, infectious COVID-19 prevalence in the community. The question is certainly interesting and useful beyond the American setting as the 2021 Fall semester approaches and countries without widespread vaccination may consider returns to school. However, I have concerns. Some of the model assumptions don’t appear justified or could be flawed; I believe the analyses are incomplete and represent only simple scenarios of implementation which are unlikely to be what are done in practice (single interventions rather than multiple); and there are issues with flow and clarity and some sections of the manuscript appearing to be “artifacts” of previous iterations (e.g., estimating community prevalence has no relevance to this paper, but is mentioned in reasonable detail).

COMMENTS

Abstract

• Specify this is in the USA (“Most universities that re-opened in the USA for in-person…”)

• Second sentence appears to be a missed edit – remove “To determine”

• Methods are lacking and need to be added. What type of model used, costs considered, etc.

• Length is <200 words but max is 300. Please add more detail in all aspects of methods/results/interpretation. Are these interventions additive? Or used in a silo? It is tough to interpret any results presented in the abstract without information such as this.

Intro

• Please add the specific aim(s) of the present study.

Methods

• Typo in first sentence (“a Markov model a Monte Carlo simulation”)

• Specify what is in the novel survey data (in “Overview”)

• The final sentence of “Overview” is strange. Does this analysis/paper follow CHEERS guidelines for reporting in CEA studies? I am not sure the ideas linked in the sentence are related.

• Suggest you number the additional interventions (on top of CDC guidelines) being modeled to help readability and allow readers to more easily track what is being compared. As a reviewer… some formatting to indicate subsections would have been helpful.

• Who is the manufacturer of the SARS-CoV-2 test kits? What specimen was assumed to be used? Did you take into account other specimen types that are similarly sensitive (e.g., NP swabs, saliva, gargle)? In any event, the base analysis parameters (specimen type, manufacturer of kit, and sens/spec) should be present – were costs of specimen collection considered? These can be substantial, particularly for NP swabs (collection alone estimated to be ~$10USD with swabs, media, PPE, and nurse time). *I see in appendix a reference for sens/spec of 95% each – in fact the reference suggests specificity of 100% (https://sites.broadinstitute.org/safe-for-school/assay-performance); is this a typo? Specificity of 95% for PCR is unacceptable given low prevalence (1%), you would expect 5 false positives for every 1 true positive. Please clarify.

• It is not clear if the “one way sensitivity analyses” referred to in the “Interventions” section actually refer to interventions evaluated or sensitivity analyses?

• Regarding the formula for calculating the prevalence – I suggest there may be something amiss. I agree with multiplying the number of cases recorded each day by 5 to account for underreporting… but then multiplying that value by 7 is a bit puzzling and may lead to systematically underestimating TRUE prevalence when prevalence is declining (the ratio is prevalence is underestimated ~3x higher than the rate of decrease—e.g., detected cases dropping by 0.5% per day leads to underestimation in prevalence of 1.5%); if cases are increasing, you will overestimate prevalence in the same way. This stems from the assumption that prevalence is ONLY related to cases occurring that day – which is not true. What would be more accurate is to sum the cases from the current day and the previous 6 days to estimate prevalence. This point aside… It is not clear to me how this enters the present analysis since prevalence estimates were fixed—suggest remove unnecessary info.

• Can the authors provide a model figure of the state transitions possible and the pathways experienced? This would help supplement the text description.

• Table 2 is mentioned before Table 1.

• Provide reference for the risk tolerance exercise.

• “simultaneously changed” – do you mean all parameters were simultaneously sampled from their assigned probabilistic distribution?

• Did you run the model with the ‘baseline’ parameters and then run the model probabilistically to get the credible intervals? Or was the point estimate derived from PSA? Please specify and give the percentile from the PSA used to create the credible interval (and point estimate, if appropriate).

• “We also conducted 1-way sensitivity analyses on variables that produced a large influence on the ICER” – do you mean you ran these analyses to determine variables that produced a large influence on the ICER?

• In looking at the parameters and distributions used – limiting the daily number of close contacts via a triangular distribution is a questionable assumption. We know some people have very few close contacts… but others (a minority) have a great deal. I think this parameter would be better served with a gamma distribution to capture this “long tail” in the N contacts distribution.

• Continuing on parameters… the effect of handwashing is HUGE – larger than masks. I wonder about this assumption which appears to come from a 2008 sys review and is based on SARS studies in largely healthcare settings (perhaps not generalizable to schools/community settings)—in fact one of the studies that was done in the community found no significant effect of handwashing. There are few cases of documented fomite transmission of SARS-CoV-2, which is where handwashing would really be beneficial – in fact most ‘supposed cases of fomite transmission’ I have seen are more likely to be explained by aerosols. I suggest the authors re-consider the effect size in light of data related to SARS-CoV-2.

• Any packages in R used to help develop the model?

Results

• I would imagine what is most useful to policymakers and university personnel would be the effect of multiple interventions on COVID-19. It is hard to think of scenarios where testing is used, but other interventions are not (as these are usually progressive). The authors must consider this.

• Moreover, the authors should consider showing their findings on an efficiency frontier, which would make the claims of extended dominance easier to see and allow a more visual comparison of the various interventions.

• I think Table 3 would benefit from stating the cumulative number of covid-19 cases occurring—to allow direct comparison with what is mentioned in the first section of results. This could also be visualized on a graph/plot.

• Credible intervals are quoted, but was this analysis done in a Bayesian framework? Is this the correct terminology – it seems analysis was done in a frequentist framework to this reviewer.

• The value of online instruction sensitivity analysis is not mentioned in the methods as an explicit sensitivity analysis. It is also not entirely clear to this reviewer how to interpret the findings.

• The finding of N close contacts being a big driver of the CE of weekly testing is highly interesting.

Discussion

• The cost of masks quoted ($4) is extremely high! A 3-ply surgical mask can be procured for <$0.50 at volume. I was not able to verify this cost at the provided reference.

• Would suggest you end with a summary of findings and major take home message.

Reviewer #2: This is a detailed, excellent study of COVID19 mitigation measures on a university campus. With corrections of errors that I found and implementation of a several presentation recommendations, I feel that it meets criteria for publication.

Presentation recommendations:

* please list all 7 interventions early in the methods, and if possible in the abstract. I kept reading 7 interventions, but didn't actually find out what they were until several pages into the paper.

* the word "coupling" refers to pairing of 2 items. I suggest replacing it throughout the paper with "comparing" or deleting it. In doing incremental cost effectiveness, you are comparing a more effective strategy with a less effective one, after ordering all strategies by effectiveness.

* please bold or underline section headings. Often, they were on the next line without paragraph breaks, which made me wonder whether they were incomplete sentences.

* please provide more detail on the methods used to cost the interventions. This would be acceptable to place into an appendix, but it still needs to be in the paper.

* in Table 3, order in each prevalence section the intervention strategies by least to most effective. Since you used QALYs lost, the interventions having the greatest QALYs lost should be at the top and least at the bottom. Incremental changes in costs and QALYs are then calculated by comparing each row to the preceding one. While I determined from the write-up in the results section that you did arrive at the correct conclusions for your data, it is more apparent to readers if you present your analysis and results in the standard manner (as described above).

* since the difference in QALY is very small between strategies, I think that presenting COVID19 cases prevented would be more interesting than presenting QALYs, especially in a population less likely to die from COVID19. Most often in cost effectiveness analyses, both cases prevented and QALYs (gained) are presented as outcomes.

Here are the instances that I interpreted as errors:

* In the first paragraph of the results, you state that prevalence of 0.1% results in 351 cases. However, 351/20,500 = 1.7%. Likewise for 230/20,500 = 1.1% not 0.1%. and 1664/20,500=8.1% and 3126/20,500=15.2%.

* on p. 7, you state that you used 7 days for infectiousness. Table 2 lists exposure, symptoms, and infectiousness, with infectiousness of 9 days after symptom start. CDC lists the incubation period as 2-14 days, and infectiousness of 15 total days capturing 95% of all infectiousness. So, if 7 days was used, that is too low and doesn't match your table 2. You should consider using the standard 14 or 15 days of infectiousness.

* it appears that your base case used 2-3 contacts per student. While you did conduct a sensitivity analysis in Figure 1 on this, with up to 10 contacts per student, I would have used 10 contacts per student as the base case. In tuberculosis studies, TB cases average 15 airborne contacts per case at diagnosis. I would think that COVID19 would result in a similar number of contacts, and possibly more for gregarious students.

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: Yes: Suzanne Marks

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

Attachment

Submitted filename: PLOS table 3.xlsx

PLoS One. 2021 Sep 30;16(9):e0257806. doi: 10.1371/journal.pone.0257806.r002

Author response to Decision Letter 0


29 Jun 2021

Dear Dr. Schwartzman,

Thank you very much for the opportunity to respond to the helpful reviewer comments, which we feel greatly improved our manuscript. We have responded to each comment in a point-by-point response letter below. We have made extensive revisions to the manuscript. To accommodate the additional text required to respond to the reviewer comments, we have edited other sections of the document to reduce the word count.

Thank you for your time and help with our manuscript.

Best,

Peter Muennig and Zafar Zafari

From the editor

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

[Response] Thank you. We have checked our formatting to ensure that it adheres to the journal’s requirements.

2. Thank you for stating the following in the Funding Section of your manuscript:

"This study is funded by Columbia University Mailman School of Public Health. The

funder had no roles in the design of the study and collection, analysis, interpretation of data, and writing of the manuscript."

We note that you have provided funding information that is not currently declared in your Funding Statement. However, funding information should not appear in the Acknowledgments section or other areas of your manuscript. We will only publish funding information present in the Funding Statement section of the online submission form.

Please remove any funding-related text from the manuscript and let us know how you would like to update your Funding Statement. Currently, your Funding Statement reads as follows:

"The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript."

Please include your amended statements within your cover letter; we will change the online submission form on your behalf.

[Response] Thank you. We deleted text from the manuscript that referred to funding. As requested, we updated the funding statement in the cover letter.

Reviewer 1:

Zafari and colleagues detail an analysis of various interventions implemented in an American university during the Fall 2020 semester across three scenarios of assumed active, infectious COVID-19 prevalence in the community. The question is certainly interesting and useful beyond the American setting as the 2021 Fall semester approaches and countries without widespread vaccination may consider returns to school. However, I have concerns. Some of the model assumptions don’t appear justified or could be flawed; I believe the analyses are incomplete and represent only simple scenarios of implementation which are unlikely to be what are done in practice (single interventions rather than multiple); and there are issues with flow and clarity and some sections of the manuscript appearing to be “artifacts” of previous iterations (e.g., estimating community prevalence has no relevance to this paper, but is mentioned in reasonable detail).

[Response] We would like to thank the reviewer for his/her time to review our paper and excellent comments. We have addressed all the comments in-depth in a point-by-point response below. We believe addressing the comments have significantly increased the quality of our paper.

As to your point regarding the model assumptions, we now addressed all your comments in the revised version including updating model parameters and their statistical distributions (e.g., updating the distribution of number of contacts to gamma). Please see our point-by-point response below for details.

For your point regarding single rather than multiple interventions, please note that each of the intervention of our model are in combination with implementing the CDC guidelines. For example, we compared weekly testing in addition to implementing the CDC guidelines with implementing CDC guidelines alone. We agree with the reviewer that in real practice some universities may opt to implement multiple interventions at once. Therefore, to accommodate the reviewer’s comment, in the revised version of the paper, we added the ‘package’ intervention, which is a combination of implementing CDC guidelines, using a symptom-checking mobile application, providing standardized, high filtration mask, gateway testing, and weekly testing. Therefore, in the revised version, we now have 7 interventions in total as follows: 1) CDC guidelines alone, 2) CDC guidelines combined with symptom-checking mobile application, 3) CDC guidelines combined with standardized, high filtration masks, 4) CDC guidelines combined with thermal cameras, 5) CDC guidelines combined with gateway testing, 6) CDC guidelines combined with weekly testing, and 7) the ‘package’ intervention.

We also build a transparent online platform for our model so that in case a university opts to implement an intervention that was not modeled in our study (or implement multiple interventions at once), the user can investigate any desired, user-defined input for effectiveness and cost of a ‘hypothetical intervention’ in our web application.

For your point regarding the prevalence of disease in the areas surrounding the university, please note that we needed this parameter to model probability of infection when student/staff make contacts with people in the surrounding community outside of campus. The user can simply define this prevalence in the online application given their unique context. As suggested, we have now removed the redundant iterations of this throughout the paper.

In the new version of the paper, we have carefully copyedited the text. We have also asked a professional academic writer to edit and proofread our paper. We hope our careful response to all the points raised meets the approval of the reviewer.

COMMENTS

Abstract

1. Specify this is in the USA (“Most universities that re-opened in the USA for in-person…”)

[Response] Thank you. We made this change.

2. Second sentence appears to be a missed edit – remove “To determine”

[Response] Thank you. Corrected.

3. Methods are lacking and need to be added. What type of model used, costs considered, etc.

[Response] In response to your concerns, we have expanded the Methods section, adding information on the model used and the costs that were considered.

4. Length is <200 words but max is 300. Please add more detail in all aspects of methods/results/interpretation. Are these interventions additive? Or used in a silo? It is tough to interpret any results presented in the abstract without information such as this.

[Response] We have made the suggested changes and expanded the Abstract. Please see changes throughout the abstract.

Intro

5. Please add the specific aim(s) of the present study.

[Response] Thank you. We added specific aims to the last paragraph of the introduction.

Methods

6. Typo in first sentence (“a Markov model a Monte Carlo simulation”)

[Response] Thank you. This typo has been corrected.

7. Specify what is in the novel survey data (in “Overview”)

[Response] Thank you. We have now expanded the relevant section of the paper for clarity and have provided additional survey data in the online appendix.

8. The final sentence of “Overview” is strange. Does this analysis/paper follow CHEERS guidelines for reporting in CEA studies? I am not sure the ideas linked in the sentence are related.

[Response] Thank you. We have edited that sentence.

9. Suggest you number the additional interventions (on top of CDC guidelines) being modeled to help readability and allow readers to more easily track what is being compared. As a reviewer… some formatting to indicate subsections would have been helpful.

[Response] Thank you. We have added headers and sub-headers. We also created a separate section for the interventions under study.

10. Who is the manufacturer of the SARS-CoV-2 test kits? What specimen was assumed to be used? Did you take into account other specimen types that are similarly sensitive (e.g., NP swabs, saliva, gargle)? In any event, the base analysis parameters (specimen type, manufacturer of kit, and sens/spec) should be present – were costs of specimen collection considered? These can be substantial, particularly for NP swabs (collection alone estimated to be ~$10USD with swabs, media, PPE, and nurse time). *I see in appendix a reference for sens/spec of 95% each – in fact the reference suggests specificity of 100% (https://sites.broadinstitute.org/safe-for-school/assay-performance); is this a typo? Specificity of 95% for PCR is unacceptable given low prevalence (1%), you would expect 5 false positives for every 1 true positive. Please clarify.

[Response] We obtained the test parameters from Columbia University, including the cost, sensitivity, specificity, and manufacturer. These have changed somewhat since we originally submitted the paper for review. Based on your comment and our recent communication with the university’s Public Health Committee, we have updated this information and re-ran the model.

We updated the cost per test to $45 including collection, personnel and supplies. This is also in line with your suggested additional $10 for collection (previously was $31). Also, as per your suggestion, we updated the test’s specificity to 100%. It is not clear why the manufacturer’s claimed test specificity (95%) is different from what might be expected from a PCR test. Therefore, we changed the specificity of 100% as a model input parameter in order to put it more in line with other manufacturers and your suggestion.

11. It is not clear if the “one way sensitivity analyses” referred to in the “Interventions” section actually refer to interventions evaluated or sensitivity analyses?

[Response] At the time that we conducted the HVAC and far-UVC analyses in the old version of the paper, there was significant disagreement among infectious disease epidemiologists as to the extent to which SARS-CoV-2 was primarily transmitted by aerosols or by fomites. We therefore conducted 1-way sensitivity analyses on the percentage of infections attributed to aerosol transmission. The intent of the one-way analyses on these two preventive modalities was to show the reader whether assumptions about the proportion of all infections attributed to aerosol infections influenced the ICER. In the new version of the paper, to avoid confusion, we have removed these two interventions from the paper.

For the core model parameters, we have a section for the sensitivity analysis.

12. Regarding the formula for calculating the prevalence – I suggest there may be something amiss. I agree with multiplying the number of cases recorded each day by 5 to account for underreporting… but then multiplying that value by 7 is a bit puzzling and may lead to systematically underestimating TRUE prevalence when prevalence is declining (the ratio is prevalence is underestimated ~3x higher than the rate of decrease—e.g., detected cases dropping by 0.5% per day leads to underestimation in prevalence of 1.5%); if cases are increasing, you will overestimate prevalence in the same way. This stems from the assumption that prevalence is ONLY related to cases occurring that day – which is not true. What would be more accurate is to sum the cases from the current day and the previous 6 days to estimate prevalence. This point aside… It is not clear to me how this enters the present analysis since prevalence estimates were fixed—suggest remove unnecessary info.

[Response] This is an excellent point, and we fully agree. We exactly did what the reviewer suggests. We now recommend that the prevalence be computed using the current day plus the past days of the entire infectiousness period, as suggested by the reviewer.

Our online model allows universities to run the model based upon local prevalence. We have also edited and shortened the corresponding section avoiding any unnecessary information.

13. Can the authors provide a model figure of the state transitions possible and the pathways experienced? This would help supplement the text description.

[Response] Thank you for the suggestion. We added Figure 1 to show the states of the disease pathways.

14. Table 2 is mentioned before Table 1.

[Response] Thank you for catching this. Corrected.

15. Provide reference for the risk tolerance exercise.

[Response] Added, thank you.

16. “simultaneously changed” – do you mean all parameters were simultaneously sampled from their assigned probabilistic distribution?

[Response] Yes. We have now modified the text as per your suggestion.

17. Did you run the model with the ‘baseline’ parameters and then run the model probabilistically to get the credible intervals? Or was the point estimate derived from PSA? Please specify and give the percentile from the PSA used to create the credible interval (and point estimate, if appropriate).

[Response] The point estimates were derived from PSA. Based on your comment, we have now reported all the PSA results including the 95% credible intervals in table 3 as well as the cost-effectiveness planes and cost-effectiveness acceptability curves in the Online Appendix.

18. “We also conducted 1-way sensitivity analyses on variables that produced a large influence on the ICER” – do you mean you ran these analyses to determine variables that produced a large influence on the ICER?

[Response] Yes. We have modified the text accordingly.

19. In looking at the parameters and distributions used – limiting the daily number of close contacts via a triangular distribution is a questionable assumption. We know some people have very few close contacts… but others (a minority) have a great deal. I think this parameter would be better served with a gamma distribution to capture this “long tail” in the N contacts distribution.

[Response] Thank you for the suggestion. In the new analyses, we changed the distributions of number of contacts to gamma. Accordingly, we updated the results, figures, and tables.

20. Continuing on parameters… the effect of handwashing is HUGE – larger than masks. I wonder about this assumption which appears to come from a 2008 sys review and is based on SARS studies in largely healthcare settings (perhaps not generalizable to schools/community settings)—in fact one of the studies that was done in the community found no significant effect of handwashing. There are few cases of documented fomite transmission of SARS-CoV-2, which is where handwashing would really be beneficial – in fact most ‘supposed cases of fomite transmission’ I have seen are more likely to be explained by aerosols. I suggest the authors re-consider the effect size in light of data related to SARS-CoV-2.

[Response] Thank you. Please note that in the previous model, the effect of face masks was still larger than that of handwashing. Table 2 reports the odds ratios (OR) of infection rather than reduction in infection relative to no intervention. For face masks, the OR was 33% (a ~67% reduction) and the OR for handwashing was 45% (a ~55% reduction). However, we agree with this reviewer that the gap in efficacy should be larger, and that fomite transmission likely represents a minority of cases. In light of changes in our knowledge of transmission that has emerged since our original submission, we conducted a more recent search for the effect of frequent handwashing in the community settings. One recent study reported the adjusted relative incidence of infection of 0.64 (~%36% reduction) for frequent handwashing. As suggested by the reviewer, we have changed the OR of frequent handwashing. This parameter does not exert much influence on our model outcomes, so even if handwashing is ineffective at preventing the spread of COVID-19, it should not greatly change the results.

21. Any packages in R used to help develop the model?

[Response] The packages that we used from R were to generate statistical distributions. The rest of the code was written by ZZ. We make the code available on GitHub.

Results

22. I would imagine what is most useful to policymakers and university personnel would be the effect of multiple interventions on COVID-19. It is hard to think of scenarios where testing is used, but other interventions are not (as these are usually progressive). The authors must consider this.

[Response] Based on the reviewer’s request, we have now added a ‘package’ of interventions that combines implementing the CDC guidelines with using the symptom-checking mobile application, standardized, high filtration mask, gateway testing, and weekly testing. This package of interventions did appear to be more cost-effective as a whole than most of the individual interventions within the package.

There are many possible combinations of interventions. Single interventions that are less cost-effective when compared with the CDC guidelines alone are likely to remain less cost-effective when added on the margin of the package of interventions. Therefore, we also present data for single interventions. Note also that all of the individual interventions are presented in combination with implementing the CDC guidelines.

23. Moreover, the authors should consider showing their findings on an efficiency frontier, which would make the claims of extended dominance easier to see and allow a more visual comparison of the various interventions.

[Response] Thank you. We have now added an efficiency frontier plot. Please see the new Figure 2 in the revised paper.

24. I think Table 3 would benefit from stating the cumulative number of covid-19 cases occurring—to allow direct comparison with what is mentioned in the first section of results. This could also be visualized on a graph/plot.

[Response] Thank you. As suggested, we have now included the number of infections in the table. We also report the incremental costs per infection averted in the online appendix.

25. Credible intervals are quoted, but was this analysis done in a Bayesian framework? Is this the correct terminology – it seems analysis was done in a frequentist framework to this reviewer.

[Response] You are correct that the analysis was done from a frequentist approach. However, since the 95% interval was informed from a Monte Carlo simulation, it was called credible interval.

26. The value of online instruction sensitivity analysis is not mentioned in the methods as an explicit sensitivity analysis. It is also not entirely clear to this reviewer how to interpret the findings.

[Response] We now mention this and provide additional text to guide the reader on how to interpret the findings. We now mention the SA around value of online instruction in the Methods:

“We tested the effect of the perceived value of online vs. in-person instruction in the one-way sensitivity analysis in which the value of tuition was varied from 0% to 100%.”

As well as in the results:

“Value of online instruction. Varying the perceived value of online vs. in-person tuition did not substantively change the model outcomes. Even when the perceived value of online-only tuition was equal to that of in-person classes, the use of standardized, high filtration masks in addition to implementing CDC guidelines provided the highest value.”

27. The finding of N close contacts being a big driver of the CE of weekly testing is highly interesting.

[Response] Thank you. We now highlight this in the discussion. In addition, we ran the model with additional close contacts per student, recognizing that students who live in universities near their family home may have more contacts than the average student at Columbia. This was done in response to another reviewer’s comments. Other universities (particularly those where students live at home with parents) may have higher numbers of close contacts.

Discussion

28. The cost of masks quoted ($4) is extremely high! A 3-ply surgical mask can be procured for <$0.50 at volume. I was not able to verify this cost at the provided reference.

[Response] We agree. We used the actual cost per mask that Columbia paid for standardized, high filtration masks plus the cost of their distribution (Columbia distributes masks by mail). They are more expensive than surgical masks. However, the original cost was obtained at the height of the pandemic when masks were more expensive. In response to the reviewer’s concern, we re-consulted the Columbia University Public Health Committee. The cost per mask has since fallen to $2, a figure we update in the revised model.

29. Would suggest you end with a summary of findings and major take home message.

[Response] Thank you. We have done so. We have added:

“When tailored to the conditions within which the university operates, our model should provide a robust estimate of the cost-effectiveness of interventions to prevent the spread of COVID-19. As COVID-19 becomes a seasonal illness that is complicated by variants of the virus, our model can be used by university decisionmakers to ascertain how much of an investment will be necessary to manage risk.”

Reviewer #2: This is a detailed, excellent study of COVID19 mitigation measures on a university campus. With corrections of errors that I found and implementation of a several presentation recommendations, I feel that it meets criteria for publication.

[Response] We would like to thank the reviewer for her time to review our paper and excellent comments. We have gone in-depth and addressed all the comments in a point-by-point response below. We believe addressing the comments have significantly increased the quality of our paper.

Presentation recommendations:

1. please list all 7 interventions early in the methods, and if possible in the abstract. I kept reading 7 interventions, but didn't actually find out what they were until several pages into the paper.

[Response] Thank you. We now mention each of the interventions in the abstract and at the beginning of the Methods section.

2. the word "coupling" refers to pairing of 2 items. I suggest replacing it throughout the paper with "comparing" or deleting it. In doing incremental cost effectiveness, you are comparing a more effective strategy with a less effective one, after ordering all strategies by effectiveness.

[Response] Thank you. As suggested, we also removed the word coupling and re-written most of the paper for clarity. We assumed that most universities would implement the CDC guidelines as a minimum, so the CDC guidelines serve as the “status quo” comparator. By “coupling,” we meant pairing the intervention with CDC guidelines relative to implementing the CDC guidelines alone. This allowed us to estimate the cost-effectiveness of each intervention on the margin of what we believe to be the status quo. In the revised version, based on a comment from another reviewer, we removed HVAC systems and far-UVC lighting and added a ‘package’ intervention that combines implementing CDC guidelines with using symptom-checking mobile application, standardized, high filtration mask, gateway testing, and weekly testing all at once.

We made changes to the paper throughout and modified the intervention section in the revised paper.

3. please bold or underline section headings. Often, they were on the next line without paragraph breaks, which made me wonder whether they were incomplete sentences.

[Response] Thank you for your suggestion. We have now either bolded or underlined the headers and sub-headers throughout the paper to make them distinct from the rest of the text. We have also added additional headers and sub-headers in response to comments from another reviewer.

4. please provide more detail on the methods used to cost the interventions. This would be acceptable to place into an appendix, but it still needs to be in the paper.

[Response] Thank you. We have now expanded the costs section both in the main text and in the online appendix (please see Online S1 File and S2 Table). The costs were informed either from published data or from the Columbia University Public Health Committee.

5. in Table 3, order in each prevalence section the intervention strategies by least to most effective. Since you used QALYs lost, the interventions having the greatest QALYs lost should be at the top and least at the bottom. Incremental changes in costs and QALYs are then calculated by comparing each row to the preceding one. While I determined from the write-up in the results section that you did arrive at the correct conclusions for your data, it is more apparent to readers if you present your analysis and results in the standard manner (as described above).

[Response] Thank you so much for your comment. As per your suggestion, we have now modified table 3 by re-ordering the list of interventions from the least to the most effective intervention. We also provided the stepwise calculations of ICERs.

6. since the difference in QALY is very small between strategies, I think that presenting COVID19 cases prevented would be more interesting than presenting QALYs, especially in a population less likely to die from COVID19. Most often in cost effectiveness analyses, both cases prevented and QALYs (gained) are presented as outcomes.

[Response] Thank you. As per your suggestion, we have added the number of infections for each intervention in Table 3. We also reported incremental cost-effectiveness per cases averted in the online appendix.

Here are the instances that I interpreted as errors:

7. In the first paragraph of the results, you state that prevalence of 0.1% results in 351 cases. However, 351/20,500 = 1.7%. Likewise for 230/20,500 = 1.1% not 0.1%. and 1664/20,500=8.1% and 3126/20,500=15.2%.

[Response] Thank you. Sorry that we did a poor job of describing the estimation of prevalence. The number of cases among university affiliates is computed by estimating the transmission from members of the community off campus to those who are on campus as well as transmission between university affiliates while on campus. We do this because students are making contacts outside and inside the campus. The 351 cases we report reflect the estimated cumulative cases over the semester arising from transmission to university affiliates. Thus, at a 0.1% prevalence in the community surrounding the campus, when CDC guidelines are implemented, we expect to see 351 infections over the entire semester among university affiliates. (Our model did a very good job of predicting the actual number of infections at the Columbia campus over the course of the Fall 2020 semester.) We have made revisions throughout the document for clarity.

8. on p. 7, you state that you used 7 days for infectiousness. Table 2 lists exposure, symptoms, and infectiousness, with infectiousness of 9 days after symptom start. CDC lists the incubation period as 2-14 days, and infectiousness of 15 total days capturing 95% of all infectiousness. So, if 7 days was used, that is too low and doesn't match your table 2. You should consider using the standard 14 or 15 days of infectiousness.

[Response] Thank you. The “7” in the text was a typo. We corrected this typo in the revised paper.

As per your suggestion, we updated the days of infectiousness after symptoms onset in the revised model. Based on data from CDC (https://www.cdc.gov/coronavirus/2019-ncov/hcp/duration-isolation.html#:~:text=For%20most%20adults%20with%20COVID,with%20improvement%20of%20other%20symptoms.) and (https://www.journalofinfection.com/article/S0163-4453(20)30651-4/fulltext), it seems that for the majority of patients, the duration of infectiousness after symptoms onset is 10 days. Therefore, we updated our estimate for days of infectiousness after symptoms onset to 10 days (previously, it was 9 days). We also model these parameters probabilistically, so that in every Monte Carlo simulation run, a random number is generated from our distributions, representing individuals falling within tails of these distributions.

9. it appears that your base case used 2-3 contacts per student. While you did conduct a sensitivity analysis in Figure 1 on this, with up to 10 contacts per student, I would have used 10 contacts per student as the base case. In tuberculosis studies, TB cases average 15 airborne contacts per case at diagnosis. I would think that COVID19 would result in a similar number of contacts, and possibly more for gregarious students.

[Response] Thank you for this observation. As suggested, we updated the number of daily contacts per student to 10 in the revised model. We also modified the statistical distribution for this parameter to gamma distribution based on a comment from another reviewer. We agree that many students, particularly gregarious students, or those who live with their family, will have more close contacts. At Columbia, the average number of close contacts per student reported in extended contact tracing in the Fall of 2020 was 2, but this number is likely an underestimate.

Thanks again for your time and effort in helping us improve the paper!

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: Yes: Suzanne Marks

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

Attachment

Submitted filename: Point-by-point Response.docx

Decision Letter 1

Kevin Schwartzman

21 Jul 2021

PONE-D-21-07113R1

The cost-effectiveness of common strategies for the prevention of transmission of SARS-CoV-2 in universities

PLOS ONE

Dear Dr. Zafari,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

==============================

Please submit your revised manuscript by Sep 04 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Kevin Schwartzman

Academic Editor

PLOS ONE

Journal Requirements:

Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

Additional Editor Comments (if provided):

Thank you for all your updates in response to the reviewer comments.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #1: All comments have been addressed

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: Thanks to the authors for addressing my comments. Only a few minor comments remain:

-Repeat word of "iterations" in the first sentence of Analyses

-Repeat word "actively" in the third sentence of Analyses

-In Table 3 - how should readers interpret negative ICERs? I suggest a footnote to explain this or to remove confidence intervals around ICERs since confidence intervals require context for interpretation (e.g., you can have a negative ICER in quadrant II and IV, and a positive ICER in quadrant I and III -- though interpretation of positive and negative ICERs vary GREATLY depending on quadrant). In the supplement, it appears most results fall in quadrant I and II. Please provide more clarity on interpretation in a footnote or text.

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2021 Sep 30;16(9):e0257806. doi: 10.1371/journal.pone.0257806.r004

Author response to Decision Letter 1


8 Sep 2021

Reviewer #1: Thanks to the authors for addressing my comments. Only a few minor comments remain:

[Response] We would like to thank the reviewer for his/her excellent comments and help throughout the revisions process. Your comments throughout this process have greatly improved our paper. Thank you again for your time and invaluable comments/feedback.

-Repeat word of "iterations" in the first sentence of Analyses

[Response] Thank you. We have corrected this problem, which appears to be an error in PDF conversion.

-Repeat word "actively" in the third sentence of Analyses

[Response] Thank you. As above, this appeared to be a conversion issue.

-In Table 3 - how should readers interpret negative ICERs? I suggest a footnote to explain this or to remove confidence intervals around ICERs since confidence intervals require context for interpretation (e.g., you can have a negative ICER in quadrant II and IV, and a positive ICER in quadrant I and III -- though interpretation of positive and negative ICERs vary GREATLY depending on quadrant). In the supplement, it appears most results fall in quadrant I and II. Please provide more clarity on interpretation in a footnote or text.

[Response] Thank you. As suggested, we removed the intervals around the ICERs in Table 3 and added a footnote that describes the meaning of negative ICERs in this table.

Attachment

Submitted filename: Point-by-point Response.docx

Decision Letter 2

Kevin Schwartzman

13 Sep 2021

The cost-effectiveness of common strategies for the prevention of transmission of SARS-CoV-2 in universities

PONE-D-21-07113R2

Dear Dr. Zafari,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Kevin Schwartzman

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Thank you for these final revisions.

Reviewers' comments:

Acceptance letter

Kevin Schwartzman

22 Sep 2021

PONE-D-21-07113R2

The cost-effectiveness of common strategies for the prevention of transmission of SARS-CoV-2 in universities

Dear Dr. Zafari:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Kevin Schwartzman

Academic Editor

PLOS ONE

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    S1 Appendix

    (DOCX)

    Attachment

    Submitted filename: PLOS table 3.xlsx

    Attachment

    Submitted filename: Point-by-point Response.docx

    Attachment

    Submitted filename: Point-by-point Response.docx

    Data Availability Statement

    All relevant data are within the paper and its Supporting Information files.


    Articles from PLoS ONE are provided here courtesy of PLOS

    RESOURCES