Table 3. Parameter combinations for the 10 best-fit simulations for the northeastern U.S. as validated with New York state P&I mortality data.
Rank | RMS Error | Correlation Coefficient(r) | L (years) | D (days) |
![]() |
![]() |
1 | 0.0070 | 0.94 | 5.59 | 5.69 | 3.83 | 2.83 |
2 | 0.0071 | 0.94 | 5.64 | 5.43 | 4.99 | 2.48 |
3 | 0.0071 | 0.90 | 9.78 | 5.59 | 2.43 | 3.69 |
4 | 0.0072 | 0.88 | 9.80 | 3.59 | 2.55 | 3.83 |
5 | 0.0072 | 0.90 | 2.53 | 6.04 | 2.86 | 2.44 |
6 | 0.0073 | 0.89 | 9.71 | 3.68 | 5.54 | 1.94 |
7 | 0.0074 | 0.89 | 3.28 | 4.31 | 2.13 | 3.22 |
8 | 0.0075 | 0.90 | 8.73 | 3.74 | 6.79 | 3.02 |
9 | 0.0077 | 0.91 | 6.44 | 5.72 | 4.24 | 2.59 |
10 | 0.0077 | 0.91 | 6.29 | 3.17 | 9.27 | 1.74 |
3000 simulations were performed at each site with the parameters
L (mean duration of immunity), D
(mean infectious period), ϕ (vitamin D scaling),
and R
0
* (the basic
reproduction number number if
γi,t = 1) randomly
chosen from within specified ranges. Parameters λ
(inflection point) and η (inflection point slope)
were fixed at and
. Best-fit
simulations were selected based on RMS error after scaling the 31-year
mean daily infection number to the 31-year mean observed daily excess
P&I mortality rate.