Skip to main content
. 2019 Dec 10;19:1659. doi: 10.1186/s12889-019-7966-8

Table 3.

Summary of results from the FluSight influenza forecast challenges*

2013–14 season 2014–15 season 2015–16 season 2016–17 season 2017–18 season
Number of participating teams 9 5 11 21 22
Number of submitted forecasts 13 7 14 28 29
Season onset top skill N/A** 0.41 0.18 0.78 0.69
Peak week top skill N/A 0.49 0.20 0.49 0.50
Peak intensity top skill N/A 0.17 0.66 0.36 0.26
1-week ahead top skill N/A 0.43 0.89 0.60 0.54
2-weeks ahead top skill N/A 0.36 0.76 0.46 0.37
3-weeks ahead top skill N/A 0.37 0.66 0.41 0.29
4-weeks ahead top skill N/A 0.35 0.58 0.38 0.26
Overall top performing team Columbia University Delphi group, Carnegie Mellon University Delphi group, Carnegie Mellon University Delphi group, Carnegie Mellon University Delphi group, Carnegie Mellon University

*Skill scores for 2016–17 and 2017–18 challenges have not been published. Results from 2018 to 19 challenge are not complete as of August 2019

†Number of submitted forecasts do not include the unweighted average ensemble or historical average forecasts

**The logarithmic scoring rule used to determine forecast skill scores was not introduced until the second year of the challenge (2014–15). Skill scores for the challenge pilot (2013–14) are therefore not available