Table 7.
Checklist for Addressing Recommended Evaluation Procedures.
| No. | Recommendation |
|---|---|
| (1) | Specify which dataset(s) are used in the analysis. |
| (2) | Specify the aggregation interval(s) used. |
| (3) | Indicate whether the approach is multivariate or univariate. |
| (4) | Clearly state if not all metrics are used. |
| (5) | Document all preprocessing steps, including filtering, normalization, and handling gaps in time series. |
| (6) | Ensure the training phase starts from the beginning of the dataset’s time frame (2023-10-09). |
| (7) | Specify the duration of the training window. |
| (8) | Define and describe the validation window if employed. |
| (9) | Clearly describe the retraining process if the model is retrained during the evaluation phase. |
| (10) | Specify the forecasting horizon (length of time into the future for predictions). |
| (11) | Clearly specify the evaluation metrics used in the article. |
| (12) | Provide an overall comparison across each time series using statistical distributions and aggregate statistics. |
| (13) | Assess and document the computational requirements and deployability of the model. |
| (14) | Make source codes of your experiments and model publicly available for the community. |