Skip to main content
. 2023 Jun 16;2023:330–339.

Table 3.

Final hyperparameters for the models, determined from 10-fold cross-validation and used to tune the final models. LSTM: long short-term memory.

Model Hyperparameters
Hourly LSTM batch size: 512; learning rate: 0.005; criterion: binary cross entropy; optimizer: adam
Daily LSTM batch size: 256; learning rate: 0.005; criterion: binary cross entropy; optimizer: adam
Neural Network batch size: 256; learning rate: 0.01; criterion: binary cross entropy; optimizer: adam
Support vector machine kernel: linear, C: 1, gamma: scale
Random forest number of estimators: 200; max depth: 10; criterion: entropy
XGBoost number of estimators: 200; max depth: 200; learning rate: 0.1