Skip to main content
. 2025 Aug 25;17:e66798. doi: 10.2196/66798

Table 1. Results of model hyperparameter tuning.

Model Model hyperparametersa Hyperparameter spaceb Total combination Best combination
ARIMAc
  • p, d, q

  • q=[0,1,2,3]

  • d=[0,1,2,3]

  • p=[0,1,2,3,4]

80
  • p=2

  • d=1

  • q=2

Double ESd
  • trend

  • damped

  • trend=['add', 'mul']

  • damped=[True, False]

4
  • trend= 'add’

  • damped=True

MLPe
  • n_input

  • n_nodes

  • n_batch

  • n_diff

  • n_input = [2-5]

  • n_nodes = [100, 150]

  • n_batch = [1, 150]

  • n_diff = [0,1, 2,4]

64
  • n_input=4

  • n_nodes=100

  • n_batch=1

  • n_diff=1

LSTMf
  • n_input

  • n_nodes

  • n_batch

  • n_diff

  • n_input=[2-5]

  • n_nodes=[100, 150]

  • n_batch=[1, 150]

  • n_diff=[0,1, 2,4]

64
  • n_input=4

  • n_nodes=100

  • n_batch=1

  • n_diff=1

CNNg
  • n_input

  • n_diff

  • n_input=[2-5]

  • n_diff=[0,1, 2,4]

16
  • n_input=4

  • n_diff=1

a

Hyperparameters are model-specific settings tuned to optimize performance.

b

Values in brackets represent the range or options tested using grid search.

c

ARIMA: autoregressive integrated moving average.

d

ES: exponential smoothing.

e

MLP: multilayer perceptron.

f

LSTM: long short-term memory.

g

CNN: convolutional neural network.