Skip to main content
. 2022 Dec 7;35(11):8259–8279. doi: 10.1007/s00521-022-08099-z

Table 2.

List of hyperparameters and their values used in our study to finalize the perfect combination for the task at hand

Hyper parameter Corresponding values
Optimizer Adam, SGD, Nadam, RMSprop, Adamax, Adagrad
Learning rate 0.01, 0.001, 0.0001, decay rate from 0.001 to 0.000001
Batch size 32