Skip to main content
. 2023 Feb 8;2023:3198066. doi: 10.1155/2023/3198066

Table 2.

CNN hyper-parameter tuning details.

Hyper-parameters Range explored Chosen
Kernel_size (3.3), (5.5), (7.7), (9.9) (3,3)
Regularizers.l2 (1e − 2), (1e − 4) (1e − 2)
Dropout (65%, 75%, 85%, 90%) 85%
Optimizer Adam, Nadam, Adagrad, Adamax Nadam
Learning rate (0.01, 0.001, 0.0001) 0.0001
Batch_size 4, 8, 32, 64 4
Epochs 20, 30, 40, 50 50