Table 2.
Hyperparameter | Range to Probe |
---|---|
Activation function | ReLU—ELU—Sigmoid—SELU—Tanh |
Batch size | 1 to 128 |
Dropout rate | 0.1 to 0.5 |
Number of dense nodes | 32 to 1024 |
Gradient descent optimizer | Adam—Nadam—AdaMax—RMSProp—SGD |
Hyperparameter | Range to Probe |
---|---|
Activation function | ReLU—ELU—Sigmoid—SELU—Tanh |
Batch size | 1 to 128 |
Dropout rate | 0.1 to 0.5 |
Number of dense nodes | 32 to 1024 |
Gradient descent optimizer | Adam—Nadam—AdaMax—RMSProp—SGD |