Skip to main content
. 2020 Apr 23;8(2):107. doi: 10.3390/healthcare8020107

Table 7.

Parameters of each optimizer method.

Optimizer Methods Parameters
Nadam Learning rate, beta_1, beta_2, epsilon, schedules-decay
Adamax Learning rate, beta_1, beta_2, epsilon, decay
Adam Learning rate, beta_1, beta_2, epsilon, decay
Adadelta Learning rate, rho, epsilon, decay
RMSprop Learning rate, rho, epsilon, decay
Adagrad Learning rate, epsilon, decay
SGD Learning rate, momentum, decay, nesterov