Skip to main content
. 2021 Jun 12;21(12):4054. doi: 10.3390/s21124054

Table 1.

The parameter settings of HyAdamC and the compared first-order optimization methods.

Algorithms Parameter Settings
HyAdamC α=103, β1=0.9, β2=0.99, ε=108
SGD α=103
RMSProp Learning rate = 102, α=0.99, ε=108
Adam α=103, β1=0.9, β2=0.99
AdamW α=103, β1=0.9, β2=0.99
Adagrad α=102, β1=0.9, ε=1010
AdaDelta α=1.0, ρ=0.9, ε=106
Rprop α=102, η=0.5, η+=1.2, step sizes =[106,50]
Yogi α=102, β1=0.9, β2=0.99, ε=103
Fromage α=102
TAdam α=103, β1=0.9, β2=0.99, v=d, kv=1.0
diffGrad α=103, β1=0.9, β2=0.99