Differentiable augmentation [34] |
TRUE/FALSE |
Activation fn of discriminator |
ReLU/LeakyRelu/Elu/Selu |
Activation fn of generator |
ReLU/LeakyRelu/Elu/Selu |
Normalization layer of discriminator |
BatchNorm [35]/InstanceNorm [36] |
Normalization layer of generator |
BatchNorm [35]/InstanceNorm [36] |
Number of filters of discriminator |
16/32/64/128 |
Number of filters of generator |
16/32/64/128 |
Use spectral norm for discriminator |
TRUE/FALSE |
Use spectral norm for generator |
TRUE/FALSE |
Weight initialization function |
Normal/Xavier/Xavier Uniform/Kaiming He |
Weight initialization gain |
0.01/0.02/0.1/1.0 |
Gradient penalty loss weight (WGAN-GP only) |
0/0.1/1.0/10.0 |
Weight clipping value (WGAN only) |
0/0.01/0.1 |
Feature matching loss weight |
0/1.0/10.0 |
VGG loss weight |
0/1.0 /10.0 |
Learning rate |
0.00004/0.00005/0.0001/0.0002/0.001 |
Use of label smoothing [29] |
TRUE/FALSE |
Use of data augmentation |
TRUE/FALSE |