Skip to main content
. Author manuscript; available in PMC: 2024 Oct 1.
Published in final edited form as: IEEE Trans Pattern Anal Mach Intell. 2023 Sep 5;45(10):11707–11719. doi: 10.1109/TPAMI.2023.3287774

TABLE 3:

Ablation studies of the input feature map size and network architecture for the regressor. Results are reported on segmentation-to-RGB task with a StyleGAN2 generator backbone [9] pretrained on FFHQ dataset [53]. The “All” in the column of Feature size denotes all the feature maps from 4×4 to 256×256 are concatenated as the input of regressor.

Method Feature size Regressor FID↓ KID×103 SSIM↑ LPIPS↓
Latent Anchor (ours) 256×256 6 conv layers 83.8 72.8 0.307 0.442

Ablation of feature size 4×4
16×16
64×64
128×128
All
-
-
-
-
-
-
201.5
188.9
115.6
116.2
129.5
199.0
217.7
111.5
116.2
129.8
0.207
0.269
0.260
0.267
0.279
0.525
0.476
0.493
0.481
0.484

Ablation of regressor -
-
3 conv layers
6 ResBlocks
93.8
155.1
83.5
151.0
0.297
0.249
0.451
0.579