Skip to main content
. Author manuscript; available in PMC: 2023 Mar 1.
Published in final edited form as: IEEE Signal Process Mag. 2022 Feb 24;39(2):28–44. doi: 10.1109/msp.2021.3119273

Fig. 6.

Fig. 6.

VAE architecture. Fϕ encodes x, and combined with random sample u to produce latent vector z. Gθ decodes the latent z to acquire x^. u is sampled from standard normal distribution for the reparameterization trick. (a) VAE. (b) spatial-VAE [19], disentangling translation/rotation features from different semantics. (c) DIVNOISING [20], enabling superviesd/unsupervised training of denoising generative model by leveraging the noise model pNM(y|x).