Skip to main content
. 2020 Aug 26;11(35):9459–9467. doi: 10.1039/d0sc03635h

Fig. 1. Schematic diagram of the latent space simulator (LSS) comprising three back-to-back deep neural networks. A state-free reversible VAMPnet (SRV) learns an encoding E of molecular configurations into a latent space spanned by the leading eigenfunctions of the transfer operator (eqn (1)). A mixture density network (MDN) learns a propagator P to sample transition probabilities pτ(ψt+τ|ψt) within the latent space. A conditional Wasserstein GAN (cWGAN) learns a generative decoding D of molecular configurations conditioned on the latent space coordinates. The trained LSS is used to generate ultra-long synthetic trajectories by projecting the initial configuration into the latent space using the SRV, sampling from the MDN to generate long latent space trajectories, and decoding to molecular configurations using the cWGAN.

Fig. 1