Skip to main content
. 2022 Dec 23;25(1):26. doi: 10.3390/e25010026

Figure 2.

Figure 2

The RealNVP neural network implementation of the basic module of the bijector ψ , where s1,s2 and t1,t2 are all feed-forward neural networks with three layers, 64 hidden neurons, and ReLU active function. si s and ti s share parameters, respectively. ⨂ and + represent element-wised product and addition, respectively. x=x1x2 and x=x1x2 .