|
Algorithm 1 VIB-GMM algorithm for unsupervised learning. |
-
1:
input: Dataset , parameter .
-
2:
output: Optimal DNN weights and GMM parameters , , .
-
3:
initialization Initialize .
-
4:
repeat
-
5:
Randomly select b mini-batch samples from .
-
6:
Draw m random i.i.d samples from .
-
7:
Compute m samples
-
8:
For the selected mini-batch, compute gradients of the empirical cost (20).
-
9:
Update using the estimated gradient (e.g., with SGD or ADAM).
-
10:
until convergence of .
|