Skip to main content
. Author manuscript; available in PMC: 2019 Jul 1.
Published in final edited form as: Artif Intell. 2018 Apr 3;260:1–35. doi: 10.1016/j.artint.2018.03.003

Figure 8.

Figure 8

MNIST training (upper) and test (lower) accuracy, as a function of training epoch, for the sparse versions of the RBP and SRBP algorithms. Experiment are carried with different levels of quantization of the weight updates by controlling the bitwidth bits, according to the formula given in the text (Equation 19). Quantization is applied to each example-specific update, before summing the updates within a minibatch.