Skip to main content
. 2017 Oct 10;11:538. doi: 10.3389/fnins.2017.00538

Table 2.

Array sizes, weight sharing factors and number of MACs performed for each layer for AlexNet* (Krizhevsky et al., 2012) architecture.

Layer RPU array size (matrix size) Weight sharing factor (ws) MACs
K1 96 × 363 3, 025 106 M
K2 256 × 2, 400 729 448 M
K3 384 × 2, 304 169 150 M
K4 384 × 3, 456 169 224 M
K5 256 × 3, 456 169 150 M
W6 4, 096 × 9, 216 1 38 M
W7 4, 096 × 4, 096 1 17 M
W8 1, 000 × 4, 096 1 4 M
Total MACs = 1.14 G
*

Table assumes the weights that are originally distributed to two GPUs are contained into a single RPU array for each layer.