Table 1. Parameter and operation counts for various networks.
The number of neurons in the input, hidden, and output layers are , , and , respectively. Note these counts do not include parameters of the readout layer, since said layer contributes the same number of parameters for each network ( and with and without a bias) and are for fixed initial states.
| Network | Trainable Parameters | State update operations |
|---|---|---|
| 2-layer fully connected | ||
| MPN | ||
| MPNpre | ||
| Vanilla RNN | ||
| GRU | ||
| LSTM |