Table 1. Network Architecture of Message, Update, and Readout Functionsa.
message
function |
update function |
readout function |
||||||||
---|---|---|---|---|---|---|---|---|---|---|
T | layer | in | out | BN | in | out | BN | in | out | BN |
1 | 1 | 751 | 200 | yes | 473 | 200 | yes | |||
2 | 200 | 100 | yes | 200 | 100 | yes | ||||
3 | 100 | 100 | no | 100 | 100 | no | ||||
2 | 1 | 205 | 200 | yes | 200 | 200 | yes | |||
2 | 200 | 100 | yes | 200 | 100 | yes | ||||
3 | 100 | 100 | no | 100 | 100 | no | ||||
3 | 1 | 205 | 200 | yes | 200 | 200 | yes | 1100 | 300 | yes |
2 | 200 | 100 | yes | 200 | 100 | yes | 300 | 200 | yes | |
3 | 100 | 100 | no | 100 | 100 | no | 200 | 100 | yes | |
4 | 100 | 2 | no |
“In” and “Out” means the number of input and output neurons in the current layer, and “BN” denotes the application of the batch normalization layer.