Table 4.
#Layer | Type | Parameters | Activation |
---|---|---|---|
1 | Batch Normalization | – | – |
2 | Fully-Connected | From: #SH coefficients To: 150 neurons | ReLU |
3 | Batch Normalization | – | – |
4 | Fully-Connected | From: 150 neurons To: 150 neurons | ReLU |
5 | Batch Normalization | – | – |
6 | Fully-Connected | From: 150 neurons To: 150 neurons | ReLU |
7 | Batch Normalization | – | – |
8 | Fully-Connected | From: 150 neurons To: #SH coefficients | – |