Table 2.
Computational comparison between weight pruning-UNet and other models
Model | Parameters | Flops |
---|---|---|
UNet | 5,680,353 | 62.4e |
UNet (depth-wise + BN) | 2,601,921 | 7.8e |
WP-UNet (network pruning + depth-wise + BN) | 1,297,441 | 7.2e |
BN – Batch normalization; WP – Weight pruning