Skip to main content
. 2021 Feb 18;21(4):1434. doi: 10.3390/s21041434

Figure 2.

Figure 2

The architecture of self-designed IRUNet. The abbreviation ‘IRB’, ‘Pool’, ‘Convt’, ‘Pro_L’, ‘Conv’, and Attention stands for interleaved residual block layers, pooling layers, transpose convolutional layers, process layers, convolutional layers, and attention module, respectively.