Skip to main content
. 2021 May 7;21(9):3246. doi: 10.3390/s21093246

Figure 5.

Figure 5

The layer structure of Convolutional Block Attention Module (CBAM), which uses both channel and spatial attention. The module was proposed in [59] and was modified here to compute the attention map for an MP volume.