Skip to main content
. 2020 Dec 16;40(3):1032–1041. doi: 10.1109/TMI.2020.3045295

Fig. 3.

Fig. 3.

The framework Multi-Scale Attention Block consists of two parts: (i) pixel patch attention layer (ii) channel attention layer. All SA, DA, and UA block can be realized with this framework only need to change the kernel size Inline graphic and stride Inline graphic.