Skip to main content
. 2021 Nov 23;11:22744. doi: 10.1038/s41598-021-02225-y

Figure 7.

Figure 7

BNCA module. The attention mechanism is applied to each channel through the Global Average Pooling and sigmoid functions, and the size of the feature layer is not changed in the end.