Skip to main content
. 2025 Oct 8;19:1682603. doi: 10.3389/fnins.2025.1682603

FIGURE 2.

Diagram depicting a neural network architecture featuring channel and spatial attention mechanisms. The channel attention block processes input through MaxPool, AvgPool, MLP, and Sigmoid layers before element-wise multiplication and concatenation. The spatial attention block further processes using convolution, permutation, and element-wise operations to produce the output. Various operations, such as convolution and pooling, are highlighted throughout the diagram.

SMWA block.