Skip to main content
. 2021 Apr 14;16(4):e0247388. doi: 10.1371/journal.pone.0247388

Fig 4. Diagram of the attention module.

Fig 4

As illustrated, the attention module utilizes both max-pooling outputs and average-pooling outputs.