Skip to main content
. Author manuscript; available in PMC: 2023 Feb 9.
Published in final edited form as: Electronics (Basel). 2023 Jan 16;12(2):467. doi: 10.3390/electronics12020467

Figure 3.

Figure 3.

The structure of the attention module. The input of the attention module is the extracted features of the CNN backbone. The output of the attention module is forwarded to the classifier.