Skip to main content
. 2021 Jul 31;11(8):1384. doi: 10.3390/diagnostics11081384

Figure 3.

Figure 3

(a) Structure of the transformer. (b) Overview of self-attention, matmul means matrix product of two arrays. (c) An illustration of our multi-head self-attention component, concat means concatenate representations.