Skip to main content
. 2022 Oct 19;14(3):973–987. doi: 10.1007/s13042-022-01676-7

Fig. 5.

Fig. 5

Transformers encoder (left) and self-attention block (right)