Skip to main content
editorial
. 2024 Jan 2;25(1):113–115. doi: 10.3348/kjr.2023.0948

Fig. 1. Analogy explaining the difference between the working of recurrent neural network (RNN) and transformer models. A: RNNs can be considered a chain of people passing down a message where each person adds and retains information from previous steps. B: Transformer models can be considered a group of experts discussing where each expert considers all parts of the input simultaneously. The attention mechanism then weighs the importance of different elements in the data and combines them.

Fig. 1