Skip to main content
. 2023 Jun 30;17:1222751. doi: 10.3389/fnins.2023.1222751

Table 4.

Comparison of the performances of models with different numbers of self-attention layers.

Number of self-attention layers MAE RMSE PCC
1 7.07 ± 0.61 9.49 ± 0.72 0.30 ± 0.14
2 5.92 ±0.62 7.56 ±0.78 0.44 ±0.11
3 6.70 ± 0.79 8.48 ± 0.93 0.34 ± 0.12
4 6.95 ± 0.52 8.97 ± 0.79 0.28 ± 0.14