Skip to main content
. 2025 Aug 11;16:1596408. doi: 10.3389/fneur.2025.1596408

Figure 10.

Line graphs display the impact of GCN layer depth, attention dimension, and regularization weight on model performance. Each graph shows accuracy, recall, and F1 score with peaks at certain values: 3, 128, and 0.5, respectively.

Parameter sensitivity analysis curves. Each plot shows how the model's accuracy, recall, and F1 score vary with respect to one hyperparameter: (Left) GCN layer depth, (Middle) attention dimension, and (Right) domain regularization weight. Optimal values were found at three GCN layers, 128 attention dimensions, and 0.5 regularization weight.