Figure 10.
Parameter sensitivity analysis curves. Each plot shows how the model's accuracy, recall, and F1 score vary with respect to one hyperparameter: (Left) GCN layer depth, (Middle) attention dimension, and (Right) domain regularization weight. Optimal values were found at three GCN layers, 128 attention dimensions, and 0.5 regularization weight.
