Table 2. Performance comparison between GNN-A2 and baselines.
To enhance the clarity of the experimental results, the optimal results for each dataset are highlighted in bold, and the suboptimal ones are indicated with underlining.
Model | MovieLens 1M | Book-crossing | Taobao | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
AUC | Logloss | NDCG@5 | NDCG@10 | AUC | Logloss | NDCG@5 | NDCG@10 | AUC | Logloss | NDCG@5 | NDCG@10 | |
FM | 0.8761 | 0.4409 | 0.8143 | 0.8431 | 0.7417 | 0.5771 | 0.7616 | 0.8029 | 0.6171 | 0.2375 | 0.812 | 0.1120 |
NFM | 0.8985 | 0.3996 | 0.8486 | 0.8832 | 0.7988 | 0.5432 | 0.7989 | 0.8326 | 0.6550 | 0.2122 | 0.0997 | 0.1251 |
W&D | 0.9043 | 0.3878 | 0.8538 | 0.8869 | 0.8105 | 0.5366 | 0.8048 | 0.8381 | 0.6531 | 0.2124 | 0.0959 | 0.1242 |
Deep-FM | 0.9049 | 0.3856 | 0.8510 | 0.8848 | 0.8127 | 0.5379 | 0.8088 | 0.8400 | 0.6550 | 0.2115 | 0.0974 | 0.1243 |
AutoInt | 0.9034 | 0.3883 | 0.8619 | 0.8931 | 0.8130 | 0.5355 | 0.8127 | 0.8472 | 0.6434 | 0.2146 | 0.0924 | 0.1206 |
Fi-GNN | 0.9063 | 0.3871 | 0.8705 | 0.9029 | 0.8136 | 0.5338 | 0.8094 | 0.8522 | 0.6462 | 0.2131 | 0.0986 | 0.1241 |
L0-SIGN | 0.9072 | 0.3846 | 0.8849 | 0.9094 | 0.8163 | 0.5274 | 0.8148 | 0.8629 | 0.6547 | 0.2124 | 0.1006 | 0.1259 |
GMCF | 0.9127 | 0.3789 | 0.9374 | 0.9436 | 0.8228 | 0.5233 | 0.8671 | 0.8951 | 0.6679 | 0.1960 | 0.1112 | 0.1467 |
CAN | 0.9133 | 0.3773 | 0.9396 | 0.9442 | 0.8235 | 0.5143 | 0.8722 | 0.8996 | 0.6776 | 0.1919 | 0.1130 | 0.1494 |
GNN-A2 | 0.9101 | 0.3846 | 0.9511 | 0.9506 | 0.8400 | 0.4956 | 0.9003 | 0.9137 | 0.6715 | 0.1944 | 0.1159 | 0.1526 |
Improv | – | – | 1.22% | 0.68% | 2.00% | 3.64% | 3.22% | 1.57% | – | – | 2.57% | 2.14% |