Skip to main content
. 2023 Dec 16;13:22408. doi: 10.1038/s41598-023-49708-8

Table 5.

Cumulative ratio of synonym prediction in each word embedding model. (FT: fastText, W2V:Word2vec, SKIP:skip-gram, number: vector dimensions).

Model Precition Recall F1-score Accuracy
FT_CBOW_100 0.5286 0.8819 0.6610 0.4937
FT_CBOW_200 0.5557 0.8870 0.6833 0.5190
FT_CBOW_300 0.5567 0.8872 0.6841 0.5199
FT_CBOW_400 0.5515 0.8863 0.6799 0.5151
FT_CBOW_500 0.5484 0.8857 0.6774 0.5121
FT_CBOW_600 0.5432 0.8847 0.6731 0.5073
FT_CBOW_700 0.5359 0.8834 0.6671 0.5005
FT_CBOW_800 0.5369 0.8836 0.6680 0.5015
FT_CBOW_900 0.5276 0.8817 0.6602 0.4927
FT_SKIP_100 0.4381 0.8609 0.5807 0.4091
FT_SKIP_200 0.4984 0.8757 0.6353 0.4655
FT_SKIP_300 0.4984 0.8757 0.6353 0.4655
FT_SKIP_400 0.5213 0.8805 0.6549 0.4869
FT_SKIP_500 0.5130 0.8788 0.6478 0.4791
FT_SKIP_600 0.5317 0.8826 0.6636 0.4966
FT_SKIP_700 0.5369 0.8836 0.6680 0.5015
FT_SKIP_800 0.5297 0.8821 0.6619 0.4947
FT_SKIP_900 0.5317 0.8826 0.6636 0.4966
W2V_CBOW_100 0.3101 0.8142 0.4491 0.2896
W2V_CBOW_200 0.3757 0.8415 0.5194 0.3508
W2V_CBOW_300 0.3809 0.8433 0.5247 0.3557
W2V_CBOW_400 0.3954 0.8482 0.5394 0.3693
W2V_CBOW_500 0.3985 0.8492 0.5425 0.3722
W2V_CBOW_600 0.3913 0.8468 0.5352 0.3654
W2V_CBOW_700 0.3965 0.8486 0.5404 0.3703
W2V_CBOW_800 0.4089 0.8525 0.5527 0.3819
W2V_CBOW_900 0.3944 0.8479 0.5384 0.3683
W2V_SKIP_100 0.3195 0.8187 0.4596 0.2983
W2V_SKIP_200 0.3278 0.8225 0.4688 0.3061
W2V_SKIP_300 0.3195 0.8187 0.4596 0.2983
W2V_SKIP_400 0.3195 0.8187 0.4596 0.2983
W2V_SKIP_500 0.3070 0.8127 0.4456 0.2867
W2V_SKIP_600 0.2955 0.8068 0.4326 0.2760
W2V_SKIP_700 0.2872 0.8023 0.4230 0.2682
W2V_SKIP_800 0.2893 0.8035 0.4254 0.2702
W2V_SKIP_900 0.2851 0.8012 0.4206 0.2663

The underlined (bold) numbers show the highest cumulated accuracy in the different models.