Skip to main content
. 2021 Jan 7;80(8):11765–11788. doi: 10.1007/s11042-020-10183-2

Table 22.

False Positive Rate (FPR) and False Negative Rate (FNR)

Word Embedding Model Classification Model FPR FNR
TF-IDF (using unigrams and bigrams) Neural Network 0.04684 0.0742
BOW (Bag of words) Neural Network 0.1040 0.0862
Word2Vec Neural Network 0.1320 0.3416
GloVe MNB 0.1151 0.0752
GloVe DT 0.3956 0.1303
GloVe RF 0.3458 0.2259
GloVe KNN 0.7299 0.1931
BERT MNB 0.0985 0.0789
BERT DT 0.1660 0.2429
BERT RF 0.1245 0.3318
BERT KNN 0.4037 0.4110
GloVe CNN 0.0989 0.0776
GloVe LSTM 0.0080 0.0482
BERT CNN 0.0590 0.0872
BERT LSTM 0.0077 0.0451
BERT FakeBERT 0.0160 0.0059