Skip to main content
. 2021 Aug 26;7:e683. doi: 10.7717/peerj-cs.683

Table 6. The first experiment, trained on the balanced training dataset.

Model Acc F1 Sens Spec MCC AUC
Validated on the imbalanced validation dataset
DeepRMethylSite CNN 0.7819 0.7557 0.5668 0.9259 0.5428 0.7990
DeepRMethylSite LSTM 0.7699 0.7479 0.5480 0.9394 0.5430 0.8024
DeepRMethylSite Merged 0.7743 0.7518 0.5474 0.9394 0.5481 0.8021
SMLP 0.7209 0.7018 0.4922 0.9281 0.4719 0.7649
SSMFN Merged 0.8187 0.7943 0.6175 0.9442 0.6143 0.8359
Validated on the balanced validation dataset
DeepRMethylSite CNN 0.8090 0.8089 0.7944 0.8251 0.6188 0.7990
DeepRMethylSite LSTM 0.7993 0.7993 0.7618 0.8493 0.6048 0.8024
DeepRMethylSite Merged 0.8059 0.8051 0.7659 0.8504 0.6169 0.8021
SMLP 0.7073 0.7073 0.7041 0.7107 0.4147 0.7649
SSMFN Merged 0.8360 0.8358 0.8130 0.8626 0.6738 0.8359
Tested on the test dataset
MeMo* 0.68 na 0.38 0.99 0.46 na
MASA* 0.65 na 0.31 0.99 0.41 na
BPB-PPMS* 0.56 na 0.12 1.00 0.25 na
PMeS* 0.58 na 0.43 0.73 0.16 na
iMethyl-PseAAC* 0.59 na 0.18 1.00 0.3 na
PSSMe* 0.72 na 0.6 0.83 0.44 na
MePred-RF* 0.69 na 0.41 0.97 0.46 na
PRmePRed** 0.8683 na 0.8709 0.8660 0.7370 0.9000
DeepRMethylSite CNN 0.7846 0.7846 0.7803 0.7891 0.5693 0.7846
DeepRMethylSite LSTM 0.8000 0.7989 0.7617 0.8514 0.6065 0.8000
DeepRMethylSite Merged 0.7942 0.7929 0.7508 0.8447 0.5959 0.7904
SMLP 0.8077 0.8076 0.8175 0.7985 0.6157 0.8077
SSMFN Merged 0.8115 0.8115 0.8000 0.8240 0.6235 0.8115

Note.

The highest value of each parameter from each measurement experiment is shown in bold.