Skip to main content
. 2014 Jan 24;9(1):e86703. doi: 10.1371/journal.pone.0086703

Table 2. Comparison of the prediction performance of the Gaussian naïve Bayes (GNB)-based wrapper, logistic regression (LogR)-based wrapper, decision tree (DT)-based wrapper, k-nearest neighbor (KNN)-based wrapper, and two support vector machine (SVM)-based wrappers with the RBF and polynomial kernels (denoted as SVM-RBF and SVM-Poly respectively).

Wrapper method Five-fold CV (average of 10 runs) Jackknife test
Sen Spe Acc MCC Sen Spe Acc MCC
GNB 0.815±0.010 0.767±0.009 0.791±0.007 0.583±0.014 0.828 0.781 0.805 0.610
DT 0.716±0.019 0.704±0.025 0.710±0.011 0.421±0.021 0.684 0.700 0.692 0.384
LogR 0.801±0.008 0.699±0.005 0.750±0.006 0.502±0.012 0.805 0.704 0.754 0.511
KNN 0.716±0.015 0.770±0.010 0.743±0.008 0.487±0.016 0.721 0.771 0.746 0.492
SVM-Poly 0.867±0.008 0.668±0.011 0.768±0.009 0.547±0.019 0.855 0.687 0.771 0.550
SVM-RBF 0.830±0.013 0.746±0.006 0.788±0.008 0.578±0.016 0.848 0.754 0.801 0.605

Note: The CV tests were based on ten runs and the averages and the standard deviations are shown. The highest values are shown in bold.