Skip to main content
. 2022 Jul 8;12(7):1664. doi: 10.3390/diagnostics12071664

Table 2.

Descriptions of the 13 feature selection methods in this study.

Type Method Description Equation
FITI MIM Evaluates features by correlation between features and classes measured by mutual information MIM(fi)=I(fi;C)
MIFS/MRMR Evaluates features by correlation between features and classes and redundancy among features MIFS(fi)=I(fi;C)βsjSI(fi;fS)
MRMR(fi)=I(fi;C)1SsjSI(fi;fS)
JMI/CMIM Evaluates features by correlation between features and classes and redundancy among features measured by conditional mutual information JMI(fi)=I(fi;C)1|S|sjS[I(fi;C)I(fi;C|fS)]
CMIM(fi)=minfsSI(fi;C|fs)
SIF Fisher/LS Compares features with their ratios of variance between classes and variance within classes Fisher(k)=RB(k)Rw(k)      LS(fi)=ab(frafrb)2WijVar(fr)
ReliefF Compares features with correlation between features and classes computed from ability of features to distinguish between close samples ReliefF(fi,R1,R2)=|R1(A)R2(A)|max(A)min(A)
STF FS Obtains feature score with ability to distinguish positive classes and negative classes computed by average of both classes FS(i)=(f¯i(+)f¯i)2+(f¯i()f¯i)21n+1k=1n+(fk,i(+)f¯i(+))2+1n1k=1n(fk,i()f¯i())2
TS Computes feature score with average and variance of features TS(i)=(f¯i(+)f¯i())1n+1k=1n+(fk,i(+)f¯i(+))2+1n1k=1n(fk,i()f¯i())2
SSL MCFS Combines cluster with feature coefficients of combinatorial classes to compute feature score MCFS(i)=maxk|fk,i|
Alpha Evaluates features by dynamically adjusting threshold on error reduction to obtain selection results E(Ni)/E(Mi)<αΔ/(1αΔ)
Lasso Uses L1 regularization to make weight of some learned features equal 0, to achieve purpose of sparse and feature selection Lasso(β)=arg min{i=1n(yiβ0j=1pβjxij*)2+λj=0p|βj|}