Table 9.
Advantages and disadvantages of feature selection methods.
Algorithms | Advantages | Disadvantages |
---|---|---|
GA [292] | Tries to avoid becoming stuck in a local optimal solution | GA does not guarantee an optimal solution and has high computational cost |
mRMR [293] | Effectively reduces the redundant features while keeping the relevant features | Mutual information is incompatible with continuous data |
LASSO [294] | Very accurate prediction, reduces overfitting, and improves model interpretability | In terms of independent risk factors, the regression coefficients may not be consistently interpretable |
SFFS [295] | Reduces the number of nesting issues and unnecessary features | Difficult to detect all subsets |
PCA [296] | Selects a number of important individuals from all the feature components, reduces the dimensionality of the original samples, and improves the classification accuracy | Only considers the linear relationships and interaction between variables at a higher level |
WONN-MLB [288] | Integrates the maximum relevancy and minimum redundancy | Has certain amount of irrelevant attributes |
HSOGR [90] | Effectively selects optimized features | Its execution is complex |