Skip to main content
. 2019 Jun 10;19(11):2629. doi: 10.3390/s19112629

Table 5.

A category of different feature selection methods, their advantages and limitations.

Methods Advantages Limitations Example
Wrapper methods Deterministic
  • -

    Simple

  • -

    Dependence to feature

  • -

    Interplay with classifier

  • -

    Slower than Randomize

  • -

    High risk to over-fitting

  • -

    More entrapment to local optimum than Randomize

  • -

    Classifier dependent selection

  • -

    Sequential forward selection (SFS)

  • -

    Sequential backward elimination (SBE)

  • -

    Beam search

Randomize
  • -

    Dependence to feature

  • -

    Less entrapment to local optimum

  • -

    Interplay with classifier

  • -

    Classifier dependent selection

  • -

    More risk of over-fitting than deterministic

  • -

    Simulated Annealing

  • -

    Randomized hill climbing

  • -

    Genetic algorithms

  • -

    Estimation of distribution algorithms

Filter methods Univariate
  • -

    Quick

  • -

    Gradable

  • -

    No dependence to the classifier

  • -

    Relinquish dependence to feature

  • -

    Relinquish interplay with the classifier

  • -

    Information Gain (IG)

  • -

    x2 − CHI

  • -

    t-test

Multivariate
  • -

    Dependence to feature

  • -

    No dependence to the classifier

  • -

    Better time complexity than wrapper

  • -

    Slower than univariate methods

  • -

    Less gradable than univariate methods

  • -

    Relinquish interplay with the classifier

  • -

    Correlation-based feature selection (CFS)

  • -

    Markov blanket filter (MBF)

  • -

    Fast correlation-based feature selection (FCBF)

Embedded methods
  • -

    Dependence to feature

  • -

    Interplay with classifier

  • -

    Better time complexity than wrapper

  • -

    Classifier dependent selection

  • -

    Decision trees

  • -

    Weighted naive Bayes

  • -

    Feature selection using the weight vector of SVM