Table 2.
Mutual information-based feature selection Algorithms.
Algorithms | Description |
---|---|
MIFS [20] | A greedy approach that selects only highly informative feature and forms an optimal feature subset. It identifies the non-linear relationship between the selected feature and its output class to reduce the amount of redundancy and uncertainty in the feature vector [61,62,63]. |
mRMR [21,22] | An incremental feature selection algorithm that forms an optimal feature subset by selecting features with minimum Redundancy and Maximum Relevancy. |
CIFE [23] | It forms an optimal feature subset that maximizes the class-relevant information between the features by reducing the redundancies among the features. |
JMI [24] | An increment to mutual information which finds the conditional mutual information to define the joint mutual information among the features and eliminates the redundant features if any. |
CMI [25] | Selects the features only if it carries additional information and eases the prediction task of the output class. |
DISR [26] | DISR measures the symmetrical relevance and combines all features variable to describe more information about the output class instead of focusing on individual feature information. |
ICAP [27] | Features are selected based on the interactions and understand the regularities of the feature set. |
CONDRED [28] | Identifies the conditional redundancy exists between the features. |