Table 5.
Model | Characterization | References | |
---|---|---|---|
NB | Consists of a network, composed of a main node with other associated descending nodes that follow Bayes’ theorem [65]. | [13,35,40,53] | |
SVM | Consists of building the hyperplane with maximum margin capable of optimally separating two classes of a data set [65]. | [13,37,38,39,40,41,50,51,52,53,54,55,61,66] | |
RF | Relies on the creation of a large number of uncorrelated decision trees based on the average random selection of predictor variables [67]. | [13,61] | |
DT | Consists of building a decision tree where each node in the tree specifies a test on an attribute, each branch descending from that node corresponds to one of the possible values for that attribute, and each leaf represents class labels associated with the instance. The instances of the training set are classified following the path from the root to a leaf, according to the result of the tests along the path [68]. | [39,53,54,55] | |
KNN | Based on the memory principle in the sense that it stores all cases and classifies new cases based on similar measures [65]. | [42,46,48] | |
LR | A model capable of finding an equation that predicts an outcome for a binary variable from one or more response variables [69]. | [42,51] | |
LDA | It is a discriminatory approach based on the differences between samples of certain groups. Unsupervised learning technique where the objective is to maximize the relationship between the variance between groups and the variance within the same group [70]. | [54,55] | |
ANN | DNN | Naturally inspired models. Supervised learning approach based on a theory of association (pattern recognition) between cognitive elements [71]. There are many possibilities with different elements, structures, layers, etc. The larger the number of parameters then the larger the dataset must be. | [42,43,46,47,48,52,53] |
CNN | |||
RNN | |||
MLP |
NB: Naive Bayes; RF: Random Forest; LDA: Linear Discriminant Analysis; SVM: Support Vector Machine; DT: Decision Trees; ANN: Artificial Neural Networks; RNN: Recurrent Neural Network; CNN: Convolutional Neural Networks; MLP: Multilayer Perceptron; KNN: k-Nearest Neighbors; DNN: Deep Neural Networks; LR: Logistic Regression.