Decision Trees [30] |
Inverted-tree-like graph that with input root nodes, internal nodes with the classification based on the feature value, and output as leaf nodes. |
Simplifies the complex relationship and allows easier interpretation However, it has limited robustness and the strong correlation of the input variables may have the probability of inaccurate results |
Boosted Trees [31] |
Ensemble of multiple decision trees, which the trees are built dependent with the results of the previous tree. |
It can adjust the errors made by the previous trees. The combining of numerous trees increases the robustness. |
Random Forests [27] |
Ensemble of numerous independent decision trees which composed of randomly selected samples, the results of each tree are averaged out to become the final result. |
It has lower susceptibility of the biased sample and a higher robustness |
Support Vector Machine [9] |
A hyperplane which can separate the data into two categories with the margin on the either side of the hyperplane. |
The maximized margins can reduce the generalization error, yet the misclassified data that not lied within the margin may lower the accuracy. |
Generalized Linear Model [32] |
The linear relationship of the observed features and the real output is drawn with similar number of samples lying on both sides of the regression line. Then the output can be predicted by the linear regression line with the known feature |
The results may be affected when the numbers of feature are much more than the numbers of output |
Artificial Neural Networks [33] |
Feed-Forward. Neural Network consists of input hidden and output layer. |
Ability to learn for nonlinear and complex relationship; yet it is difficult to interpret It works well with the tabular data. |
Convolution Neural Networks [33] |
The convolution filter is applied on the image forming kernels to form the features map. Then extracting the relevant features from the input data automatically |
It captures the spatial feature of the image well |
Recurrent Neural Networks [33] |
It consists of the ANN-like structure but with a recurrent loop in the hidden layer. |
The recurrent loop can well capture the sequential feature & it works well with the text data. |