Table 1.
Regression based | Examples | |
---|---|---|
Logistic regression | • Use parametric regressions to estimate the probabilities of dichotomous outputs (Dasgupta et al., 2011) | Cox, 1958; Yu et al., 2014; Niriella et al., 2018 |
Neural Network | • Use multi-layers of non-parametric regressions and transformations to model input data to outputs (Mehta et al., 2019) | Rosenblatt, 1962; Montañez et al., 2015; Xue et al., 2018 |
Support vector machine (SVM) | • Use non-parametric regressions to model input data for creating multi-dimensional hyperspaces to discriminate the outputs (Yu, 2010) | Corinna and Vladimir, 1995; Abraham et al., 2014; Han, 2018 |
Regression based regularization | ||
Lasso | • Apply L1 penalized loss functions in regression (Okser et al., 2014) | Tibshirani, 1996; Wei et al., 2013; Song et al., 2018 |
Elastic net | • Apply L1 and L2 penalized loss functions in regression (Okser et al., 2014) | Zou and Hastie, 2005; Abraham et al., 2013; Rashkin et al., 2018 |
Tree-based | ||
Decision tree | • Utilize binary decision splitting rule approaches to model the relationships between input data and outputs (Mehta et al., 2019) | Quinlan, 1986; Geurts et al., 2009; Li et al., 2018 |
Random forest | • Utilize an ensemble of randomized decision trees to model input data to outputs (Mehta et al., 2019) | Breiman, 2001; Worachartcheewan et al., 2015; Dai et al., 2018 |
The examples include the founding papers and current examples as at December 2018.