Table 1.
Model characteristics | All models (n = 152) |
---|---|
n (%) | |
Regression-based models | 42 (28) |
Logistic regression | 26 |
Cox regression | 7 |
Linear regression | 3 |
LASSO (Logistic regression) | 1 |
LASSO (Cox regression) | 1 |
LASSO (model not specified) | 3 |
Best subset regression with leave-out cross-validation | 1 |
Non-regression-based models | 71 (47) |
Neural network (including deep learning) | 18 |
Classification tree (e.g., CART, decision tree) | 28 |
Support vector machine | 12 |
Naive Bayes | 6 |
K nearest neighbours | 3 |
Othera | 4 |
Ensemble models | 39 (26) |
Random forest (including random survival forest) | 23 |
Gradient boosting machine | 8 |
RUSBoost - boosted random forests | 1 |
Bagging with J48 selected by Auto-WEKA | 1 |
CoxBoost - boosted Cox regression | 1 |
XGBoost: exTreme Gradient Boosting | 1 |
Gradient boosting machine and Nystroem, combined using elastic net | 1 |
Adaboost | 1 |
Bagging, method not specified | 1 |
Partitioning Around Medoid algorithm and complete linkage method | 1 |
Median number of models developed per study [IQR], range | 2 [1–4], 1–6 |
CART Classification And Regression Tree, LASSO Least Absolute Shrinkage and Selection Operator
aOther includes voted perceptron; fuzzy logic, soft set theory and soft set computing; hierarchical clustering model based on the unsupervised learning for survival data using the distance matrix of survival curves; Bayes point machine