Table 4.
Study, year | ML methods | Best models | Best performance metrics | External validation |
Inselman et al [27], 2022 | GLMNeta, RFb, and GBMc | GBM |
|
No |
Hurst et al [25], 2022 | Lasso, RF, and XGBooste | XGBoost |
|
No |
Hogan et al [28], 2022 | Cox proportional hazard, LRf, and ANNg | ANN |
|
No |
Zein et al [29], 2021 | LR, RF, and GBDTh | GBDT |
|
No |
Sills et al [30], 2021 | AutoML, RF, and LR | AutoML |
|
No |
Hozawa et al [31], 2021 | XGBoost | XGBoost |
|
No |
Lisspers et al [32], 2021 | XGBoost, LGBMj, RNNk, and LR (Lasso, Ridge, and Elastic Net) | XGBoost |
|
No |
Ananth et al [23], 2021 | LR, DTl, and ANN | LR |
|
No |
Tong et al [33], 2021 | WEKAm and XGBoost | XGBoost |
|
Yes |
Mehrish et al [24], 2021 | GLMn, correlation models, and LR | LR |
|
No |
Xiang et al [4], 2020 | LR, MLPo, and LSTMp with an attention mechanism | LSTM with an attention mechanism |
|
No |
Cobian et al [34], 2020 | LR, RF, and LSTM | LR with L1 (Ridge) |
|
No |
Luo et al [35], 2020 | WEKA and XGBoost | XGBoost |
|
No |
Roe et al [22], 2020 | XGBoost, NNq, LR, and KNNr | XGBoost |
|
No |
Luo et al [26], 2020 | WEKA and XGBoost | XGBoost |
|
Yes |
Wu et al [21], 2018 | LSTM | LSTM |
|
Yes |
Patel et al [11], 2018 | DT, Lasso, RF, and GBDT | GBDT |
|
No |
aGLMNet: Lasso and Elastic-Net Regularized Generalized Linear Models.
bRF: Random Forest.
cGBM: gradient boosting method.
dAUC: area under the curve.
eXGBoost: extreme gradient boosting.
fLR: logistic regression.
gANN: artificial neural network.
hGBDT: gradient boosting decision tree.
iED: emergency department.
jLGBM: light gradient boosting method.
kRNN: recurrent neural network.
lDT: decision tree.
mWEKA: Waikato Environment for Knowledge Analysis.
nGLM: Generalized Linear Model.
oMLP: multilayers perceptron.
pLSTM: long short-term memory.
qNN: neural network.
rKNN: K-nearest neighbor.