Table A1.
SN | Attributes | Multiclass | Multi-Label | Ensemble | |||
- | - | Characteristics | Characteristics | Characteristics | |||
Total Studies | 14 | [69,70,71,72,73,74,75,76,77,78,79,80,81,82] | 8 | [83,84,85,86,87,88,89,90] | 32 [80,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105,106,107,108,109,110,111,112,113,114,115,116,117,118,119,120,121] | ||
1 | Data Size | 212–66,363 | [69,70,71,72,73,74,75,76,77,78,79,80,81,82] | 300–46,520 | [83,84,85,86,87,88,89,90] | 459–823,627 [80,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105,106,107,108,109,110,111,112,113,114,115,116,117,118,119,120,121] | |
2 | Risk Factors | Low | [69,70,71,72,73,74,75,76,77,78,79,80,81,82] | Large | [83,84,85,86,87,88,89,90] | Moderate [80,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105,106,107,108,109,110,111,112,113,114,115,116,117,118,119,120,121] | |
3 | Family History | Frequent Considered | [69,71,76,77,80,82] | Seldom Considered | [83,84,90] | Considered Intermittently [80,91,96,97,99,100,102,105,106,110,111,112,114,115,116,117,118,119,120] | |
4 | BMI | Less considered | [72,74,75,76,80] | Considered Moderately | [84,85,86] | Highly considered [46,47,48,49,50,51,52,80,91,93,94,95,96,97,99,100,102,106,107,112] | |
5 | Ethnicity | Less Considered | [72,74,75,76,80] | Considered Moderately | [84,85,86] | Highly Considered | |
6 | Type of data | OBBM and LBBM | [69,70,71,72,73,74,75,76,77,78,79,80,81,82] | OBBM, LBBM and Image | [83,84,85,86,87,88,89,90] | OBBM and LBBM [80,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105,106,107,108,109,110,111,112,113,114,115,116,117,118,119,120,121] | |
7 | Hypertension | Low Usage | [72,74,75,76,80] | High Usage | [83,84,85,86,87,88,89,90] | Moderate Usage [46,47,48,49,50,51,52,80,91,93,94,95,96,97,99,100,102,106,107,112] | |
8 | Smoking | Low Usage | [72,74,75,76,80] | High Usage | [83,84,85,86,87,88,89,90] | Moderate Usage [80,91,96,97,99,100,102,105,106,110,111,112,114,115,116,117,118,119,120] | |
9 | Multicenter | Low Usage | [72,74,75,76,80] | High Usage | [83,84,85,86,87,88,89,90] | Moderate Usage [80,91,96,97,99,100,102,105,106,110,111,112,114,115,116,117,118,119,120] | |
10 | MRI | Considered Moderately | [71,80] | Considered Moderately | [83,89] | Less Considered [80] | |
11 | ECG | Partial Considered | [72,74,75,78,79,81,82] | Strongly Considered | [83,86,87,89] | Not Considered | |
12 | CUSIP | Moderate Usage | Moderate Usage | Low Usage | |||
13 | # GT | Only 1 | [69,70,71,72,73,74,75,76,77,78,79,80,81,82] | Very high (10-4) | [83,84,85,86,87,88,89,90] | Average (1,2) | [80,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105,106,107,108,109,110,111,112,113,114,115,116,117,118,119,120,121] |
14 | # Algorithm | 🗶 | 🗸 | [83,84,85,86,87,88,89,90] | 🗶 | ||
15 | Type of Algorithm | 🗶 | - | 🗶 | |||
16 | # Classifiers | Ranging from 1–4 | [69,70,71,72,73,74,75,76,77,78,79,80,81,82] | Ranging from 1–9 | [83,84,85,86,87,88,89,90] | Ranging from 1–10 [80,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105,106,107,108,109,110,111,112,113,114,115,116,117,118,119,120,121] | |
SN | Attributes | Multiclass | Multi-label | Ensemble | |||
- | - | Characteristics | Characteristics | Characteristics | |||
17 | Classifier Type | SVM, RF, CNN DT, k-NN Agatston classifier, Elastic Net, NN, NB, XGBoost SVM, ELM, OAO, OAA, DDAG, ECOC [69,70,71,72,73,74,75,76,77,78,79,80,81,82] |
RF, SVM, DT, KNN, LDA, LR, XGBoost, AdaBoost, GBA, Basic RNN, GRU RNN CNN, AAM [83,84,85,86,87,88,89,90] |
kNN, GaussNB, LDA, QDA, RF, MLP, CNN, LSTM, GRU, BiLSTM, BiGRU Bagging, XGBoost, Adaboost, DNN, NB, NN, RS, GAMs, Elastic Net, GBMs, DT, CART, MARS, Logistic, EB, SMO, Boosting, MLDS, AVEn, MVEn, WAVEn, HTSA [80,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105,106,107,108,109,110,111,112,113,114,115,116,117,118,119,120,121] |
|||
18 | # Classes | 🗸 | [69,70,71,72,73,74,75,76,77,78,79,80,81,82] | 🗶 | 🗶 | ||
19 | Hyperparameters Used | 🗸 | [79] | 🗸 | [83,84,90] | 🗸 | [92,98,99,100] |
20 | Protocol | K-10 | [64,65,66,67,68,69,70,71,72,73,74,75,76,77,78,79,80,81,82] | K-10, K, K-5 | [83,84,85,86,87,88,89,90] | K-10, k, K-5 | [80,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105,106,107,108,109,110,111,112,113,114,115,116,117,118,119,120,121] |
21 | # PE parameters | Ranging from 1–5 | [69,70,71,72,73,74,75,76,77,78,79,80,81,82] | Ranging from 1–8 | [83,84,85,86,87,88,89,90] | Ranging from 1–8 | [80,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105,106,107,108,109,110,111,112,113,114,115,116,117,118,119,120,121] |
22 | Precision | 🗸 | [72,73,77,81,82] | 🗶 | 🗸 | [80,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105,106,107,108,109,110,111,112,113,114,115,116,117,118,119,120,121] | |
23 | PPV | 🗶 | 🗸 | [84,86] | 🗸 | [80,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105,106,107,108,109,110,111,112,113,114,115,116,117,118,119,120,121] | |
24 | NPV | 🗶 | 🗸 | [84,86] | 🗸 | [80,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105,106,107,108,109,110,111,112,113,114,115,116,117,118,119,120,121] | |
25 | FPR | 🗶 | 🗸 | [84,90] | 🗸 | [80,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105,106,107,108,109,110,111,112,113,114,115,116,117,118,119,120,121] | |
26 | FNR | 🗶 | 🗸 | [84] | 🗸 | [80,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105,106,107,108,109,110,111,112,113,114,115,116,117,118,119,120,121] | |
27 | Hamming Loss | 🗶 | 🗸 | [87] | 🗶 | ||
28 | C-index | 🗶 | 🗸 | [83] | 🗶 | ||
29 | Statistical Analysis | 🗶 | 🗸 | [83,84,85,86,87,88,89,90] | 🗸 | [80,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105,106,107,108,109,110,111,112,113,114,115,116,117,118,119,120,121] | |
30 | Power Analysis | 🗶 | 🗸 | [83,84] | 🗶 | ||
31 | Hazard Analysis | 🗶 | 🗸 | [83] | 🗶 | ||
32 | Survival Test | 🗶 | 🗸 | [83] | 🗶 |
SN: Serial number; SVM: Support vector machine; RF: Random forest; CNN: Convolutional neural network; DT: Decision tree, k-NN: k-Nearest neighbor; NN: Neural network; ELM: Extreme learning machine; OAO: One against one; OAA: One against all; DDAG: Decision direct acyclic graph; EOECC: Exhaustive output error correction code; LDA: Linear discriminant analysis; RNN: Recurrent neural networks; GRU: Gated recurrent unit; AAM: Algorithm adaptation methods; MARS: Multivariate adaptive regression splines; GAMs: Generalized additive models; PLR: Penalized logistic regression; GBM: Gradient boosted machines; MLP: Multilayer perceptron; CART: Classification and regression trees; SMO: Sequential minimal optimization; DNN: Deep neural network; NB: Naive Bayes; LSTM: Long short term memory network; EB: Ensemble boosting; MLDS: Multi-layer defense system; PPV: Positive predictive value; NPV: Negative predictive value; FPR: False positive rate; FNR: False negative rate; #GT: Number of ground truth.