Table 2.
Ref. | Base Learner Models | Ensemble Model | Data Type | Preprocessing Technique | Positive/Negative Cases | Dataset | Attributes/Instances | Accuracy | Best Model |
---|---|---|---|---|---|---|---|---|---|
[45] | NB, LR, MLP, SVM, DS, RT | AdaBoost, Bagging, Voting, Stacking | Clinical | Handled missing values, feature selection, and sample filtering | Razi Hospital | 42/936 | Bagging = 99.1%, Boosting (AdaBoost) = 99.1%, Voting = 96.6%, Stacking = 97.1% | Boosting Bagging |
|
[46] | NB, LR, ANN, CART, SVM | Gradient Boosting, RF | Clinical | Feature selection, handling missing values, and imputation | 250/150 | UCI Chronic Kidney | 25/400 | Bagging (RF) = 96.5%, Boosting (Gradient Boosting) = 90.4% | Bagging |
[2] | - | AdaBoost, RF, ETC bagging, Gradient boosting | Clinical | Feature engineering | 250/150 | UCI Chronic Kidney | 25/400 | Bagging (Extra trees) = 98%, Bagging = 96%, Bagging (RF) = 95%, Boosting (AdaBoost) = 99%, Boosting (Gradient) = 97% | Boosting |
[47] | LR, KNN, SVC | Gradient Boosting, RF | Clinical | Handled missing values | 250/150 | UCI Chronic Kidney | 25/400 | Bagging (RF) = 99%, Boosting (Gradient) = 98.7% | Bagging |
[13] | - | AdaBoost, Bagging and Random Subspaces | Clinical | Feature extraction | 250/150 | UCI Chronic Kidney | 25/400 | AdaBoost = 99.25%, Bagging = 98.5%, Bagging (Random Subspace) = 99.5% | Bagging |
[11] | NB, SMO, J48, RF | Bagging, AdaBoost | Feature selection and handling missing values | 250/150 | UCI Chronic Kidney | 25/400 | Bagging = 98%, Bagging (RF) = 100%, Boosting (AdaBoost) = 99% | Bagging |