Table 3.
First Author (Ref. #) | Disease Application |
Sample Size |
Variable Input | Output | Algorithms | Results |
---|---|---|---|---|---|---|
Huang et al (30) | Prediction of hypertension | 3,054 | Occupation, family history, educational level, alcohol intake, vegetable and fruit intake, salt, animal insides intake, physical exercise, body mass index, and blood pressure | Prevalent hypertension | LRM and ANN | ANN model was better than LRM in predicting the presence of hypertension |
AlKaabi et al (31) | Prediction of hypertension | 987 | Age, sex, education, employment, tobacco use, physical activity, consumption of fruits and vegetables, mother history of hypertension, diabetes, cholesterol, and abdominal obesity | Prevalent hypertension | DT, RF and LRM | RF model had better prediction accuracy for screening the presence of hypertension |
Kanegae et al (38) | Prediction of hypertension | 18,258 | Medical history, lifestyle factors, anthropometrics, and biochemical measurements | Incident hypertension | XGBoost, ensemble, and LRM | ML developed a highly precise model for predicting incident hypertension |
Katz et al (35) | Classification of hypertension | 1,273 | Demographics, physical characteristics, laboratory, and echocardiographic indices | Hypertension phenotypes | Agglomerative hierarchical clustering | 2 distinct types of hypertension with different cardiac substrate |
Wu et al (20) | Prediction of outcome | 508 | Left atrial diameter, HDL-C, big endothelin-1, right arm diastolic BP, right/left leg systolic BP, right leg diastolic BP, left arm systolic BP, mean nocturnal arterial oxygen saturation, past maximum systolic BP, and urea | Clinical outcomes | Recursive feature elimination and XGBoost | ML model was comparable with Cox proportional model for outcome prediction and better than recalibrated Framingham risk score model |
ANN = artificial neural network; BP = blood pressure; DT = decision tree; HDL-C = high-density lipoprotein–cholesterol; LRM = logistic regression model; ML = machine learning; RF = random forest.