Table 1.
Selected examples of the use of interpretable machine learning approaches to NICU data
Article | Study population | Data set(s) | Predicted variable | Machine learning algorithm applied | Interpretability technique(s) |
---|---|---|---|---|---|
Overweg et al. [40] | ICU and TBI | CENTER-TBI, MIMIC-III | ICU/NICU mortality | BNN | HorseshoeBNN—a novel approach proposed by the authors; the horseshoe prior has been added to induce sparsity in the first layer of the BNN, enabling feature selection |
Caicedo-Torres and Gutierrez [41] | ICU | MIMIC-III | ICU mortality | Multiscale deep convolutional neural network (ConvNet) | DeepLIFT, visualizations |
Thorsen-Meyer et al. [42] | ICU | 5 Danish medical and surgical ICUs | All-cause 90-day mortality | Recurrent neural network with LSTM architecture | SHAP |
Wang et al. [43] | ICU patients diagnosed with cardiovascular disease | MIMIC-III | Survival | LSTM network | Counterfactual explanations |
Fong et al. [44] | ICU | eICU collaborative research database and 5 ICUs in Hong Kong | Hospital mortality | XGBoost | SHAP |
Che et al. [45] | Pediatric ICU patients with acute lung injury | Pediatric ICU at Children’s Hospital Los Angeles | Mortality, ventilator-free days | Interpretable mimic learning (using gradient boosting trees) | Partial dependence plots, feature importance, intrinsic interpretability of tree structure |
Shickel et al. [46] | ICU | UFHealth, MIMIC-III | In-hospital mortality | RNN with GRU | Modified GRU-RNN network with final self-attention mechanism (to identify feature importance) |
Farzanah et al. [47] | TBI | ProTECT III | Functional outcome – GOSE at 6 months | XGBoost | SHAP |
Gao et al. [48] | TBI | NICU at Cambridge University Hospitals, Cambridge | Mortality 6 months post brain injury | Decision tree | Intrinsic interpretability of model |
Thoral et al. [49] | ICU | AmsterdamUMCdb | ICU readmission and/or death, both within 7 days of ICU discharge | XGBoost | SHAP |