Table 3.
Articles | Feature Extraction Strategy | Classifier/Model | Validation (trn:tst) |
---|---|---|---|
Hu et al. [48] | 7 ResNet18 models on different segmentation approaches | 71:29 | |
Pereira et al. [49] | Predetermined SWE statistical features and SWE features extracted by circular Hough transform | Logistic regression, naïve Bayes, SVM, decision tree | 82:18 |
Fully trained CNN (2-layer) model for B-mode and SWE Pretrained CNN18 for B-mode and SWE Combine classification by averaging class probabilities of trained B-mode and SWE models | |||
Qin et al. [50] | Pretrained VGG16 with 3 fused methods (MT, FEx-reFus, and Fus-reFEx) and 3 classifier layers (FCL, SPP, and GAP) | 82:18 | |
Săftoiu et al. [51] | MLP (3- and 4-layer) | 10-fold cxv | |
Săftoiu et al. [52] | MLP (4-layer) | 10-fold cxv | |
Sun et al. [53] | Deep feature extractor on SWE US Predetermined statistical and radiomics features on B-mode US |
Logistic regression, naïve Bayes, and SVM on both SWE and B-mode features. Classifications of both models were combined and hybridized by uncertainty decision-theory-based voting system (pessimistic, optimistic, and compromise approaches). | 5-fold cxv |
Udriștoiu et al. [54] | CNN on B-mode, contrast harmonic sequential images taken at 0, 10, 20, 30, 40 s, color Doppler, and elastography LSTM on contrast harmonic sequential images taken at 0, 10, 20, 30, 40 s CNN and LSTM merged by concatenation layer. |
80:20 | |
Zhang et al. [55] | 11 predetermined B-mode features 1 predetermined elastography feature |
Logistic regression, linear discriminant analysis, random forest, kernel SVM, adaptive boosting, KNN, neural network, naïve Bayes, CNN | 60:40, 10-fold cxv |
Zhao et al. [56] | 20 predetermined radiomics features | Logistic regression, random forest, XGBoost, SVM, MLP, KNN | - |
Zhao et al. [57] | Machine-learning-assisted approach (6 predetermined B-mode and 5 SWE features) Radiomics features |
Decision tree, naïve Bayes, KNN, logistic regression, SVM, KNN-based bagging, random forest, XGBoost, MLP, gradient boosting tree | Training: 520 Testing: 223 External Testing: 106 |
Zhou et al. [58] | Predetermined statistical features, Feature extraction by GLCOM-GLRLM, MSCOM | RBM + Bayesian | - |
CNN: convolutional neural network; FCL: fully connected layers; FEx-reFus: feature extraction followed by refusion; Fus-reFEx: fusion followed by feature re-extraction; GAP: global average pooling; GLCOM-GLRLM: ray-level co-occurrence matrix and gray-level run-length matrix; KNN: k-nearest neighborhood; LSTM: long short-term memory; MLP: multilayer perceptron; MSCOM: multiple subgraph co-occurrence matrix based on multilevel wavelet; MT: mixed training; RBM: restricted Boltzmann machine; SPP: spatial pyramid pooling; SVM: support vector machine; SWE: shear-wave elastography; trn: training; tst; testing; cxv: cross-validation; XGBoost: extreme gradient boosting.