Table 3.
Model | Algorithm | Category | Strengths | Weaknesses | Example |
---|---|---|---|---|---|
DenseNet | CNN | Radiomics for TMB and survive prediction | Available for better performance with fewer parameters and computational costs by dense connection and feature reuse | Worse performance than other algorithms under the same video memory usage | [5] |
SResCNN | CNN | Radiomics for PD-L1 and survive prediction | Alleviate the network degradation problem caused by layer deepening and increased the generalization ability of the network |
Network layer redundancy Insufficient effective depth |
[14] |
RF | ML | Radiomics for PD-L1 and survive prediction |
Less likely to overfit Suitable for uneven data sets with missing variables Easier to explain Higher accuracy |
The larger the number of decision trees, the higher memory usage. Not suitable for situations with high real-time requirements | [22] |
Lunit SCOPE IO | DNN | Pathology images for TIL and prognosis prediction | Extracting richer data features and larger capacity | Training process is difficult: gradient explosion, gradient disappearance, etc. | [43] |
LCI-RPV | LR | Multi-omics for PD-L1 and Pneumonia prediction |
Suitable for linear variables Easier to explain |
Difficult to process nonlinear data or polynomial regression with correlation between data features | [20] |
MLP | ANN | Gut microbiome for survive prediction |
Suitable for nonlinear model and real-time learning process Stronger elf-learning function |
Slower training rate Difficult to determine the parameters |
[52] |
SVM | ML | Combined biomarkers for efficiency prediction |
Suitable for high-dimensional space High accuracy Not suffer multicollinearity Flexible selection of kernels for nonlinear correlation |
Inefficient to train Not suitable for plenty training examples |
[46] |
DenseNet Densely Connected Convolutional Network, SResCNN Small Residual Product Network, CNN Convolutional Neural Network, TMB Tumor Mutation Burden, PD-L1 Programmed Death Ligand 1, RF Random Forests, ML Machine Learning, DNN Deep Neural Networks, ANN Artificial Neural Network, MLP Multilayer Perceptron, SVM support vector machine