Logistic regression |
Provides probabilistic interpretation of model parameters |
Only used to predict discrete function |
- |
Quick model update for incorporating new data |
Sensitive to outliers |
K-nearest neighbors |
Nonparametric model |
Time-consuming and computationally expensive |
Nerve identification [10] |
Used both for classification and regression problems |
Number of neighbors must be defined in advance |
Low interpretability |
Naïve Bayes |
Suitable for relatively small datasets |
Classes must be mutually exclusive |
- |
Handles both binary and multi-class classification problems |
Presence of dependency between attributes results in loss of accuracy |
Fast application and high computational efficiency |
Assumptions such as the normal distribution might be invalid |
Support vector machines |
Good prediction performance in different tasks |
Have "black box" characteristics |
Lumbar spine classification [11] |
Can handle multiple feature spaces |
Sensitive to manual parameter tuning and kernel choice |
Synovitis grading [12] |
Nerve identification [10] |
Decision trees |
Perform in datasets with large number of features |
Only axis-aligned rectangle splits. |
Nerve identification [10] |
Few parameter tuning |
Inadequate for regression and continuous value prediction problems |
High representational power and easy to interpret |
Mistake in higher labels cause errors in subtrees |
Random forest |
Provide estimates of variable or attribute importance in the classification |
Complex and computationally expensive |
Myositis classification [13] |
Ensemble-based classifications shows relatively good performance |
Number of base classifiers needs to be defined |
Hip 2-D US adequacy classification [14] |
Overfitting has been observed for noisy data |
Neural networks |
Direct image processing |
Have "black box" characteristics |
Nerve identification [10] |
Can map complex nonlinear relationships between dependent and independent variables |
Have to fine-tune many parameters |
Require a large well-annotated dataset to achieve good performance |
K-means |
Can process large datasets |
Number of clusters must be defined |
Nerve localization [15] |
Algorithm that is simple to understand and implement |