Table 5.
ML model | Class | SCLC | |||
---|---|---|---|---|---|
Accuracy | Precision | Recall | F1 score | ||
Logistic regression | 0 | 0.83 | 0.76 | 0.68 | 0.68 |
1 | 0.7 | 0.75 | 0.68 | 0.68 | |
W. Avg | 0.76 | 0.76 | 0.68 | 0.68 | |
k-nearest neigh-bours | 0 | 0.85 | 0.77 | 0.7 | 0.84 |
1 | 0.79 | 0.76 | 0.76 | 0.79 | |
W. Avg | 0.82 | 0.77 | 0.73 | 0.82 | |
Support vector machine | 0 | 0.88 | 0.86 | 0.74 | 0.88 |
1 | 0.82 | 0.84 | 0.87 | 0.71 | |
W. Avg | 0.85 | 0.85 | 0.8 | 0.79 | |
Random forest classifier | 0 | 0.88 | 0.87 | 0.79 | 0.87 |
1 | 0.77 | 0.83 | 0.88 | 0.79 | |
W. Avg | 0.85 | 0.85 | 0.85 | 0.84 | |
XgBoost algorithm | 0 | 0.80 | 0.81 | 0.86 | 0.80 |
1 | 0.79 | 0.80 | 0.79 | 0.85 | |
W. Avg | 0.81 | 0.81 | 0.83 | 0.82 | |
AdaBoost algorithm | 0 | 0.85 | 0.82 | 0.80 | 0.84 |
1 | 0.81 | 0.83 | 0.83 | 0.79 | |
W. Avg | 0.83 | 0.83 | 0.81 | 0.81 |