Table 3. Performance of six ML classifiers on MSFT stock Goal I.
| Model | Accuracy | F1−score | Precision | Recall | ||||
|---|---|---|---|---|---|---|---|---|
| TD1 | TD2 | TD1 | TD2 | TD1 | TD2 | TD1 | TD2 | |
| KNN | 0.520 | 0.451 | 0.515 | 0.443 | 0.515 | 0.441 | 0.520 | 0.451 |
| LR | 0.536 | 0.533 | 0.380 | 0.383 | 0.294 | 0.406 | 0.536 | 0.533 |
| NB | 0.600 | 0.482 | 0.573 | 0.417 | 0.604 | 0.428 | 0.600 | 0.482 |
| RF | 0.464 | 0.463 | 0.459 | 0.459 | 0.457 | 0.458 | 0.464 | 0.463 |
| SVM | 0.488 | 0.502 | 0.417 | 0.433 | 0.546 | 0.570 | 0.488 | 0.502 |
| Tree | 0.472 | 0.494 | 0.467 | 0.484 | 0.465 | 0.484 | 0.472 | 0.494 |
Bold values represent the best performances.