Skip to main content
. 2017 Jul 4;17(7):1564. doi: 10.3390/s17071564

Table 4.

The confusion matrix produced by the proposed method.

Label Predicted Sum PA (%)
1 2 3 4 5 6 7 8
Actual 1 2470 1 1 0 6 1 21 6 2506 98.56
2 1 2454 0 1 0 7 0 0 2463 99.63
3 0 0 2501 0 1 0 3 7 2512 99.56
4 0 3 0 2509 0 0 0 0 2512 99.88
5 0 0 0 0 2545 0 0 0 2545 100
6 1 13 0 1 2 2438 6 2 2463 98.99
7 3 1 1 1 1 0 2507 11 2525 99.29
8 9 4 3 0 4 0 28 2426 2474 98.06
Sum 2484 2476 2506 2512 2559 2446 2565 2452 20,000
UA (%) 99.44 99.11 99.80 99.88 99.45 99.67 97.74 98.94

Overall accuracy: 99.25%; Kappa coefficient: 0.9914. Note: PA means the Producer’s Accuracy, UA means the User’s Accuracy.