Table 7. Comparison of the proposed work with the literature.
| Reference | # Species | # Images | Features | Classifier | Accuracy | Precision | Recall | F-measure |
|---|---|---|---|---|---|---|---|---|
| Bayer & Du Buf (2002) | 37 | 781 | Geometrical, textural, morphological, and frequency | Bagging Tree | 0.9690 | – | – | – |
| Luo et al. (2010) | 6 | 78 | Texture | BP neural network | 0.9600 | – | – | – |
| Dimitrovski et al. (2012) | 38 | 837 | Morphological, Texture | Random forest | 0.9797 | – | – | – |
| 48 | 1,019 | 0.9715 | – | – | – | |||
| 55 | 1,098 | 0.9617 | – | – | – | |||
| Bueno et al. (2017) | 80 | 24,000 | Morphological, statistical, textural, space-frequency | Bagging tree | 0.9810 | – | – | – |
| Pedraza et al. (2017) | 80 | 24,000 | AlexNet | Softmax | 0.9562 | – | – | – |
| 160,000 | 0.9951 | – | – | – | ||||
| Sánchez, Cristóbal & Bueno (2019) | 8 | 703 | Elliptical fourier descriptors, phase congruency descriptors, gabor filter | Supervised: k-NN, SVM, Unsupervised: K-means, hierarchical agglomerative clustering, BIRCH |
0.9900 | – | – | – |
| Libreros et al. (2019) | − | 365 | GoogleNet | Softmax | 0.9200 | 0.8400 | 0.9800 | 0.9000 |
| ResNet | 0.8900 | 0.6000 | 0.6700 | 0.6300 | ||||
| AlexNet | 0.9900 | 0.8400 | 0.9500 | 0.8900 | ||||
| Chaushevska et al. (2020) | 55 | 1,100 | Inceptionv3 | Bagging random forest SVM Fine-tuned CNN |
0.8027 0.8636 0.9109 0.9872 |
– | – | – |
| Proposed work | 68 | 12,108 | AlexNet | Softmax | 0.9802 | 0.9818 | 0.9802 | 0.9792 |
| DiatomNet | 0.9895 | 0.9898 | 0.9895 | 0.9892 | ||||
| GoogleNet | 0.9851 | 0.9853 | 0.9851 | 0.9847 | ||||
| Inceptionv3 | 0.9774 | 0.9788 | 0.9774 | 0.9771 | ||||
| ResNet18 | 0.9835 | 0.9851 | 0.9835 | 0.9828 | ||||
| VGG16 | 0.9758 | 0.9769 | 0.9758 | 0.9747 | ||||
| Xception | 0.9703 | 0.9711 | 0.9703 | 0.9688 | ||||
| TL with AlexNet | 0.9818 | 0.9836 | 0.9818 | 0.9808 | ||||
| TL with GoogleNet | 0.9818 | 0.9829 | 0.9818 | 0.9816 | ||||
| TL with Inceptionv3 | 0.9901 | 0.9911 | 0.9901 | 0.9902 | ||||
| TL with ResNet18 | 0.9807 | 0.9827 | 0.9807 | 0.9808 | ||||
| TL with VGG16 | 0.9884 | 0.9887 | 0.9884 | 0.9882 | ||||
| TL with Xception | 0.9873 | 0.9878 | 0.9873 | 0.9868 |