Skip to main content
. 2021 May 15;7(5):89. doi: 10.3390/jimaging7050089

Table 2.

Characteristics of the dimensionality-reduction-based feature extraction methodologies.

Ref. Key Features Advantages Disadvantages
[45] Application of pattern map images with PCA Fast and a high identification rate (100%) High number of feature vectors (40 features), results depend on parameters
[46] Application of manifold learning Robust against pose variation, a low EER (0.80%) Low RR (97.80%)
[47] Combination of B2DPCA with eigenvalue normalization Improves upon the original 2DPCA method and other methods Low RR (97.73%)
[48] Combination of Radon transformation and PCA Low FAR (0.008) and FRR (0) An in-house dataset is used instead of a benchmark one
[49] Application of linear discriminant analysis with PCA Very fast and retains the main feature vector Low Accuracy (98.00%)
[50] Application of (2D)2PCA High RR (99.17%) Sample increment with SMOTE
[51] Comparison of multiple PCA algorithms Can reach an accuracy of up to 100% Requires a large training set
[52] Application of KPCA High accuracy (up to 100%) Accuracy depends on the kernel, feature output, and training size
[53] Combination of KMMC and 2DPCA Improves upon the recognition time of just KMMC Very slow recognition time
[54] Combination of MFRAT and GridPCA Fast and robust against vein structures, variations in illumination and rotation Low RR (95.67%)
[55] Application of pseudo-elliptical sampling model with PCA Retains the spatial distribution of vein patterns, reduces redundant information and differences High EER (1.59%) and low RR (97.61%)
[56] Application of Discriminative Binary Codes Fast extraction and matching with a low EER (0.0144%) Requires the construction of a relation graph
[57] Combination of Gabor filters and LDA Low EER (0.12%) Part of a multi-modal system
[58] Application of multi-scale uniform LMP with block (2D)2PCA Preserves local features with a high RR (99.32%) Does not retain global features and the EER varies per dataset (high to low)