Skip to main content
. 2022 Apr 28;2022:3854635. doi: 10.1155/2022/3854635

Table 12.

Summary of review of HSI classification using transfer learning.

Year Method used Dataset and COA Research remarks and future scope
2018 Deep mapping-based heterogeneous transfer learning model (DLTM) [150] Washington DC Mall—96.25% Capable of binary classification
Improvisation to multiclass classification

2018 AL with stacked sparse autoencoder (AL-SSAE) [151] UP—99.48%, center of Pavia—99.8%, SV— 99.45% Domains, both source, and target possess finely tuned hyperparameters
Architectural parameters need to be modified further to enhance the classification accuracy

2020 Heterogeneous TL based on CNN with attention mechanism (HT-CNN-attention) [152] SV—99%, UP—97.78%, KSC—99.56%, IP—96.99% Efficient approach regardless of the sample selection strategies chosen

2020 ELM-based ensemble transfer learning (TL-ELM) [26] UP—98.12%, Pavia center—96.25% Efficient accuracy and transferability with high training speed
Inclusion of SuperPCA and knowledge transfer

2020 Lightweight shuffled group convolutional neural network (abbreviated as SG-CNN) [153] Botswana—99.67%, HU—99.4%, Washington DC—97.06% Fine-tuned model as compared to CNN architectures, low computational cost for training
Inclusion of more grouped convolutional architectures

2021 Super-pixel pooling convolutional neural network with transfer learning (SP-CNN) [154] SV—95.99%, UP—93.18%, IP—94.45% More excellent parameter optimization with more accuracy using a limited number of samples and in a very short period for both training and testing
Optimal super-pixel segmentation and merging with different CNN architectures