Skip to main content
. 2022 Feb 21;5:780405. doi: 10.3389/frai.2022.780405

Table 7.

Strength and limitations of transfer learning approaches.

TL method Strengths Limitations
KL Needs little computational resources and time. Cannot be implemented on CNN algorithms.
FF Does not need training or fine-tuning. Also, does not need large computational resources. Cannot generalize well on very different source and target datasets. Fewer applications.
FT Convolution layers act as feature extractor and do not need to be trained again. Diverse applications. Faster than most methods other than FF. Second in performance after TT. Needs more hardware resources than FF because of fine-tuning of FC layers. May not be as successful as TT if source and target datasets are very different.
FI Convolution layers act as feature extractor and do not need to be trained again. Diverse applications. Faster than most methods other than FF. Needs more resources for FC layers to be trained from scratch. Slow.
TT Best performance among all methods. Very flexible. Needs more resources for fine-tuning convolution and FC layers. Slow.
TI Strong performance, almost as good as FT. Much more flexible than other methods, including TT. Slowest. Needs much more resources than other methods.