Skip to main content
. 2022 Jun 9;14(12):2860. doi: 10.3390/cancers14122860

Table 2.

AI-based model using cross-validation.

SN Cross-Validation Type Brief Description
1 Leave one out cross-validation An extreme type of CV that leaves one data sample out of the total data sample, then n − 1 samples are used to train the model and one sample is used as the validation set.
2 Hold-out cross-validation This is the usual train/test split of the dataset is a CV technique in which the dataset is arbitrarily partitioned into 2 parts of training and testing (validation).
3 k-fold cross-validation In the k-fold cross-validation, the dataset is partitioned into k parts such that each time, one of the k parts is used as the training set and the other k − 1 subsets as the validation set.
4 Stratified k-fold cross-validation It is a small variation of k-fold CV, in which each fold contains approximately the same strata of samples.
5 Nested cross-validation Otherwise known as double cross-validation, in which k-fold cross-validation is employed within each fold of cross-validation often to tune the hyperparameters during model evaluation.