1 |
Leave one out cross-validation |
An extreme type of CV that leaves one data sample out of the total data sample, then n − 1 samples are used to train the model and one sample is used as the validation set. |
2 |
Hold-out cross-validation |
This is the usual train/test split of the dataset is a CV technique in which the dataset is arbitrarily partitioned into 2 parts of training and testing (validation). |
3 |
k-fold cross-validation |
In the k-fold cross-validation, the dataset is partitioned into k parts such that each time, one of the k parts is used as the training set and the other k − 1 subsets as the validation set. |
4 |
Stratified k-fold cross-validation |
It is a small variation of k-fold CV, in which each fold contains approximately the same strata of samples. |
5 |
Nested cross-validation |
Otherwise known as double cross-validation, in which k-fold cross-validation is employed within each fold of cross-validation often to tune the hyperparameters during model evaluation. |