Cross Validation

Cross Validation

How it works?Illustration
K-fold1. Create $k$-fold partition of the dataset
2. Estimate $k$ hold-out predictors using $1$ partition as validation and $k-1$ partition as training set

img
Leave-One-Out (LOO)(Special case with $k=n$)
Consequently estimate $n$ hold-out predictors using $1$ partition as validation and $n-1$ partition as training set

img
Random sub-sampling1. Randomly sample a fraction of $\alpha \cdot n, \alpha \in (0,1)$ data points for validation
2. Train on remaining points and validate, repeat $K$ times

🎥 Explaination