Cross Validation
How it works? | Illustration | |
---|---|---|
K-fold | 1. Create $k$-fold partition of the dataset 2. Estimate $k$ hold-out predictors using $1$ partition as validation and $k-1$ partition as training set | |
Leave-One-Out (LOO) | (Special case with $k=n$) Consequently estimate $n$ hold-out predictors using $1$ partition as validation and $n-1$ partition as training set | |
Random sub-sampling | 1. Randomly sample a fraction of $\alpha \cdot n, \alpha \in (0,1)$ data points for validation 2. Train on remaining points and validate, repeat $K$ times |