- Let us say, you are writing a nice and clean Machine Learning code (e.g. Linear Regression).
- As the name of the suggests, cross-validation is the next fun thing after learning Linear Regression because it helps to improve your prediction using the K-Fold strategy.
- But we divide the dataset into equal K parts (K-Folds or cv).
- Then train the model on the bigger dataset and test on the smaller dataset.
- This graph represents the k- folds Cross Validation for the Boston dataset with Linear Regression model.
Cross-validation helps to improve your prediction using the K-Fold strategy. What is K-Fold you asked? Check out this post for a visualized explanation.
Continue reading “Visualizing Cross-validation Code”