- Consequent trees have similar structure
- after each tree is built, gradients are recomputed (hover cursor over any ‘plus’!
- each time you change some parameter, gradient boosting is recomputed from the scratch (yes, gradient boosting algorithm is quite fast)
- datasets from other playgrounds are too easy for gradient boosting, that’s why some challenging datasets were added
- If you set learning rate to be high (without using Newton-Raphson update) only several first trees make serious contribution, other trees are almost not used
Read the full article, click here.
@alxndrkalinin: “Gradient Boosting Interactive Playground #machinelearning”
Dataset to classify: Prediction: ↑ …
Gradient Boosting Interactive Playground