MountainAI
Войти
classical-mlsupervisedtreesensembles

Gradient boosting

XGBoost, LightGBM, CatBoost — the workhorse of tabular ML competitions and production pipelines.

Уровни глубины

L0Intro~2ч

Reads a boosted-tree prediction; knows "many weak learners → strong model".

L1Basics~10ч

Derives AdaBoost; writes gradient-boosting pseudocode; tunes n_estimators, lr, depth.

L2Working~20ч

Ships XGBoost/LightGBM with CV, early stopping, monotonic constraints; handles categoricals with CatBoost.

L3Advanced~30ч

Reads histogram-based tree construction; implements custom loss & objective; SHAP interpretation.

L4Research~60ч

Contributes to tree-structure learning or boosting theory (margin bounds, functional gradient).

Ресурсы