classical-ml
Decision trees and ensembles
CART, Random Forest, Gradient Boosting — the dominant classical ML approach for tabular data.
Уровни глубины
L0Intro~2ч
Understands a decision tree as a series of if/else splits; knows that boosting improves weak learners.
L1Basics~15ч
Trains Random Forest and XGBoost; tunes depth, estimators, learning rate; understands feature importance.
L2Working~20ч
Handles class imbalance, missing values; applies early stopping and DART; interprets with SHAP.
L3Advanced~30ч
Understands information gain derivation, AdaBoost convergence; implements custom objective functions in XGBoost/LightGBM.
L4Research~60ч
Contributes to new boosting algorithms, differentiable trees, or tabular deep learning comparisons.
Ресурсы
L1 — Basics