MountainAI
Войти
deep-learningtraining

Regularization

Dropout, weight decay, early stopping, data augmentation — preventing overfitting in deep networks.

Уровни глубины

L0Intro~1ч

Knows overfitting and that regularisation reduces it; has heard of dropout and weight decay.

L1Basics~8ч

Applies L1/L2 weight decay, dropout, early stopping, and basic data augmentation.

L2Working~15ч

Diagnoses under/overfitting via learning curves; applies BatchNorm, MixUp, label smoothing, and stochastic depth.

L3Advanced~30ч

Understands PAC-Bayes bounds; applies sharpness-aware minimisation (SAM), R-Drop, and calibration.

L4Research~60ч

Contributes to implicit regularisation theory or generalisation bounds research.

Ресурсы