deep-learningtraining
Backpropagation
Automatic differentiation and the chain rule applied to neural networks — how gradients flow.
Уровни глубины
L0Intro~1ч
Knows that backprop computes gradients automatically so we can update weights.
L1Basics~8ч
Traces gradient flow through a 2-layer network by hand; understands forward/backward pass.
L2Working~20ч
Implements autograd from scratch; can debug vanishing/exploding gradients; understands gradient checkpointing.
L3Advanced~35ч
Implements custom backward passes; understands higher-order derivatives; applies BPTT for sequential models.
L4Research~70ч
Contributes to automatic differentiation research, implicit differentiation, or meta-learning backprop variants.
Ресурсы
L2 — Working