MountainAI
Войти
mathfoundationslosses

Information theory

Entropy, cross-entropy, KL divergence and mutual information — the language behind most ML loss functions.

Уровни глубины

L0Intro~3ч

Understands Shannon entropy as "average surprise"; knows bits vs nats.

L1Basics~12ч

Computes H(X), H(X,Y), mutual information; derives cross-entropy as a loss.

L2Working~20ч

Uses KL/JS divergences in model regularisation; implements variational bounds (ELBO).

L3Advanced~30ч

Applies rate-distortion theory, channel capacity; analyses information bottlenecks in networks.

L4Research~80ч

Contributes to information-theoretic analysis of deep learning (generalisation bounds, MDL).

Ресурсы