MountainAI
Войти
deep-learningsequencesrnn

LSTM and GRU

Gated recurrent units that solved vanishing gradients — the workhorse of sequence modelling before transformers.

Уровни глубины

L0Intro~2ч

Knows LSTM has "memory cells" that persist across time.

L1Basics~10ч

Draws LSTM gates (input, forget, output); implements a language model.

L2Working~15ч

Compares LSTM vs GRU vs BiLSTM; uses teacher forcing; gradient clipping for stability.

L3Advanced~20ч

Analyses peephole connections, layered/residual RNNs; attention on top of LSTM.

L4Research~40ч

Linear-time alternatives (Mamba, SSMs) and their connection to gated RNNs.

Ресурсы