MountainAI
Войти
deep-learningnlpsequence

Recurrent neural networks

LSTM, GRU, and sequence modelling — the precursor to transformers for sequential data.

Уровни глубины

L0Intro~1ч

Understands that RNNs maintain hidden state to process sequences; knows LSTM was invented to fix vanishing gradients.

L1Basics~12ч

Implements a simple RNN and LSTM in PyTorch for sequence classification or language modelling.

L2Working~20ч

Applies bidirectional LSTMs, stacked RNNs, encoder-decoder with attention for seq2seq tasks.

L3Advanced~30ч

Analyses BPTT gradient flow; applies advanced tricks (zone-out, variational dropout); understands when to prefer RNN vs Transformer.

L4Research~60ч

Contributes to SSMs (Mamba), linear-complexity sequence models, or time-series deep learning.

Ресурсы