MountainAI
Войти
nlprepresentationfoundations

Embeddings and representation learning

word2vec, GloVe, fastText, sentence-transformers — dense vector representations of discrete items.

Уровни глубины

L0Intro~2ч

Reads "king − man + woman ≈ queen"; understands dense vs sparse features.

L1Basics~10ч

Trains word2vec / GloVe; uses cosine similarity for lookups.

L2Working~15ч

Uses sentence-transformers / BGE for semantic search; fine-tunes embeddings with triplet loss.

L3Advanced~25ч

Analyses isotropy, anisotropy; matrix-factorisation and contrastive framings.

L4Research~50ч

Multi-modal embeddings, retrieval-aware pretraining, memory-efficient embedding tables.

Ресурсы