====== Word Embeddings ====== ===== Papers ===== * [[https://arxiv.org/pdf/1402.3722.pdf|Goldberg & Levy 2014 - word2vec Explained: Deriving Mikolov et al.’s Negative-Sampling Word-Embedding Method]] * [[https://nlp.stanford.edu/pubs/glove.pdf|Pennington et al 2014 - GloVe: Global Vectors for Word Representation]] ===== Resources ===== * [[https://nlp.stanford.edu/projects/glove/|GloVe Embeddings]] [[https://github.com/stanfordnlp/GloVe|github]] For non-contextualized (fixed) word embeddings, GloVe is usually the best ===== Related Pages ===== * [[Pretraining]]