Embeddings
Embeddings are mappings from discrete items to dense, real-valued vectors in a continuous space. They are designed so that geometric relationships among vectors reflect semantic or syntactic relationships among the items. This dense representation contrasts with one-hot encodings, which are high-dimensional and sparse. The arrangement of vectors often enables simple operations, such as measuring similarity with cosine similarity or performing vector arithmetic that mirrors linguistic relations.
Word embeddings are a common focus. Static word embeddings assign a single vector to each token, learned
Graph embeddings map nodes in a network to vectors that preserve neighborhood structure, enabling tasks such
Image embeddings are feature vectors extracted from neural networks trained on large image datasets, used for