wordembedding
Word embeddings are a type of word representation that allows words with similar meaning to have a similar representation. They are a distributed representation for text that is perhaps one of the key breakthroughs for the impressive performance of deep learning methods on challenging natural language processing problems. In this approach, words are represented by real-valued vectors in a predefined vector space. Each word is mapped to one vector and the vector values are learned in a way that resembles a neural network, and hence the technique is often lumped into the field of deep learning.
Word embeddings are useful because they capture the context of a word in a document, semantic and
Word embeddings can be static or dynamic. Static word embeddings are fixed after training and do not