Word Embedding

Embedding of word tokens that are created via Word Embedding Techniques.
It is a class of techniques where individual words are represented as numeric vectors in a lower-dimensional space(vector space). With each individual words mapped to a vector, It allows words with similar meaning to have a similar representation.

Tldr

Word Embedding is used for mapping words to numerical vectors.

word_embedding.png
In Word Embedding each word is represented by a set of numerical values, so it’s considered a dimensionality reduction technique. The geometric relationships between word vectors should reflect the semantic relationships between these words.


Learning Material: