Word2Vec and Semantic Similarity using spacy | NLP spacy Series | Part 7

Word vectors – also called word embeddings – are mathematical descriptions of individual words such that words that appear frequently together in the language will have similar values. In this way we can mathematically derive context. As mentioned above, the word vector for “lion” will be closer in value to “cat” than to “dandelion”.

Continue reading