Description
We know what is Word2Vec and how word vectors are used in NLP tasks but do we really know how they are trained and what were the previous approaches for training word vectors. Well, here is an…
Summary
- An Intuitive understanding and explanation of the word2vec model.
- As the name suggests, model predicts the N-grams words except the current word as its the input to the model, hence the name skip-gram.
- Due to the lower complexity of word2vec model, models are trained on the huge corpus utilising DistBelief distributed training which speeds up the training procedure.
- There are many interesting applications of using word2vec models in NLP tasks.