Word2Vec Research Paper Explained

By Medium - 2021-03-17

Description

We know what is Word2Vec and how word vectors are used in NLP tasks but do we really know how they are trained and what were the previous approaches for training word vectors. Well, here is an…

Summary

  • An Intuitive understanding and explanation of the word2vec model.
  • As the name suggests, model predicts the N-grams words except the current word as its the input to the model, hence the name skip-gram.
  • Due to the lower complexity of word2vec model, models are trained on the huge corpus utilising DistBelief distributed training which speeds up the training procedure.
  • There are many interesting applications of using word2vec models in NLP tasks.

 

Topics

  1. NLP (0.19)
  2. Machine_Learning (0.06)
  3. Backend (0.05)

Similar Articles

The Model’s Shipped; What Could Possibly go Wrong

By Medium - 2021-02-18

In our last post we took a broad look at model observability and the role it serves in the machine learning workflow. In particular, we discussed the promise of model observability & model monitoring…