Understanding Transformers, the Programming Way

By Medium - 2020-10-19

Description

Transformers have become the defacto standard for NLP tasks nowadays. They started being used in NLP but they are now being used in Computer Vision and sometimes to generate music as well. I am sure…

Summary

  • Because you can only understand it, if you can program it Transformers have become the defacto standard for NLP tasks nowadays.
  • This is because we don’t have batches yet and the number of pad tokens will inherently depend on the maximum length of a sentence in the particular batch.
  • Since I don’t have a German Translator at hand, I will use the next best thing to see how our model is performing.
  • We will discuss those advancements and how they came about in the upcoming post, where I will talk about BERT, one of the most popular NLP models that utilizes a Transformer at its core.

 

Topics

  1. NLP (0.3)
  2. Backend (0.21)
  3. Machine_Learning (0.2)

Similar Articles

K-fold Cross Validation with PyTorch

By MachineCurve - 2021-02-02

Explanations and code examples showing you how to use K-fold Cross Validation for Machine Learning model evaluation/testing with PyTorch.

Facebook’s Prophet + Deep Learning = NeuralProphet

By Medium - 2020-12-10

While learning about time series forecasting, sooner or later you will encounter the vastly popular Prophet model, developed by Facebook. It gained lots of popularity due to the fact that it provides…