AI 360: 01/03/2021. Unified Transformer, Sebastian Ruder, OpenAI's DALL-E, GLOM and StudioGAN

By lamaai - 2021-03-02
LAMA is the Language and Multi-modal AI lab at Imperial College London. Facebook AI Research (FAIR) propose a multi-modal model they call the Unified Transformer (UniT), which is a Transformer based model jointly trained on 7 different tasks: The architecture, which achieves comparable results to task specific Transformer based models with a signficantly reduced parameter set uses two Transformer encoders and one Transformer decoder. He posts about the Recent Advances in Langguage Model Fine-tuning.

 

Topics

  1. NLP (0.27)
  2. Machine_Learning (0.17)
  3. UX (0.09)

Similar Articles

pytorch-widedeep: deep learning for tabular data

By Medium - 2021-02-22

This is the third of a series of posts introducing pytorch-widedeepa flexible package to combine tabular data with text and images (that could also be used for “standard” tabular data alone). The…

FastFormers: 233x Faster Transformers inference on CPU

By Medium - 2020-11-04

Since the birth of BERT followed by that of Transformers have dominated NLP in nearly every language-related tasks whether it is Question-Answering, Sentiment Analysis, Text classification or Text…