mT5: A massively multilingual pre-trained text-to-text transformer

By arXiv.org - 2020-10-23

Description

The recent "Text-to-Text Transfer Transformer" (T5) leveraged a unified text-to-text format and scale to attain state-of-the-art results on a wide variety of English-language NLP tasks. In this paper, ...

Summary

  • The recent "Text-to-Text Transfer Transformer" (T5) leveraged a unified text-to-text format and scale to attain state-of-the-art results on a wide variety of English-language NLP tasks.
  • All of the code and model checkpoints used in this work are publicly available.
  • References & Citations Bibtex formatted citation Bookmark About arXivLabs arXivLabs:

 

Topics

  1. UX (0.38)
  2. Backend (0.15)
  3. Database (0.08)

Similar Articles

Code and Named Entity Recognition in StackOverflow

By arXiv.org - 2020-10-14

There is an increasing interest in studying natural language and computer code together, as large corpora of programming texts become readily available on the Internet. For example, StackOverflow curr ...