By arXiv.org -
2020-10-23
The recent "Text-to-Text Transfer Transformer" (T5) leveraged a unified
text-to-text format and scale to attain state-of-the-art results on a wide
variety of English-language NLP tasks. In this paper, ...
By arXiv.org -
2020-10-08
Multilingual pre-trained Transformers, such as mBERT (Devlin et al., 2019)
and XLM-RoBERTa (Conneau et al., 2020a), have been shown to enable the
effective cross-lingual zero-shot transfer. However, t ...
By arXiv.org -
2020-10-15
Humans learn language by listening, speaking, writing, reading, and also, via
interaction with the multimodal real world. Existing language pre-training
frameworks show the effectiveness of text-only ...
By arXiv.org -
2020-10-01
Benchmarks such as GLUE have helped drive advances in NLP by incentivizing
the creation of more accurate models. While this leaderboard paradigm has been
remarkably successful, a historical focus on p ...
By arXiv.org -
2020-10-06
Literary tropes, from poetry to stories, are at the crux of human imagination
and communication. Figurative language such as a simile go beyond plain
expressions to give readers new insights and inspi ...
By arXiv.org -
2020-10-13
In this paper, we tackle the problem of learning control policies for tasks
when provided with constraints in natural language. In contrast to instruction
following, language here is used not to speci ...