By arXiv.org -
2020-10-23
The recent "Text-to-Text Transfer Transformer" (T5) leveraged a unified
text-to-text format and scale to attain state-of-the-art results on a wide
variety of English-language NLP tasks. In this paper, ...
By arXiv.org -
2020-10-08
Multilingual pre-trained Transformers, such as mBERT (Devlin et al., 2019)
and XLM-RoBERTa (Conneau et al., 2020a), have been shown to enable the
effective cross-lingual zero-shot transfer. However, t ...
By arXiv.org -
2020-10-06
Literary tropes, from poetry to stories, are at the crux of human imagination
and communication. Figurative language such as a simile go beyond plain
expressions to give readers new insights and inspi ...
By arXiv.org -
2020-10-13
Topic models are a useful analysis tool to uncover the underlying themes
within document collections. The dominant approach is to use probabilistic
topic models that posit a generative story, but in t ...
By arXiv.org -
2020-10-14
There is an increasing interest in studying natural language and computer
code together, as large corpora of programming texts become readily available
on the Internet. For example, StackOverflow curr ...
By arXiv.org -
2020-10-08
Whilst there has been growing progress in Entity Linking (EL) for general
language, existing datasets fail to address the complex nature of health
terminology in layman's language. Meanwhile, there is ...