By arXiv.org -
2020-10-08
Multilingual pre-trained Transformers, such as mBERT (Devlin et al., 2019)
and XLM-RoBERTa (Conneau et al., 2020a), have been shown to enable the
effective cross-lingual zero-shot transfer. However, t ...
By arXiv.org -
2020-10-01
Benchmarks such as GLUE have helped drive advances in NLP by incentivizing
the creation of more accurate models. While this leaderboard paradigm has been
remarkably successful, a historical focus on p ...
By arXiv.org -
2020-10-14
There is an increasing interest in studying natural language and computer
code together, as large corpora of programming texts become readily available
on the Internet. For example, StackOverflow curr ...
By arXiv.org -
2020-10-06
Literary tropes, from poetry to stories, are at the crux of human imagination
and communication. Figurative language such as a simile go beyond plain
expressions to give readers new insights and inspi ...
By arXiv.org -
2020-10-08
Whilst there has been growing progress in Entity Linking (EL) for general
language, existing datasets fail to address the complex nature of health
terminology in layman's language. Meanwhile, there is ...
By arXiv.org -
2020-10-13
In this paper, we tackle the problem of learning control policies for tasks
when provided with constraints in natural language. In contrast to instruction
following, language here is used not to speci ...