By arXiv.org -
2021-02-28
Modern natural language processing (NLP) methods employ self-supervised
pretraining objectives such as masked language modeling to boost the
performance of various application tasks. These pretraining ...
By arXiv.org -
2020-10-23
The recent "Text-to-Text Transfer Transformer" (T5) leveraged a unified
text-to-text format and scale to attain state-of-the-art results on a wide
variety of English-language NLP tasks. In this paper, ...
By arXiv.org -
2020-10-01
Benchmarks such as GLUE have helped drive advances in NLP by incentivizing
the creation of more accurate models. While this leaderboard paradigm has been
remarkably successful, a historical focus on p ...
By userinterviews -
2021-19,-Feb
The third annual State of User Research report uncovers trends in UXR methods, tools, salaries, and remote work. Includes data from 525 user researchers.
By arXiv.org -
2020-10-08
Whilst there has been growing progress in Entity Linking (EL) for general
language, existing datasets fail to address the complex nature of health
terminology in layman's language. Meanwhile, there is ...
By arXiv.org -
2020-10-08
Multilingual pre-trained Transformers, such as mBERT (Devlin et al., 2019)
and XLM-RoBERTa (Conneau et al., 2020a), have been shown to enable the
effective cross-lingual zero-shot transfer. However, t ...