Description
This post summarizes progress in 10 exciting and impactful directions in ML and NLP in 2020.
Summary
- Sebastian Ruder The selection of areas and methods is heavily influenced by my own interests; Large models have been shown to have learned a surprising amount of world knowledge from their pre-training data, which allows them to reproduce facts (Jiang et al., Retrieval-augmented generation should be particularly useful for dealing with failure cases that have plagued generative neural models in the past, such as dealing with hallucinations (Nie et al., State-of-the-art models in NLP have achieved superhuman performance across many tasks.