Retrieval Augmented Generation: Streamlining the creation of intelligent natural language processing models

By facebook - 2021-03-10

Description

Teaching computers to understand how humans write and speak, known as natural language processing or NLP, is one of the oldest challenges in AI research....

Summary

  • ML APPLICATIONS OPEN SOURCE Teaching computers to understand how humans write and speak, known as natural language processing (NLP), is one of the oldest challenges in AI research.
  • We found that RAG uses its nonparametric memory to “cue” the seq2seq model into generating correct responses, essentially combining the flexibility of the “closed-book” or parametric-only approach with the performance of “open-book” or retrieval-based methods.
  • Combining a retrieval-based component with a generative component has advantages even in purely extractive tasks, such as the open-domain NaturalQuestions task.
  • Hugging Face’s Transformers has become a de facto standard in open source NLP, thanks to its low barrier to entry and coverage of state-of-the-art models, and it integrates with the new Datasets library to provide the indexed knowledge source that RAG relies on.

 

Topics

  1. NLP (0.28)
  2. UX (0.05)
  3. Machine_Learning (0.04)

Similar Articles

FastFormers: 233x Faster Transformers inference on CPU

By Medium - 2020-11-04

Since the birth of BERT followed by that of Transformers have dominated NLP in nearly every language-related tasks whether it is Question-Answering, Sentiment Analysis, Text classification or Text…