facebook/bart-large-mnli · Hugging Face

By huggingface - 2020-10-23

Description

We’re on a journey to solve and democratize artificial intelligence through natural language.

Summary

  • NLI-based Zero Shot Text Classification Yin et al.
  • proposed a method for using pre-trained NLI models as a ready-made zero-shot sequence classifiers.
  • This method is surprisingly effective in many cases, particularly when used with larger pre-trained models like BART and Roberta.
  • See this blog post for a more expansive introduction to this and other zero shot methods, and see the code snippets below for examples of using this model for zero-shot classification both with Hugging Face's built-in pipeline and with native Transformers/PyTorch code.

 

Topics

  1. NLP (0.28)
  2. Backend (0.13)
  3. Security (0.07)

Similar Articles

How to build a fraud detection solution

By Google Cloud Blog - 2021-03-03

In collaboration with our partner Quantiphi, we developed a smart analytics design pattern that enables you to build a scalable real-time fraud detection solution in one hour using serverless, no-ops ...