Description
We’re on a journey to solve and democratize artificial intelligence through natural language.
Summary
- NLI-based Zero Shot Text Classification Yin et al.
- proposed a method for using pre-trained NLI models as a ready-made zero-shot sequence classifiers.
- This method is surprisingly effective in many cases, particularly when used with larger pre-trained models like BART and Roberta.
- See this blog post for a more expansive introduction to this and other zero shot methods, and see the code snippets below for examples of using this model for zero-shot classification both with Hugging Face's built-in pipeline and with native Transformers/PyTorch code.