Improving and Simplifying Pattern Exploiting Training

By arXiv.org - 2021-03-23

Description

Recently, pre-trained language models (LMs) have achieved strong performance when fine-tuned on difficult benchmarks like SuperGLUE. However, performance can suffer when there are very few labeled exa ...

Summary

  • Computer Science > Computation and Language Title: Improving and Simplifying Pattern Exploiting Training Abstract: Recently, pre-trained language models (LMs) have achieved strong performance when fine-tuned on difficult benchmarks like SuperGLUE.
  • However, PET uses task-specific unlabeled data.

 

Topics

  1. Backend (0.3)
  2. Machine_Learning (0.25)
  3. NLP (0.19)

Similar Articles