An Empirical Study of Pre-trained Transformers for Arabic Information Extraction

By arXiv.org - 2020-10-08

Description

Multilingual pre-trained Transformers, such as mBERT (Devlin et al., 2019) and XLM-RoBERTa (Conneau et al., 2020a), have been shown to enable the effective cross-lingual zero-shot transfer. However, t ...

Summary

  • Multilingual pre-trained Transformers, such as mBERT (Devlin et al., 2020a), have been shown to enable the effective cross-lingual zero-shot transfer.
  • We study GigaBERT's effectiveness on zero-short transfer across four IE tasks: Our best model significantly outperforms mBERT, XLM-RoBERTa, and AraBERT (Antoun et al.,

 

Topics

  1. NLP (0.29)
  2. Machine_Learning (0.21)
  3. UX (0.12)

Similar Articles

Code and Named Entity Recognition in StackOverflow

By arXiv.org - 2020-10-14

There is an increasing interest in studying natural language and computer code together, as large corpora of programming texts become readily available on the Internet. For example, StackOverflow curr ...