China’s GPT-3? BAAI Introduces Superscale Intelligence Model ‘Wu Dao

By Synced | AI Technology & Industry Review - 2021-03-23

Description

The Beijing Academy of Artificial Intelligence (BAAI) releases Wu Dao 1.0, China’s first large-scale pretraining model.

Summary

  • Since the May 2020 release of OpenAI’s GPT-3, AI researchers have embraced super-large-scale pretraining models.
  • The Wu Dao – Wen Lan model has reached SOTA performance, scoring 5 percent higher than the champion team on the Image Caption task on the Chinese public multimodal test set AIC-ICC and 20 percent higher than the most popular UNITER model on the Visual Entailment task.
  • Wen Hui’s inverse prompting algorithm achieves close to human performance on the task of Q&A and poetry generation, and is the first model that can generate classical Chinese poetry based on modern themes.
  • BAAI Research is currently in discussions with Sogou, 360, Alibaba, Zhipu.

 

Topics

  1. NLP (0.24)
  2. Machine_Learning (0.12)
  3. Management (0.08)

Similar Articles

The Model’s Shipped; What Could Possibly go Wrong

By Medium - 2021-02-18

In our last post we took a broad look at model observability and the role it serves in the machine learning workflow. In particular, we discussed the promise of model observability & model monitoring…