Google trained a trillion-parameter AI language model

By VentureBeat - 2021-01-12

Description

Researchers at Google claim to have trained a natural language model containing over a trillion parameters.

Summary

  • Profile Google trained a trillion-parameter AI language model NEW YORK, NEW YORK - OCTOBER 20: Spencer Platt/Getty Images Transform 2021 Join us for the world’s leading event about accelerating enterprise transformation with AI and Data, for enterprise technology decision-makers, presented by the #1 publisher in AI and Data Parameters are the key to machine learning algorithms.
  • However, on one benchmark — the Sanford Question Answering Dataset (SQuAD) — Switch-C scored lower (87.7) versus Switch-XXL (89.6), which the researchers attribute to the opaque relationship between fine-tuning quality, computational requirements, and the number of parameters.
  • Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations.

 

Topics

  1. NLP (0.3)
  2. Backend (0.13)
  3. Machine_Learning (0.08)

Similar Articles