Do NLP Models Cheat at Math Word Problems? Microsoft Research Says Even SOTA Models Rely on Shallow Heuristics

By Synced | AI Technology & Industry Review - 2021-03-18

Description

A Microsoft research team provides concrete evidence showing that existing NLP models cannot robustly solve even the simplest of Math word problems, suggesting the hope that they might capably handle ...

Summary

  • “Yoshua recently turned 57.
  • A child could likely figure this one out, and recent natural language processing (NLP) models have also shown an ability to achieve reasonably high accuracy on MWPs.
  • MWP tasks can be challenging as they require the machine to extract relevant information from natural language text and perform mathematical operations or reasoning to find the solution.
  • Examples of one-unknown arithmetic word problems Researchers have recently begun applying machine learning to more complex MWPs such as multiple-unknown linear word problems and those concerning geometry and probability.

 

Topics

  1. NLP (0.31)
  2. Management (0.13)
  3. Machine_Learning (0.1)

Similar Articles

CLIP: Connecting Text and Images

By OpenAI - 2021-01-05

We’re introducing a neural network called CLIP which efficiently learns visual concepts from natural language supervision.

FastFormers: 233x Faster Transformers inference on CPU

By Medium - 2020-11-04

Since the birth of BERT followed by that of Transformers have dominated NLP in nearly every language-related tasks whether it is Question-Answering, Sentiment Analysis, Text classification or Text…