Batch Normalization

By DeepAI - 2019-05-17

Description

Batch Normalization is a supervised learning technique that converts selected inputs in a neural network layer into a standard format, called normalizing.

Summary

  • What is Batch Normalization?
  • Which significantly improves accuracy throughout the network.
  • To enhance the stability of a deep learning network, batch normalization affects the output of the previous activation layer by subtracting the batch mean, and then dividing by the batch’s standard deviation.
  • This is why batch normalization works together with gradient descents so that data can be “denormalized” by simply changing just these two weights for each output.

 

Topics

  1. Machine_Learning (0.58)
  2. Backend (0.27)
  3. Database (0.09)

Similar Articles

ResNets — Residual Blocks & Deep Residual Learning

By Medium - 2020-12-02

Deep Learning harnesses the power of Big Data by building deep neural architectures that try to approximate a function f(x) that can map an input, x to its corresponding label, y. The Universal…

10 Deep Learning Terms Explained in Simple English

By datasciencecentral - 2020-12-27

  Deep Learning is a new area of Machine Learning research that has been gaining significant media interest owing to the role it is playing in artificial intel…

Introduction to Active Learning

By KDnuggets - 2020-12-15

An extensive overview of Active Learning, with an explanation into how it works and can assist with data labeling, as well as its performance and potential limitations.