Description
Batch Normalization is a supervised learning technique that converts selected inputs in a neural network layer into a standard format, called normalizing.
Summary
- What is Batch Normalization?
- Which significantly improves accuracy throughout the network.
- To enhance the stability of a deep learning network, batch normalization affects the output of the previous activation layer by subtracting the batch mean, and then dividing by the batch’s standard deviation.
- This is why batch normalization works together with gradient descents so that data can be “denormalized” by simply changing just these two weights for each output.