Weight Initialization for Deep Learning Neural Networks

By Machine Learning Mastery - 2021-02-02

Description

Weight initialization is an important design choice when developing deep learning neural network models. Historically, weight initialization involved using small random numbers, although over the last ...

Summary

  • Historically, weight initialization involved using small random numbers, although over the last decade, more specific heuristics have been developed that use information, such as the type of activation function that is being used and the number of inputs to the node.
  • Both approaches were derived assuming that the activation function is linear, nevertheless, they have become the standard for nonlinear activation functions like Sigmoid and Tanh, but not ReLU.
  • The complete example is listed below.

 

Topics

  1. Machine_Learning (0.51)
  2. NLP (0.1)
  3. Backend (0.05)

Similar Articles