“Adam” and friends

By Committed towards better future - 2021-03-13

Description

Who’s Adam? Why should we care about “his” friends?!

Summary

  • Each of Adam’s friends has contributed to Adam’s personality.
  • Gradient at point A is the slope of the parabolic function, and by calculating the gradients, we can find the steepest direction in which to move to minimise the value of the function.
  • Thus, essentially, with Momentum, if the momentum factor as in eq-3 is β, then compared to SGD, instead of the new step just being guided by the gradients, is also guided by β times the old step size.
  • Next, for each parameter we store a state referred to as param_state.

 

Topics

  1. Machine_Learning (0.23)
  2. Backend (0.15)
  3. Stock (0.1)

Similar Articles

NoSQL: Updating Data In A MongoDB Database

By Code Wall - 2019-02-10

In this last article about MongoDB, we are going to complete our knowledge about how to write queries using this technology. In particular, we are going to focus on how […]

Q-Learning Algorithm: From Explanation to Implementation

By Medium - 2020-12-13

In my today’s medium post, I will teach you how to implement the Q-Learning algorithm. But before that, I will first explain the idea behind Q-Learning and its limitation. Please be sure to have some…