Description
In this tutorial, you'll learn what the stochastic gradient descent algorithm is, how it works, and how to implement it with Python and NumPy.
Summary
- Stochastic gradient descent is an optimization algorithm often used in machine learning applications to find the model parameters that correspond to the best fit between predicted and actual outputs.
- The learning rate determines how large the update or moving step is.
- A lower learning rate prevents the vector from making large jumps, and in this case, the vector remains closer to the global optimum.
- The best regression line is đť‘“(đť‘Ą) = 5.63 + 0.54đť‘Ą.
- As in the previous examples, this result heavily depends on the learning rate.