Gradient Descent — An Explanation
Gradient Descent is one of the most widely used optimization algorithms out there. Ease of understanding and the its applicability to various ML algorithms makes it an attractive choice. How does it work though? Let’s find out!
We will go through various sections mentioned below to understand this algorithm. By the end of this article you will have a strong idea on what the algorithm is and what it does.
- What is Gradient Descent?
- How does it work?
- Key Parameter
- Understanding the result of the algorithm
1. What is Gradient Descent?
The formal definition of gradient descent is as follows
Gradient Descent is an optimization algorithm for finding a local minimum of a differentiable function.
In simple words, gradient descent is an algorithm that finds the point where the error of the function to which it is applied is close to zero. In ML, gradient descent is widely used to find the value of the coefficients of a cost function where the error is close to zero. By doing this, we an find the optimal values for the coefficients to minimize the overall cost.
2. How does it work?
To first understand the working of the algorithm, we need to understand what a gradient is. In mathematical terms, a gradient is the slope of a function. To put it in simple terms, a gradient measures…