Back Navigation Next Navigation Linear Regression (page 5 of 7)

Appendix - Gradient Descent

Gradient descent is an algorithm that mathematically estimates where a function outputs its lowest values. Gradient descent approximates the local minima solution with numbers instead of using symbols. If we had a simple formula like f(x) = x^2 - 4x, then we could easily solve this equation using symbols ∇f=0 and find that x=2 minimizes f(x). As an alterative, we could use gradient descent to approximate the minimium, which could be x≈1.99999967. If our function has hundreds, thousands, or millions of features, then manipulating symbols isn't feasible. Gradient descent, however, provides estimates no matter how complex our function is.

GD Image General