Back Navigation Next Navigation SGD Classifier and Regressor (page 5 of 9)

A gradient is a partial derivative with respect to its inputs. More specifically, a gradient measures the change in all weights with regard to the change in error (similar to a slope function). The higher the gradient, the steeper the slope and the faster a model can learn. However, if the slope is zero (or approaches zero), the model stops learning or learns at a very rate, which results in an approximation of the minumum.

Linear Animated GIF