Gradient in any level of a Artificial Neural Networks (ANN) is the result of multiplication of gradients at prior layer.

Notes:

- in Machine Learning, a gradient is a derivative of a function that has more than one input variable. Known as the slope of a function in mathematical terms, the gradient simply measures the change in all Weights about the change in error.
- Getting Backpropagation to behave well requires gradients that are smooth, that is, the slope doesn’t change very quickly as you make small steps in any direction.
- Vanishing Gradient and Exploding Gradients are two issues related to gradients.

Interactive Graph

Table Of Contents