```
tags:
- AI/Algorithms/ANN
aliases:
- Backprop
- Backward Propagation of Errors
- Back-Propagation
```

The Backpropagation algorithm is used in Artificial Neural Networks (ANN) to look for the minimum value of the error function in weight space using Delta Rule (a Gradient Descent rule). The Weights that minimize the error function is then considered to be a solution to the learning problem.

Backpropagation makes training deep models computationally tractable and training with gradient descent greatly faster(faster derivatives calculation). It depends on each computation step being mathematically differentiable. This means that for any small change in a parameter, we can calculate the corresponding change in the model error or loss.

Tldr

Backpropagation algorithm is used to compute the Gradients of the Loss Function with respect to each weight by using the chain rule.

Backpropagation is used for:

- Calculating the error function
- Check for Minimum Error
- Update the parameters (weights and biases), then check for error until error is minimized.
- Repeat the process until the error becomes minimum.
- Determine if the model is ready to make a prediction (error is minimum)

Interactive Graph

Table Of Contents