Backpropagation

Backpropagation(Backprop) makes training deep models computationally tractable and training with gradient descent greatly faster(faster derivatives calculation). It depends on each computation step being mathematically differentiable. This means that for any small change in a parameter, we can calculate the corresponding change in the model error or loss.