Exploding Gradients

Exploding Gradients are situations in which massive incorrect gradients build during training, resulting in huge updates to neural network model Weights. When there are exploding gradients, an unstable network might form, and training cannot be completed. Due to exploding gradients, the weights’ values can potentially grow to the point where they overflow, resulting in loss in NaN values.