Learning Rate

Learning Rate() is a Hyper-Parameter used in Optimization algorithms that determines the step size(the pace at which an algorithm updates or learns the values of a parameter estimate) at each iteration, regulating the Weights of Artificial Neural Networks (ANN) concerning the loss gradient.


Notes:

  • Large Learning Rate:
    • Accelerates the training
    • Can miss the minimum of Optimization function.
  • Small Learning Rate:
    • Slower training process.
    • Can find precise minimum
    • The solution can get stuck in a local minimum, stopping the weights to update.
  • The learning rate controls how quickly the model is adapted to the problem.