Softmax

This function calculates the probabilities of each class over all possible target classes. The sum of all probabilities is equal to one and the max value of them is the classified class.

activation-functions-softmax.png


Notes:

  • In multi-class classification, SoftMax activation function is most commonly used for the last layer of the neural network.
  • The softmax function can be used for multi-label classification and regression tasks.
  • It mimics the one encoded label better than the absolute values.
  • Softmax is differentiable, meaning we can calculate how much each element of the results will change, given a small change in any of the input elements.
  • We would lose information if we used absolute (modulus) values, but the exponential takes care of this on its own.