Cross Entropy

Cross Entropy measures the difference between two probability distributions for a given random variable or set of events.

Definitions of Entropy

Entropy is the level of disorder or uncertainty in a dataset or model.
High entropy means the data has high variance and thus contains a lot of information and/or noise, and so datasets with high entropy are not suitable for creating accurate models.


Types of cross entropy:

  • Binary cross-entropy: it’s often used when we have two classes.
  • Categorical Cross Entropy: used for both binary and multi-class tasks. This types of Cross Entropy requires the label to be encoded as categorical.
  • Sparse Cross Entropy: used for both binary and multi-class tasks. This types of Cross Entropy requires the label to be an integer; 0 or 1 or n. it is faster than categorical cross-entropy.

Notes:


Related: