```
tags:
- AI/EvaluationMetrics
aliases:
- Log Loss
- Logistic Loss
```

Cross Entropy measures the difference between two probability distributions for a given random variable or set of events.

Important

Entropy is the level of disorder or uncertainty in a dataset or model. Datasets with high entropy are not suitable for creating accurate models.

Types of cross entropy:

- Binary cross-entropy: it’s often used when we have two classes.
- Categorical Cross Entropy: used for both binary and multi-class tasks. This types of Cross Entropy requires the label to be encoded as categorical.
- Sparse Cross Entropy: used for both binary and multi-class tasks. This types of Cross Entropy requires the label to be an integer; 0 or 1 or n. it is faster than categorical cross-entropy.

Notes:

- Cross-entropy loss is commonly used as the Loss Functions for Classification tasks.

Related:

Interactive Graph

Table Of Contents