tags:
- AI/EvaluationMetrics
aliases:
- Log Loss
- Logistic Loss
Cross Entropy measures the difference between two probability distributions for a given random variable or set of events.
Entropy is the level of disorder or uncertainty in a dataset or model.
High entropy means the data has high variance and thus contains a lot of information and/or noise, and so datasets with high entropy are not suitable for creating accurate models.
Types of cross entropy:
Notes:
Related: