Entropy
Entropy quantifies uncertainty of a random variable.
Contents
Description
Entropy is a measure of the information needed to encode an event drawn from a random distribution.
Cross-entropy
Cross-entropy is a comparative measure between two distributions. Generally it is framed as the information needed to encode an event with reference to a distribution q which differs from the true distribution p.
Cross-entropy can easily be implemented as a loss function for estimation. This provides a measure for how well a model classifies observations, as compared to true labels. In the field of classification trees, this is referred to as either cross-entropy loss or log loss.
