Cross-Entropy
Cross-entropy measures the average number of bits required to identify an event if you had a coding scheme optimised for one probability distribution
Cross-entropy measures the average number of bits required to identify an event if you had a coding scheme optimised for one probability distribution
Entropy is a measure of uncertainty of a random variable's possible outcomes.
It's highest when there are many equally likely outcomes. As you introduce more …