Cross entropy
In the realm of information theory, cross entropy is a measure of the average number of bits required to identify an event from a set, given a specific probability distribution. It is widely applied in machine learning, especially in classification problems, where it is linked to the concept of maximum likelihood.