Introduction to Entropy

Learn the basics of entropy and how to calculate it.

What does entropy show?

Another approach for understanding the importance of various features is the measure of entropy. Entropy is a measure developed in information theory, which can be used to estimate a measure of event predictability. An event with high entropy is one that is hard to predict, while one with low entropy is easy to predict. To give us an intuitive understanding of this concept, let’s consider a coin toss where the coin has a 50/50 chance of landing on heads vs. tails. This event will have a high entropy because there’s no way to predict what the outcome will be with better than a probability of 0.5. If the coin was biased with heads 90% of the time, then the entropy would be lower as the probability of us predicting heads is greater. Given this example, note that high entropy features split the population of observation roughly in half, while low entropy features split the population in a much more skewed manner.

Get hands-on with 1200+ tech skills courses.