Entropy Explained!
In information theory, the
entropy of a random variable is the average level of "
information", "
surprise",
or "uncertainty" inherent in the variable's possible outcomes.
For example for the entropy of a coin , probability landing on head is p and on tail is 1-p
Here the minimize entropy would happen at the p = 1/2 ( minimum uncertainty ), where the entropy would be represented as 1 bit
When the entropy is zero bits, this is sometimes referred to as unity, where there is no uncertainty at all - no freedom of choice - no information. Other values of p give different entropies between zero and one bits.
src:
https://en.wikipedia.org/wiki/Entropy_(information_theory)