|
The uncertainty of a random variable X is measured by its entropy H(X); p(x) is the probability density of X
|
|
The uncertainty of a random variably X given knowledge of another random variable Y is measured by the conditional entropy H(X ∣ Y) |
|
The uncertainty of a pair of random variables X, Y is measured by the entropy H(X ∣ Y) |
H(X, Y) = H(X) + H(Y ∣ X) = H(Y) + H(X ∣ Y) |
|
Given two random variables X and Y, the amount of information that each one of them provides about the other is the mutual information MI(X; Y) |
MI(X; Y) = H(X) + H(Y) − H(X, Y) |