Skip to main content
. 2010 Jan 20;2009:642524. doi: 10.1155/2009/642524

Table 1.

Concepts of entropy and MI defined by Shannon's theory of information.

Concepts of Shannon's theory of information Descriptions
H(X)=-xp(x)log2p(x) The uncertainty of a random variable X is measured by its entropy H(X); p(x) is the probability density of X
H(XY)=-xp(xy)log2p(xy) The uncertainty of a random variably X given knowledge of another random variable Y is measured by the conditional entropy H(XY)
H(X,Y)=-xyp(x,y)log2p(x,y) The uncertainty of a pair of random variables X, Y is measured by the entropy H(XY)
H(X, Y) = H(X) + H(YX) = H(Y) + H(XY)
MI(X;Y)=xyp(x,y)log2p(x,y)p(x)p(y) Given two random variables X and Y, the amount of information that each one of them provides about the other is the mutual information MI(X; Y)
MI(X; Y) = H(X) + H(Y) − H(X, Y)