Skip to main content
. 2020 Oct 22;10:18074. doi: 10.1038/s41598-020-75029-1

Figure 15.

Figure 15

This figure depicts the relationship between entropy H(·) and mutual information I(·,·) for two variables X and Y. I(XY) measures the amount of information content that one variable contains about another. (a) I(XY) is equal to zero if and only if X and Y are statistically independent; and, (b) I(X,Y)=H(X)-H(X|Y)=H(Y)-H(Y|X), which corresponds to the reduction in the entropy of one variable due to the knowledge of the other. Hence, the I(XY) can take values in the interval: 0I(X,Y)min{H(X),H(Y)}; the larger the value of I(XY) is, the more the two variables are related. This figure has been created using the package TikZ (version 0.9f 2018-11-19) in Latex, available at https://www.ctan.org/pkg/tikz-cd.