Skip to main content
. Author manuscript; available in PMC: 2013 Nov 7.
Published in final edited form as: Phys Biol. 2012 Aug 7;9(4):10.1088/1478-3975/9/4/045011. doi: 10.1088/1478-3975/9/4/045011

Figure 2. (A) Schematic of a communication channel.

Figure 2

A basic communication channel can be described by an input random variable S connected by a channel to a random variable output R such that the outcome of R is dependent on S subject to the distorting influence of noise. In information theory the complexity of the channel can be represented as a “black box”, since the internal details are fully captured by the joint distribution between R and S. (B) Entropy as a function of a Bernoulli random variable with probability p. This concave down graph illustrates that entropy is at its maximum when all outcomes are equally probable (p = 0.5) and at a minimum when the outcome is predetermined (p = 0 or 1).