Skip to main content
. 2016 May 6;113(21):5841–5846. doi: 10.1073/pnas.1520969113

Fig. 1.

Fig. 1.

Information theory determines the capacity of systems of specific interactions. (A) A model system of locks (black) that each bind with energy s to their specific key (gray) via some specific interaction. (B) As the number N of lock–key pairs is increased, noncognate locks and keys inevitably start resembling each other as they fill up the finite space of all possible components (boxes), both with optimized or random design of lock–key pairs (circles). Consequently, mutual information I between bound locks and keys rises with N for small N but reaches a point of diminishing returns at N=NC; due to the rapid rise in off-target binding energy, I can no longer increase and, for randomly chosen pairs, will typically decrease. The largest achievable value of I is the capacity C. (C) Capacity C can be estimated from the distribution ρ(Δ) of the gap Δ=ws between off-target binding energy w and on-target binding energy s for randomly generated lock–key pairs. Among the three distinct ρ(Δ) shown, the blue distributions have the same βΔ^=logβΔeβΔρ. (C, Inset) I(N) (compare Eq. 3) is the same for the blue distributions that, despite being markedly different in shape, have the same Δ^, which captures the essential aspects of crosstalk.