Skip to main content
. Author manuscript; available in PMC: 2022 Apr 1.
Published in final edited form as: Trends Cancer. 2021 Feb 20;7(4):335–346. doi: 10.1016/j.trecan.2020.12.013

Table 1.

Definitions, Meanings, and Applications of Information Theory Concepts to the Study of the Immune System and Immuno-Oncologyb

Concept Mathematical definition Meaning Application Refs
Entropy, maximum entropy H(X) = − ∑xXP(x) log P(x) System organization/disorganization T-cell receptor diversity
Information capacity
EGFR/Akt signaling
Carcinogenesis
[28]
[35]
[29]
[2830]
Mutual information I(X;Y) = H(X) − H(X|Y)
H(XY)=xX,yYP(x,y) logP(x,y)P(y).
Information shared between variables Biomarker cellular patterns
T-cell activation and spatial organization in lymph nodes
Cytokines and protein interaction networks
Machine learning
[31]
[32]
[33,34],
[36]
[44]
Cross and relative entropy CE(PQ)=xXP(x)logP(x)Q(x)xXP(x) log P(x) Information shared between distributions of variables Biomarker identification
Comparison of transcriptional states (e.g., CD8+ vs. CD4+)
Machine learning
[40]
[42]
[6063]
[41]
Channel capacitya C=supPX(x)I(X;Y) Maximum rates at which information can be reliably transmitted over a communication channel JAK/STAT signaling
Cytokine signaling, gene expression
Intrinsic and extrinsic noise in signaling
[34]
[16,53]
[54,58]
Information transfer and flowb dxdt=F(x,t);dHdt=E(F) Transfer of information between correlated or uncorrelated variables over time Spatial and temporal dynamics
Information flow in dynamic systems
[6063]
[73]
a

Sup is the supremum, or least upper bound of mutual information I(X;Y) over the marginal distribution PX(x).

b

F is a vector field of a dynamical system (dx/dt), dH/dt is the evolution of entropy (H), equal to the expectation (E) of the divergence (∇) of the vector field F.