Table 1.
Concept | Mathematical definition | Meaning | Application | Refs |
---|---|---|---|---|
Entropy, maximum entropy | H(X) = − ∑x∈XP(x) log P(x) | System organization/disorganization | T-cell receptor diversity Information capacity EGFR/Akt signaling Carcinogenesis |
[28] [35] [29] [28–30] |
Mutual information |
I(X;Y) = H(X) − H(X|Y) |
Information shared between variables | Biomarker cellular patterns T-cell activation and spatial organization in lymph nodes Cytokines and protein interaction networks Machine learning |
[31] [32] [33,34], [36] [44] |
Cross and relative entropy | Information shared between distributions of variables | Biomarker identification Comparison of transcriptional states (e.g., CD8+ vs. CD4+) Machine learning |
[40] [42] [60–63] [41] |
|
Channel capacitya | Maximum rates at which information can be reliably transmitted over a communication channel | JAK/STAT signaling Cytokine signaling, gene expression Intrinsic and extrinsic noise in signaling |
[34] [16,53] [54,58] |
|
Information transfer and flowb | Transfer of information between correlated or uncorrelated variables over time | Spatial and temporal dynamics Information flow in dynamic systems |
[60–63] [73] |
Sup is the supremum, or least upper bound of mutual information I(X;Y) over the marginal distribution PX(x).
F is a vector field of a dynamical system (dx/dt), dH/dt is the evolution of entropy (H), equal to the expectation (E) of the divergence (∇) of the vector field F.