Skip to main content
. 2021 Apr 9;2(4):100226. doi: 10.1016/j.patter.2021.100226

Table 1.

Summary table of algorithms and tools for time series cancer modeling

Section Algorithm or technique Description
2 Lyapunov exponents (λL) A characteristic set of exponents used to characterize the rate of separation of trajectories in a dynamical system. If λL>0, there may be a chaotic attractor.
2 Frequency spectra The frequency decomposition of a spatial signal. Various algorithms, such as fast-Fourier transforms, are applicable to convert the time series signal traces to a frequency spectrum. A broad frequency spectrum in the time trace of a signal may be an indicator of chaotic dynamics.
2 Fractal dimension A statistical index of self-similarity and complexity. The box-counting algorithm, wavelet analysis-based methods, or multifractal analysis (when more than one fractal dimension exists) are used to compute the fractal dimension.
4 Master equation A way of describing gene expression dynamics as the evolution of a probability function. The Fokker-Planck equations and Gillespie algorithms are approximations of the Master equation used in the stochastic modeling of cell fate transitions.
4 Boolean networks Discrete models of biological networks. Most applicable in the study of gene expression network dynamics, wherein each gene can be on or off. However, continuous analogs exist.
5 Reaction-diffusion equations Mathematical models used to describe pattern formation in chemical systems. Examples of applicability include tumor pattern formation and cancer stem cell differentiation. Chaotic attractors may emerge in these equations in a regime referred to as chemical turbulence.
6 Computational simulations Simulating the differential equations characterizing gene expression or protein oscillation dynamics may provide a computational tool for identifying chaotic attractors. Simulations must be paired with experimental data to infer such complex patterns.
7 Network science Various network visualization tools consisting of machine learning pipelines are discussed. These tools combine various algorithms to infer regulatory networks from single-cell datasets.
8 Convergent cross mapping (CCM) A method of attractor reconstruction using Whitney's theorem and Takens's theorem (time-delay coordinate embedding).
9 Entropy A powerful information theoretic in network science used as a measure of chaotic flows in dynamical systems.
9 Waddington landscape reconstruction The general methods to reverse engineer the transcriptomic state space as an energy (epigenetic) landscape are discussed. scEpath was demonstrated as an example of such algorithms.
10 Deep learning neural networks A type of multi-layered artificial neural networks. They are among the most powerful algorithms in machine learning capable of detecting complex patterns from empirical datasets.
10 Recurrent neural networks (RNNs) A type of artificial neural networks used in time series analysis. Reservoir computing, a type of RNN, can be used to find the Lyapunov exponents of large-scale empirical datasets, from which chaotic attractors could be reconstructed.
11 Kolmogorov complexity, K(s) K(s) is the length of the shortest description of the data object in program space. Lossless compression algorithms and the block decomposition method (BDM) are available to approximate the K(s) measure. BDM is the most robust estimate of a graph network’s complexity available. Algorithmic information dynamics is the branch of computational science using K(s) approximation algorithms to study complex networks.