Skip to main content
PLOS Computational Biology logoLink to PLOS Computational Biology
. 2013 Mar 28;9(3):e1002965. doi: 10.1371/journal.pcbi.1002965

The Fidelity of Dynamic Signaling by Noisy Biomolecular Networks

Clive G Bowsher 1,*, Margaritis Voliotis 1, Peter S Swain 2,*
Editor: Jason M Haugh3
PMCID: PMC3610653  PMID: 23555208

Abstract

Cells live in changing, dynamic environments. To understand cellular decision-making, we must therefore understand how fluctuating inputs are processed by noisy biomolecular networks. Here we present a general methodology for analyzing the fidelity with which different statistics of a fluctuating input are represented, or encoded, in the output of a signaling system over time. We identify two orthogonal sources of error that corrupt perfect representation of the signal: dynamical error, which occurs when the network responds on average to other features of the input trajectory as well as to the signal of interest, and mechanistic error, which occurs because biochemical reactions comprising the signaling mechanism are stochastic. Trade-offs between these two errors can determine the system's fidelity. By developing mathematical approaches to derive dynamics conditional on input trajectories we can show, for example, that increased biochemical noise (mechanistic error) can improve fidelity and that both negative and positive feedback degrade fidelity, for standard models of genetic autoregulation. For a group of cells, the fidelity of the collective output exceeds that of an individual cell and negative feedback then typically becomes beneficial. We can also predict the dynamic signal for which a given system has highest fidelity and, conversely, how to modify the network design to maximize fidelity for a given dynamic signal. Our approach is general, has applications to both systems and synthetic biology, and will help underpin studies of cellular behavior in natural, dynamic environments.

Author Summary

Cells do not live in constant conditions, but in environments that change over time. To adapt to their surroundings, cells must therefore sense fluctuating concentrations and ‘interpret’ the state of their environment to see whether, for example, a change in the pattern of gene expression is needed. This task is achieved via the noisy computations of biomolecular networks. But what levels of signaling fidelity can be achieved and how are dynamic signals encoded in the network's outputs? Here we present a general technique for analyzing such questions. We identify two sources of signaling error: dynamic error, which occurs when the network responds to features of the input other than the signal of interest; and mechanistic error, which arises because of the inevitable stochasticity of biochemical reactions. We show analytically that increased biochemical noise can sometimes improve fidelity and that, for genetic autoregulation, feedback can be deleterious. Our approach also allows us to predict the dynamic signal for which a given signaling network has highest fidelity and to design networks to maximize fidelity for a given signal. We thus propose a new way to analyze the flow of information in signaling networks, particularly for the dynamic environments expected in nature.

Introduction

Cells are continuously challenged by extra- and intracellular fluctuations, or ‘noise’, [1][3]. We are only starting to unravel how fluctuating inputs and dynamic interactions with other stochastic, intracellular systems affect the behavior of biomolecular networks [4][9]. Such knowledge is, however, essential for studying the fidelity of signal transduction [10], [11] and therefore for understanding and controlling cellular decision-making [12]. Indeed, successful synthetic biology requires quantitative predictions of the effects of fluctuations at the single-cell level, both in static and dynamic environments [13]. Furthermore, sophisticated responses to signals that change over time are needed for therapeutics that involve targeted delivery of molecules by microbes [14], [15] or the reprogramming of immune cells [16]. Here we begin to address these challenges by developing a general framework for analysing the fidelity with which dynamic signals are represented by, or ‘encoded’ in, the output of noisy biomolecular networks.

Results

Two types of fidelity loss in dynamic signaling

For cellular signaling to be effective, it should maintain sufficient fidelity. We wish to quantify the extent to which the current output of an intracellular biochemical network, Inline graphic, can represent a particular feature of a fluctuating input (Fig. 1). This signal of interest, Inline graphic, is generally a function of the history of the input, denoted Inline graphic. By its history, we mean the value of the input Inline graphic at time Inline graphic and at all previous times. The signal Inline graphic could be, for example, the level of the input at time Inline graphic or a time average of the input over a time window in the most recent past. The output of the signaling network, Inline graphic, is able to perfectly represent the signal Inline graphic if Inline graphic can be inferred exactly from Inline graphic at all times, Inline graphic. The system then has zero fidelity error. However, for a stochastic biochemical mechanism, a given value of Inline graphic will map to multiple possible values of the output, Inline graphic.

Figure 1. The dynamics of the protein output can result in a faithful representation of the current biological environment.

Figure 1

We consider a 2-stage model of gene expression [22]. The extracellular environment or input, Inline graphic, gives the current rate of transcription and the signal of interest Inline graphic. We model Inline graphic as either a 2-state Markov chain with equal switching rates between states (the states each have unconditional probability of Inline graphic) (A&C); or as proportional to a Poissonian birth-death process for a transcriptional activator (B&D; proportionality constant of 0.025). The transformed signals Inline graphic (in red, lower panels) are a perfect representation of Inline graphic, although protein levels Inline graphic (in blue) are not. Inline graphic, the lifetime Inline graphic of Inline graphic equals 1 hr, and the translation rate Inline graphic. Degradation rates of mRNA and protein are chosen to maximize the fidelity, Eq. 7. The units for Inline graphic are chosen so that its variance equals one.

We will assume that the conditional mean, Inline graphic, is an invertible function of Inline graphic: it takes different values for any two values of Inline graphic. It is then a perfect representation of Inline graphic. The output Inline graphic will, however, usually be different from Inline graphic and have a fidelity error, defined as the difference between Inline graphic and Inline graphic. The notation Inline graphic is read as Inline graphic conditioned on, or given, the value of the variable Inline graphic at time Inline graphic. We use Inline graphic, as for example in Inline graphic, to denote averaging over all random variables except those given in the conditioning. Therefore Inline graphic is itself a random variable: it is a function of the random variable Inline graphic (we give a summary of the properties of conditional expectations in the SI).

Many response functions, Inline graphic, in biochemistry and physiology (for example, Hill functions) satisfy the requirement of invertibility or can be made to do so by defining Inline graphic appropriately—for example, when a response exactly saturates for all input values above a threshold, those values can be grouped to form a single input state. Furthermore, we know from the properties of conditional expectations that Inline graphic is closer to Inline graphic in terms of mean squared fidelity error than to any other representation (function) of Inline graphic (SI).

The difference between the conditional expectations Inline graphic and, for example, Inline graphic is important. The former, Inline graphic, is the average value of the output at time Inline graphic given a particular history of the input Inline graphic. It will often coincide with the deterministic (macroscopic) solution when the same input trajectory is applied to the network. The output Inline graphic shows random variation around this average, Inline graphic, for identical realisations of the trajectory of Inline graphic. By contrast, Inline graphic is the average value of Inline graphic given that the trajectory of Inline graphic up to time Inline graphic ends at the value Inline graphic. By the properties of conditional expectations, this is also the average value of Inline graphic over all trajectories ending in the value Inline graphic: that is, Inline graphic Inline graphic. These mathematical definitions are illustrated diagrammatically in Fig. 2.

Figure 2. Dynamical error as the difference between two conditional expectations.

Figure 2

To illustrate, we consider a 2-stage model of gene expression with the input, Inline graphic, equal to the current rate of transcription, and the signal of interest Inline graphic. We model Inline graphic as a 2-state Markov chain and show simulated trajectories of the protein output, Inline graphic, corresponding to four different input trajectories, Inline graphic. These input trajectories (or histories) all end at time Inline graphic in the state Inline graphic (not shown) and differ according to their times of entry into that state (labelled Inline graphic on the time axis; Inline graphic is off figure). Inline graphic (black lines) is the average value of Inline graphic at time Inline graphic given a particular history of the input Inline graphic: the random deviation of Inline graphic around this average is the mechanistic error Inline graphic (shown at time Inline graphic for the first realisation of Inline graphic). Inline graphic is the average or mean value of Inline graphic given that the trajectory of Inline graphic ends in the state Inline graphic at time Inline graphic. Inline graphic (red line) can be obtained by averaging the values of Inline graphic over all histories of Inline graphic ending in Inline graphic. The mean is less than the mode of the distribution for Inline graphic because of the distribution's long tail. Inline graphic, not shown, is obtained analogously. The dynamical error, Inline graphic, is the difference between Inline graphic and Inline graphic and is shown here for the first trajectory, Inline graphic. Fig. 3B shows data from an identical simulation model (all rate parameters here as detailed in Fig. 3B).

We distinguish between two types of error that reduce fidelity between Inline graphic and Inline graphic.

Dynamical error

becomes significant when the response time of the signaling network is comparable to or longer than the timescale on which the signal of interest, Inline graphic, fluctuates. On average, the output Inline graphic then responds to other features of the input history as well as to Inline graphic. We define the dynamical error therefore as the difference between the average level of the output given a particular history of the input, Inline graphic, and the average level of the output given the signal of interest (a function of Inline graphic):

graphic file with name pcbi.1002965.e104.jpg (1)

The magnitude (variance) of the dynamical error is equal to Inline graphic, [7].

For example, if the signal of interest is the current value of the input, Inline graphic, then Inline graphic records a catch-up error if the network still ‘remembers’ (is still responding to) previous values of the input (Fig. 3). Since Inline graphic will generally be different for different input trajectories, it will generally differ from Inline graphic (which is an average over all input trajectories that end at Inline graphic, Fig. 2).

Figure 3. As the protein lifetime decreases, a trade-off between dynamical and mechanistic error determines fidelity.

Figure 3

We consider a 2-stage model of gene expression with the input, Inline graphic, equal to the current rate of transcription, and the signal of interest Inline graphic. (A) The magnitude of the relative fidelity errors as a function of the protein degradation rate, Inline graphic (from Eqs. 11, 12 and 13), using a logarithmic axis. (B–D) Simulated data with Inline graphic as in Fig. 1A. The units for Inline graphic are chosen so that its variance equals one in each case (hence Inline graphic and Inline graphic). Pie charts show the fractions of the protein variance due to the mechanistic (m) and dynamical (d) errors and to the transformed signal. The latter equals Inline graphic. In B, the relative protein lifetime, Inline graphic, is higher than optimal (Inline graphic) and fidelity is 2.2; in C, Inline graphic is optimal (Inline graphic) and fidelity is 10.1; and in D, Inline graphic is lower than optimal (Inline graphic) and fidelity is 5.3. Dynamical error, Inline graphic, is the difference between Inline graphic (black) and the faithfully transformed signal Inline graphic (red), and decreases from B to D, while mechanistic error increases. The lower row shows the magnitudes of the relative dynamical error (black) and relative mechanistic error (orange). All rate parameters are as in Fig. 1 A&C with Inline graphic, unless otherwise stated.

We can write the dynamical error as

graphic file with name pcbi.1002965.e129.jpg (2)

If fluctuations in Inline graphic are slower than the response time of the system, then Inline graphic will be effectively constant over the ‘portion’ of its history detected by the output and the first term becomes zero because Inline graphic. We note that the magnitude (variance) of Inline graphic is always non-zero if the magnitude of this first term is non-zero because the two terms in Eq. 2 are uncorrelated (Methods). The second term quantifies the difference between the average effect on the output, Inline graphic, exerted by the history of the signal of interest and the average effect on the output exerted by the history of the input. This term would be non-zero, for example, if the input Inline graphic consists of multiple ligands that influence Inline graphic, perhaps because of cross-talk between signaling pathways, but the signal of interest is only a function of the history of one of those ligands. This second term is zero, however, for the systems we will consider.

Mechanistic error

is generated by the inherent stochasticity of the biochemical reactions that comprise the signaling network. We define mechanistic error as the deviation of the current value of the output from its average value given a particular history of the input:

graphic file with name pcbi.1002965.e137.jpg (3)

Inline graphic departs from its average (given the realised input history) because of biochemical stochasticity (Fig. 2). The magnitude of mechanistic error is given by Inline graphic, which equals Inline graphic.

Mechanistic error is related to intrinsic noise. Intrinsic variation measures the expected variation in Inline graphic given the history of all the extrinsic variables [7], [8]. Extrinsic variables describe the influence of the rest of the cell and of the extracellular environment on, say, expression of a gene of interest [17] and would include, for example, levels of ATP and ribosomes as well as extracellular signals such as the input Inline graphic. The magnitude of the mechanistic error measures, however, the expected variation in Inline graphic given the history of just one extrinsic variable, the input Inline graphic. Mechanistic variation therefore also includes the effects of fluctuations in the levels of ATP and ribosomes on the signalling mechanism and is always greater than or equal to the intrinsic variation.

We then define the fidelity error, Inline graphic, to be the sum of these two errors:

graphic file with name pcbi.1002965.e146.jpg (4)

which has zero mean, as do Inline graphic and Inline graphic. Fig. 1 shows fluctuating protein output levels, Inline graphic, for a network that has high fidelity (small errors) for the signal of interest, there the current state of the environment, Inline graphic.

Orthogonal signal and error components

We can decompose the output Inline graphic into the sum of the faithfully transformed or transmitted signal, Inline graphic, the dynamical error, and the mechanistic error:

graphic file with name pcbi.1002965.e153.jpg (5)

for all times Inline graphic. Eq. 5 is an orthogonal decomposition of the random variable Inline graphic—each pair of random variables on the right-hand side has zero correlation (Methods). The variance of Inline graphic therefore satisfies

graphic file with name pcbi.1002965.e157.jpg (6)

where the magnitude of the fidelity error is given by Inline graphic, which is Inline graphic because of the orthogonality. This magnitude of the fidelity error is also equal to the expected conditional variance of the output, Inline graphic. We note that we can generalize this decomposition, and thus extend our approach, for example, to study different components of the mechanistic error (Methods).

To compare signaling by different biochemical mechanisms, we normalize Inline graphic by the square root of its variance, writing Inline graphic, and define the fidelity as a signal-to-noise ratio:

graphic file with name pcbi.1002965.e163.jpg (7)

for some signal of interest, Inline graphic. Eq. 7 is dimensionless and a montonically decreasing function of Inline graphic. Indeed, we have shown that the maximal mutual information between Inline graphic and Inline graphic across all possible signal distributions is bounded below by a decreasing function of Inline graphic (and so an increasing function of our fidelity), for a suitable choice of distribution of the signal Inline graphic and when Inline graphic is an invertible function of Inline graphic [7].

Comparing biochemical systems using the fidelity measure is equivalent to comparison based on the magnitude of the fidelity error, Inline graphic, where Inline graphic and the error is measured in units of the standard deviation of the output. Eq. 7 is maximized when Inline graphic is minimized. One minus the magnitude of the fidelity error is the fraction of the variance in the output that is generated by the signal of interest. In information theoretic approaches, normalizing the output by its standard deviation is also important, because the normalization allows determination of the number of ‘unique’ levels of output that can be distinguished from one other despite the stochasticity of the output, as least for Gaussian fluctuations [18].

When Inline graphic and Inline graphic have a bivariate Gaussian distribution, the instantaneous mutual information, Inline graphic, is monotonically related to the fidelity and exactly equal to Inline graphic [7], where Inline graphic denotes the correlation coefficient. Also in this Gaussian case, Inline graphic is equal to the minimum mean squared error (normalised by Inline graphic) between Inline graphic and the linear, optimal estimate, Inline graphic. (This is the optimal ‘filter’ when only the current output Inline graphic is available, although typically a filter such as the Wiener filter would employ the entire history of Inline graphic up to time Inline graphic.) Gaussian models of this sort for biochemical signalling motifs were considered in [19], with instantaneous mutual information expressed in terms of a signal-to-noise ratio equivalent (for their models) to the fidelity of Eq. 7. Such Gaussian models (if taken literally, rather than used to provide a lower bound on the information capacity [19]) would imply that the input-output relation, Inline graphic, is linear and that Inline graphic does not depend on Inline graphic (by the properties of the multivariate normal distribution). Our approach requires neither assumption.

Whenever Inline graphic is a linear function of Inline graphic, that is Inline graphic for constants Inline graphic and Inline graphic, we consider Inline graphic to be the gain for the signal of interest Inline graphic [19]. The fidelity then depends on the ratio of the squared gain to the fidelity error and is given by Inline graphic.

The dynamic signal with maximum fidelity for a given input process

Suppose that the input process Inline graphic is given and we want to choose from among all functions or statistics of the input history that ‘signal of interest’, Inline graphic, for which the network achieves the highest fidelity. An immediate implication of Eq. 7 is that it identifies the signal of interest with the highest fidelity. Since Inline graphic Inline graphic, the dynamical error is zero when

graphic file with name pcbi.1002965.e202.jpg (8)

from Eq. 1. This choice of Inline graphic therefore maximizes fidelity for all signaling networks: it minimizes the magnitude of the fidelity error (Eq. 6), because Inline graphic and Inline graphic do not depend on Inline graphic. The variance of Inline graphic only changes with the biochemistry of the network and the input process. We will give an example of such a signal of interest that maximizes fidelity in Eq. 9.

Analyzing networks with fluctuating inputs

Methods of analysis of stochastic systems with dynamic inputs are still being developed. We argue that deriving expectations of network components conditional upon the histories of stochastic inputs is a powerful approach. We have developed three methods to determine components of Eqs. 5 and 6 (SI):

  1. An exact analytical method, applicable to linear cascades and feedforward loops, based on the observation that moments calculated from a chemical master equation with propensities that are the appropriate functions of time are conditional moments, where the conditioning is on the history of the inputs at time Inline graphic and on the initial conditions.

  2. A Langevin method that can include non-linearities, requires stationary dynamics, and whose accuracy as an approximation improves as typical numbers of molecules grow.

  3. A numerical method, applicable to arbitrary biomolecular networks and signals of interest—based on a modification of the Gillespie algorithm allowing time-varying, stochastic propensities—that uses a ‘conjugate’ reporter to estimate the mechanistic error [7] and a simulated sample from the distribution of the signal-output pair, Inline graphic, to estimate the conditional means, Inline graphic.

We note that our methods require that the inputs can be modeled as exogenous processes that are unaffected by interactions with the biochemistry of the signaling network (a distinction emphasised in [20]). By an exogenous process we mean one whose future trajectory is independent, given its own history, of the history of the biochemical system. This model for an input is reasonable, for example, when the input is the level of a regulatory molecule, such as a transcription factor, that has relatively few binding sites in the cell.

Analyzing signal representation by gene expression

Transcriptional regulation is a primary means by which cells alter gene expression in response to signals [21]. We now provide an exact, in-depth analysis of a two-stage model of gene expression [22] where the fluctuating input, Inline graphic, is the rate (or propensity) of transcription and the signal of interest, Inline graphic, equals the current value of the input, Inline graphic. For example, Inline graphic may be proportional to the extracellular level of a nutrient or the cytosolic level of a hormone regulating a nuclear hormone receptor.

The cellular response should account for not only the current biological state of Inline graphic but also future fluctuations. If we consider an input that is a Markov process, future fluctuations depend solely on the current value Inline graphic, and the cell would need only to ‘track’ the current state as effectively as possible and then use the representation in protein levels to control downstream effectors. These ideas are related to those underlying predictive information [23], [24].

Our analysis requires only the stationary mean and variance of the input Inline graphic and that Inline graphic has exponentially declining ‘memory’ (SI). Consequently, the autocorrelation function of Inline graphic is a single exponential with autocorrelation time Inline graphic (the lifetime of fluctuations in Inline graphic). Examples include a birth-death process or a two-state Markov chain. We can generalize using, for example, weighted sums of exponentials to flexibly model the autocorrelation function of Inline graphic.

Solving the ‘conditional’ master equation with a time-varying rate of transcription, we find that the conditionally expected protein level is a double weighted ‘sum’ of past levels of the signal Inline graphic (SI):

graphic file with name pcbi.1002965.e224.jpg (9)

(where for simplicity the equation is stated for the case of zero initial mRNA and protein). We denote the rate of translation per molecule of mRNA by Inline graphic, the rate of mRNA degradation per molecule by Inline graphic, and the rate of degradation of protein per molecule by Inline graphic. The most recent history of the input Inline graphic exerts the greatest impact on the current expected output, with the memory of protein levels for the history of the input determined by the lifetimes of mRNA and protein molecules. Eq. 9 gives the signal of interest, Inline graphic (a function of the history of the fluctuating transcription rate), that gene expression transmits with the highest fidelity to protein levels (see Eq. 8). Notice that the current value of the input, Inline graphic, cannot be recovered exactly from Inline graphic, which is therefore not a perfect representation of Inline graphic.

We find, by contrast, that Inline graphic is an invertible, linear function of Inline graphic:

graphic file with name pcbi.1002965.e235.jpg (10)

when the dynamics reach stationarity, and that the stationary unconditional mean is Inline graphic (SI). Notice that Inline graphic does not converge for large Inline graphic to the average ‘steady-state’ solution for a static Inline graphic, but depends on Inline graphic. The discrepancy between Eqs. 9 and 10 results in dynamical error with non-zero magnitude (Fig. 3B).

Using our solutions for the conditional moments, we can calculate the variance components of Eq. 6 (SI). For the faithfully transformed signal, when Inline graphic, we have

graphic file with name pcbi.1002965.e242.jpg (11)

where Inline graphic is the ratio of the lifetime of mRNA to the lifetime of fluctuations in Inline graphic, and Inline graphic is the ratio of the lifetime of protein to the lifetime of fluctuations in Inline graphic. The magnitude of the dynamical error is in this case proportional to Eq. 11

graphic file with name pcbi.1002965.e247.jpg (12)

and the magnitude of the mechanistic error satisfies

graphic file with name pcbi.1002965.e248.jpg (13)

When the autocorrelation time of Inline graphic becomes large (Inline graphic and Inline graphic tending to zero), the dynamical error Inline graphic therefore vanishes (Eq. 12). In this limit, the output effectively experiences a constant input Inline graphic during the time ‘remembered’ by the system.

To gain intuition about the the effect of relative lifetimes on the fidelity of signaling, we first suppose the mechanistic error is small relative to Inline graphic. Eq. 7 then becomes simply Inline graphic if protein lifetime is large relative to mRNA lifetime, Inline graphic (as expected for many genes in budding yeast [25]). The fidelity thus improves as the protein lifetime decreases relative to the lifetime of fluctuations in Inline graphic, and the output is able to follow more short-lived fluctuations in the signal. This observation is only true, however, for negligible mechanistic error.

Tradeoffs between errors can determine signaling fidelity

It is the aggregate behavior of dynamical and mechanistic errors as a fraction of the total variance of the output that determines signaling fidelity, Eq. 7. Effective network designs must sometimes balance trade-offs between the two types of error.

Increasing biochemical noise can enhance signaling fidelity

Predicting changes in fidelity requires predicting whether changes in the magnitude of the dynamical error relative to Inline graphic, denoted Inline graphic, either dominate or are dominated by changes in the magnitude of the mechanistic error relative to Inline graphic, denoted Inline graphic. For example, shorter protein lifetimes can decrease the absolute value of both the dynamical error and the mechanistic error (the output has a lower mean—Eq. 13). We calculated for all parameter space the sensitivities of the magnitude of the two (relative) errors with respect to changes in the protein lifetime, Inline graphic (using Eqs. 11, 12, and 13). We found that although the relative magnitude of the dynamical error decreases with shorter protein lifetime, the relative magnitude of the mechanistic error increases. The sign of the overall effect on the relative fidelity error can therefore be positive or negative (Fig. 3A), and consequently fidelity is maximized by a particular protein lifetime, Inline graphic (Fig. 3B–D).

Similar trade-offs have been observed before in signal transduction. For example, tuning the protein's degradation rate can also maximize the instantaneous mutual information, at least for Gaussian models [19]. As the protein degradation rate increases, although the fidelity error Inline graphic decreases, there is a trade-off because the gain also decreases. In our model the gain, Inline graphic (Eq. 10), is decreasing in Inline graphic and we observe the same tradeoff.

Further, the trade-off between the two relative errors has some similarities with trade-offs that occur with Wiener filtering [26]. There, however, the entire output history is used to optimally estimate (or reconstruct) the signal of interest. In contrast, we consider representation of Inline graphic only by the current output Inline graphic.

The rule-of-thumb that increasing stochasticity or noise in signaling mechanisms reduces signaling fidelity is broken in this example. Such statements typically ignore the effect of dynamical error, but here reductions in relative dynamical error can more than compensate for gains in relative mechanistic error. Both errors should be included in the analysis.

Feedback can harm signaling fidelity

Intuitively we might expect that feedback can improve signaling fidelity because feedback affects response times. For example, autoregulation affects the mean time to initiate transcription: it is reduced by negative autoregulation [27] and increased by positive autoregulation [28]. We introduce autoregulation into our model of gene expression, interpreting again Inline graphic as proportional to the fluctuating level of a transcriptional activator and allowing the protein Inline graphic to bind to its own promoter. For negative feedback, the rate of transcription becomes Inline graphic; for positive feedback, it becomes Inline graphic, with Inline graphic the rate of transcription from the active promoter (SI). We impose Inline graphic so that the transcription rate increases with Inline graphic for a given Inline graphic. Increasing Inline graphic increases the strength of the feedback in both cases. We note that other models of autoregulation may give different conclusions, and that the transcription rate depends linearly on Inline graphic in our models.

We let the signal of interest Inline graphic again be Inline graphic. To proceed we calculate the sensitivities of the magnitudes of the fidelity errors using our Langevin method with the input an Ornstein-Uhlenbeck process. We determine their signs with respect to changes in feedback strength by randomly sampling a biophysically plausible parameter space (SI). As we sample, the parameter space governing fluctuations of Inline graphic is also explored. We find excellent agreement between our Langevin and numerical, simulation-based approach (SI). Since we calculate sensitivities, we are examining the effect of changing feedback strength, Inline graphic, while holding other network parameters constant. This process both imitates the incremental change often expected during evolution and the way that network properties tend to be manipulated experimentally. When comparing the fidelity error of the signal representations for different Inline graphic using Eq. 7, we implicitly normalise the variance of the output to one in order to ensure fair comparison.

Consider first the static case where the fluctuations in Inline graphic are sufficiently slow relative to the timescales of the transduction mechanism that the input is effectively constant (Inline graphic with fixed Inline graphic). As expected (Eq. 1), Inline graphic converges to zero as Inline graphic. With a static input, negative autoregulation is expected to reduce the variances of the response, Inline graphic, for each value of the input [29]. The mechanistic variance is therefore expected to decrease, and does so in all models sampled as Inline graphic increases. We can show analytically (SI) that the suppression of mean levels also decreases the variance of the conditional mean, the ‘signal’ variance Inline graphic, and so the total variance of the output decreases. We find that the decrease in mechanistic variance cannot outweigh the decreased signal variance, and the fidelity always decreases with increasing feedback (increasing Inline graphic). Such a reduction in information transfer through negative feedback has recently been observed experimentally [10]. For positive autoregulation, the mechanistic variance increases with Inline graphic, which dominates any increase in the signal variance observed at low values of Inline graphic. Relative mechanistic error again rises and fidelity therefore decreases.

For a static Inline graphic, therefore, neither negative nor positive autoregulation improves signaling fidelity. As the strength of feedback becomes large, the transcriptional propensity tends to zero for negative feedback and to the constant Inline graphic for positive feedback (with fixed positive Inline graphic), and the propensities for different Inline graphic become indistinguishable as functions of Inline graphic (SI). Signaling is correspondingly compromised in both cases.

These findings essentially still hold when the input is dynamic. For negative autoregulation, all three components of the output variance decrease with Inline graphic. The relative dynamical error decreases with Inline graphic, but this decrease is typically outweighed by an increase in the relative mechanistic error, and the overall fidelity deteriorates (Inline graphic of cases sampled and Fig. 4). Any reduction in fidelity error, Inline graphic, was negligible (the difference from the fidelity error when Inline graphic was always less than Inline graphic). We note that this conclusion is in contradistinction to the finding (using a linear Gaussian model) that negative feedback does not affect information transfer between entire input and output trajectories [30]. For positive feedback, both the mechanistic variance and the relative mechanistic error increase with Inline graphic (for all models sampled). This mechanistic effect dominates the relative dynamical error, which can change non-monotonically with Inline graphic, and fidelity again deteriorates.

Figure 4. Increasing the strength of negative feedback decreases fidelity.

Figure 4

We consider a 2-stage model of gene expression with the signal of interest Inline graphic, and with Inline graphic proportional to the level of a transcriptional activator. We simulate Inline graphic as in Fig. 1A. Upper row compares the time course of the protein output (blue) to the faithfully transformed signal (red), Inline graphic. Lower row shows the distributions for the output, Inline graphic, that correspond to each of the two possible values of the input, Inline graphic (low and high). Vertical lines indicate the means of the distributions. Pie charts show the fractions of the variance of each (conditional) distribution due to dynamical (d) and mechanistic (m) error, weighted by the probability of the input state: summing these gives the overall magnitude (variance) of the dynamical and mechanistic errors. (A) No feedback (Inline graphic), fidelity equals 2.4. (B) Intermediate feedback (Inline graphic), fidelity equals 2.0. (C) Strong feedback (Inline graphic), fidelity equals 1.3. As the strength of feedback increases, the underlying state of the input is more difficult to infer (the conditional distributions overlap more) because increasing (relative) mechanistic error dominates the decreasing (relative) dynamical error. Note the decrease in the (relative) dynamical error when Inline graphic is in its high state (yellow conditional distribution) because stronger negative feedback gives faster initiation of transcription. Transcription propensities are given by Inline graphic, and all parameters except Inline graphic are as in Fig. 3B.

Our results are consistent with the intuition that, although negative feedback reduces the absolute mechanistic error (fewer molecules) and absolute dynamical error (faster response times), negative feedback also decreases the dynamic range of the output. The fidelity therefore does not improve because the output distributions corresponding to each value of Inline graphic, despite being tighter, are also located closer together (Fig. 4). Positive feedback acts in the opposite way, with increasing variance in the (conditional) output distributions overwhelming any increase in the dynamic range of the output.

To explore what happens when the effect of feedback on the dynamic range is directly controlled, we investigated the effect of varying Inline graphic in our negative feedback model while simultaneously altering the translation rate (Inline graphic) to hold the system's ‘gain’ constant (SI). In our model, the faithfully transformed signal is a linear function of Inline graphic: Inline graphic, where Inline graphic is the gain. If only Inline graphic is varied and the translation rate kept fixed, then the gain is always less than the gain when Inline graphic is zero. The signal variance or ‘dynamic range’, Inline graphic, is equal to Inline graphic, which is also therefore held constant as we vary Inline graphic at constant gain. The fidelity is Inline graphic.

For static signals, we again find the fidelity almost always decreases with increasing negative feedback strength, Inline graphic: the absolute mechanistic error now increases with increasing Inline graphic, presumably because of the decreased rate of translation. For dynamic signals we find, for the vast majority of cases, an optimal feedback strength, Inline graphic, above and below which fidelity deteriorates. With increased Inline graphic, although the absolute mechanistic error increases, the absolute dynamical error decreases, when we compare randomised initial parameterisations with the Inline graphic that maximises fidelity. When Inline graphic decreases compared to its initial value, these errors have the opposite behavior. At constant gain, the tradeoff between dynamical and mechanistic error is thus still observed, as is the harmful effect of too strong a negative feedback.

Combining outputs from multiple cells improves fidelity

When a physiological response corresponds to the average output of multiple cells, the magnitude of the mechanistic error is that for a single cell divided by the number of cells in the group (for identical and independent cells receiving the same input). This reduction arises because the magnitude of the mechanistic error is now the variance of the average mechanistic error of the cells in the group. The dynamical error, Eq. 1, however, is the same as the dynamical error of each individual cell: expectations of the average response equal the expectations of the response of each single cell when the cells are identical. Therefore the fidelity for any signal of interest, Inline graphic, increases if the average or aggregate output of a group of cells is used (SI). Measuring the collective response of small groups of cells, Cheong et al. indeed found that information capacity increased significantly compared to that of a single cell [10], and averaging of individual cellular responses is believed to increase the precision of gene expression during embryonic development [31].

Although negative feedback reduces relative dynamical error, it increases relative mechanistic error in individual cells. At the level of the collective response of multiple cells, the deleterious effect on mechanistic error is attentuated (Fig. 5). Using a population of 100 independent and identical cells we find that adding negative feedback now improves fidelity in the majority of cases, with moderate reductions in (relative) fidelity error (Inline graphic) for our parameter space. Adding positive feedback never significantly improves overall fidelity (all observed reductions Inline graphic). Furthermore, negative feedback can often significantly reduce the number of cells needed to achieve the same fidelity as, say, 100 cells that lack feedback (less than 10 cells are needed Inline graphic of the time and less than 50 cells Inline graphic of the time when sampling from our parameter space).

Figure 5. The fidelity of the collective response of a group of cells exceeds that of a single cell.

Figure 5

We consider a 2-stage model of gene expression with the signal of interest Inline graphic, and with Inline graphic proportional to the level of a transcriptional activator and modeled as an Ornstein-Uhlenbeck process. The unconditional distribution of Inline graphic is therefore Gaussian. Pie charts show fractions of the protein variance due to the mechanistic (m) and dynamical (d) errors and are computed using our Langevin method (SI). (A) For a single cell with negative autoregulation (Inline graphic), fidelity is low and equal to 0.2, with a dominant mechanistic error. (B) For 100 identical and independent cells (given the input's history), with negative autoregulation (Inline graphic): fidelity between Inline graphic and the average protein output for the group is higher and equal to 3.5. All parameters as in Fig. 3B except Inline graphic.

Designing dynamic networks in synthetic biology

Our framework naturally adapts to the scenario of controlling a network output to approach a desired ‘target’ response when, for example, the cell's environment changes. Combined with model search procedures for synthetic design [32], it is a promising approach to the design of synthetic biomolecular networks. If the target response is given by Inline graphic, which is a function of the input history, then to guide the design process, we can decompose the error Inline graphic analogously to Eq. 5 and find an equivalent to Eq. 6, a dissection of the network performance into orthogonal components (SI).

Discussion

Cells use the information conveyed by signaling networks to regulate their behavior and make decisions. Not all features of the input trajectory will, however, be relevant for a particular decision, and we define the fidelity between the output of the network and a signal of interest, Inline graphic, which is a function of the input trajectory. Information encoded in upstream fluctuations must eventually either be lost or encoded in current levels of cellular constituents. We have therefore focused on the fidelity with which Inline graphic is represented by the current output, Inline graphic.

Using an orthogonal decomposition of the network's output into the faithfully transformed signal and error terms, we are able to identify two sources of error – dynamical and mechanistic. We assume the transformed signal, Inline graphic, to be an invertible function of Inline graphic. The aggregate behavior of the two types of error determines the signaling fidelity, and ignoring either may cause erroneous conclusions. We interpret Inline graphic as the current cellular estimate or ‘readout’ of the faithfully transformed signal. The magnitude of the fidelity error relative to the variance in Inline graphic, Eq. 7, is a dimensionless measure of the quality of that estimate since Inline graphic. Furthermore, we have shown that Inline graphic is related to the mutual information between the input and output [7].

To apply our approach experimentally, we can use microfluidic technology to expose cells to the same controlled but time-varying input in the medium [33], and a fluorescent reporter to monitor the network output, Inline graphic. This reporter could measure, for example, a level of gene expression or the extent of translocation of a transcription factor. The transformed signal, Inline graphic, and its variance (for a given probability distribution of the input process) can then be estimated with sufficient amounts of data by monitoring Inline graphic in each cell and Inline graphic in the microfluidic medium. We can determine the mechanistic error by measuring the average squared difference between the output of one cell and that of another — because the outputs of two cells are conjugate given the history of the input [7] –and hence determine the dynamical error by applying Eq. 6.

Our analysis is complementary to one based on information theory and the entire distribution of input and output [7]. Without making strong assumptions about the network and the input, calculation of mutual information is challenging for dynamic inputs. Previous work has considered either the mutual information between entire input and output trajectories with a Gaussian joint distribution of input and output [19], [34], or the ‘instantaneous’ mutual information between input and output at time Inline graphic [19] (applicable in principle to non-Gaussian settings). Our approach, however, depends only on conditional moments and avoids the need to fully specify the distribution of the input process, which is often poorly characterized.

The environments in which cells live are inherently dynamic and noisy. Here we have developed mathematical techniques to quantify how cells interpret and respond to fluctuating signals given their stochastic biochemistry. Our approach is general and will help underpin studies of cellular behavior in natural, dynamic environments.

Methods

Orthogonality of transformed signal, dynamical error and mechanistic error

Define Inline graphic, the transformed signal with zero mean. Then the signal and error components of Eq. 5 are pairwise uncorrelated:

graphic file with name pcbi.1002965.e367.jpg (14)

Orthogonal decomposition of a random variable based on a filtration

Eq. 5 is a special case of the following general decomposition for any random variable (with finite expectation), here denoted Inline graphic. Consider a filtration, or increasing sequence of conditioning ‘information sets’, Inline graphic, where Inline graphic and Inline graphic. Let Inline graphic for Inline graphic, and let Inline graphic. Then the decomposition

graphic file with name pcbi.1002965.e375.jpg (15)

satisfies Inline graphic for all Inline graphic since the sequence Inline graphic is a martingale difference sequence with respect to the filtration (SI). Therefore, Inline graphic.

Supporting Information

Text S1

The complete supporting information is provided as Text S1.

(PDF)

Funding Statement

We acknowledge support from a Medical Research Council and Engineering and Physical Sciences Council funded Fellowship in Biomedical Informatics (CGB) and a Scottish Universities Life Sciences Alliance chair in Systems Biology (PSS). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

References

  • 1. Eldar A, Elowitz MB (2010) Functional roles for noise in genetic circuits. Nature 467: 167–173. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2. Perkins TJ, Swain PS (2009) Strategies for cellular decision-making. Mol Syst Biol 5: 326. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3. Balázsi G, van Oudenaarden A, Collins JJ (2011) Cellular decision making and biological noise: from microbes to mammals. Cell 144: 910–25. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4. Shahrezaei V, Ollivier JF, Swain PS (2008) Colored extrinsic fluctuations and stochastic gene expression. Mol Syst Biol 4: 196. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5. Bowsher CG (2010) Stochastic kinetic models: Dynamic independence, modularity and graphs. Annals of Statistics 38 (4) 2242–81. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6. Bowsher CG (2011) Information processing by biochemical networks: a dynamic approach. J R Soc Interface 8: 186–200. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7. Bowsher CG, Swain PS (2012) Identifying information flow and sources of variation in biochemical networks. Proc Natl Acad Sci USA 109: E1320–E1328. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8. Hilfinger A, Paulsson J (2011) Separating intrinsic from extrinsic fluctuations in dynamic biological systems. Proc Natl Acad Sci USA 108: 12167–72. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9. Hu B, Kessler D, Rappel WJ, Levine H (2011) Effects of Input Noise on a Simple Biochemical Switch. Phys Rev Lett 107: 1–5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10. Cheong R, Rhee A, Wang CJ, Nemenman I, Levchenko A (2011) Information Transduction Capacity of Noisy Biochemical Signaling Networks. Science 334: 354–358. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11. Libby E, Perkins TJ, Swain PS (2007) Noisy information processing through transcriptional regulation. Proc Natl Acad Sci USA 104: 7151–6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12. Kobayashi TJ (2010) Implementation of Dynamic Bayesian Decision Making by Intracellular Kinetics. Phys Rev Lett 104: 1–4. [DOI] [PubMed] [Google Scholar]
  • 13. Purnick P, Weiss R (2009) The second wave of synthetic biology: from modules to systems. Nat Rev Mol Cell Biol 10: 410–22. [DOI] [PubMed] [Google Scholar]
  • 14. Steidler L (2000) Treatment of Murine Colitis by Lactococcus lactis Secreting Interleukin-10. Science 289: 1352–1355. [DOI] [PubMed] [Google Scholar]
  • 15. Anderson JC, Clarke EJ, Arkin AP, Voigt CA (2006) Environmentally controlled invasion of cancer cells by engineered bacteria. J Mol Biol 355: 619–27. [DOI] [PubMed] [Google Scholar]
  • 16. Shaw T, Martin P (2009) Epigenetic reprogramming during wound healing: loss of polycomb-mediated silencing may enable upregulation of repair genes. EMBO Rep 10: 881–886. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17. Swain PS, Elowitz MB, Siggia ED (2002) Intrinsic and extrinsic contributions to stochasticity in gene expression. Proc Natl Acad Sci USA 99: 12795–800. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18. Detwiler PB, Ramanathan S, Sengupta A, Shraiman BI (2000) Engineering aspects of enzymatic signal transduction: photoreceptors in the retina. Biophysical Journal 79: 2801–2817. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19. Tostevin F, ten Wolde PR (2010) Mutual information in time-varying biochemical systems. Phys Rev E 81: 1–15. [DOI] [PubMed] [Google Scholar]
  • 20. Tănase-Nicola S, Warren P, ten Wolde PR (2006) Signal detection, modularity, and the correlation between extrinsic and intrinsic noise in biochemical networks. Physical review letters 97: 68102. [DOI] [PubMed] [Google Scholar]
  • 21. Brivanlou AH, Darnell JE (2002) Signal transduction and the control of gene expression. Science 295: 813–8. [DOI] [PubMed] [Google Scholar]
  • 22. Thattai M, van Oudenaarden A (2001) Intrinsic noise in gene regulatory networks. Proc Natl Acad Sci USA 98: 8614–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23. Bialek W, Nemenman I, Tishby N (2001) Predictability, complexity, and learning. Neural Comput 13: 2409–63. [DOI] [PubMed] [Google Scholar]
  • 24.Nemenman I (2012) Information theory and adaptation. In: Wall M, editor, Quantitative biology: from molecular to cellular systems, Boca Raton, Florida: CRC Press.
  • 25. Shahrezaei V, Swain PS (2008) Analytical distributions for stochastic gene expression. Proc Natl Acad Sci USA 105: 17256–61. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Wiener N (1975) Extrapolation, Interpolation, and Smoothing of Stationary Time Series. The MIT Press.
  • 27. Rosenfeld N, Elowitz MB, Alon U (2002) Negative Autoregulation Speeds the Response Times of Transcription Networks. J Mol Biol 323: 785–793. [DOI] [PubMed] [Google Scholar]
  • 28. Maeda YT, Sano M (2006) Regulatory Dynamics of Synthetic Gene Networks with Positive Feedback. J Mol Biol 359: 1107–1124. [DOI] [PubMed] [Google Scholar]
  • 29. Voliotis M, Bowsher CG (2012) The magnitude and colour of noise in genetic negative feedback systems. Nucleic Acids Res 40: 7084–95. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30. de Ronde W, Tostevin F, ten Wolde PR (2010) Effect of feedback on the fidelity of information transmission of time-varying signals. Phys Rev E 82: 031914. [DOI] [PubMed] [Google Scholar]
  • 31. Erdmann T, Howard M, ten Wolde PR (2009) Role of spatial averaging in the precision of gene expression patterns. Phys Rev Lett 103: 258101. [DOI] [PubMed] [Google Scholar]
  • 32. Barnes CP, Silk D, Sheng X, Stumpf MPH (2011) Bayesian design of synthetic biological systems. Proc Natl Acad Sci USA 108: 15190–15195. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33. Hersen P, McClean MN, Mahadevan L, Ramanathan S (2008) Signal processing by the HOG MAP kinase pathway. Proc Natl Acad Sci USA 105: 7165–7170. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34. Tostevin F, ten Wolde PR (2009) Mutual Information between Input and Output Trajectories of Biochemical Networks. Phys Rev Lett 102: 1–4. [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Text S1

The complete supporting information is provided as Text S1.

(PDF)


Articles from PLoS Computational Biology are provided here courtesy of PLOS

RESOURCES