Skip to main content
PLOS Computational Biology logoLink to PLOS Computational Biology
. 2013 Jan 31;9(1):e1002888. doi: 10.1371/journal.pcbi.1002888

Maximizing the Information Content of Experiments in Systems Biology

Juliane Liepe 1,#, Sarah Filippi 1,#, Michał Komorowski 2, Michael P H Stumpf 1,3,*
Editor: Andrey Rzhetsky4
PMCID: PMC3561087  PMID: 23382663

Abstract

Our understanding of most biological systems is in its infancy. Learning their structure and intricacies is fraught with challenges, and often side-stepped in favour of studying the function of different gene products in isolation from their physiological context. Constructing and inferring global mathematical models from experimental data is, however, central to systems biology. Different experimental setups provide different insights into such systems. Here we show how we can combine concepts from Bayesian inference and information theory in order to identify experiments that maximize the information content of the resulting data. This approach allows us to incorporate preliminary information; it is global and not constrained to some local neighbourhood in parameter space and it readily yields information on parameter robustness and confidence. Here we develop the theoretical framework and apply it to a range of exemplary problems that highlight how we can improve experimental investigations into the structure and dynamics of biological systems and their behavior.

Author Summary

For most biological signalling and regulatory systems we still lack reliable mechanistic models. And where such models exist, e.g. in the form of differential equations, we typically have only rough estimates for the parameters that characterize the biochemical reactions. In order to improve our knowledge of such systems we require better estimates for these parameters and here we show how judicious choice of experiments, based on a combination of simulations and information theoretical analysis, can help us. Our approach builds on the available, frequently rudimentary information, and identifies which experimental set-up provides most additional information about all the parameters, or individual parameters. We will also consider the related but subtly different problem of which experiments need to be performed in order to decrease the uncertainty about the behaviour of the system under altered conditions. We develop the theoretical framework in the necessary detail before illustrating its use and applying it to the repressilator model, the regulation of Hes1 and signal transduction in the Akt pathway.

Introduction

Mathematical models of biomolecular systems are by design and necessity abstractions of a much more complicated reality [1], [2]. In mathematics, and the theoretical sciences more generally, such abstraction is seen primarily as a virtue which allows us to capture the essential features or defining mechanisms underlying the workings of natural systems and processes. But while qualitative agreement between even very simple models and very complex systems is easily achieved, formally assessing whether a given model is indeed good (or even just useful) is notoriously difficult. These difficulties are exacerbated in no small measure for many of the most important and topical research areas in biology [3][5]. The regulatory, metabolic and signalling processes involved in cell-fate and other biological decision-making processes are often only indirectly observable; moreover, when studied in isolation their behavior can often be markedly altered compared to the experimentally more challenging in vivo contexts [6]. The so-called “inverse problem” — to learn, construct or infer mathematical or mechanistic models from experimental data — is often considered (see e.g. Brenner [7]) as one of the major problems facing systems biologists.

These challenges have prompted the development of novel statistical and inferential tools, required to construct (or improve) mathematical models of such systems. We can loosely group these methods into (i) those aimed at reconstructing network models [8][10] (using correlations or statistical dependencies in observed datasets), (ii) methods to estimate (biochemical reaction) rate parameters of models describing the dynamics of biological systems [11][13], and (iii) approaches that allow us to rank or discern between different candidate models/hypotheses [14], [15]. The first set of challenges is typically faced when dealing with new systems where little information is known, and where network-inference algorithms offer a convenient way of generating novel mechanistic hypotheses from data. Here we address the second point. In particular, we start from a model that describes how the abundances of a set of molecular entities, Inline graphic, change with time, Inline graphic; the rate of change in Inline graphic over time is typically described in terms of (ordinary, partial or stochastic) differential equation systems,

graphic file with name pcbi.1002888.e004.jpg

where Inline graphic is a Inline graphic-dimensional vector describing the system's state and Inline graphic is an Inline graphic-dimensional vector containing the model parameters. Finally, Inline graphic denotes the particular experimental setup under which data are collected. This dependence is generally tacitly ignored but, as we will show below, explicitly incorporating the experimental approach (and the fact that different experimental choices are typically available) into the model and any down-stream statistical analysis allows us to develop strategies that yield more detailed insights into biological systems, and better models thereof. The aims of the present study are to develop experimental design strategies that allow us to infer the (unknown or only poorly known) model parameter, Inline graphic, and to reduce the uncertainty in the predicted model behavior.

Inferential tools have been developed that, given some observed biological data and a suitable mathematical candidate model, provide us with parameters that best describe the system's dynamics. Unfortunately obtaining reliable parameter estimates for dynamical systems is plagued with difficulties [16], [17]. Usually sparse and notoriously noisy data are fitted using models with large number of parameters [18]. As a result, over-parameterized models tend to fit to the noisy data but may loose confidence in predictive behavior. Conventional fitting approaches to such data routinely fail to capture this complexity by underestimating the uncertainty in the estimated parameters, which substantially increases the uncertainty in prediction of model behavior.

We use Inline graphic to denote the total set of available experimental assays that could be used to probe a system in a given situation. These might, for example, include knock-out or knock-down mutants, transcriptomic or proteomic assays, different time-courses or different environmental conditions (or both), etc. Here we remain very flexible as to which type of experimental setup is included in Inline graphic, but merely acknowledge that it is rarely possible to probe all important aspects of a system simultaneously. Instead different techniques require different sample preparations etc , and therefore separate experiments. Here we account also for the possibility, that for two different experimental set-ups, Inline graphic and Inline graphic, the mathematical model may differ; therefore the dependence on Inline graphic is made explicit in our notation Inline graphic. For example, if species Inline graphic is knocked out in experiment Inline graphic, we can ignore any terms referring to it when modelling Inline graphic.

Performing different experiments is costly, however, in terms of both money and time, and not all experiments are equally informative. Ideally we would like to perform only those experiments which yield substantial and relevant information. We regard any information that decreases our uncertainty about model parameters or model predictions as relevant. As we will show below, what is substantial information is then easily and naturally resolved. We will show, for example, that experimental interventions differ in the amount of information they provide e.g. about model parameters. Equally some experiments provide insights that are more useful for making predictions about system behavior than others. It may seem surprising that we consider parameter inference and prediction of output separately, but this merely reflects the fact that not all parameters contribute equally to system output: varying some parameters will have huge impact on the output, while varying other parameters will lead to negligible changes in the output. By making the reduction in uncertainty of predicted model behavior the target of experimental design we explicitly acknowledge this.

Experimental design in systems biology is different from classical experimental design studies. The latter theory was first developed at a time when the number of alternative hypotheses was smaller than the amount of available data and replicates [19]. Systems biology, on the other hand is hypotheses rich and data rarely suffice to decide clearly in favour of one model unambiguously. Moreover for dynamical systems, as a host of recent studies have demonstrated, generally less than half of the parameters are tightly confined by experimental data [16], [17]. Together these two challenges have given rise to a number of approaches aimed at improving our ability to develop mechanistic models of such systems. Here we meld concepts from Bayesian inference and information theory to guide experimental investigations into biological systems to arrive at better parameter estimates and better model predictions.

Several authors have used the information theoretical framework, in particular the expected gain in Shannon information to assess the information content of an experiment [20][24]. Although the methodology of Bayesian experimental design is well established, its applicability has been computationally limited to small models involving only several free parameters. Recently their use for systems biology becomes possible as a result of increased computational resources. Vanlier et. al proposed an approach that uses the Bayesian predictive distribution to asses the predictive power of experiments [25]. Huan and Marzouk used a framework, which is similar to ours, in that it maximizes mutual information via Monte Carlo approximation to find optimal experiments [26]; but they only focus on parameter inference and ignore prediction. Furthermore they only apply their method to systems with small number of parameters. Here we demonstrate how such an approach can be utilized to analyse multi-parameter models described by ordinary differential equations (ODEs) regarding both, parameter inference and prediction of system behavior. The latter is especially useful when one aims to predict the outcome of an experiment which is too laborious or impossible to perform. Our approach improves on previous methods [25][32] in a number of ways: first we are able to incorporate but do not require preliminary experimental data; second, it is a global approach that is not limited to some neighbourhood in parameter space unlike approaches solely based on e.g. the Fisher information [29], [33]; third, we obtain comprehensive statistical predictions (including confidence, sensitivity and robustness assessments if desired); and we are very flexible in the type of information that we seek to optimize.

Below we first develop the theoretical concepts before demonstrating the use (and usefulness) of the Bayesian experimental design approach in the context of a number of biological systems that exemplify the set of problems encountered in practice. In order to demonstrate the practical applicability of our approach we investigate two simple models (repressilator and Hes1 systems), as well as a complex signalling pathway (AKT) with experimentally measured dynamics.

Results

Information content of experimental data

To achieve their full functionality mathematical models require parameter values that generally need to be inferred from experimental data. The extraction of this information is, however, a nontrivial task and is further compounded by the need to assess the statistical confidence of parameter estimates. In the Bayesian framework for example, we seek to evaluate the conditional probability distribution, Inline graphic, which relates to the prior knowledge Inline graphic and the distribution of data, Inline graphic, given parameters, Inline graphic, via Bayes' formula

graphic file with name pcbi.1002888.e024.jpg (1)

The posterior probability density function, Inline graphic, describes the probability of finding a parameter Inline graphic in the volume element Inline graphic of parameter space, given the data, the model and the prior information. From this distribution we can obtain all relevant information about the parameters: how sensitive the solution is to varying them individually or together; how correlated different parameters are with one another; if there are non-linear dependencies among the parameters; and the level of precision with which each parameter is known in light of the data. A highly readable and enlightening account of the Bayesian formalism is given e.g. in [34]. Finding the posterior, Inline graphic, is usually achieved by means of powerful (if costly) computational algorithms such as Markov chain Monte Carlo (MCMC) and sequential Monte Carlo (SMC) methods, which also exist in the approximate Bayesian computation (ABC) framework [15], [35], [36].

Rather than providing a single parameter estimate the posterior distribution allows us to assess how well a parameter is constrained by data (see Figure 1 A). More formally, we measure the uncertainty about a parameter information-theoretically in terms of the Shannon entropy [37],

graphic file with name pcbi.1002888.e029.jpg (2)

for the prior and

graphic file with name pcbi.1002888.e030.jpg (3)

for the posterior. The information gained by collecting data Inline graphic can then be expressed as Inline graphic. The output of the experiment, however, is in turn “random” with distribution Inline graphic, and therefore the average posterior uncertainty is

graphic file with name pcbi.1002888.e034.jpg (4)

which leads to the average information gain called mutual information between Inline graphic and Inline graphic,

graphic file with name pcbi.1002888.e037.jpg (5)

When faced with different experimental setups, Inline graphic, and hence different datasets, Inline graphic, choosing the set(s) which maximize Inline graphic will provide the best insights into the system via improved parameter estimates [20], [38], [39]. This observation is the basis of our experimental design methodology which consists of computing the mutual information Inline graphic for every experiment Inline graphic and selecting the experiment resulting in the highest mutual information (see Methods for computational details). Once the chosen experiment has been carried out, the new data are used to update the model and the posterior distribution of the parameters (see Figure 1 B).

Figure 1. Information content of experimental data and flow chart of the experimental design method.

Figure 1

(A) The regions of plausible parameters values for three different experiments. Each ellipse defines the set of parameters which are commensurate with the output Inline graphic of an experiment Inline graphic. In this example, the data Inline graphic leads to the most precise inference of the parameters. The parameters which explain the output of all the three experiments are at the intersection of the three ellipsoids. (B) Flowchart of the experimental design method. Given a mathematical model of the biological system, a set of experiments and the target information —which can be either a set of parameters to infer or a description of the experiment to predict — the Bayesian Experimental Design method determine the experiment to carry out. Once the experiment has been performed, the experimental data are then used to provide target information and to improve the model. Thereafter, the process can be iterated to select other experiments in order to improve the accuracy of the target information. (C) Link between the total and conditional entropies and the mutual information of experimental data Inline graphic and parameters Inline graphic.

Given the importance of the predictive role of mathematical modelling it is also of interest to reduce the uncertainty of model predictions; intriguingly and perhaps counterintuitively — but demonstrably and provably (see below and Supplementary Material) — better parameter estimates are not necessarily required for better, more certain model predictions. Therefore, instead of focussing on parameter inference we can directly seek to identify the experimental condition Inline graphic which minimizes the uncertainty in the predicted trajectories Inline graphic. Analogously to the previous case minimizing uncertainty in predictions of Inline graphic means to maximize mutual information between Inline graphic and Inline graphic (see Methods):

graphic file with name pcbi.1002888.e053.jpg (6)

Below we use three examples of different complexity to show how this combination of rigorous Bayesian and information theoretical frameworks allows us to design/choose optimal experimental setups for parameter/model inference and prediction, respectively.

Experiment selection for parameter inference

To investigate the potential of our experimental design method for parameter estimation we first apply it to the repressilator model, a popular toy model for gene regulatory systems [40]. It consists of three genes connected in a feedback loop, where each gene transcribes the repressor protein for the next gene in the loop (see Figure 2 A and B).

Figure 2. The repressilator model (as described in [40]) and the set of possible experiments.

Figure 2

(A) Illustration of the original repressilator model. The model consists of 3 mRNA species (coloured wavy lines, labeled Inline graphic, Inline graphic and Inline graphic) and their corresponding proteins (circles shown in the same color with labels Inline graphic, Inline graphic and Inline graphic). The 3 regulatory DNA regions are not modeled explicitly but included in the mRNA production process. They are shown for illustration purpose only. (B) The ordinary differential equations which describe the evolution of the concentration of the mRNAs and proteins over time. (C) Four potential modifications of the wild-type model. For each experimental intervention the modified parameters are listed (colours are as in A). The modifications of the wild-type model consist of decreasing one or several of the parameters of the system: in sets Inline graphic, Inline graphic and Inline graphic, the regime of the parameter Inline graphic is changed; in sets Inline graphic, Inline graphic and Inline graphic, respectively, parameters Inline graphic, Inline graphic and Inline graphic are modified for only one gene which breaks the symmetry of the system.

To infer the parameters of this model, Inline graphic, Inline graphic, Inline graphic, and Inline graphic, we propose Inline graphic sets of possible experiments: the original repressilator model (set Inline graphic) which is described in Figure 2 A and corresponds to the ordinary differential equations in Figure 2 B, and Inline graphic modifications of the original model, see Figure 2 C. The suggested 4 experiments are hypothetical, but can be linked to potential experiments. For example, a decrease in the parameter Inline graphic corresponds to a decrease of the basal transcription rate, which could be achieved with inhibitors or site-directed mutation of the corresponding transcription factor binding site. The proposed modifications can lead to different dynamics, and this in turn can lead to a higher mutual information between the parameters and the resulting mRNA and protein trajectories, which are here the output of the system. The information content increases as differences in the outputs resulting from different parameter values increase. In Figure S1, we illustrate the link between the increase in mutual information and the dynamics of the system for three different regimes.

To determine which experiment to carry out we compute the mutual information between the parameter prior distribution and the system output via Monte-Carlo estimation. We use uniform priors over Inline graphic for Inline graphic, over Inline graphic for Inline graphic, over Inline graphic for Inline graphic, and over Inline graphic for Inline graphic. Figure 3 shows that experiment Inline graphic and Inline graphic have highest mutual information, i.e. carrying out those experiments will decrease the uncertainty in the parameter estimates most. To confirm this we simulate data for the Inline graphic experiments using the parameter Inline graphic. The simulated data are shown in Figure S2. Based on these data we perform parameter inference using an approximate Bayesian computation approach [41] for each experiment separately and compare the posterior distributions shown in Figure 3. We observe that using the data generated from set Inline graphic (original repressilator system) only Inline graphic parameters can be inferred with confidence: Inline graphic and Inline graphic. By contrast, the data generated by set Inline graphic and set Inline graphic allow us to estimate all Inline graphic parameters. In addition, for each experiment we compute the reduction of uncertainty from the prior to the posterior distribution. The results are consistent with the results using mutual information and confirm that we should choose experiment Inline graphic or Inline graphic for parameter inference. In practice not all molecular species may be experimentally accessible and it is therefore also of interest to decide which species carries most information about the parameters. We can estimate the mutual information between the parameter and each species independently, and, for example, for experimental set Inline graphic we observe that mRNA Inline graphic and Inline graphic as well as protein Inline graphic carry equally high information. This mutual information for each mRNA and each protein is plotted in Figure S3.

Figure 3. Experiment choice for parameter inference in the repressilator model.

Figure 3

Top: The mutual information Inline graphic between the parameters Inline graphic and the output of each set of experiment (in dark green), and the entropy difference between the prior distribution and the posterior distribution. The posterior distribution is based on data obtained from simulation of the system for each experiment (in light green). The error bars on the mutual information barplots show the variance of the mutual information estimations over Inline graphic independent simulations. Bottom: For each set of experiment we show the histogram of the marginal of the posterior distribution of every parameters. The red line indicates the true parameter value.

Sometimes we are interested in estimating only some of the parameters, e.g. those that have a direct physiological meaning or are under experimental control. To investigate this aspect we consider the HesInline graphic transcription factor that plays a number of important roles, including in the cell differentiation and segmentation of vertebrate embryos. Oscillations observed in the Hes1 system [42] might be connected with formation of spatial patterns during development. The Hes1 oscillator can be modelled by a simple three-component ODE model [43] as shown in Figure 4 A. This model contains Inline graphic parameters, Inline graphic, Inline graphic, Inline graphic, and Inline graphic, and Inline graphic species: Hes 1 mRNA, Inline graphic, Hes 1 nuclear protein, Inline graphic, and Hes 1 cytosolic protein, Inline graphic. It is possible to measure either the mRNA (using real-time PCR) or the total cellular Hes Inline graphic protein concentration Inline graphic (using Western blots). We investigate whether protein or mRNA measurements provide more information about the model parameters. Thus we estimate the mutual information between mRNA and parameters, and between protein and parameters. Figure 4 B shows that mRNA measurements carry more information about all of the parameters.

Figure 4. Experiment selection for parameter inference in the Hes Inline graphic model.

Figure 4

(A) Diagram of the Hes Inline graphic model and the associated ordinary differential equations. The model shows a cell (green box) with its nucleus (white circle). Hes Inline graphic mRNA (wavy line with label Inline graphic) is produced inside the nucleus and translated into Hes Inline graphic cellular protein (Inline graphic), which in turn is then transported into the nucleus and becomes nuclear Hes Inline graphic protein (Inline graphic). Inline graphic is regulating the production of Inline graphic. (B) The mutual information between each parameters and the output of the system for respectively mRNA measurement (left) and total cellular Hes Inline graphic measurement (right). The barplot represents the mean and the variance over Inline graphic repetitions of the Monte-Carlo estimation. (C) Estimation of the difference between the entropy of the parameter prior and the entropy of the posterior distribution given one dataset. The dataset results from simulation of the Hes Inline graphic system. The parameter Inline graphic was set to be 0.03 min−1 as experimentally determined by [42].

This can again be further substantiated by simulations. We perform parameter inference based on such simulated data (simulated data are shown in Figure S5) and compute the difference between the entropy of the prior and that of the resulting posterior distribution. The results shown in Figure 4 C are consistent with the predictions based on mutual information: mRNA measurements carry more information for parameter inference. Interestingly, however, although the mutual information computation indicates that the protein measurements should contain more information about parameter Inline graphic than about the other parameters, this is not confirmed by the difference in entropy result for this simulated data set. This divergence is due to the fact that the mutual information measures the amount of information contained on average over all the possible behaviours of the system, whereas Figure 4 C represents the decrease in entropy from the prior to the posterior distribution given specific data. The differences in entropy for other data sets simulated using different parameter regimes are thus in better agreement with the mutual information results. We confirm this in Figure S6 where we show the results of the same analysis based on a different simulated data set.

Experiment selection for prediction

We next focus on a scenario where we aim to predict the behaviour of a biological system [44] under conditions for which it is not possible to obtain direct measurements. We consider as an example the phosphorylation of Akt and ribosomal binding protein S6 in response to a epidermal growth factor (EGF) signal. Figure 5 A shows the pathway of interest: the EGF growth factor binding to the activated receptor EGFR leads to phosphorylation of EGFR and a signal cascade which results in the phosphorylation of Akt (pAkt) which in turn can activate downstream signalling cascades and leads to the phosphorylation of S6 (pS6); a corresponding mathematical model is shown in Figure S7 [45].

Figure 5. The EGF-dependent AKT pathway and an initial dataset.

Figure 5

(A) Diagram of the model of the EGF-dependent AKT pathway. Epidermal growth factor (EGF, red triangle) is a stimulus for a signalling cascade, which results in the phosphorylation (green circle) of Akt (blue square) and S6 (purple square). EGF binds to the EGF membrane receptor EGFR (orange), which is generated from a pro-EGFR. The Binding results in the phosphorylation of the receptor, which consequently leads to the activation of downstream cascades (thick black circle). This simplified model was shown to capture the experimentally determined dynamics [45]. (B) A impulse input of EGF over Inline graphic seconds with an intensity of Inline graphic ng/ml (top) and the resulting time course of phosphorylated EGF receptor (pEGFR), phosphorylated Akt (pAKT) and phosphorylated S6 (pS6) in response to this stimulus (bottom). Data were provided by the authors of [45].

We are interested in predicting the dynamics under multiple pulsed stimuli with EGF in the presence of background noise, as shown in Figure 6 A. We consider Inline graphic pulses of intensity Inline graphic ng/ml and length Inline graphic seconds spaced by Inline graphic seconds with additive background noise. This input is difficult to realize in an experimental system (let alone an animal or clinical setting). Using an initial data set, see Figure 5 B, we can infer system parameters using ABC SMC. The resulting fit to the data using the inferred parameters are shown in Figure S8. From the resulting posterior distribution we then sample Inline graphic parameter combinations and simulate the model with the Inline graphic-pulse-stimulus in order to predict the time courses of phosphorylated EGF receptor (EGFR), phosphorylated Akt and phosphorylated S6; based on just the estimated parameters these predictions are, however, associated with high uncertainty, see Figure 6 B.

Figure 6. Experiment selection for prediction in the EGF-dependent AKT pathway.

Figure 6

(A) The noisy Inline graphic-impulses EGF input signal: the Inline graphic pulses are of intensity Inline graphic ng/ml and length Inline graphic seconds spaced by Inline graphic seconds with an additive background noise which is the absolute value of a gaussian white noise of variance Inline graphic. (B) The predicted time course of the proteins pEGFR, pAKT and pS6 under the noisy Inline graphic-impulses EGF input signal based on the initial dataset. (C) The mutual information between the time course of the Inline graphic species of interest under the noisy Inline graphic-impulses EGF input signal and the time course of the species under each of the following Inline graphic possible experiments: an impulse stimulus of length Inline graphic seconds with Inline graphic possible intensity (Inline graphic, Inline graphic, Inline graphic, Inline graphic and Inline graphic ng/ml), a step stimulus of length Inline graphic minutes with Inline graphic possible intensity (Inline graphic, Inline graphic, Inline graphic and Inline graphic ng/ml) and a ramp stimulus of length Inline graphic minutes with Inline graphic possible final intensity (Inline graphic, Inline graphic and Inline graphic ng/ml). (D) The predicted time course of the proteins pEGFR, pAKT and pS6 under the noisy Inline graphic-impulses EGF input signal based on the outcome of the step stimulus with intensity Inline graphic ng/ml, which is the experiment with the highest mutual information. The scale of the y-axis is different for Figures (B) and (D). (E) The posterior distribution of two parameters (EGFR turnover and EGFR initial condition) when using the initial dataset alone (left) and when using the initial dataset and the outcome of the step stimulus with intensity Inline graphic ng/ml (right). The scale of the EGFR turnover is the prior range for the figure on the left panel whereas it is Inline graphic times smaller for the figure in the right panel.

To obtain better predictions we can use data from other experiments measuring the time course of the Inline graphic species of interest for a experimentally more straightforward input signals chosen from among Inline graphic possible stimuli: impulse, step or ramp stimuli with different EGF concentrations (see Figure 6). To determine which of those inputs would result in the most reliable predictions we compute the mutual information between the time courses for the different potential experimental inputs and the time-course of the target Inline graphic-impulse noisy stimulus. We incorporate initial information about model parameters by computing the mutual information based on the posterior distribution inferred above. Figure 6 C shows that a step stimulus of intensity Inline graphic ng/ml has the highest mutual information and therefore reduces the uncertainty of the predicted behavior of our target stimulus pattern.

In Figure 6 D we show that this does indeed yield much improved predictions compared to the initial prediction. This reduction in uncertainty about predicted model behaviour results from the difference in the posterior distributions obtained under different stimulus regimes; by focussing on predictive ability we focus implicitly on data that is informative about those parameters that will affect the system behaviour most under the target (5-pulse) stimulus. The posterior distributions are represented in Figure 6 E for two parameters, the EGFR turn over and the EGFR initial concentration, which appear to be essential for the prediction of the evolution of Akt/S6 phosphorylation patterns under the Inline graphic-pulse stimulus. Those two parameters were not inferred using the initial dataset alone, whereas the addition of the outcome of the step stimulus experiment suggested by our methods infers these parameters at the required precision.

This ability to predict the time courses extends to much greater signal distortion and even with a noise level of Inline graphic percent of the signal intensity we find that our experimental design method yields similar improvements in the predictions. This additional analysis of the new input signal is shown in Figure S10 and S11. We observe that the direct target (EGFR receptor) as well as activated AKT (pAKT) efficiently filter out the noise but capture the 5 pulses; EGFR activation quickly returns to base level in response to the higher frequency background noise. This indicates that there might be a constant low concentration of activated EGFR (pEGFR), but the activation of S6 has very different characteristics and is far less robust to noise. The level of noise is amplified as can be observed in the pS6 time course. This might suggest that the downstream molecule pS6 has a longer time delay to react to a signal. Moreover, pS6 does not have time to relax to its baseline between the Inline graphic pulses, which leads to incremental signal amplification. This behavior fits with the low-pass filter characteristics previously described [45]. In further support earlier studies [46] found that a downstream molecule can be more sensitive to an upstream activator than the direct target molecule of the activator. This might explain that the activation of EGFR and AKT is more robust to noise than the downstream molecule, S6.

Discussion

We have found that maximizing the mutual information between our target information — here either model parameter values or predictions of system behaviour — and the (simulated) output of potentially available experiments offers a means of arriving at optimally informative experiments. The experiments that are chosen from a set of candidates are always those that add most to existing knowledge: they are, in fact, the experiments that most challenge our current understanding of a system.

This framework has a number of advantages: First, we can simulate cheaply any experimental set-up that can in principle be implemented; second, using simulations allows us to propagate the model dynamics and to quantify rigorously the amount of (relevant) information that is generated by any given experimental design; third, our information measure gives us a means of meaningfully comparing different designs; finally, our approach can be used to design experiments sequentially — our preferred route as this will enable us to update iteratively our knowledge of a system along the way — or in parallel, i.e. selecting more than one experiment. Previous approaches had taken a more local approach [25], [29], [30], [47] that relied on initial parameter guesses and often data; our approach also readily incorporates different stimulus patterns [48].

Here we have focussed on designing experiments that increase our ability to estimate model parameters and to predict model behaviour. The latter depends on model parameters in a very subtle way: not all parameters affect system output equally and under all conditions. Target conditions could, for example, include clinical settings which are generally not experimentally amenable (at least in early stage research); here the current approach offers a rationale for designing [49] therapeutic interventions into complex systems based on investigations of suitable model systems. In this study we provide an approach to chose the optimal experiment out of a finite and discrete set of possible experiments. Experimental design with a continuous set of experiments requires a different approach, as for example shown in [50].

With an optimal design we can overcome the problems of sloppy parameters [16] (which are, of course, dependent on the experimental intervention chosen [17]) and can narrow down the posterior probability intervals of parameters. We would like to reiterate, however, the importance of considering joint distributions rather than merely the marginal probabilities of (or the confidence regions associated with) individual parameters: parameters of dynamical systems tend to show high levels of correlation (i.e. we can vary them simultaneously in a way that does not affect the output of the system — at least in some areas of parameter space) and their posterior probability distributions often deviate from normality (which also motivated the use of information theoretical measures which can deal with non-linearities and non-Gaussian probability distributions).

We tested and applied our approach in three different contexts: while the repressilator serves as a toy model with hypothetical data and experiments, the Hes1 transcription regulatory system and the EGF induced Akt pathway are relevant biological systems. The question to answer in the Hes Inline graphic system, which molecular species to measure, is a common question in laboratories. The presented study of the Akt system does not only demonstrate how to chose experiments for prediction of system behaviours, but it is also an example for stimulus design.

The approach we presented here yields the potential for model discrimination or checking the target of our analysis [48], [51], and, for example, choose experimental designs that maximize our ability to distinguish between competing alternative hypotheses or models. All of this is straightforwardly reconciled in the Bayesian framework, which also naturally lends itself to such iterative procedures where “today's posterior” is “tomorrow's prior” and models are understood increasingly better as new, more informative data are systematically being generated.

Methods

Information theoretic design criteria

Our aim is to choose an experiment Inline graphic from a set of candidate experimental setups, Inline graphic, which either reduces uncertainty about model parameters or uncertainty of an outcome of a particular condition Inline graphic for which data are impossible or difficult to obtain. In the information theory framework, these two goals boil down to determining an experiment Inline graphic which contains maximal information about the parameter or the desired predictions for condition Inline graphic. In order to present in more details these two goals and for better understanding we revise some concepts of information theory [34]. We define the entropy Inline graphic of a random variable Inline graphic, which measures the uncertainty of the random variable,

graphic file with name pcbi.1002888.e188.jpg (7)

and the mutual information Inline graphic between two random variables Inline graphic, which is the reduction of the uncertainty that knowing Inline graphic provides about Inline graphic,

graphic file with name pcbi.1002888.e193.jpg (8)

where Inline graphic is the joint probability density function of Inline graphic and Inline graphic while Inline graphic and Inline graphic are the marginal probability density functions. We denote by Inline graphic the expectation with respect to the probability distribution of Inline graphic. Here we follow the convention where capital letters stand for random variables while lower-case stand for a particular realization of a random variable.

Reducing uncertainty in model parameters

We first consider the task of choosing an experiment that will on average provide most information about model parameters measured through the reduction in their respective uncertainties. In the information theoretic language, as by Lindley [20] and later by Sebastiani and Wynn [52] the initial (prior) uncertainty is given by the entropy Inline graphic of the prior distribution Inline graphic, which after data Inline graphic have been collected (in experimental setup Inline graphic) gives rise to the entropy Inline graphic of the posterior distribution Inline graphic. The information gained about the parameter by collecting the data Inline graphic is then Inline graphic. On average, however, the decrease of uncertainty about Inline graphic after data are collected in an experiment Inline graphic is given by Inline graphic. Therefore, in order to reduce parameters' uncertainties one should choose an experiment that maximizes the mutual information between Inline graphic and Inline graphic.

Here we specifically consider models such that the output is of the form

graphic file with name pcbi.1002888.e214.jpg (9)

where Inline graphic is a deterministic function and Inline graphic an uncorrelated, zero mean, gaussian random variable with variance Inline graphic (our approach is readily extended to stochastic systems). In such a model maximization of mutual information Inline graphic is equivalent to maximization of the entropy Inline graphic. This observation described first in [52] results directly from the fact that the mutual information Inline graphic can be written as the difference between Inline graphic and Inline graphic and that

graphic file with name pcbi.1002888.e223.jpg

does not depend on the experiment Inline graphic. Indeed, Equation (9) implies that Inline graphic is the probability of the experimental noise Inline graphic. Therefore maximization of Inline graphic is equivalent to maximization of Inline graphic. However, this is only the case for the mutual information between the ouput Inline graphic of an experiment Inline graphic and the parameter of the system Inline graphic. Whenever we are interested in the increase of information about only one component of the parameter vector, or in reducing uncertainty about an experimental outcome we need to use the mutual information and not the entropy.

Reducing uncertainty in an experimental outcome

Similar reasoning leads us to a criterion for selecting an experiment Inline graphic that reduces uncertainty about predictions for the system output under a different set of conditions or experiment Inline graphic. Choosing Inline graphic that maximizes Inline graphic leads to an experiment that on average reduces the uncertainty of predictions for condition Inline graphic most. This can be seen by rewriting (8)

graphic file with name pcbi.1002888.e237.jpg (10)

Estimation of the mutual information

The mutual information for models of type (9) can be estimated using Monte Carlo simulations [26], [53]. We first focus on the mutual information between parameters Inline graphic and the output Inline graphic of an experiment Inline graphic, which can be written as a function of the prior distribution Inline graphic, the probability of the output given the parameter Inline graphic and the evidence Inline graphic as follows

graphic file with name pcbi.1002888.e244.jpg (11)

Drawing a sample Inline graphic from the prior distribution Inline graphic we obtain a Monte-Carlo estimate,

graphic file with name pcbi.1002888.e247.jpg (12)

where for all Inline graphic, Inline graphic is an output of the system for the parameter Inline graphic. For models of type (9) Inline graphic is the probability density function of a Gaussian distribution with mean Inline graphic and covariance Inline graphic taken at Inline graphic. To compute the quantity in (12) we have to estimate the evidence Inline graphic, which can be done via Monte Carlo simulation: given a Inline graphic-sample Inline graphic drawn independently from the prior distribution Inline graphic with Inline graphic we have

graphic file with name pcbi.1002888.e260.jpg (13)

Combining equations (12) and (13), we obtain the following estimate of the mutual information between the parameter Inline graphic and the output Inline graphic,

graphic file with name pcbi.1002888.e263.jpg (14)

Similarly, we can estimate the mutual information between any single component Inline graphic of the Inline graphic-dimensional parameter vector, Inline graphic, and the output Inline graphic. We denote by Inline graphic the Inline graphic-dimensional vector containing all the components of Inline graphic except the Inline graphic-th one. Integrating over Inline graphic, the mutual information between Inline graphic and Inline graphic is equal to

graphic file with name pcbi.1002888.e275.jpg

and can be estimated through Monte Carlo simulation by

graphic file with name pcbi.1002888.e276.jpg (15)

As previously the evidence Inline graphic can be estimated from a sample drawn from the prior without great difficulty. On the other hand, the estimation of the numerator Inline graphic requires an additional integration over Inline graphic. We then have

graphic file with name pcbi.1002888.e280.jpg (16)

where for each Inline graphic, Inline graphic is a sample drawn from Inline graphic under the constraint that Inline graphic. Putting all the terms together, we obtain the following estimation of the mutual information between Inline graphic and Inline graphic: given a sample Inline graphic drawn from Inline graphic and a Inline graphic-sample Inline graphic such that for all Inline graphic, Inline graphic, Inline graphic is drawn from Inline graphic,

graphic file with name pcbi.1002888.e295.jpg (17)

To finish we consider the estimation of the mutual information between the output of the system for two different experiments Inline graphic and Inline graphic. We have

graphic file with name pcbi.1002888.e298.jpg

where we used the fact that Inline graphic and Inline graphic are independent conditionally to a parameter Inline graphic. This equation leads to a Monte Carlo estimation from a Inline graphic-sample Inline graphic drawn from the prior given by

graphic file with name pcbi.1002888.e304.jpg

where Inline graphic.

Technical details

We implemented the algorithms in Inline graphic. The numerical solutions of the models were obtained using the solver for ordinary differential equations of the package Inline graphic [54], which allows for parallel implementation on graphical processing units (GPUs). The algorithm to obtain the Monte Carlo estimates was also parallelized using GPUs. For the repressilator, we assume measurement noise Inline graphic with variance Inline graphic. To estimate the mutual information between the output and the parameter, we use Inline graphic and Inline graphic. The mutual information between the output of the system and each parameter in the HesInline graphic example is computed using Inline graphic, and Inline graphic. For the AKT model, the variance of the measurement noise is equal to Inline graphic, Inline graphic, and Inline graphic.

Approximate Bayesian Computation (ABC)

Once data Inline graphic have been collected, we use an Approximate Bayesian Computation framework [55], [56] to infer the posterior parameter distribution Inline graphic. This is a simulation-based method which mainly consists in sampling the parameter space from a prior distribution Inline graphic, simulating the system for each sampled parameter (often called particle) and selecting the particles such that the simulated data are less than some maximal distance away from the observed data. Those particles define an estimate of the posterior distribution given the observed data:

graphic file with name pcbi.1002888.e321.jpg

Specifically we use an ABC scheme based on sequential Monte Carlo, which has been developed for likelihood-free parameter inference in deterministic and stochastic systems [41]. We use the implementation of this method of the package called Inline graphic [57].

Estimation of the entropy

The estimation of entropy has been performed only to test and confirm our experimental choice, which is based on Monte Carlo estimation of mutual information. For each experiment Inline graphic we compute the difference between the entropy Inline graphic of the prior distribution Inline graphic and the entropy Inline graphic of the posterior distribution Inline graphic. The entropies Inline graphic and Inline graphic are approximated using a histogram-based estimators. This discretization of the parameter space leads to a change of scale in the entropy measure. This explains why the scales of the differences between estimated entropies and the estimated mutual information differ despite the fact that the mutual information Inline graphic is the expectation over all possible data Inline graphic of the difference between Inline graphic and Inline graphic. It is also well known that such histogram approach leads to a biased estimate of the entropy [58]. However, since the bias only depends on the number of bins and the sample size, we can compare the estimation results among experiments as long as these algorithm parameters are kept the same.

Technical details

To compute the entropy Inline graphic for each experiment Inline graphic in the repressilator example we compute a 4-dimensional histogram to discretise the posterior distribution (for all 4 model parameters) using Inline graphic bins for each dimension resulting in a total of Inline graphic bins. We use the Inline graphic package Inline graphic [59] to estimate the entropy. For the Hes Inline graphic model we computed histograms over the marginals posterior distribution, to measure the entropy of each parameter separately. Here we used Inline graphic bins.

Experimental data

The experimental data sets used to investigate the Akt model were collected and published by the lab of S. Kuroda. The data are normalised Western blot measurements as described in [45].

Supporting Information

Figure S1

Information content of different parameter regimes. The mutual information Inline graphic depends on the dynamics of the system given the prior of the system parameters: the more the dynamics for different parameter values differ from each other, the higher is the information content. To visualize this we compute the mutual information between one parameter and the outcome of the system for three different regimes in the repressilator example. Noting that Inline graphic is a bifurcation parameter and Inline graphic is a Hopf bifurcation point, we choose Inline graphic different prior regimes for Inline graphic: Inline graphic, Inline graphic and Inline graphic. We keep the remaining Inline graphic parameters constant: Inline graphic, Inline graphic and Inline graphic. For these three priors we estimate the mutual information Inline graphic and represent the dynamics of the output of the system. We observe that the dynamics resulting from the first prior regime are most diverse and therefore Inline graphic has the highest value (Inline graphic) compared to the remaining two parameter regimes (Inline graphic for Inline graphic and Inline graphic for Inline graphic). Shown is the bifurcation diagram for parameter Inline graphic with its stable (solid lines) and unstable (dashed lines) states. Estimation of mutual information was performed for 3 different parameter regimes. For illustration we plot the mean (dark blue), 25 and 75 percentiles (blue) and the 5 and 95 percentiles (light blue) of trajectories simulated with 10000 parameter sets, where Inline graphic is uniformly sampled and the remaining parameters are kept constant (Inline graphic, Inline graphic and Inline graphic).

(TIFF)

Figure S2

The simulated evolution of the mRNA and protein concentration in the repressilator model for each experimental setup. The parameter vector used for simulations is Inline graphic. The colours correspond to those in Figure 2. The dots represent the simulated data and the lines correspond to the mean of the species for Inline graphic parameters sampled from the posterior distribution computed using ABC SMC.

(TIFF)

Figure S3

Mutual information between the parameter and each species (Inline graphic mRNA and Inline graphic protein measurements) for experiment Inline graphic in the repressilator model.

(TIFF)

Figure S4

The posterior distribution given the data represented in Figure S2. Each subfigure (A to E) corresponds to an experiment (Inline graphic to Inline graphic). In each subfigure, the diagonal represents the marginal posterior distribution for each parameter and the off-diagonal elements show the correlations between pairs of parameters.

(TIFF)

Figure S5

Simulated trajectories of the mRNA and protein concentrations (dots). The parameter used for simulation is Inline graphic The lines represent the Inline graphic and Inline graphic percentiles of the species abundances for Inline graphic parameters sampled from the posterior distribution computed using ABC SMC.

(TIFF)

Figure S6

(A) Simulated trajectories of the mRNA and protein concentration (dots) for the parameter Inline graphic The lines represent the Inline graphic and Inline graphic percentiles of the species abundances for Inline graphic parameters sampled from the posterior distribution computed using ABC SMC. (B) Estimates of the differences between the entropies of the prior and posteriors.

(TIFF)

Figure S7

Ordinary differential equations which describe the dynamics of the Inline graphic species of the AKT model. The model contains Inline graphic parameters denoted Inline graphic, Inline graphic. The concentration of the following species (in this order) are denoted by Inline graphic, Inline graphic: EGF, EGFR, pEGFR, pEGFR-AKT, AKT, pAKT, S6, pAKT-S6, pS6, pro-EGFR and EGF-EGFR.

(TIFF)

Figure S8

The time course of phosphorylated EGF receptor (pEGFR), phosphorylated Akt (pAKT) and phosphorylated S6 (pS6) in response to an impulse input of EGF over Inline graphic seconds with an intensity of Inline graphic ng/ml (dots). Data are Western blots measurements, described in [45]. The lines represent the Inline graphic and Inline graphic percentiles of the evolution of the species for Inline graphic parameters sampled from the posterior distribution computed using ABC SMC.

(TIFF)

Figure S9

The time course of phosphorylated EGF receptor (pEGFR), phosphorylated Akt (pAKT) and phosphorylated S6 (pS6) in response to a step input of EGF over Inline graphic minutes with an intensity of Inline graphic ng/ml (dots). Data are Western blots measurements, which have been generated and published by [45]. The lines represent the Inline graphic and Inline graphic percentiles of the evolution of the species for Inline graphic parameters sampled from the posterior distribution computed using ABC SMC.

(TIFF)

Figure S10

(A) A noisy Inline graphic-impulses EGF input signal: the Inline graphic pulses are of intensity Inline graphic ng/ml and length Inline graphic seconds spaced by Inline graphic seconds with an additive background noise which is the absolute value of a gaussian white noise of variance Inline graphic. (B) The mutual information between the time course of the Inline graphic species of interest under the noisy input signal represented in (A) and the time course of the species under each of the following Inline graphic possible experiments: an impulse stimulus of length Inline graphic seconds with Inline graphic possible intensity (Inline graphic, Inline graphic, Inline graphic, Inline graphic and Inline graphic ng/ml), a step stimulus of length Inline graphic minutes with Inline graphic possible intensity (Inline graphic, Inline graphic, Inline graphic and Inline graphic ng/ml) and a ramp stimulus of length Inline graphic minutes with Inline graphic possible final intensity (Inline graphic, Inline graphic and Inline graphic ng/ml).

(TIFF)

Figure S11

The predicted time course of the proteins pEGFR, pAKT and pS6 under the noisy Inline graphic-impulses EGF input signal with a noise of high intensity represented Figure S10 A. In the left panel, the prediction is based on the initial dataset whereas in the right panel in addition to the initial data it is also based on the outcome of the step stimulus with intensity Inline graphic ng/ml, which is the experiment with the highest mutual information. The scale of the y-axis is different for each figure.

(TIFF)

Funding Statement

The authors greatfully acknowledge funding from the Wellcome Trust (through a PhD studentship to JL; www.wellcome.ac.uk), MRC (through a Biocomputing Research Fellowship to SF; www.mrc.ac.uk) and BBSRC (to MK and MPHS; BB/G020434/1, www.bbsrc.ac.uk). MPHS is a Royal Society Wolfson Research Merit Award Holder (www.royalsociety.org). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

References

  • 1. Bruggeman F, Westerhoff H (2007) The nature of systems biology. TRENDS in Microbiology 15: 45–50. [DOI] [PubMed] [Google Scholar]
  • 2. Nurse P, Hayles J (2011) The cell in an era of systems biology. Cell 144: 850–854. [DOI] [PubMed] [Google Scholar]
  • 3. Silver P, Way J (2007) Molecular systems biology in drug development. Clin Pharmacol Ther 82: 586–590. [DOI] [PubMed] [Google Scholar]
  • 4. Spiller D, Wood C, Rand D, White M (2010) Measurement of single-cell dynamics. Nature 465: 736–745. [DOI] [PubMed] [Google Scholar]
  • 5. Del Sol A, Balling R, Hood L, Galas D (2010) Diseases as network perturbations. Current Opinion in Biotechnology 21: 566–571. [DOI] [PubMed] [Google Scholar]
  • 6. Liepe J, Taylor H, Barnes CP, Huvet M, Bugeon L, et al. (2012) Calibrating spatio-temporal models of leukocyte dynamics against in vivo live-imaging data using approximate Bayesian computation. Integrative biology 4: 335–345. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7. Brenner S (2010) Sequences and consequences. Phil Trans Biol Sci 365: 207–212. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8. Beal M, Falciani F, Ghahramani Z, Rangel C, Wild D (2005) A bayesian approach to reconstructing genetic regulatory networks with hidden factors. Bioinformatics 21: 349–356. [DOI] [PubMed] [Google Scholar]
  • 9. Sachs K, Perez O, Pe'er D, Lauffenburger D, Nolan G (2005) Causal protein-signaling networks derived from multiparameter single-cell data. Science 308: 523–529. [DOI] [PubMed] [Google Scholar]
  • 10. Lèbre S, Becq J, Devaux F, Stumpf MPH, Lelandais G (2010) Statistical inference of the time-varying structure of gene-regulation networks. BMC Systems Biology 4: 130. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11. Mendes P, Kell D (1998) Non-linear optimization of biochemical pathways: applications to metabolic engineering and parameter estimation. Bioinformatics 14: 869–883. [DOI] [PubMed] [Google Scholar]
  • 12. Reinker S, Altman RM, Timmer J (2006) Parameter estimation in stochastic biochemical reactions. IEE Proc Syst Biol 153: 168–178. [DOI] [PubMed] [Google Scholar]
  • 13. Kreutz C, Timmer J (2009) Systems biology: experimental design. FEBS Journal 276: 923–942. [DOI] [PubMed] [Google Scholar]
  • 14. Vyshemirsky V, Girolami M (2008) Bayesian ranking of biochemical system models. Bioinformatics 24: 833. [DOI] [PubMed] [Google Scholar]
  • 15. Toni T, Stumpf MPH (2010) Simulation-based model selection for dynamical systems in systems and population biology. Bioinformatics 26: 104–110. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16. Gutenkunst RN, Waterfall JJ, Casey FP, Brown KS, Myers CR, et al. (2007) Universally sloppy parameter sensitivities in systems biology models. PLoS Comput Biol 3: 1871–1878. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17. Erguler K, Stumpf M (2011) Practical limits for reverse engineering of dynamical systems: a statistical analysis of sensitivity and parameter inferability in systems biology models. Mol BioSyst 7: 1593–1602. [DOI] [PubMed] [Google Scholar]
  • 18. Wilkinson D (2009) Stochastic modelling for quantitative description of heterogeneous biological systems. Nature Reviews Genetics 10: 122–133. [DOI] [PubMed] [Google Scholar]
  • 19.Cox D (2006) Principles of Statistical Inference. Cambridge: Cambridge University Press.
  • 20. Lindley DV (1956) On a measure of the information provided by an experiment. Ann Math Statistics 27: 986–1005. [Google Scholar]
  • 21. Stone M (1959) Application of a measure of information to the design and comparison of regression experiments. The Annals of Mathematical Statistics 30: 55–70. [Google Scholar]
  • 22. DeGroot M (1962) Uncertainty, information, and sequential experiments. The Annals of Mathematical Statistics 33: 404–419. [Google Scholar]
  • 23.DeGroot M (1986) Concepts of information based on utility. In: Daboni L, editor. Recent Developments in the Foundations of Utility and Risk Theory. Dordrecht, Reidel: Springer. pp. 265–275.
  • 24. Bernardo J (1979) Expected information as expected utility. The Annals of Statistics 7: 686–690. [Google Scholar]
  • 25. Vanlier J, Tiemann C, Hilbers P, van Riel N (2012) A bayesian approach to targeted experiment design. Bioinformatics 28: 1136–42. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26. Huan X, Marzouk Y (2012) Simulation-based optimal bayesian experimental design for nonlinear systems. Journal of Computational Physics 232: 288–317. [Google Scholar]
  • 27. Kutalik Z, Cho KH, Wolkenhauer O (2004) Optimal sampling time selection for parameter estimation in dynamic pathway modeling. Biosystems 75: 43–55. [DOI] [PubMed] [Google Scholar]
  • 28. Casey FP, Baird D, Feng Q, Gutenkunst RN, Waterfall JJ, et al. (2007) Optimal experimental design in an epidermal growth factor receptor signalling and down-regulation model. IET Systems Biology 1: 190–202. [DOI] [PubMed] [Google Scholar]
  • 29. Chu Y, Hahn J (2008) Integrating parameter selection with experimental design under uncertainty for nonlinear dynamic systems. Aiche Journal 54: 2310–2320. [Google Scholar]
  • 30. Bandara S, Schloeder J, Eils R, Bock H, Meyer T (2009) Optimal Experimental Design for Parameter Estimation of a Cell Signaling Modell. PLoS Comp Biol 5: e1000558. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31. Apgar JF, Witmer DK, White FM, Tidor B (2010) Sloppy models, parameter uncertainty, and the role of experimental design. Molecular BioSystems 6: 1890. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32. Vanlier J, Tiemann C, Hilbers P, van Riel N (2012) An integrated strategy for prediction uncertainty analysis. Bioinformatics 28: 1130–5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33. Komorowski M, Costa MJ, Rand DA, Stumpf MPH (2011) Sensitivity, robustness, and identifiability in stochastic chemical kinetics models. Proc Natl Acad Sci USA 108: 8645–8650. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.MacKay DJC (2003) Information theory, inference and learning algorithms. Cambridge University Press.
  • 35. Marjoram P, Molitor J, Plagnol V, Tavaré S (2003) Markov chain monte carlo without likelihoods. Proc Natl Acad Sci U S A 100: 15324–15328. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36. Sisson SA, Fan Y, Tanaka MM (2007) Sequential monte carlo without likelihoods. Proc Natl Acad Sci U S A 106: 16889. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37. Shannon CE (1948) A mathematical theory of communication. Bell System Technical Journal 27: 379–656 379-423, 623-656. [Google Scholar]
  • 38. Chaloner K, Verdinelli I (1995) Bayesian experimental design: A review. Statist Sci 10: 273–304. [Google Scholar]
  • 39.Clyde MA (2001) Experimental design: A bayesian perspective. In: Smelser, editor. International Encyclopedia of the Social and Behavioral Sciences. New York: Elsevier Science.
  • 40. Elowitz M, Leibler S, et al. (2000) A synthetic oscillatory network of transcriptional regulators. Nature 403: 335–338. [DOI] [PubMed] [Google Scholar]
  • 41. Toni T, Welch D, Strelkowa N, Ipsen A, Stumpf M (2009) Approximate bayesian computation scheme for parameter inference and model selection in dynamical systems. Journal of the Royal Society Interface 6: 187–202. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42. Hirata H, Yoshiura S, Ohtsuka T, Bessho Y, Harada T, et al. (2002) Oscillatory expression of the bhlh factor hes1 regulated by a negative feedback loop. Science's STKE 298: 840. [DOI] [PubMed] [Google Scholar]
  • 43. Silk D, Kirk P, Barnes C, Toni T, Rose A, et al. (2011) Designing attractive models via automated identification of chaotic and oscillatory dynamical regimes. Nature Communications 2: 489. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 44. Bazil JN, Buzzard GT, Rundell AE (2012) A global parallel model based design of experiments method to minimize model output uncertainty. Bull Math Biol 74: 688–716. [DOI] [PubMed] [Google Scholar]
  • 45. Fujita K, Toyoshima Y, Uda S, Ozaki Y, Kubota H, et al. (2010) Decoupling of receptor and downstream signals in the akt pathway by its low-pass filter characteristics. Science's STKE 3: ra56. [DOI] [PubMed] [Google Scholar]
  • 46. Toyoshima Y, Kakuda H, Fujita K, Uda S, Kuroda S (2012) Sensitivity control through attenuation of signal transfer efficiency by negative regulation of cellular signalling. Nature Communications 3: 743. [DOI] [PubMed] [Google Scholar]
  • 47. Hengl S, Kreutz C, Timmer J, Maiwald T (2007) Data-based identifiability analysis of non-linear dynamical models. Bioinformatics 23: 2612–2618. [DOI] [PubMed] [Google Scholar]
  • 48. Apgar JF, Toettcher JE, Endy D, White FM, Tidor B (2008) Stimulus design for model selection and validation in cell signaling. PLoS Comp Biol 4: e30. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49. Barnes C, Silk D, Sheng X, Stumpf M (2011) Bayesian design of synthetic biological systems. Proc Natl Acad Sci U S A 108: 15190–15195. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 50. Bazil J, Buzzard G, Rundell A (2012) A global parallel model based design of experiments method to minimize model output uncertainty. Bulletin of mathematical biology 74: 1–29. [DOI] [PubMed] [Google Scholar]
  • 51. Mélykúti B, August E, Papachristodoulou A, El-Samad H (2010) Discriminating between rival biochemical network models: three approaches to optimal experiment design. BMC Systems Biology 4: 38. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 52. Sebastiani P, Wynn H (2002) Maximum entropy sampling and optimal bayesian experimental design. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 62: 145–157. [Google Scholar]
  • 53.Berg BA (2004) Markov Chain Monte Carlo Simulations And Their Statistical Analysis. World Scientific Publishing.
  • 54. Zhou Y, Liepe J, Sheng X, Stumpf M, Barnes C (2011) Gpu accelerated biochemical network simulation. Bioinformatics 27: 874–876. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 55. Pritchard JK, Rosenberg NA (1999) Use of unlinked genetic markers to detect population stratification in association studies. Am J Hum Genet 65: 220–228. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 56. Beaumont MA, Zhang W, Balding DJ (2002) Approximate bayesian computation in population genetics. Genetics 162: 2025–2035. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 57. Liepe J, Barnes C, Cule E, Erguler K, Kirk P, et al. (2010) ABC-SysBio–approximate Bayesian computation in Python with GPU support. Bioinformatics 26: 1797–1799. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 58. Paninski L (2003) Estimation of entropy and mutual information. Neural Computation 15: 1191–1253. [Google Scholar]
  • 59. Hausser J, Strimmer K (2009) Entropy inference and the james-stein estimator, with application to nonlinear gene association networks. J Mach Learn Res 10: 1469–1484. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Figure S1

Information content of different parameter regimes. The mutual information Inline graphic depends on the dynamics of the system given the prior of the system parameters: the more the dynamics for different parameter values differ from each other, the higher is the information content. To visualize this we compute the mutual information between one parameter and the outcome of the system for three different regimes in the repressilator example. Noting that Inline graphic is a bifurcation parameter and Inline graphic is a Hopf bifurcation point, we choose Inline graphic different prior regimes for Inline graphic: Inline graphic, Inline graphic and Inline graphic. We keep the remaining Inline graphic parameters constant: Inline graphic, Inline graphic and Inline graphic. For these three priors we estimate the mutual information Inline graphic and represent the dynamics of the output of the system. We observe that the dynamics resulting from the first prior regime are most diverse and therefore Inline graphic has the highest value (Inline graphic) compared to the remaining two parameter regimes (Inline graphic for Inline graphic and Inline graphic for Inline graphic). Shown is the bifurcation diagram for parameter Inline graphic with its stable (solid lines) and unstable (dashed lines) states. Estimation of mutual information was performed for 3 different parameter regimes. For illustration we plot the mean (dark blue), 25 and 75 percentiles (blue) and the 5 and 95 percentiles (light blue) of trajectories simulated with 10000 parameter sets, where Inline graphic is uniformly sampled and the remaining parameters are kept constant (Inline graphic, Inline graphic and Inline graphic).

(TIFF)

Figure S2

The simulated evolution of the mRNA and protein concentration in the repressilator model for each experimental setup. The parameter vector used for simulations is Inline graphic. The colours correspond to those in Figure 2. The dots represent the simulated data and the lines correspond to the mean of the species for Inline graphic parameters sampled from the posterior distribution computed using ABC SMC.

(TIFF)

Figure S3

Mutual information between the parameter and each species (Inline graphic mRNA and Inline graphic protein measurements) for experiment Inline graphic in the repressilator model.

(TIFF)

Figure S4

The posterior distribution given the data represented in Figure S2. Each subfigure (A to E) corresponds to an experiment (Inline graphic to Inline graphic). In each subfigure, the diagonal represents the marginal posterior distribution for each parameter and the off-diagonal elements show the correlations between pairs of parameters.

(TIFF)

Figure S5

Simulated trajectories of the mRNA and protein concentrations (dots). The parameter used for simulation is Inline graphic The lines represent the Inline graphic and Inline graphic percentiles of the species abundances for Inline graphic parameters sampled from the posterior distribution computed using ABC SMC.

(TIFF)

Figure S6

(A) Simulated trajectories of the mRNA and protein concentration (dots) for the parameter Inline graphic The lines represent the Inline graphic and Inline graphic percentiles of the species abundances for Inline graphic parameters sampled from the posterior distribution computed using ABC SMC. (B) Estimates of the differences between the entropies of the prior and posteriors.

(TIFF)

Figure S7

Ordinary differential equations which describe the dynamics of the Inline graphic species of the AKT model. The model contains Inline graphic parameters denoted Inline graphic, Inline graphic. The concentration of the following species (in this order) are denoted by Inline graphic, Inline graphic: EGF, EGFR, pEGFR, pEGFR-AKT, AKT, pAKT, S6, pAKT-S6, pS6, pro-EGFR and EGF-EGFR.

(TIFF)

Figure S8

The time course of phosphorylated EGF receptor (pEGFR), phosphorylated Akt (pAKT) and phosphorylated S6 (pS6) in response to an impulse input of EGF over Inline graphic seconds with an intensity of Inline graphic ng/ml (dots). Data are Western blots measurements, described in [45]. The lines represent the Inline graphic and Inline graphic percentiles of the evolution of the species for Inline graphic parameters sampled from the posterior distribution computed using ABC SMC.

(TIFF)

Figure S9

The time course of phosphorylated EGF receptor (pEGFR), phosphorylated Akt (pAKT) and phosphorylated S6 (pS6) in response to a step input of EGF over Inline graphic minutes with an intensity of Inline graphic ng/ml (dots). Data are Western blots measurements, which have been generated and published by [45]. The lines represent the Inline graphic and Inline graphic percentiles of the evolution of the species for Inline graphic parameters sampled from the posterior distribution computed using ABC SMC.

(TIFF)

Figure S10

(A) A noisy Inline graphic-impulses EGF input signal: the Inline graphic pulses are of intensity Inline graphic ng/ml and length Inline graphic seconds spaced by Inline graphic seconds with an additive background noise which is the absolute value of a gaussian white noise of variance Inline graphic. (B) The mutual information between the time course of the Inline graphic species of interest under the noisy input signal represented in (A) and the time course of the species under each of the following Inline graphic possible experiments: an impulse stimulus of length Inline graphic seconds with Inline graphic possible intensity (Inline graphic, Inline graphic, Inline graphic, Inline graphic and Inline graphic ng/ml), a step stimulus of length Inline graphic minutes with Inline graphic possible intensity (Inline graphic, Inline graphic, Inline graphic and Inline graphic ng/ml) and a ramp stimulus of length Inline graphic minutes with Inline graphic possible final intensity (Inline graphic, Inline graphic and Inline graphic ng/ml).

(TIFF)

Figure S11

The predicted time course of the proteins pEGFR, pAKT and pS6 under the noisy Inline graphic-impulses EGF input signal with a noise of high intensity represented Figure S10 A. In the left panel, the prediction is based on the initial dataset whereas in the right panel in addition to the initial data it is also based on the outcome of the step stimulus with intensity Inline graphic ng/ml, which is the experiment with the highest mutual information. The scale of the y-axis is different for each figure.

(TIFF)


Articles from PLoS Computational Biology are provided here courtesy of PLOS

RESOURCES