Skip to main content
PLOS Computational Biology logoLink to PLOS Computational Biology
. 2014 Jul 17;10(7):e1003522. doi: 10.1371/journal.pcbi.1003522

Poisson-Like Spiking in Circuits with Probabilistic Synapses

Rubén Moreno-Bote 1,2,*
Editor: Yasser Roudi3
PMCID: PMC4102400  PMID: 25032705

Abstract

Neuronal activity in cortex is variable both spontaneously and during stimulation, and it has the remarkable property that it is Poisson-like over broad ranges of firing rates covering from virtually zero to hundreds of spikes per second. The mechanisms underlying cortical-like spiking variability over such a broad continuum of rates are currently unknown. We show that neuronal networks endowed with probabilistic synaptic transmission, a well-documented source of variability in cortex, robustly generate Poisson-like variability over several orders of magnitude in their firing rate without fine-tuning of the network parameters. Other sources of variability, such as random synaptic delays or spike generation jittering, do not lead to Poisson-like variability at high rates because they cannot be sufficiently amplified by recurrent neuronal networks. We also show that probabilistic synapses predict Fano factor constancy of synaptic conductances. Our results suggest that synaptic noise is a robust and sufficient mechanism for the type of variability found in cortex.

Author Summary

Neurons in cortex fire irregularly and in an irreproducible way under repeated presentations of an identical stimulus. Where is this spiking variability coming from? One unexplored possibility is that cortical variability originates from the amplification of a particular type of noise that is present throughout cortex: synaptic failures. In this paper we found that probabilistic synapses are sufficient to lead to cortical-like firing for several orders of magnitude in firing rate. Moreover, the resulting variability displays the property that variance of the spike counts is proportional to the mean for every cell in the network, the so-called Poisson-like firing, a well-known property of sensory cortical firing responses. We finally argue that far from being harmful, probabilistic synapses allow networks to sample neuronal states and sustain probabilistic population codes. Therefore, synaptic noise is not only a robust mechanism for the type of variability found in cortex, but it also provides cortical circuits with computational properties to perform probabilistic inference under noisy and ambiguous stimulation.

Introduction

Cortical neurons respond to repeated presentations of the same stimulus in a remarkably idiosyncratic way, and no identical responses are observed twice [1], [2], [3], [4]. Although the spike count responses are on average reproducible (ibid.), they display high variability. It is well established that in evoked conditions the variance of the spike count over time windows of a few hundreds of milliseconds is closely proportional to the mean spike count, which in turn implies that the Fano factor –variance to mean ratio– is approximately constant as a function of firing rate [2], [3], [5], [6], [7]. Approximate Fano factor constancy is not only found for a small range of evoked firing rates, but rather it holds for the whole observable dynamic firing range of cortical neurons, which covers from a few to hundreds of spikes per second [2], [5], [7], [8]. In addition, Fano factor constancy is not just a property of the distribution of a neuronal population, but every single neuron in the population displays Fano factor constancy over its whole dynamical range [3], [7]. This single-cell, whole dynamical range property is referred here to as Poisson-like firing, in analogy to the Poisson process, whose Fano factor is rate-independent.

Theoretical neuronal network models often invoke a balance between excitatory and inhibitory inputs to describe the high spiking variability observed in cortex, a mechanism that leads to complex or chaotic firing behavior that is Poisson-like at low firing rates [3], [9], [10], [11]. These networks can also be adapted to display bistable dynamics to model working memory tasks, and it has been shown that they can generate Poisson-like firing during persistent activity even at moderately high firing rates [12], [13], [14], [15], [16], [17]. Relatively less attention, however, has been paid to study the origin of Poisson-like variability over a full continuum (non-discrete) of firing rates ranging from a few to hundreds of spikes per second [18], as observed experimentally in sensory areas [3], [9], [10], [11]. As we show below, although balanced excitatory and inhibitory networks are well suited to generate Poisson-like firing at low rates, balanced networks fire with low variability as their firing rate increases continuously unless connectivity parameters are fine-tuned [12] or inputs to the network are themselves Poisson-like [18]. Introducing Poisson-like inputs to obtain Poisson-like outputs is a valid solution to the problem of how variability is generated in cortex. However, this solution might seem unsatisfactory because it does not address the problem of how and where Poisson-like inputs are originated in the first place. Moreover, the notion of Poisson-like inputs to sensory cortical areas fails to find strong experimental support, because LGN spike train inputs to V1 display Fano factors decreasing by two-fold or more as a function of firing rate [19], [20]. In summary, although there is a solid understanding of how Poisson-like variability arises from the chaotic balanced dynamics of neuronal networks at low rates [3], [9], [10], [11], or in discrete high rate persistent states [12], [13], [14], [15], the mechanisms underlying single-cell Poisson-like variability over a broad continuum of rates have not yet been elucidated.

Other sources of noise in neuronal networks that have so far been largely neglected might be responsible for cortical spiking variability, in particular at high firing regimes. A well-documented source of variability in the central nervous system is synaptic transmission failures [21], [22], [23]. Synaptic vesicle recovery and release has complex time history dependences [24], [25], [26], [27], but at the finest level synaptic transmission is fundamentally probabilistic [21], [22]. In this paper, we show that amplification of synaptic noise generated by realistically small postsynaptic potentials through recurrent connections is sufficient to generate Poisson-like spiking over several orders of magnitude in the firing rate. Other variability-inducing mechanisms, such as random synaptic delays or intrinsic spike generation jittering [28], [29], constitute negligible sources of spiking variability at high rates.

Results

A single neuron case

To understand under what conditions Poisson-like firing can be generated, we simulated a single leaky integrate-and-fire neuron with various types of input white noise. We first considered the case where the mean input variance is constant as the mean input rises (Fig. 1, dashed lines). In correspondence to the responses of sensory cells to stimuli with increasing intensity (e.g. contrast in V1 [5], [30]), boosting the mean input drive of the neuron increases its firing rate and mean membrane potential (Fig. 1a,d). Although the Fano factor is close to one at low input drive (Fig. 1b, dashed line), when the input drive is above the threshold Inline graphic (vertical line) the Fano factor drops to very small values. This in turn implies that the Fano factor decreases to low values at high rates (>50 Hz) (Fig. 1c, dashed line). At low rates the mean current is below the threshold current (sub-threshold regime) and spiking is induced by membrane potential fluctuations (Fig. 1e, light green trace), while at high rates the mean current is above the threshold current (supra-threshold regime) and spiking is mainly induced by voltage threshold crossings around the mean membrane potential trajectory (dark green trace), leading to a very regular spike train with low Fano factor.

Figure 1. Poisson-like output firing requires Poisson-like inputs in a single neuron.

Figure 1

(a) Firing rate as a function of mean input drive with constant noise (dashed lines) and Poisson-like input noise Inline graphic (solid lines). (b,c) Fano factor as a function of mean input drive (b) and firing rate (c). Vertical red line indicates threshold current Inline graphic, defined as the minimal current to elicit firing in absence of input noise. (d) Mean membrane potential as a function of mean input drive. (e) Membrane potential traces corresponding to color dots in the previous panels: low (light green) and high (dark green) firing rate with constant input noise, and low (light blue) and high (dark blue) firing rate with Poisson-like input noise. Low and high firing rates conditions were chosen such that firing rates were comparable for the two input noise types.

As constant input noise was not able to keep high the Fano factors at high rates, we considered next the scenario where the input variance grows in proportion to the mean input current (Inline graphic). This manipulation corresponds to the case where inputs are Poisson-like because a rate-independent Fano factor in the input spike trains implies proportionality between input variance and mean [31]. In this scenario, the output Fano factor remains approximately constant even at very high rates (>100 Hz) (Fig. 1c, solid line). Like the constant input noise case, at low rates spikes are induced by small membrane potential excursions around the mean membrane potential trajectory that sporadically reach spiking threshold (Fig. 1e, light blue trace). However, unlike the constant input noise case, at high rates the membrane potential undergoes large fluctuations that cause bursts of spikes followed by silence periods (dark blue trace), leading to high spiking variability even at elevated firing.

The two scenarios described above are a priori possible in recurrent networks. If the Fano factor of the afferent spike trains to a neuron in the network decays with firing rate as Inline graphic (as in dashed line of Fig. 1c), the input noise becomes approximately constant because Inline graphic [31]. In this scenario we find that the Fano factor of the output spike train is Inline graphic, and therefore the Fano factor displays the same firing rate scaling as that in the inputs. If, in contrast, the Fano factor of the afferent spike trains is constant with firing rate, Inline graphic, then the input noise becomes Poisson-like because Inline graphic. In this scenario the Fano factor of the output spike train is approximately independent of firing rate, Inline graphic (solid line in Fig. 1c), like the Fano factor in the input, and it again displays the same firing rate scaling as that in the inputs. Therefore the two scenarios are potentially self-consistent in the sense that the same type of variability that is introduced in the inputs is recovered in the outputs.

Breakdown of Fano factor constancy

We sought to determine what type of neuronal variability is self-consistent and stable in recurrent networks, that is, whether the Poisson-like input noise scenario or the constant input noise scenario described above is stable in a recurrent network. We simulated a balanced recurrent spiking network [9], [10], [32] that generated strong excitatory and inhibitory currents. We also stimulated the network with external inputs. The external inputs were designed to be non-Poisson-like because the central question is whether Poisson-like variability can be self-generated by neuronal networks when the external inputs are not in the same Poisson-like family (Fig. 2a, top). The results shown below correspond to networks with non-Poisson-like inputs modeled with constant variance to enforce the experimental constraint that the input Fano factor decreases with firing rate [19], [20].

Figure 2. Approximate Fano factor constancy with probabilistic synapses.

Figure 2

(a) Scheme of a balanced recurrent network with excitatory and inhibitory neurons driven by non-Poisson-like inputs. Bottom: the network is embedded with probabilistic synaptic transmission. The scheme shows how a presynaptic spike train generates stochastic currents on several postsynaptic neurons. (b) Mean firing rate for excitatory (red) and inhibitory (green) populations for a network with probabilistic synapses and noiseless inputs (solid lines) and for a network without probabilistic noise and constant input noise (dashed lines) as a function of the mean input drive. (c–d) Spike count variance and Fano factor as a function of firing rate. Open circles correspond to mean values, and black dots correspond to individual neurons. Line and color codes are as in panel b. (e) Raster plots of 20 randomly selected excitatory and inhibitory neurons for the high firing rate network corresponding to the point marked in blue in panels b–d. Center: sample traces of excitatory and inhibitory current leading to the net input current (black), magnified on the right. Yellow line corresponds to zero net current, and blue trace shows the membrane potential of a randomly selected excitatory neuron. (f) Coefficient of variation of the ISIs, Inline graphic ,as a function of the mean ISI. (g) Distribution of ISIs for the selected neuron. (h) Auto-correlogram (ACG) of the spike train for that neuron.

As the input drive increases, the mean firing rates (Fig. 2b, dashed lines) for the excitatory (red) and inhibitory (green) populations increase accordingly. At low firing rates the Fano factor is high (Fig. 2c,d, dashed lines), a standard property in neuronal networks in the balanced regime [9], [10], [32]. However, the Fano factor drops monotonically to very low values as the mean population rate increases (above 50 spikes per second). As long as the network holds a single stable state (see Methods), the breakdown of approximate Fano factor constancy at high rates occurs regardless of the connectivity matrix of the network, including sparsely, densely and fully connected networks (see below); it also occurs regardless the overall connection strength, that is, the type of synaptic strength scaling used as the network becomes large, and regardless of the intensity of the constant noise. If the network is multi-stable, transitions between different states can exist, but conditioned on each state, the Fano factor is very low. These results are shown analytically for an even broader family of spiking neuronal networks in the Methods (see eqs. (11) and (22)). In particular, for sparse and randomly connected balanced networks the dynamics displays elevated Fano factors at low rates [9], [10], [33], but at high rates (>50 spikes per second) Fano factors fall off (Fig. 3a). This is because the neurons in the network enter in the supra-threshold regime, in which firing is mainly induced by mean membrane potential threshold crossings; as a result variability becomes progressively lower as firing rates increases (Fig. 3a). The same results hold when the network is randomly but densely connected and when the network is fully connected (see Fig. 2d, dashed lines). In addition, if the value of the reset membrane potential is raised, the variability increases at low rates, but the Fano factor does not remain steady at high rates (Fig. 3b). In conclusion, the previous analysis manifests that under a broad range of situations and network designs, the constant input noise scenario (corresponding to Fano factor decreasing monotonically with firing rate) becomes the only stable scenario in recurrent networks.

Figure 3. Sparse connectivity, high reset voltage or deterministic STD does not necessarily produce Poisson-like firing.

Figure 3

(a) Sparse and randomly connected networks display low spiking variability at high rates. (b) Raising the reset membrane potential of the neurons increases the Fano factor at low rates but does not generate Poisson-like firing for a broad range of firing rates. (c) Networks with deterministic STD fail to generate Poisson-like variability at high firing rates. (d) Networks with random spike jittering display low firing variability at high rates. (e) Exact analytical predictions for networks with probabilistic synapses without STD (blue lines) for the firing rate (left) and Fano factor of the spike counts (right). Red and green points correspond to simulations results for excitatory and inhibitory neurons, respectively. Blue solid lines correspond to theoretical predictions.

Poisson-like variability in networks with probabilistic synapses

The previous results suggest that, at least at high rates, additional sources of variability are required to account for Poisson-like spiking. A well-known source of noise in cortex is probabilistic synaptic transmission. Neurotransmitter release at a synapse upon arrival of an action potential is fundamentally stochastic [21], [22] and thus it will result in spiking variability (Fig. 2a, bottom). However, it is not obvious that this source of noise can account for most of the spiking variability observed in cortex. The average number of contacts that a cortical neuron makes on postsynaptic targets is 2–6 [22] and synaptic release is independent across contacts. Therefore it could occur that synaptic noise is mostly averaged out, leaving very little room for its contribution to spiking variability. Whether strong amplification of synaptic noise can be achieved with realistic neurophysiological parameters and whether probabilistic synapses can give rise to Poisson-like variability is unknown.

We studied a balanced recurrent network with probabilistic synapses where the probability that an action potential generated a post-synaptic current underwent stochastic short-term-depression (STD) (see Methods). The network can generate high spiking variability for its full dynamical range when the connections are sufficiently strong even when the external input to the network is noiseless (Fig. 2c, solid lines; see also Fig. 2e, left panel). In the network, a presynaptic spike caused postsynaptic potentials between 0.2 and 1 mV on average, within the neurophysiological range [34], [35]. Therefore weak, independent noise across synaptic contacts can be amplified by strong synapses to generate high fluctuations at the spiking level. The network was not only able to generate high variability, but the Fano factor was approximately constant for at least two orders of magnitude range in firing rate (Fig. 2d, solid lines). Importantly, the Fano factor was not only constant on average over the population (white dots), but also individually for each neuron (black dots) as a function of firing rate.

The neurons' Fano factors increase with the strength of the recurrent connections, but in all cases they remain constant at high rates (shown analytically for general neuronal networks in the Methods). The Fano factor was high and sustained for a broad region of scaling factors of the synaptic strength and input drives (Fig. 4, top panel), but this region vanished at moderately high input drives when the network lacked probabilistic synapses (lower panel). These results hold when synapses display STD dynamics as long as synaptic transmission does not saturate for a very broad range of firing rates [23], [36] (see Methods). When STD is modeled without stochastic release [26], [27] the Fano factor decreases monotonically to very low values at high rates (Fig. 3c). Although high reset and STD are required in some models of delayed persistent activity to generate high variable binary attractor states [13], these mechanisms do not guaranty high variability for a broad continuum of rates, as it has been shown above (Fig. 3b,c). Finally, Poisson-like variability also holds for a stochastic model of synaptic transmission without STD (Fig. 3d; see eqs. (11) and (22) in Methods). In summary, the probabilistic nature of synaptic transmission is sufficient to robustly generate Poisson-like firing at high rates.

Figure 4. Poisson-like variability from probabilistic synapses does not require fine-tuning of the parameters.

Figure 4

The plots display the iso-Fano factor lines on the synaptic scaling factor g vs. input drive plane for a network with (top) and without (bottom) probabilistic synapses. The region for which the Fano factor is high and sustained (shaded area) is broad for a network with probabilistic synapses, but this region vanishes at moderately high rates for a network without probabilistic synapses. Network parameters are as in Fig. 2. The shaded areas are defined as the areas of the planes with Fano factors lying between 0.8 and 1.2.

At elevated rates (blue dots in Fig. 2b–d), as it is the case at low rates, the network generates strong excitatory and inhibitory currents (Fig. 2e, red and green traces in the middle panel, respectively) that approximately cancel, leading to a balanced net input current (black trace) that wanders around zero (yellow line). However, the net input current is on average above zero and close to the threshold current (mean Inline graphic; threshold current Inline graphic). For the rightmost point in the solid lines of Fig. 2b–d, the mean current is supra-threshold. As it has been shown for single neurons (see Fig. 1), supra-threshold currents or currents around threshold generate low Fano factors unless the input noise is Poisson-like. Here, at elevated rates the net current is close or within the supra-threshold regime, and therefore the network must have generated spontaneously a current that is in the Poisson-like family (see next section).

Further support for our theory arises from the puzzling dependences of other statistical measurements of variability with firing rate. It is well established that the coefficient of variation (Inline graphic) of the inter-spike-intervals (ISIs) of cortical neurons (s.d. to mean ratio) decreases at high firing rates [3], [4]. The rate dependence of Inline graphic seems to be at odds with the Fano factor constancy at the same high firing rates. In fact, if spike trains were renewal processes, one would expect that the Fano factor were equal to Inline graphic. Although renewal point processes with absolute spiking refractory periods could explain the drop of Inline graphic at very high rates [4], they cannot explain why the Fano factor does not decay at high rates in the same way. In our recurrent networks with probabilistic synapses, the Inline graphic increases with the mean ISI (Fig. 2f), implying that it decreases as a function of the firing rate. This is so even when the Fano factor remains approximately constant for the whole range of firing rates, particularly at high rates (see Fig. 2d). The reason for this behavior is that the network dynamics generates temporal correlations that make the spike trains non-renewal, with experimentally consistent ISI distributions and auto-correlations functions (Fig. 2g,h). Therefore, a network effect that cannot be understood at the single neuron level gives rise simultaneously to approximate Fano factor constancy and the drop of Inline graphic at high rates (equivalently, at short mean ISIs). Finally, although no fit of the experimental data was performed, the dependence of the Inline graphic as a function of the mean ISI followed well the values and the mean ISI dependence previously reported [4], with values close to one above mean ISIs of 30 ms, and a reduction of variability up to a value of around 0.6 at mean ISIs shorter than 20 ms.

Mechanism for Poisson-like variability

To understand how Poisson-like firing arises from networks with probabilistic synapses, we used a simplified network model where transmission probability is time-independent (Fig. 5a; see Methods). A neuron in the network (pre) receives a barrage of spikes per presynaptic neuron with spike count Inline graphic and variance Inline graphic and generates an output spike train whose spike count Inline graphic has variance Inline graphic. This spike train in turn evokes post-synaptic currents (PSCs) with variance Inline graphic on postsynaptic cells (post). The evoked PSCs are replicas of the same presynaptic spike train that has been diluted by a fraction p, corresponding to the probability of synaptic transmission.

Figure 5. The mechanism for Poisson-like variability in a network with probabilistic synapses.

Figure 5

(a) Scheme of the transformation between input variance Inline graphic in the spike counts of the presynaptic spike trains and output variance Inline graphic of the post-synaptic currents in an open loop network with probabilistic synapses. (b) Precise balancing of two competing forces in a closed-loop network: the integration step tends to lower spiking variance, while the probabilistic synaptic step increases spiking variability. (c) Output variance (solid red line) and the variance of the spike train, Inline graphic (dashed) increase linearly as a function of input variance for fixed input firing rates. Solid line is vertically shifted respect to the dashed line due to the increase of variance by probabilistic synapses, which is uniform for all input variances. The equilibrium point of the network (red point) corresponds to the state where the input and output variances match. (d) The equilibrium point moves linearly with firing rate because the vertical shift induced by probabilistic synapses increases linearly with rate. (e) Spike count variance increases linearly with rate, leading to Fano factor constancy.

There are two competing forces that affect the variability of the spike trains and series of PSCs (Fig. 5b). The first one is the integration step of the neuron, which tends to lower the input variance. And the second one is the probabilistic synaptic step, which increases the variance. These two forces have to cancel out precisely when the network reaches equilibrium, because at equilibrium Inline graphic should equal Inline graphic. More precisely, it can be shown (see Methods) that Inline graphic is proportional to the input variance Inline graphic at fixed input firing rate (dashed red line, Fig. 5c) and that the effect of probabilistic synapses is shifting this line upwards regardless of the value of the input variance (solid red line). The point at which the input and output variances are the same (red dot) corresponds to the equilibrium state of the network. The crucial question is to determine how this equilibrium point depends on the rate of the network. If the firing rate of the network increases, the crossing point moves at higher values linearly with the population rate (Fig. 5d). Because the spike count variance is proportional to the output variance at equilibrium (see dashed line in Fig. 5c), the spike count variance increases linearly with population rate (Fig. 5e). Therefore, the ratio between variance and mean in the spike count is constant in this network, leading to Fano factor constancy. The same Poisson-like generation mechanism takes place in more biophysically realistic networks (Fig. 2), and holds exactly for networks of spiking neurons with probabilistic synapses with constant transmission probability (Fig. 3e; see Methods).

This mathematical exercise (see details in Methods) shows that the presence of both excitation and inhibition is not strictly necessary for Poisson-like variability, since it is possible to obtain high sustained variability in large networks with pure excitation and sufficiently weak synapses. However, although balancing strong excitation with inhibition is not required per se for Poisson-like variability at moderately high rates, the presence of both excitation and inhibition is required to avoid runaway excitation in networks with more realistically strong excitatory synapses [9], [37] (see Fig. 2).

Finally, other types of noise, such as random synaptic delays [1] with arbitrary distributions or ion-channel noise that jitters randomly the timing of the evoked action potentials [38], [39] do not lead to Poisson-like variability (Fig. 3d; see also Methods). Although random synaptic delays, a broad static distribution of synaptic delays [40], or spike jittering can improve the stability of Poisson-like variability, these types of noise are not sufficiently amplified by recurrent neuronal networks at high rates, and therefore they constitute a negligible source of noise.

Membrane potential and synaptic conductance fluctuations

Because probabilistic synapses introduce multiplicative noise at the synaptic level, the membrane potential and synaptic conductances of neurons must show some characteristic statistical properties. We studied these statistical properties in recurrent networks of conductance-based spiking neurons in the high-conductance regime [11], [41]. We found that the standard deviation of the neuron membrane potential is approximately constant as a function of firing rate (Fig. 6a) for both networks with (full line) and without (dashed) probabilistic synapses, consistent with experimental observations [5], [30]. The fact that the constancy of the standard deviation of the membrane potential naturally arises in neuronal networks can be used as a justification of Gaussian rectification models of single cell spiking variability, where the standard deviation is assumed to be constant with firing rate [5]. Both excitatory and inhibitory synaptic conductances increased linearly with rate (Fig. 6b). Interestingly, the Fano factor (FF, variance to mean ratio) of the conductances was approximately constant as a function of firing rate for networks with probabilistic synapses, but it was much smaller and decreasing rapidly with firing rate for networks without probabilistic synapses (Fig. 6c). These results show that the size constancy of the membrane potential fluctuations arises as a result of the shunting effect of mean conductances on the conductance fluctuations [11], [41], [42], while the FF constancy of the synaptic conductances is a natural consequence of the multiplicative noise introduced by probabilistic synapses. The FF constancy of synaptic conductances is an experimentally testable prediction of our theory.

Figure 6. Theoretical predictions: Fano factor constancy of synaptic conductances.

Figure 6

(a) The standard deviation of the membrane potential is approximately constant as a function of firing rate for networks with (full line) and without (dashed) probabilistic synapses. (b) The mean excitatory (red) and inhibitory (green) conductances increase linearly with firing rate. (c) The Fano factor of the synaptic conductances (FF, variance to mean ratio) for a network with probabilistic synapses is constant as a function of the firing rate (full lines), indicating that the variance of the conductance is proportional to the mean conductance. The FF of the synaptic conductances for a network without probabilistic synapses is lower than in the previous case and strongly decreases with firing rate (dashed lines). For all panels, open circles correspond to mean values and black dots correspond to sampled neurons. Error bars represent s.e.m.

Discussion

We have shown that spiking networks endowed with probabilistic synapses lead to Poisson-like variability for several orders of magnitude in firing rate, in line with extensive experimental observations in sensory areas [2], [3], [4], [6], [7]. Poisson-like spiking variability naturally arises from the multiplicative nature of synaptic noise and its amplification through strong recurrent connections and does not require fine-tuning of the network parameters. The multiplicative noise implies that that the size of membrane potential fluctuations is relatively constant with firing rate while the size of synaptic conductance fluctuations grows in proportion to their means.

Other sources of variability could also contribute to evoked cortical spiking variability. Experimentally uncontrolled external variables might artificially introduce spiking variability that is not a property of the system per se. However, even when eye movements are controlled [6] or paralyzed [43], or when the statistical properties of the stimulus are fixed [44], cortical neurons still respond with high Poisson-like variability at all registered neuron firing rates. Photon noise and intrinsic receptors' noise can partly explain cortical spiking variability [45], but at high stimulus intensities this variability is unlikely to represent a major contribution. Internal variables such as attention and arousal might also be at place, but even when they are controlled experimentally, spiking responses are still highly variable [46], [47]. Therefore, the hypothesis that variability is intrinsically generated by neuronal networks with probabilistic synapses is favored against other less specific alternatives in view of the very little explanatory power that the presence of uncontrolled external or internal variables has on the type of spiking variability that is observed in cortex.

At the mechanistic level, a balance between strong excitatory and inhibitory inputs that sets the membrane potential below threshold has become the prevalent model for high cortical spiking variability [3], [9], [10]. Experimental evidence supports that cortical networks are in the balanced regime [48]. In evoked conditions, sensory stimulation drives individual cortical neurons to a state where the mean membrane potential increases with contrast and firing rate is high [5], [30]. As it has been shown (see Fig. 1), in this condition Fano factors are low even in the balanced regime unless input spike trains are themselves Poisson-like, raising the question as to how input Poisson-like variability is generated in the first place and whether this type of inputs is realistic. Precise cancellation of the input currents can potentially clamp the membrane potential to a value below threshold for a broad range of firing rates, but this exquisite cancellation requires fine-tuning of the network parameters for very large networks [12]. It has also been suggested that to produce in vivo high spiking variability, presynaptic spikes need to be synchronous [49], but it was unknown how large input variability caused by synchrony can be generated in recurrent networks. Previous models have also explored the role of synaptic noise in neuronal computations [50], [51], [52], [53], in up-down state transitions [54] and in the spiking variability of single cells or pairs of cells [50], [55], [56], [57], but the role of probabilistic synapses on Poisson-like variability in large recurrent networks or over a broad continuum of firing rates was not studied. As we have shown here, probabilistic synapses generate multiplicative noise that is amplified by recurrent connections without fine-tuning of the network parameters. This mechanism underlies a sufficient requisite for large multiplicative input fluctuations that guarantees Poisson-like spiking for several orders of magnitude in firing rate.

Probabilistic synapses have also the potential to explain at the mechanistic level the origin of high spiking variability in a much broader context than the one that we have considered here. High activity states during delayed persistent activity in working memory tasks are characterized by high spiking variability [16], [17]. Although bistable attractor networks have been shown to display high variability at both spontaneous and moderately high rate persistent activity states [12], [13], [14], [15], the contribution of probabilistic synapses to spiking variability in these networks has not been studied. We have shown that probabilistic synapses stabilize Poisson-like firing for a broad continuum of firing rates in single-attractor networks because this type of noise introduces multiplicative noise. Clearly, probabilistic synapses have also potential to account by itself for the high variability observed during persistent activity in working memory tasks. Therefore in future studies it will be important to elucidate the role of probabilistic synapses on spiking variability and stability of working memory states in bistable attractor networks.

It has been recently shown that stimulus onset reduces the average Fano factor across a broad variety of cortical areas and conditions [58], a reduction that is specific to the transition from spontaneous to evoked activity. This reduced variability has been hypothesized to arise because of the redirection of the system to a particular state configuration during stimulation [58]. It is important to realize however that despite the reduction of variability relative to spontaneous activity, the responses in evoked conditions are still highly variable and the Fano factor is approximately constant with neuron's firing rate, as it has been shown by many previous studies [2], [3], [4], [6], [7]. Simulated neuronal networks based on balanced inputs with weak multi-attractor states can account for the finding that variability is reduced at stimulus onset [12], [33], [59], [60], but they leave unanswered why the Fano factor remains approximately constant in a broad range of firing rates in evoked conditions. In the general condition as in the specific networks studied in those works, increasing the input will eventually guide neurons to the supra-threshold regime, where firing is due to quasi-deterministic membrane potential threshold crossings and Fano factors decrease with increasing firing rate (see Fig. 1b). As we have demonstrated, balanced neuronal networks with probabilistic synapses can generate Fano factor constancy for a wide range of firing rates in evoked conditions even in the supra-threshold regime because synaptic noise is multiplicatively scaled up with firing rate.

Finally, injecting noise in the brain with probabilistic synapses might seem harmful at a first glance. Therefore it can appear that we have presented a “solution” to the Poisson-like variability problem, but we have “created” a new one: boosting neuronal variability. However, noisy systems can have an advantage against deterministic systems in detecting sub-threshold stimuli [61], learning more quickly [62], and displaying larger memory capacity [63]. Injecting noise through probabilistic synapses is particularly relevant in view of the new computational capabilities that neuronal networks with Poisson-like firing acquire, allowing neuronal codes to be in the appropriate format to perform optimal cue combination [64] and sampling cortical states over the whole dynamical range [65]. Therefore, synaptic noise is not only a robust and sufficient mechanism for the type of variability found in cortex, but it can also provide cortical circuits with computational tools to perform probabilistic inference under noisy and ambiguous conditions.

Methods

Spiking network with probabilistic synapses

We consider a network of leaky integrate-and-fire (LIF) neurons with Inline graphic cells, Inline graphic of which are excitatory and Inline graphic are inhibitory [10], [66], [67], [68]. The membrane potential of neuron i below the spiking threshold obeys

graphic file with name pcbi.1003522.e036.jpg (1)

where Inline graphic is the membrane capacitance, Inline graphic is the passive leak conductance and Inline graphic is the resting state potential. The membrane time constant of the neuron is defined as Inline graphic.The neuron emits a spike when the membrane potential reaches the threshold Inline graphic, after which the potential is reset to Inline graphic. The total synaptic current Inline graphic delivered to the neuron is

graphic file with name pcbi.1003522.e044.jpg (2)

where the first term corresponds to the currents generated by other cells in the network, while the two last ones correspond to the current generated by external sources to the network. Inline graphic is the connectivity strength of contact k between the presynaptic neuron j and the postsynaptic neuron i. We typically consider 2–6 contacts per pair of connected neurons. Auto-synapses are not included in the network, i.e. Inline graphic for all i. The sum over m corresponds to the spikes times of each presynaptic cell j, denoted Inline graphic, Inline graphic. Each spike from neuron j can potentially generates a stereotyped current after a delay Inline graphic on the postsynaptic cell proportional to the synaptic kernel Inline graphic, such that

graphic file with name pcbi.1003522.e051.jpg (3)

With this choice, the total charge injected in the neuron due to a presynaptic spike at time Inline graphic is determined by Inline graphic, where Inline graphic is the synaptic variable that specifies the amount of neurotransmitter released at contact k between postsynaptic neuron i and presynaptic neuron j at the time of the presynaptic spike Inline graphic. The dynamics of the synaptic variables are described by the end of this section. As synaptic kernel, we choose

graphic file with name pcbi.1003522.e056.jpg (4)

where Inline graphic is the synaptic decay time constant of the postsynaptic current, which can be excitatory with time constant Inline graphic or inhibitory with time constant Inline graphic. The external current consists of a deterministic component (mean drive) Inline graphic, and a white noise process Inline graphic with variance Inline graphic. Here Inline graphic is a white noise process with zero mean and unit variance independent across neurons, i.e. Inline graphic and Inline graphic, where Inline graphic is the Kronecker's delta, and Inline graphic is the Dirac's delta function.

We endowed synapses with a probabilistic transmission model where the synapses evoke successfully postsynaptic currents with a fixed probability upon presynaptic spike arrival if a vesicle is ready to be released, and the replenishment of the vesicle is stochastic with an exponential distribution over time [50]. This model is based on deterministic models of short-term-depression (STD) in vitro [26], [27]. We further modified the deterministic models of STD for in vitro slices to incorporate the lack of depression observed at high rates in vivo [23], [25], [36] and in some neuronal population in vitro [69] as follows: upon vesicle release, a new vesicle is immediately ready to be released with the same probability, but with a lower neurotransmitter load. This model creates effectively a lower bound in synaptic efficacy, allowing for non-saturating responses at high firing rates. Other biophysical implementations of non-saturation and stochastic release at high rates (∼100–200 Hz) are also possible (such as, simply, very fast vesicle replenishment times), but the results of our work do not depend on the particularities of this implementation. Neuronal networks with STD models without the experimentally motivated non-saturating synapses cannot display firing above ∼50 Hz due to synaptic exhaustion with standard in vitro parameters, a firing rate condition in which Poisson-like variability is commonly observed in sensory areas [2], [4], [8]. Full details of the model are given next.

We first specify the dynamics of the synaptic variable Inline graphic, defined as the amount of neurotransmitter released at each synaptic contact Inline graphic between the postsynaptic neuron i and the postsynaptic neuron j at the arrival of the m-th spike from neuron j, Inline graphic. Associated to this variable, there is a neurotransmitter availability variable Inline graphic that specifies how much neurotransmitter is ready for release at any time Inline graphic.

The stochastic model of synaptic transmission at each synaptic contact is as follows and independent across contacts: (1) upon arrival of a spike at time Inline graphic a vesicle from a readily releasable pool fuses the membrane and releases its content Inline graphic with probability Inline graphic. If neurotransmitter is released, the synaptic variable equals the amount of neurotransmitter that is released by the vesicle, Inline graphic, and Inline graphic otherwise. (2) Immediately after release, a vesicle from a readily releasable pool with low neurotransmitter load becomes available. It has an amount of neurotransmitter Inline graphic, Inline graphic. (3) The time it takes this vesicle to be replaced by a vesicle with high neurotransmitter load, Inline graphic, is a random variable following an exponential distribution with mean Inline graphic.

With the above choice of the maximum value Inline graphic, the synaptic strength at each contact is quantified by Inline graphic. The dynamics of probabilistic synapses has been simulated as follows: for each synaptic contact that has been partially depleted to the value Inline graphic, a random exponentially distributed time was generated as Inline graphic where Inline graphic is uniformly distributed in the interval [0,1]. Once this time has elapsed, the synaptic contact was replenished to its maximum value Inline graphic. When Inline graphic synaptic transmission is permitted immediately after a successful transmission. However, the neurotransmitter that can be immediately released is smaller than the maximum allowed value. The choice Inline graphic also ensures that the currents generated by the network do not saturate below 20–50 Hz.

Network of non-leaky integrate-and-fire neurons with probabilistic synapses

We start by describing a recurrent network with non-leaky integrate-and-fire neurons (nLIF) and probabilistic synapses. In an nLIF neuron the leak term (see eq. (1)) has been dropped and the voltage obeys

graphic file with name pcbi.1003522.e090.jpg (5)

For simplicity in the expression we have taken the membrane capacitance Inline graphic. We also normalize the spiking threshold Inline graphic such that the reset membrane potential Inline graphic is defined to be at zero. The nLIF neuron is an excellent approximation for a LIF neuron when inputs are strong and firing rate is high, precisely the situation where Poisson-like firing breaks down in LIF networks. Therefore, showing that networks of nLIF neurons with probabilistic synapses give rise to Poisson-like firing will mean that the same property holds for LIF networks with probabilistic synapses. In the main text we show that the qualitative results derived for networks of nLIF neurons also apply to networks of LIF neurons.

The total synaptic current Inline graphic delivered to the neuron is

graphic file with name pcbi.1003522.e095.jpg (6)

identical to eq. (2) but where we use instead a simplified model of probabilistic synapse. Specifically, each synaptic variable Inline graphic in eq. (6) becomes one with probability p upon spike arrival, and otherwise it is zero, independently across contacts and time. Therefore, in this model the temporal dynamics of synapses is neglected, but the probabilistic nature of synaptic transmission is preserved. The theory that is presented below is valid for any arbitrary synaptic kernel with the properties described in eq. (3). It is worth emphasizing that neglecting the temporal dynamics of synapses modifies the precise values of the steady-state neurons' firing rates and their Fano factors, but the qualitative effects about Poisson-like variability naturally extend to the more realistic case with synaptic dynamics. In the next section we compute exactly the mean activity and covariance of the spike counts across neurons in the network, required to show that a nLIF neuronal network with probabilistic synapses display exactly Poisson-like variability.

First, we rewrite eqs. (5) and (6) in a more convenient way that will highlight the effect of membrane potential resetting. Since the effect of a spike emitted by neuron i is to decrease its membrane potential instantaneously from threshold to the reset values, eq. (5) can be expressed as

graphic file with name pcbi.1003522.e097.jpg (7)

where Inline graphic denotes the spike times of neuron i. Eqs. (5) and (6) can be rewritten in matrix notation as

graphic file with name pcbi.1003522.e099.jpg (8)

where Inline graphic and Inline graphic are diagonal matrices with entries Inline graphic and Inline graphic. In the expression, Inline graphic, Inline graphic (recurrent part of the total current) and Inline graphic are vectors with i-th components Inline graphic,

graphic file with name pcbi.1003522.e108.jpg (9)

and Inline graphic, Inline graphic, respectively. The advantage of using eq. (8) instead of eq. (5) is that the non-linear resetting mechanism of the cells is transformed into a term indistinguishable from self-inhibition or negative-feedback.

Now we move to compute the mean spike counts over the neurons in the network. In the following we assume that there is a single attractor state of the system. We start by taking expected values (over all realizations of the white noise processes and initial conditions of the network leading to the same set of active neurons) in the two sides of eq. (8) to obtain

graphic file with name pcbi.1003522.e111.jpg (10)

Since the average membrane potential does not change in the stationary regime if the firing rates of the neurons are positive, the l.h.s. of the equation is zero. Noting that Inline graphic are random variables independent of spike times and both across contacts and synapses, and using that Inline graphic, where Inline graphic is the population firing rate vector, if the rates are non-negative we find that eq. (10) is equivalent to the constraint over the population firing rate vector

graphic file with name pcbi.1003522.e115.jpg (11)

where the effective connectivity matrix Inline graphic has diagonal entries Inline graphic and off-diagonal entries Inline graphic. The matrix Inline graphic explicitly shows the self-inhibitory effect of the reset mechanism. If Inline graphic is invertible, eq. (11) can be readily solved to give an expression for the population firing rate

graphic file with name pcbi.1003522.e121.jpg (12)

Eq. (11), written more generally to include cases where the firing rates can be zero, becomes

graphic file with name pcbi.1003522.e122.jpg (13)

where Inline graphic is the linear rectified function (Inline graphic if Inline graphic, and Inline graphic otherwise). Although we have assumed the presence of a single attractor, this equation allows for multiple solutions in general. In those cases, multi-stability develops in the network, and each state obeys an equation like eq. (11) where the connectivity matrix Inline graphic becomes the original one but where the columns and rows corresponding to the inactive neurons have been removed. The firing properties described below hold for each state but in addition stochastic transitions between the states are possible.

Eq. (11) expresses the required balance between excitatory (E), inhibitory (I) and external inputs in the stationary regime. We can rewrite this equation for the simple but illustrative case where all neurons in population Inline graphic connects with all other neurons but itself in the population Inline graphic with strength Inline graphic, the mean input currents to the E and I populations are Inline graphic and Inline graphic, respectively, spiking threshold for all neurons is Inline graphic, and there is a single contact per neuron pair with the same transmission probability p for all synapses. In this case eq. (11) it is equivalent to the set of linear equations

graphic file with name pcbi.1003522.e134.jpg (14)

Where Inline graphic and Inline graphic are the firing rates of E and I neurons. The first (second) equation describes the mean input currents to an E (I) neuron, and it shows that the mean excitatory, inhibitory and external mean currents and the effective depolarizing current proportional to Inline graphic cancel precisely in such a way that the sum of them is precisely zero. Therefore, in the stationary regime the network settles down in a set of firing rates that satisfies precisely this balance equation [9]. Note that if the connectivity strengths are large (Inline graphic), the cancelation mainly occurs between a large excitatory mean drive by a large inhibitory mean drive. Fig. 2e shows how the mean currents are dynamically cancelled, leading to a state of low firing rates.

As a next step, it is useful to show that the average activity over realizations of the white noise process equals its temporal average. We first integrate eq. (8) from time zero to a long time Inline graphic as

graphic file with name pcbi.1003522.e140.jpg (15)

where Inline graphic is the spike count population vector in the time window Inline graphic with components Inline graphic, Inline graphic. The integrated current has components

graphic file with name pcbi.1003522.e145.jpg (16)

where Inline graphic is the number of successful synaptic transmissions at contact k from neuron j to i. This number is a random number that depends on the number of actual spikes from neuron j, Inline graphic, as

graphic file with name pcbi.1003522.e148.jpg (17)

where Inline graphic is a normally distributed variable independent across synaptic contacts. The first term in the equation corresponds to the average number of successful spike transmissions at the synaptic contact k from neuron j to i, while the second term correspond to the fluctuations of the number of successful transmissions around the mean, which becomes a Gaussian variable for long Inline graphic. Note that the mean and variances of the Gaussian are proportional to the mean and variances of a Bernoulli process with success probability Inline graphic. Strict equality of eq. (17) holds for Inline graphic.

Now it is easy to extract useful information from eq. (15). For long Inline graphic the terms proportional to Inline graphic and to the spike counts Inline graphicdominate. Therefore, for Inline graphic and positive spike counts eq. (15) reduces to

graphic file with name pcbi.1003522.e157.jpg (18)

This equation is identical to eq. (11) if the expression is divided by Inline graphic, hence showing the equivalence between realization and temporal averages.

Finally, we compute the covariance matrix of the spike counts across pairs of neurons using the previous equations. We start by splitting the recurrent term Inline graphic in eq. (15) into deterministic and fluctuation terms and lumping together the normally distributed random variables across contacts at each synapse. It can be shown that eq. (15) can be rewritten in a more convenient way as

graphic file with name pcbi.1003522.e160.jpg (19)

where Inline graphic is a random matrix with entries

graphic file with name pcbi.1003522.e162.jpg

(Inline graphic are i.i.d. normally distributed variables), and Inline graphic is a vector whose components are the squared root of the spike counts up to time Inline graphic, Inline graphic, Inline graphic. Taking expectations in both sides of eq. (19) we find

graphic file with name pcbi.1003522.e168.jpg (20)

Subtracting eq. (20) from eq. (19), and defining the membrane potential and spike count fluctuations as Inline graphicand Inline graphic respectively, we obtain

graphic file with name pcbi.1003522.e171.jpg (21)

This equation is the basis to obtain the covariance matrix of the spike counts. Solving for Inline graphic in eq. (21), we find after some laborious algebra that the covariance of the spike count can be written up to order Inline graphic as

graphic file with name pcbi.1003522.e174.jpg (22)

where Inline graphic is a rate-dependent diagonal matrix with entries

graphic file with name pcbi.1003522.e176.jpg (23)

and “tr” stands for matrix transpose.

We note that in eq. (22) the noise introduced by probabilistic synaptic transmission is multiplicative with the rate (see eq. (23)), and that it enters as a diagonal matrix that is further amplified and transformed by the recurrent connections. These results fully and exactly describe the first and second-order firing statistical properties of nLIF recurrent networks, opening in turn the door to study correlations in spiking recurrent networks with probabilistic synapses.

The probabilistic synaptic model that we have considered so far does not have variability in the amplitude of the synaptic strength. It is possible to include this source of variability in the present formalism by replacing Inline graphic in eq. (23) by

graphic file with name pcbi.1003522.e178.jpg (24)

where Inline graphic is the variance of Inline graphic across successful synaptic transmissions at each contact, and by replacing Inline graphic in eq. (22) by a matrix with diagonal entries Inline graphic and off-diagonal entries Inline graphic.

The expression for the firing rate and covariance of the spike counts, eqs. (11) and (22), have been derived for the case where delays are fixed and there is not jittering during the generation of the spikes. However, it is possible to show that the same expressions hold when the delays and jitters are random with finite first and second order moments. This shows that noise introduced by random synaptic delays and spike generation jittering constitute a negligible source of noise in nLIF networks.

From the equation of the mean firing rates, eq. (12), and the expression for the covariance matrix of the spike counts, eq. (22), it follows that that the Fano is constant for all firing rates. If the input drive is scaled by a factor Inline graphic, Inline graphic, then according to eq. (12) the firing rates are scaled up by the same factor, Inline graphic. Similarly, if the noise from external sources is small, then the covariance matrix of the spike counts is approximately scaled up by the same factor, Inline graphic because the noise introduced by probabilistic synapses is multiplicative. Since the Fano factor is defined as Inline graphic, scaling up the input drive can modulate the firing rate of individual neurons by several orders of magnitude while their Fano factor remains constant. This finally shows that Poisson-like variability arises in spiking networks with probabilistic synapses by virtue of the multiplicative nature of synaptic noise.

Mechanism for Poisson-like variability

In this section we provide details for Fig. 3 in the main text. Let us assume that the output spike train of the presynaptic neuron, described by the spike count Inline graphic, has a mean count Inline graphic and variance Inline graphic, where Inline graphic is the firing rate of the neuron and Inline graphic is the length of the time window. When this spike train passes through a probabilistic synapse with a probability Inline graphic of successful transmission, a sequence of PSCs is generated with mean count Inline graphic and variance Inline graphic. Note that the output firing rate has been diluted by a fraction Inline graphic, and that the variance contains two terms arising from a doubly stochastic process: the first term comes from the extra variability introduced by the probabilistic synaptic transmission, while the second term is a diluted version of the presynaptic spike train variability. It is crucial to realize that the first term is proportional to the firing rate in the network, while the second term is rate-independent. To close the loop, we need to specify the way that input mean count and variability are transformed into the mean spike count and variability of Inline graphic. Assuming an homogeneous network of Inline graphic neurons with connectivity strength Inline graphic, then a LIF neuron with threshold Inline graphic generates a spike train with mean count Inline graphic and variance Inline graphic, where Inline graphic is the mean external input drive to the neurons in the network. To derive the expression for the mean and variance, we have assumed that spike trains across neurons are approximately independent and that the firing rate is high. Finally, using the relationship between Inline graphic and both rate and Inline graphic, and the fact that input and output variances should be equal in a self-consistent recurrent network, we arrive to the expression Inline graphic. Note that this expression predicts that the spiking variability of the neurons is Poisson-like because it is proportional to the firing rate Inline graphic in the network. Note also that the Fano factor increases with the connectivity strength, and with the number of connections per neuron at fixed connectivity strength.

Network and stimulus parameters

Finally, we provide details about the parameters used in each figure. The parameters for Fig. 1 are as follows. A single neuron obeying eqs. (1)(2) was simulated. The membrane capacitance, leak conductance and leak potential were Inline graphic, Inline graphic and Inline graphic respectively. With those choices, the membrane time constant wasInline graphic. The spiking threshold and reset membrane potentials were set at Inline graphic and Inline graphic respectively [11]. For the case with constant input noise (dashed lines), the standard deviation of the noise was set at Inline graphic. For the case of Poisson-like inputs (solid lines), the variance of the noise grew proportionally with the mean input drive Inline graphic as Inline graphic, where we chose Inline graphic.

The parameters of the network with probabilistic synapses described in Fig. 2 (solid lines) were as follows. A total of 2000 neurons were simulated, 80% of which were excitatory and 20% were inhibitory. The connectivity was all to all. The membrane capacitance, leak conductance, leak potential, and resting potential were as in Fig. 1, while Inline graphic. The connectivity strength Inline graphic of contact k between the presynaptic neuron j in population E and the postsynaptic neuron i in population E took the values Inline graphic, between E and I neurons Inline graphic, between I and E neurons Inline graphic, and between I and I neurons Inline graphic. There were a fixed number of 4 contacts between all pairs of neurons. With these values, a successfully transmitted E presynaptic excitatory spike generates an EPSP of 0.66 mV on E postsynaptic neurons [34], [35]. The synaptic decay time constant of excitatory and inhibitory PSC (see eq.(4)) was Inline graphic and Inline graphic respectively. The probability of release was Inline graphic [22], the recovery time constant of vesicles took the value Inline graphic [26] and the minimum vesicle neurotransmitter fractional load was set at Inline graphic [25], [69], identical for all contacts. The mean input current Inline graphic in eq. (2) was the same for all the neurons, and the input variance Inline graphic was taken to be zero. Simulations were run for 200 s with a one-step Euler method with time step Inline graphic. Fano factors were computed using time windows of 2 s. None of the results presented depend critically on the values of the parameters chosen.

For the network without probabilistic synapses in Fig. 2 (dashed lines), the parameters were as above except for the following. Neurons were driven by noise with constant standard deviation of the noise Inline graphic. We set Inline graphic and Inline graphic, and therefore probabilistic synapses and STD temporal dynamics were absent. To produce comparable rates to those in the network with probabilistic synapses we compensated the larger Inline graphic by reducing by half all synaptic strengths.

In Fig. 3a the parameters were as in Fig. 2 for the network without probabilistic synapses. The network connectivity was random and sparse, with every neuron in the network receiving connections from a small fraction, Inline graphic, of pre-synaptic excitatory and inhibitory neurons randomly chosen. The strength of the connections was as before. In Fig. 3b, the parameters were as in Fig. 2 for the network without probabilistic synapses, with the exception that the reset potential was set at a higher value, Inline graphic. In Fig. 3c the parameters were as in Fig. 2 for the network with probabilistic synapses, but with Inline graphic and Inline graphic. In Fig. 3d, parameters are as in Fig. 3c with the addition of a random delay independently for each spike (spike jitter) and uniformly distributed between 0 and 10 ms. In Fig. 3e the parameters were again as in Fig. 2 for the network with probabilistic synapses, except that Inline graphic, Inline graphic and the synaptic weights were divided by half.

In Fig. 4 parameters were identical as in Fig. 2 when the synaptic scaling factor g is 1. For the cases of scaling factor different from one, all connectivity strengths of the network in Fig. 2 were multiplied by g, keeping fixed all other parameters.

For the conductance-based network with probabilistic synapses in Fig. 6, parameters were as in Fig. 1, with Inline graphic and Inline graphic. Synaptic current were modeled as Inline graphic, where Inline graphic (k = E,I) is the synaptic conductance and the reversal potentials are Inline graphic and Inline graphic. The synaptic conductances follow an equation identical to eq. (2) with Inline graphic, Inline graphic, Inline graphic and Inline graphic. Probabilistic synapses without STD were studied with Inline graphic. For the network without probabilistic synapses (Inline graphic) all synaptic strengths were reduced three-fold to keep firing rates close to those from the network with probabilistic synapses. In addition, external constant noise was added to each neuron with standard deviation Inline graphic. In both networks, there were 100 neurons with a single contact between all pairs of neurons, of which 80 were excitatory and 20 were inhibitory. The synaptic decay time constant of excitatory and inhibitory PSC (see eq. (4)) was Inline graphic and Inline graphic respectively. External current-based inputs were excitatory and identical to all neurons.

Acknowledgments

I thank Alexandre Pouget, Jan Drugowitsch, Adam Kohn, Alex Reyes and Michael Graupner for inspiring discussions.

Funding Statement

This work is supported by a Ramón y Cajal Spanish Award and by the Marie Curie FP7-PEOPLE-2010-IRG grant PIRG08-GA-2010-276795. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

References

  • 1. Faisal AA, Selen LP, Wolpert DM (2008) Noise in the nervous system. Nat Rev Neurosci 9: 292–303. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2. Tolhurst DJ, Movshon JA, Dean AF (1983) The statistical reliability of signals in single neurons in cat and monkey visual cortex. Vision Res 23: 775–785. [DOI] [PubMed] [Google Scholar]
  • 3. Shadlen MN, Newsome WT (1998) The variable discharge of cortical neurons: implications for connectivity, computation, and information coding. J Neurosci 18: 3870–3896. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4. Softky WR, Koch C (1993) The highly irregular firing of cortical cells is inconsistent with temporal integration of random EPSPs. J Neurosci 13: 334–350. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5. Carandini M (2004) Amplification of trial-to-trial response variability by neurons in visual cortex. PLoS Biol 2: E264. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6. Gur M, Beylin A, Snodderly DM (1997) Response variability of neurons in primary visual cortex (V1) of alert monkeys. J Neurosci 17: 2914–2920. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7. Geisler WS, Albrecht DG (1997) Visual cortex neurons in monkeys and cats: detection, discrimination, and identification. Vis Neurosci 14: 897–919. [DOI] [PubMed] [Google Scholar]
  • 8. Shadlen MN, Britten KH, Newsome WT, Movshon JA (1996) A computational analysis of the relationship between neuronal and behavioral responses to visual motion. The Journal of Neuroscience 16: 1486–1510. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9. van Vreeswijk C, Sompolinsky H (1996) Chaos in neuronal networks with balanced excitatory and inhibitory activity. Science 274: 1724–1726. [DOI] [PubMed] [Google Scholar]
  • 10. Amit DJ, Brunel N (1997) Model of global spontaneous activity and local structured activity during delay periods in the cerebral cortex. Cereb Cortex 7: 237–252. [DOI] [PubMed] [Google Scholar]
  • 11. Destexhe A, Rudolph M, Pare D (2003) The high-conductance state of neocortical neurons in vivo. Nat Rev Neurosci 4: 739–751. [DOI] [PubMed] [Google Scholar]
  • 12. Renart A, Moreno-Bote R, Wang XJ, Parga N (2007) Mean-driven and fluctuation-driven persistent activity in recurrent networks. Neural Comput 19: 1–46. [DOI] [PubMed] [Google Scholar]
  • 13. Barbieri F, Brunel N (2007) Irregular persistent activity induced by synaptic excitatory feedback. Front Comput Neurosci 1: 5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14. Roudi Y, Latham PE (2007) A balanced memory network. PLoS Comput Biol 3: 1679–1700. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15. Hansel D, Mato G (2013) Short-term plasticity explains irregular persistent activity in working memory tasks. J Neurosci 33: 133–149. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16. Compte A, Constantinidis C, Tegner J, Raghavachari S, Chafee MV, et al. (2003) Temporally irregular mnemonic persistent activity in prefrontal neurons of monkeys during a delayed response task. J Neurophysiol 90: 3441–3454. [DOI] [PubMed] [Google Scholar]
  • 17. Shinomoto S, Sakai Y, Funahashi S (1999) The Ornstein-Uhlenbeck process does not reproduce spiking statistics of neurons in prefrontal cortex. Neural Comput 11: 935–951. [DOI] [PubMed] [Google Scholar]
  • 18. Lerchner A, Ursta C, Hertz J, Ahmadi M, Ruffiot P, et al. (2006) Response variability in balanced cortical networks. Neural Comput 18: 634–659. [DOI] [PubMed] [Google Scholar]
  • 19. Sestokas AK, Lehmkuhle S (1988) Response variability of X- and Y-cells in the dorsal lateral geniculate nucleus of the cat. J Neurophysiol 59: 317–325. [DOI] [PubMed] [Google Scholar]
  • 20. Sadagopan S, Ferster D (2012) Feedforward origins of response variability underlying contrast invariant orientation tuning in cat visual cortex. Neuron 74: 911–923. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21. Allen C, Stevens CF (1994) An evaluation of causes for unreliability of synaptic transmission. Proc Natl Acad Sci U S A 91: 10380–10383. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22. Branco T, Staras K (2009) The probability of neurotransmitter release: variability and feedback control at single synapses. Nat Rev Neurosci 10: 373–383. [DOI] [PubMed] [Google Scholar]
  • 23. Borst JG (2010) The low synaptic release probability in vivo. Trends Neurosci 33: 259–266. [DOI] [PubMed] [Google Scholar]
  • 24. Dobrunz LE, Stevens CF (1999) Response of hippocampal synapses to natural stimulation patterns. Neuron 22: 157–166. [DOI] [PubMed] [Google Scholar]
  • 25. Zucker RS, Regehr WG (2002) Short-term synaptic plasticity. Annu Rev Physiol 64: 355–405. [DOI] [PubMed] [Google Scholar]
  • 26. Markram H, Tsodyks M (1996) Redistribution of synaptic efficacy between neocortical pyramidal neurons. Nature 382: 807–810. [DOI] [PubMed] [Google Scholar]
  • 27. Abbott LF, Varela JA, Sen K, Nelson SB (1997) Synaptic depression and cortical gain control. Science 275: 220–224. [DOI] [PubMed] [Google Scholar]
  • 28. Mainen ZF, Sejnowski TJ (1995) Reliability of spike timing in neocortical neurons. Science 268: 1503–1506. [DOI] [PubMed] [Google Scholar]
  • 29. Stiefel KM, Englitz B, Sejnowski TJ (2013) Origin of intrinsic irregular firing in cortical interneurons. Proc Natl Acad Sci U S A 110: 7886–7891. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30. Anderson JS, Lampl I, Gillespie DC, Ferster D (2000) The contribution of noise to contrast invariance of orientation tuning in cat visual cortex. Science 290: 1968–1972. [DOI] [PubMed] [Google Scholar]
  • 31. Moreno R, de la Rocha J, Renart A, Parga N (2002) Response of spiking neurons to correlated inputs. Physical Review Letters 89: 288101. [DOI] [PubMed] [Google Scholar]
  • 32. Lerchner A, Sterner G, Hertz J, Ahmadi M (2006) Mean field theory for a balanced hypercolumn model of orientation selectivity in primary visual cortex. Network 17: 131–150. [DOI] [PubMed] [Google Scholar]
  • 33. Rajan K, Abbott LF, Sompolinsky H (2010) Stimulus-dependent suppression of chaos in recurrent neural networks. Phys Rev E Stat Nonlin Soft Matter Phys 82: 011903. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34. Mason A, Nicoll A, Stratford K (1991) Synaptic transmission between individual pyramidal neurons of the rat visual cortex in vitro. J Neurosci 11: 72–84. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35. Komatsu Y, Nakajima S, Toyama K, Fetz EE (1988) Intracortical connectivity revealed by spike-triggered averaging in slice preparations of cat visual cortex. Brain Res 442: 359–362. [DOI] [PubMed] [Google Scholar]
  • 36. Boudreau CE, Ferster D (2005) Short-term depression in thalamocortical synapses of cat primary visual cortex. J Neurosci 25: 7179–7190. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37. Brunel N (2000) Dynamics of sparsely connected networks of excitatory and inhibitory spiking neurons. J Comput Neurosci 8: 183–208. [DOI] [PubMed] [Google Scholar]
  • 38. Faisal AA, White JA, Laughlin SB (2005) Ion-channel noise places limits on the miniaturization of the brain's wiring. Curr Biol 15: 1143–1149. [DOI] [PubMed] [Google Scholar]
  • 39. Faisal AA, Laughlin SB (2007) Stochastic simulations on the reliability of action potential propagation in thin axons. PLoS Comput Biol 3: e79. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40. Mattia M, Del Giudice P (2002) Mean-field population dynamics of spiking neurons with random synaptic delays. Artificial Neural Networks - ICANN 2002: 111–116. [Google Scholar]
  • 41. Moreno-Bote R, Parga N (2005) Membrane potential and response properties of populations of cortical neurons in the high conductance state. Physical Review Letters 94: 088103. [DOI] [PubMed] [Google Scholar]
  • 42. Borg-Graham LJ, Monier C, Fregnac Y (1998) Visual input evokes transient and strong shunting inhibition in visual cortical neurons. Nature 393: 369–373. [DOI] [PubMed] [Google Scholar]
  • 43. Kohn A, Smith MA (2005) Stimulus dependence of neuronal correlation in primary visual cortex of the macaque. J Neurosci 25: 3661–3673. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 44. Britten KH, Shadlen MN, Newsome WT, Movshon JA (1992) The analysis of visual motion: a comparison of neuronal and psychophysical performance. J Neurosci 12: 4745–4765. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45. Baylor DA, Lamb TD, Yau KW (1979) Responses of retinal rods to single photons. J Physiol 288: 613–634. [PMC free article] [PubMed] [Google Scholar]
  • 46. Cohen MR, Maunsell JH (2009) Attention improves performance primarily by reducing interneuronal correlations. Nat Neurosci 12: 1594–1600. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 47. Mitchell JF, Sundberg KA, Reynolds JH (2009) Spatial attention decorrelates intrinsic activity fluctuations in macaque area V4. Neuron 63: 879–888. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 48. London M, Roth A, Beeren L, Hausser M, Latham PE (2010) Sensitivity to perturbations in vivo implies high noise and suggests rate coding in cortex. Nature 466: 123–127. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49. Stevens CF, Zador AM (1998) Input synchrony and the irregular firing of cortical neurons. Nat Neurosci 1: 210–217. [DOI] [PubMed] [Google Scholar]
  • 50. de la Rocha J, Moreno R, Parga N (2004) Correlations modulate the non-monotonic response of a neuron with short-term plasticity. Neurocomputing 58: 313–319. [Google Scholar]
  • 51. de la Rocha J, Parga N (2005) Short-term synaptic depression causes a non-monotonic response to correlated stimuli. J Neurosci 25: 8416–8431. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 52. Goldman MS, Maldonado P, Abbott LF (2002) Redundancy reduction and sustained firing with stochastic depressing synapses. J Neurosci 22: 584–591. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 53. Li C, Zheng Q (2010) Synchronization of the small-world neuronal network with unreliable synapses. Phys Biol 7: 036010. [DOI] [PubMed] [Google Scholar]
  • 54. Mejias JF, Kappen HJ, Torres JJ (2010) Irregular dynamics in up and down cortical states. PLoS One 5: e13651. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 55. Rosenbaum R, Josic K (2011) Mechanisms that modulate the transfer of spiking correlations. Neural Comput 23: 1261–1305. [DOI] [PubMed] [Google Scholar]
  • 56. Reich S, Rosenbaum R (2013) The impact of short term synaptic depression and stochastic vesicle dynamics on neuronal variability. J Comput Neurosci 35: 39–53. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 57. Rosenbaum R, Rubin JE, Doiron B (2013) Short-term synaptic depression and stochastic vesicle dynamics reduce and shape neuronal correlations. J Neurophysiol 109: 475–484. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 58. Churchland MM, Yu BM, Cunningham JP, Sugrue LP, Cohen MR, et al. (2010) Stimulus onset quenches neural variability: a widespread cortical phenomenon. Nat Neurosci 13: 369–378. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 59. Litwin-Kumar A, Doiron B (2012) Slow dynamics and high variability in balanced cortical networks with clustered connections. Nat Neurosci 15: 1498–1505. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 60. Deco G, Hugues E (2012) Neural network mechanisms underlying stimulus driven variability reduction. PLoS Comput Biol 8: e1002395. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 61. Riani M, Simonotto E (1994) Stochastic resonance in the perceptual interpretation of ambiguous figures: A neural network model. Physical Review Letters 72: 3120–3123. [DOI] [PubMed] [Google Scholar]
  • 62. Seung HS (2003) Learning in spiking neural networks by reinforcement of stochastic synaptic transmission. Neuron 40: 1063–1073. [DOI] [PubMed] [Google Scholar]
  • 63. Fusi S, Drew PJ, Abbott LF (2005) Cascade models of synaptically stored memories. Neuron 45: 599–611. [DOI] [PubMed] [Google Scholar]
  • 64. Ma WJ, Beck JM, Latham PE, Pouget A (2006) Bayesian inference with probabilistic population codes. Nat Neurosci 9: 1432–1438. [DOI] [PubMed] [Google Scholar]
  • 65. Moreno-Bote R, Knill DC, Pouget A (2011) Bayesian sampling in visual perception. Proc Natl Acad Sci U S A 108: 12491–12496. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 66. Brunel N, Wang XJ (2003) What determines the frequency of fast network oscillations with irregular neural discharges? I. Synaptic dynamics and excitation-inhibition balance. J Neurophysiol 90: 415–430. [DOI] [PubMed] [Google Scholar]
  • 67. Kriener B, Tetzlaff T, Aertsen A, Diesmann M, Rotter S (2008) Correlations and population dynamics in cortical networks. Neural Comput 20: 2185–2226. [DOI] [PubMed] [Google Scholar]
  • 68. Vogels TP, Abbott LF (2009) Gating multiple signals through detailed balance of excitation and inhibition in spiking networks. Nat Neurosci 12: 483–491. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 69. Reyes AD (2011) Synaptic short-term plasticity in auditory cortical circuits. Hear Res 279: 60–66. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from PLoS Computational Biology are provided here courtesy of PLOS

RESOURCES