Abstract
Experimental results suggest that there are two distinct mechanisms of inhibition in cortical neuronal networks: subtractive and divisive inhibition. They modulate the input-output function of their target neurons either by increasing the input that is needed to reach maximum output or by reducing the gain and the value of maximum output itself, respectively. However, the role of these mechanisms on the dynamics of the network is poorly understood. We introduce a novel population model and numerically investigate the influence of divisive inhibition on network dynamics. Specifically, we focus on the transitions from a state of regular oscillations to a state of chaotic dynamics via period-doubling bifurcations. The model with divisive inhibition exhibits a universal transition rate to chaos (Feigenbaum behavior). In contrast, in an equivalent model without divisive inhibition, transition rates to chaos are not bounded by the universal constant (non-Feigenbaum behavior). This non-Feigenbaum behavior, when only subtractive inhibition is present, is linked to the interaction of bifurcation curves in the parameter space. Indeed, searching the parameter space showed that such interactions are impossible when divisive inhibition is included. Therefore, divisive inhibition prevents non-Feigenbaum behavior and, consequently, any abrupt transition to chaos. The results suggest that the divisive inhibition in neuronal networks could play a crucial role in keeping the states of order and chaos well separated and in preventing the onset of pathological neural dynamics.
I. INTRODUCTION
Neurons can be understood as information processing units that transform synaptic input into a spike train output. This transformation is often described by an input-output function, which can be experimentally measured. Recent experiments have demonstrated that different inhibitory mechanisms can modulate this function [1, 2]. These inhibitory mechanisms can be considered to be either subtractive or divisive based on the modulation that is applied on the postsynaptic neurons. The subtractive modulation shifts the sigmoidal input-output function to higher inputs (hyperpolarizing effect), whereas the divisive modulation decreases the slope of the function (also termed the neuronal gain) [3]. Recent studies demonstrated that the two types of modulations are applied on the cortical pyramidal neurons by two distinct inhibitory populations. Dendrite-targeting interneurons provide the subtractive inhibition, whereas divisive inhibition is provided by soma-targeting interneurons [2, 4]. Additionally, the connectivity patterns between these populations were revealed in recent anatomical study in neocortex, where it was shown that the dendrite-targeting interneurons inhibit the soma-targeting but not the other way around [5].
In particular, the role of divisive inhibition (i.e., gain control) has been explored by experimental as well as computational studies, as it is a nonlinear effect that enables more complex functionality into the system. Gain control has been shown to be crucial in human vision [6], sensory processing [7, 8], gaze direction [9], selective attention [10], and motor processing [11]. In simulated networks of neurons, divisive inhibition was shown to prevent some problems (proximity to unstable behavior, sensitivity of dynamics to connectivity parameters, and slow reaction to fast fluctuating input) caused by the lack of divisive inhibition [12]. Additionally it was shown that including divisive inhibition can improve storage capacity in neuronal networks without compromising its dynamical stability [13]. In terms of network dynamics, the divisive modulation was found to regulate the duration of the active and silent phases during rhythmic bursting activity [14, 15].
Despite these findings, there are still open questions about the role of divisive inhibition in the overall network dynamics. In particular, the effect on the transition between different network states is unclear. In the present computational work, transitions from low-amplitude oscillations to high-amplitude paroxysmal oscillations are of interest. A high-amplitude paroxysmal oscillation in local neocortical networks is a model of hypersynchronous activity which indicates pathological dynamics, as, for example, in epilepsy [16]. Experimental studies investigated the role of different elements of neocortical networks and their interaction with the thalamus in the generation of such oscillations [17, 18]. These oscillations closely resemble the spike-wave complexes that characterize the pathological activity during seizures. Despite the fact that the interaction between neocortex and thalamus enhances spike-wave complexes, it was shown that the thalamus is not necessary for the generation or propagation of these paroxysmal oscillations in neocortex [17, 19, 20]. Computational studies have used limit cycles or chaotic attractors to model these paroxysmal oscillations [21–25]. Despite the fact that chaoticity is a property not always found in this type of pathological activity, the chaotic attractors usually have high-amplitude and complex behavior that resembles such paroxysmal activity. Also a three-dimensional chaotic attractor has the same dimensionality as the seizure attractors analyzed from EEG signals [26]. For a review on seizure dynamics and the types of attractors that are used to model this type of activity can be found in Ref. [27]. In the present work, the transition to paroxysmal oscillations is modeled as a transition from a low-amplitude limit cycle (order) to a high-amplitude chaotic attractor (chaos) in a model of local neocortical networks which were shown to exhibit such activity even in isolation. Furthermore, a recent study on the macroscopic behavior of spiking neural networks verifies the coexistence of order and chaos in local networks [28].
One intensively studied route to chaos is via a so-called cascade of period-doubling bifurcations [29]. During a period-doubling bifurcation, a limit cycle is replaced by a new periodic orbit with double the period of the original orbit. Period-doubling bifurcations are well documented in complex neural systems, both theoretically [30, 31] and experimentally [32–34]. Additionally, cascades of period-doubling bifurcations are often found preluding the onset of paroxysmal or irregular behavior in these studies. Interestingly, the cascades of period-doubling bifurcations can happen at a constant transition rate, first described by Feigenbaum [35]. The Feigenbaum constant was since found to apply universally in many dissipative systems in nature [36–38]. The significance of Feigenbaum universality, and particularly of the Feigenbaum constant [35], is that it provides a prediction for the onset of chaos in parameter space. A system complying with Feigenbaum universality has a well-defined relative boundary between order and chaos, whereas a system with non-Feigenbaum behavior can exhibit abrupt transitions between the two states. Non-Feigenbaum behavior is still an open field of study. Some classes of such behavior have already been characterized, mainly in discrete dynamical systems [39–41]. Examples of this behavior in continuous dynamical systems remain poorly understood.
Previous studies of neural population dynamics reported period-doubling transitions [24, 30, 31, 42]. However, none of them, to our knowledge, focused on the influence of inhibitory mechanisms on the period-doubling cascades. The primary aim of this study is to apply bifurcation theory and investigate the role of divisive inhibition in a neural mass model while it undergoes a period-doubling cascade leading to chaos. In particular, we will explore how the system behaves in relevance to Feigenbaum universality in two different cases: while the model includes divisive inhibition and while it uses only subtractive inhibition.
II. MODELS AND METHODS
A. Modeling framework
As in previous studies, neural mass models are used in this work, giving an abstract and macroscopic description of neocortical networks comprised of excitatory and inhibitory populations. The dynamics arise from the interaction of these populations which are expressed by a set of ordinary differential equations (ODEs). In particular, the model used here is an extended version of the spatially localized Wilson-Cowan model [43], which is primarily used to model the oscillatory behavior of neural systems [44–46]. This type of model is conceptually simple and well studied and can be easily analyzed using bifurcation theory [47–49]. Therefore it is an ideal choice for investigating how abstract concepts like subtractive or divisive inhibition can change the network’s behavior at the population level. The model introduced here can be considered as a generalization of the Wilson-Cowan model [43]. This generalization can be used to model not only subtractive inhibition, as featured in the classic model, but also divisive inhibition. In order to achieve this, we consider the excitation and external inputs to be the drivers of the network, whereas the inhibition is used only to modulate the sigmoidal input-output functions of all the units in the network. Distinguishing between drivers and modulators in the network is inspired by the Sherman-Guillery proposition [50]. This separation of drivers from the modulators comes in contrast to the way that inhibition was previously modeled: as a subtraction from the input, that is, as a negative driver. An equivalent result can be obtained instead by shifting the input-output function to higher inputs (see Fig. 1, bottom left), that is, subtractive modulation of the input-output function. This displacement represents the subtractive inhibition in the proposed model. Similarly, the divisive inhibition can be modeled as a gain control mechanism that decreases the slope and maximum output of the input-output functions (see Fig. 1, bottom right). By choosing the logistic function as the input-output function, we can model these modulations by promoting the constants for displacement and slope to variables that can be dynamically controlled by the inhibitory populations in the model. This modification results in a function of three variables . Variable x represents the input or the driver of the unit, variable θ represents the displacement of the sigmoidal curve along the x axis, and variable α represents the slope of the curve. Thus, the input-output function F(driver, subtractive modulator, divisive modulator) is given by:
| (1) |
where the minimum displacement θj and the maximum slope αj are constants, representing the default case when no modulatory inhibition is delivered to the unit. Specifically, these constants differ for different populations: j = {e, s, d} for the excitatory, subtractive, and divisive inhibitory populations, respectively. Note also that the last term in the expression only depends on the variable of slope α and it is used for decreasing the maximum value of the output along with the decrease in slope (see Fig. 1 for a schematic).
FIG. 1.
(Color online) Schematic of the ESD model. The model incorporates an excitatory population (E), a subtractive (S), and a divisive (D) inhibitory population. The lower panels show how the two inhibitory mechanisms modulate the input-output function of their target population. Note the nonreciprocal inhibitory connection between S and D. The external inputs to the populations are omitted in this schematic.
Using the input-output function F we now have an easy way to express subtractive and divisive modulations in the function arguments.
ESD model
The model incorporating both subtractive and divisive inhibition (the ESD model, also see Fig. 1) is given by the following system of ordinary differential equations (ODEs):
| (2) |
where
| (3) |
with j = {e, s, d}. The variables of the system, E, S, and D, express the activity level of the excitatory, subtractive inhibitory, and divisive inhibitory population respectively. The functions Fj are the sigmoidal input-output functions as presented above. Parameters Pj give the external inputs to the units and they are considered to be independent of time in this study. The refractory period (see Ref. [43]) is assumed to be the same for all populations and equal to 1 (omitted here). The connectivity parameter wji ≥ 0 represents the weight of connection from unit i to unit j. The absence of the inhibitory connection wsd is justified by the anatomical findings [5]. Note that the divisive inhibitory population is considered divisive just because it delivers divisive inhibition to the excitatory population. Its self-inhibition connection wdd remains subtractive; no divisive modulation is evidenced, to our knowledge, in neuronal population other than on the pyramidal cells. All three populations are assumed to work at the same time scale, so all time constants are omitted in this model. A schematic of the model can be found in Fig. 1.
ESŚ model
An equivalent model but without the divisive inhibition (ESŚ model) is given by the following system of ODEs:
| (4) |
with the same input-output functions Fj, j = {e, s, ś}, as described in Eq. (1). Note that the only thing that changes compared to the ESD model is the quality of inhibition that is provided by the secondary inhibitory population, Ś, to the excitatory population, E. Instead of being divisive, denoted by D, now it is subtractive, denoted by Ś. This second ESŚ model is created in order to provide a comparison to the ESD model.
Nonideal divisive inhibition
The divisive inhibition in the network can be modeled in a way that is not purely divisive modulation of the input-output function of the excitatory population but rather a combination of divisive and subtractive modulation. This can be thought as a more biologically realistic modulation, which more closely resembles the experimental data [2, 4]. An additional constant parameter q ∈ [0, 1] is introduced in the model in order to express the fraction of divisive modulation that is delivered to the excitatory population. The rest of the modulation, 1-q, is delivered as subtractive. For a schematic see Fig. 5(a). The only change in the ESD model [Eq. (2)] is the input-output function of the excitatory population:
| (5) |
and, consequently,
| (6) |
FIG. 5.
(Color online) Zero-Hopf (ZH) bifurcations can only be found when divisive inhibition is far from being purely divisive (low values of q). (a) Schematic of the model incorporating the q parameter which enables the modeling of nonideal divisive inhibition and creates a spectrum between ESŚ and ESD. (b) Counting the zero-Hopf bifurcations found for different values of q using a random search in the parameter space. (c) Intuitive schematic of how the presence or the absence of the divisive inhibition shapes the boundary between order and chaos. (d) Bifurcation diagram showing an example of the spectrum between ESŚ (for q = 0) and ESD (for q = 1). Note the presence of the saddle-node curve (cyan) near the Andronov-Hopf curve (orange) only for low values of q.
The input-output functions Fs and Fd remain the same as in Eq. (1). We shall use this nonideal divisive inhibition at a later stage in the analysis to simulate mixture models between the ESD and ESŚ models. Table I summarizes all the model parameters we used in this study.
TABLE I.
Parameter sets used for the results shown. For all cases: Pe = 1.1, Ps = Pd = 0, θe = 4, θs = θd = 3.7, αe = 1.3, αs = αd = 2. Note the varying parameters denoted with var.
| Set No | wee | wes | wed or weś | wse | wss | wde or wśe | wds or wśs | wdd or wśś | Figs. |
|---|---|---|---|---|---|---|---|---|---|
| 1 | var | 12 | 28 | 14 | 2 | 20 | 21 | 0 | 2, 3(a-b), 4, 5(d) |
| 2 | 20.7 | 12 | 19 | 14 | 2 | 20 | 21 | 0 | 3(c) |
| 3 | 19.6 | 12 | 19 | 14 | 2 | 20 | 21 | 0 | 3(d) |
| 4 | 21 | 12 | 28 | 14 | 2 | 20 | 21 | 0 | 5(b) |
| 5 | 21 | 11.5 | 24 | var | 1.5 | 20 | 21.5 | 6 | 6 |
| 6 | 21 | 11.5 | var | 15 | 1.5 | 20 | var | 8 | 7(a) |
| 7 | 21 | 11.5 | 24 | var | 1.5 | 19.5 | var | 6 | 7(b) |
| 8 | 21.5 | 12 | var | 16 | 2 | 20 | 18 | 0 | 10 |
B. Feigenbaum number
The Feigenbaum number expresses the rate by which the system undergoes the period-doubling bifurcations en route to chaos. Therefore it can be used as a relative measure of the abruptness of period-doubling cascades. Considering a cascade of period-doubling bifurcations , the Feigenbaum number is given by:
| (7) |
An approximation of this number, based on the first four period-doubling bifurcations R2, R4, R8, and R16, was calculated by use of the following ratio:
| (8) |
This measure can be compared with the Feigenbaum constant δ = 4.6692 ⋯ [35], which was discovered to be a characteristic constant for all one-dimensional maps and many dissipative systems undergoing period-doubling cascades [51, 52]. Values of near the Feigenbaum constant are considered as an indication that the system complies with the Feigenbaum universality, whereas values far from this constant indicate a non-Feigenbaum behavior.
Essentially, the number estimated from a period-doubling cascade offers us a way to classify the route into chaos. If is close to the Feigenbaum constant, then the transition into chaos is well understood and can be reduced to the dynamics of a one-dimensional map. If is far from the Feigenbaum constant, then the transition to chaos is underpinned by more complex processes. Particularly, if , then the onset of chaos is considered relatively more abrupt.
C. Bifurcation analysis using numerical continuation
The toolbox MATCONT [53] was used for the detection of bifurcation points and the numerical continuation of bifurcation curves presented in this work. A fourth-order Runge-Kutta method (ode45) implemented in MATLAB Release 2013b (The MathWorks, Inc., Natick, MA) was used for the numerical integration of the ODEs in all the reported results. Other built-in ODE solvers were also used and produced similar results.
III. RESULTS
In this work, the role of divisive inhibition in the transition from order to chaos through a period-doubling cascade is examined. For this purpose, the model introduced in the previous section was used. The connectivity between the two inhibitory populations is unidirectional and follows the experimental findings in neocortical networks [2, 5]. A schematic of the model with all the connectivity parameters can be found in Fig. 1. A numerical approach was used for its analysis throughout.
The introduced model can exhibit transitions from a limit cycle (order) to a chaotic attractor through a cascade of period-doubling bifurcations. Figure 2(a) shows the bifurcation diagram of such a transition. The cascade parameter used in this example is the self-excitation synaptic weight, wee. The limit cycle emanates from a supercritical Andronov-Hopf bifurcation and the amplitude of the oscillation increases with the increase of the cascade parameter. The first four period-doubling bifurcations are labeled as R2, R4, R8, and R16, denoting the beginning of the period-2, period-4, period-8, and period-16 cycles, respectively. Instances of the phase space at different stages of the transition can be seen in Fig. 2(b), including the chaotic attractor appearing at the end of the cascade. The strange attractor is topologically similar to the Rössler attractor [54], the Sprott D attractor [55], and the Genesio-Tesi attractor [56] exhibiting a stretching and folding mechanism [57]. The period-doubling cascade shown in Fig. 2(a) has a Feigenbaum number close to the Feigenbaum constant, .
FIG. 2.
(Color online) Example of a period-doubling cascade in the ESD model and the different oscillatory states of the system. (a) The bifurcation diagram, where local minima of variable E are plotted against the cascade parameter wee, shows the successive period-doubling bifurcations R2, R4, R8, and R16 and occurring at a rate close to the Feigenbaum constant δ ≈ 4.67. (b) The phase space plots show the different oscillatory states of the system from a period-1 cycle up to the chaotic attractor. The value ranges for these oscillatory states are marked in (a) by corresponding Arabic numerals.
The behavior during the transition in this model, which includes divisive inhibition, is compared with an equivalent one which does not include divisive inhibition. All the parameters of the model are the same; the only thing that changes is the quality of inhibition delivered by the secondary inhibitory population. The two versions of the model are labeled as ESD and ESŚ and are expressed by the sets of ODEs Eq. (2) and Eq. (4), respectively. The first four period-doubling bifurcations R2, R4, R8, and R16 were numerically calculated in both ESD and ESŚ models using the MATCONT toolbox [53] (see Models and Methods). The same toolbox was used to produce the period-doubling bifurcation curves by varying both the cascade parameter and another connectivity parameter. This way it is possible to explore whether the transition from order to chaos is sensitive to changes in network connectivity. The resulting bifurcation diagrams are shown in Figs. 3(a) and 3(b).
FIG. 3.
(Color online) Comparison between the ESŚ and ESD model in terms of their Feigenbaum behavior and their first return maps. [(a)-(b)] Bifurcation diagrams for the ESŚ and ESD models showing the first four period-doubling curves. The parameter wee was used as the cascade parameter in both cases. The lower panels show the calculated of the cascades for varying weś and wed, respectively. [(c)-(d)] First return maps of typical chaotic attractors produced by the ESŚ and ESD model, respectively. The maps are produced by taking the local minima of the variable E (red dots on the attractors in the inset plots). The parameter values used to produce the chaotic attractors are marked with a red cross in the respective bifurcation diagram in (a) or (b) (see also Table I).
As shown in the lower panels of Figs. 3(a) and 3(b), the rate of bifurcation was calculated for each value of the varying parameter on the abscissa (see Models and Methods). This measure was plotted alongside the Feigenbaum constant δ = 4.6692 [35] represented by the dashed line in the same panels. This plot reveals whether the behavior of the model follows Feigenbaum universality. In Fig. 3, an example of the comparison between ESD and ESŚ is shown with self-excitation wee being the cascade parameter and wed (respectively weś for ESŚ) being the varying connectivity parameter.
From the example in Fig. 3, it is obvious that the two versions of the model, ESD and ESŚ, can exhibit a significantly different behavior. While the value is always near the Feigenbaum constant in the ESD model, the value for the ESŚ model is sensitive to changes of the parameter weś and can take a wide range of values. There is actually a linear increase as the parameter weś is increased. The results indicate that the ESD model complies with the Feigenbaum universality and has always smooth transition from order to chaos, whereas the ESŚ model exhibits a non-Feigenbaum behavior with transitions which can be much more abrupt. Similar observations can be made using other cascade parameters, like wds or wse, and other varying connectivity parameters (see also Appendix A).
This difference in behavior between ESŚ and ESD model can be also seen in the first return maps of their chaotic attractors [see Fig. 3(c) and 3(d)]. By taking the local minima of the chaotic activity of excitatory population E, we constructed the first return maps Emin(n + 1) vs Emin(n) where Emin(n) the n-th local minimum. In particular, the chaotic activity was produced using the parameters marked with a red cross in Figs. 3(a) and 3(b). As shown in Figs. 3(c) and 3(d), only the return map of the ESD attractor is one-dimensional and unimodal, indicating compliance to Feigenbaum universality [52]. This type of return map is typical of attractors resulting from a stretching and folding mechanism [57]. In contrast, the ESŚ model produces a two-dimensional and bimodal return map, indicating that the attractor is not folded completely upon the first return; that is, its dynamics cannot be described by a one-dimensional map as the universality requires.
The question that arises is how does the divisive inhibition in the ESD model ensure Feigenbaum behavior preventing any abrupt transitions? Can this observation be generalized to all possible parameter sets that produce transitions to chaos through period-doubling cascade? Studies on non-Feigenbaum behavior suggest that phenomena of codimension-2 or higher can disturb the period-doubling curves in their neighborhood, resulting in arbitrarily abrupt transitions [39]. In Fig. 4, the bifurcation diagrams include also the Andronov-Hopf bifurcation curve and the saddle-node (also known as fold) bifurcation curve along with the period-doubling bifurcation curves. The ESŚ diagram reveals a codimension-2 bifurcation called zero-Hopf bifurcation (or fold-Hopf) at the point where the two bifurcation curves tangentially intersect [58]. As the value indicates, the period-doubling bifurcation curves are disturbed near the zero-Hopf bifurcation resulting into a non-Feigenbaum behavior. For higher values of wds, though, away from the zero-Hopf bifurcation point, the value returns to values near the Feigenbaum constant. A branch of subcritical Neimark-Sacker bifurcation also appears in Fig. 4(a), originating from the zero-Hopf point as expected [58]. This results in the appearance of an unstable torus in phase space for parameter values between the Andronov-Hopf and Neimark-Sacker bifurcation curves (shaded area). Tori (stable and unstable) and quasiperiodic activity are features easily found in the ESŚ model and they are always linked with the zero-Hopf and Neimark-Sacker bifurcations in this model. This and similar findings (see Appendix A) suggest that the appearance of such bifurcations (zero-Hopf and Neimark-Sacker) can be responsible for the non-Feigenbaum behavior and abrupt transitions in our model. The implication of Neimark-Sacker bifurcation in non-Feigenbaum behavior is also documented in another study in which such phenomena are found in a two-dimensional model map [41]. Apparently these bifurcations can appear in the ESŚ system but what about the ESD? Does the divisive inhibition prevent such bifurcations for all possible parameter settings?
FIG. 4.
(Color online) Zero-Hopf (ZH) and Neimark-Sacker bifurcations are implicated in the non-Feigenbaum behavior of the ESŚ model. The Andronov-Hopf bifurcation curves (orange) are plotted alongside the period-doubling bifurcation curves for both the ESŚ and ESD model. A saddle-node bifurcation curve (cyan) was also found near the Andronov-Hopf curve only in the case of ESŚ. These two bifurcation curves tangentially intersect and produce a zero-Hopf (ZH) bifurcation point. Near the ZH point, the value of is increased indicating a non-Feigenbaum behavior in the ESŚ model. In contrast, the ESD model does not exhibit any interaction between Andronov-Hopf and saddle-node curves (at least in this example) and the model continues to obey Feigenabum universality. The shaded area indicates the existence of a torus. Other plotting conventions are the same as in Figs. 3(a)-3(b).
To address this question we introduce an additional parameter in the ESD model. The parameter q ∈ [0, 1] creates a continuum between the two extremes: ESŚ at q = 0 and ESD at q = 1. With any other value within this range, the secondary inhibitory population of the model exhibits a combination of subtractive and divisive inhibition [for a schematic see Fig. 5(a)]. Using this model, it is possible to incorporate the results shown in Figs. 3 and 4 in a single plot and examine what is the behavior of the model between the two extremes. As shown in Fig. 5(d), the behavior of the system switches gradually from non-Feigenbaum (q < 0.2) to Feigenbaum (q > 0.4). The saddle-node bifurcation curve and also the zero-Hopf bifurcation point are found for low values of q, near the ESŚ extreme. Starting from q = 0, the saddle-node bifurcation curve is very close to the Andronov-Hopf bifurcation curve. They tangentially meet each other at the zero-Hopf point and then they diverge rapidly from each other for values of q higher than 0.2. The saddle-node bifurcation curve reaches a cusp point (CP) at around q = 0.51 and returns back to lower values of q, preventing any subsequent interactions with the Andronov-Hopf bifurcation curve for high values of q (q > 0.5). Note also that at around q = 0.4 and wee = 18.3 the two bifurcation curves seem to intersect again but actually they do not; two different fixed points bifurcate separately in this case so no zero-Hopf is produced there. Despite the fact that the cusp point does not always appear in such bifurcation diagrams and therefore the saddle-node bifurcation curve can sometimes reach high values of q (depending on the parameter set used), the two bifurcation curves always seem to diverge from each other for high values of q (data not shown). If this is true for all possible parameter sets, then the appearance of a zero-Hopf bifurcation is impossible for high values of q (near the ESD extreme). Also the presence of the cusp point (CP) indicates that the phenomenon of hysteresis is also possible for low q values. During this phenomenon the whole period-doubling cascade can be skipped and the system can directly transition from a resting state to chaos through a saddle-node bifurcation instead (see Appendix B for an example). Note that the saddle-node bifurcation is considered as the most common transition into seizure dynamics [59].
Next we examined if it is possible to find any zero-Hopf bifurcation points in the parameter space for high values of q. For this purpose, a numerical optimization approach was used to search for such points in the parameter space for different values of q. In particular, the search starts from random points in the parameter space and tries to converge to a fixed point which undergoes simultaneously saddle-node and Andronov-Hopf bifurcations by varying the connectivity parameters. More precisely, the search tries to find fixed points where one of the eigenvalues of the Jacobian matrix is zero, λ1 = 0, and the other two are purely imaginary conjugates, λ2,3 = ±iω [60] (see Appendix C for details). Running the algorithm multiple times with these criteria, it is expected to reveal multiple occurrences of zero-Hopf bifurcation in the parameter space. The algorithm was tested and successfully managed to find the particular zero-Hopf bifurcation points that were first found by MATCONT and shown in Figs. 4 and 5, demonstrating its reliability. After multiple runs (107 for each q value), the overall results of this search are shown in Fig. 5(b). The two curves show all the occurrences of zero-Hopf bifurcation which were found for different values of q. Each curve corresponds to a different subset of varying parameters (see Appendix C for details). Apparently all the zero-Hopf points found in the system are limited in the region near the ESŚ extreme, with q < 0.5. This plot clearly shows that it is impossible for the random search algorithm to find any fixed points undergoing a zero-Hopf bifurcation for values of q ≥ 0.5. This result coupled with some further analysis on the distribution of the zero-Hopf points (see Appendix D) indicates that pure or almost pure divisive inhibition can prevent phenomena like zero-Hopf and Neimark-Sacker bifurcations. Therefore it prevents non-Feigenbaum behavior and abrupt transitions into chaos.
IV. DISCUSSION
The comparison between a model of neocortical networks with divisive inhibition (ESD, q = 1) and an equivalent one without divisive inhibition (ESŚ, q = 0) suggests that gain control plays a special role in the dynamics of such networks. The present numerical study of the transition from order to chaos in this type of networks shows that pure or almost pure divisive inhibition ensures that the transition always complies with Feigenbaum universality. Complying with Feigenbaum universality means that there is always a well-defined boundary (in relative terms) that separates the regions of order and chaos in the parameter space regardless of the specific values of connectivity weights. In contrast, when the divisive inhibition is replaced by subtractive, the transition can be abrupt or smooth depending on the exact parameter settings of the network. Without divisive inhibition, the period-doubling transition to chaos can be considered unpredictable. An intuitive representation of the difference between the two cases is depicted in the schematic of Fig. 5(c). As outlined in the Introduction, similar findings about divisive inhibition (preventing instabilities and the sensitivity to parameter changes) were found in a recurrent network of interconnected neurons with firing rate dynamics [12]. This indicates that this effect of divisive inhibition is not limited only to the abstract Wilson-Cowan type of models. Consequently, the role of divisive inhibition is worth investigating also in spiking neural networks that feature gain modulation (e.g., Ref. [61]).
The non-Feigenbaum behavior, which is only found when the inhibition is far from being divisive (low q values), is linked with the appearance of zero-Hopf and Neimark-Sacker bifurcations. In general, when the model lacks divisive inhibition, it seems to have a more diverse dynamic repertoire with the possibility of exhibiting phenomena of codimension-2 or higher. This increased effective dimension of the dynamics might be the underlying explanation for the non-Feigenbaum behavior found in the model, as suggested in Ref. [39]. The gain control mechanism prevents these phenomena and therefore prevents non-Feigenbaum critical behavior. Additionally, the simple generic structure of this system suggests that gain control might have a similar effect in structurally equivalent systems.
The model is capable of incorporating mixed effects from divisive and subtractive inhibition (using the parameter q). This is because different factors have been suggested to enable and modulate (possibly in combination) divisive inhibition. Such factors include synaptic noise [62, 63] and target positions on the principal cells of the inhibitory input [4]. As both the synaptic noise variance and the target position can vary in a continuous fashion, it is reasonable to include the degree of divisive inhibition as a continuous parameter. We shall elaborate on the implications of linking synaptic noise to our parameter q.
Experimental studies investigated how synaptic noise influences the gain control mechanism of shunting inhibition which is provided primarily by soma-targeting interneurons [2]. By simulating the background synaptic noise with a dynamic clamping technique, it was shown that highly variable synaptic input is required for the modulation of neuronal gain [62]. A similar in vitro approach was taken in Ref. [63], where they controlled the magnitude and the variance of the excitatory and inhibitory conductances independently. In agreement with the previous study, the input-output relationship of pyramidal neurons was divisively modulated proportionally with the variance of the injected conductances. A computational study of this mechanism also demonstrated the importance of highly variable synaptic noise in gain modulation with a biophysically detailed neuron model [64] (for a review, see Ref. [3]). In our model, high values of the q parameter can indicate that the network is functioning under high synaptic noise conditions, that is, high variance of the synaptic currents. By linking the parameter q with high synaptic noise, our model provides hypotheses which can be tested either experimentally with electrophysiological setups or computationally with detailed networks of spiking neuron models. For instance, we predict that when the synaptic noise is increased, the possible dynamic phenomena in the network are more limited (no codimension-2 bifurcations), structures such as tori are less likely to arise, the transition to chaos through period-doubling bifurcations follows a Feigenbaum behavior, and the chaotic attractor can be described by a one-dimensional first return map. In experimental setups, the prediction about tori might prove most interesting, as they would correspond to quasiperiodic oscillations or amplitude modulated oscillations (also observed in experiments [65]).
Our findings might also have implications in the understanding of pathological dynamics in the brain and the role that gain control may play on their onset. The dynamics in pathologies like epilepsy and Parkinson’s disease are characterized by hypersynchronous activity in local networks [16, 66] and, consequently, reduced entropy [67]. This activity can be either regular or irregular but the common feature is the excessive synchronous activity that is detected in electroencephalogram (EEG) or local field potential (LFP) recordings. Such abnormally synchronous oscillations are also characterized as paroxysmal events, that is, featuring rapid fluctuations between extremely high and low values in the mean-field potential. The transition into chaos in our model can be considered as such a transition to a pathological state exactly because of the paroxysmal oscillations that emerge (e.g., see Fig. 8 in Appendix B). Figure 2(a) shows how the local minima get more and more extreme as we progress through the cascade and into the chaotic regime. A similar plot can be produced for the local maxima of the system (not shown) indicating that the range of activity increases while the oscillation becomes more complex. These observations are typical among period-doubling cascades and the resulting chaotic attractors as dictated by the α constant of the Feigenbaum universality [35]. Through these cascades, the activity is pushed to its limits and, consequently, becomes increasingly paroxysmal. Many computational studies modeled pathological dynamics and in particular seizure like activity with chaotic attractors that produce complex paroxysmal oscillations resembling the spike-wave or polyspike-wave complexes usually seen in epilepsy patients’ EEG [24, 25, 31, 42]. In addition, some of these models feature period-doubling cascades at the onset or offset of the paroxysmal activity [24, 31, 42]. Period-doubling cascades were also detected in the analysis of EEG taken from patients of temporal lobe epilepsy [34]. Hence we suggest that the low-dimensional chaotic attractor as introduced here could be identified as a paroxysmal state, with gain control serving as a way to prevent relatively rapid transitions into such a state.
In more general terms, chaotic dynamics in local networks imply a failure of stable periodic activity that is necessary for long-range synchronization at specific frequencies. Hence, any brain function that relies on stable periodic activity of local neocortical networks would be impaired by the onset of chaos. It is well established that long-range synchronization at α and θ frequencies are crucial for memory and other cognitive functions [68]. Furthermore, failure in long-range synchronization at β and γ frequencies is associated with pathologies like schizophrenia [16]. The results presented here suggest that divisive inhibition is responsible for maintaining stable periodic behavior and thus enabling synchronization. Indeed, preliminary simulation results in a paradigm of long-range synchronization between two local networks suggest that the inclusion of divisive inhibition prevents chaotic activity and enhances synchronization. Hence, by preventing abrupt transition into chaos, divisive inhibition could act to prevent the onset of pathological neural dynamics.
ACKNOWLEDGMENTS
C.A.P. was supported by the Wellcome Trust (099755/Z/12/A). M.K. and Y.W. were supported by the Human Green Brain project (http://www.greenbrainproject.org) funded through EPSRC (EP/K026992/1) and the CANDO project (http://www.cando.ac.uk/) funded through the Wellcome Trust (102037) and EPSRC (NS/A000026/1). A.J.T was supported by MRC (MR/J013250/1). This work made use of the facilities of N8 HPC Centre of Excellence, provided and funded by the N8 consortium and EPSRC (Grant No. EP/K000225/1). The Centre is co-ordinated by the Universities of Leeds and Manchester. We thank Gerold Baier, Fred Wolf, and Bulcsú Sándor for helpful discussions.
Appendix A. Abrupt period-doubling cascade and non-Feigenbaum behavior in the ESŚ model
Figure 6 shows an example of an abrupt period-doubling cascade leading to chaos taken from the ESŚ model. The specific parameters used are given in Table I, Parameter Set 5. Comparing it to the typical, fractal in nature, Feigenbaum cascade in ESD (see Fig. 2), this cascade can be considered as a much more abrupt transition into chaos with which is much higher than the Feigenbaum constant [35].
FIG. 6.
Example of an abrupt period-doubling cascade in the ESŚ model. The fourth period-doubling bifurcation comes relatively much more abruptly compared to the previous one .
Figures 7(a) and 7(b) are showing two more examples of non-Feigenbaum behavior produced by the ESŚ model. The specific parameters used in both cases are given in Table I. The period-doubling bifurcation curves, in both of these examples, are found to be distorted or twisted resulting into values of that can vary widely. The presence of the saddle-node bifurcation curves near these cascades is hypothesized to be interfering locally with limit cycles in phase space. This prevents the neighboring limit cycles to bifurcate in a typical Feigenbaum way. Saddle-node and Andronov-Hopf bifurcation curves are also interacting producing the phenomena of zero-Hopf (ZH) and Neimark-Sacker (torus) bifurcations. Such phenomena were implicated in non-Feigenbaum behavior in previous works [39, 41].
FIG. 7.
(Color online) zero-Hopf bifurcation found near the non-Feigenbaum critical behavior in the ESŚ model. The parameter sets for these examples can be found in Table I. The legend applies for both (a) and (b). Other plotting conventions are the same as in Fig. 4.
Appendix B. Transition to chaos through a saddle-node bifurcation (hysteresis phenomenon)
As shown in Fig. 5(d), the system can enter the chaotic region through a saddle-node bifurcation at low values of q. This is possible as a result of a hysteresis phenomenon. Consider the following scenario. We keep q constant at 0.2. By increasing the parameter wee from 17 to 19, the activity starts from a resting state (i.e., converges to a stable fixed point) and remains in the resting state despite the appearance of a limit cycle at the Andronov-Hopf curve (at wee ≈ 18.7). The system is bistable at this point. Then, by increasing wee even more, the limit cycle undergoes the period-doubling cascade while the activity remains at rest. At wee ≈ 21 the stable fixed point collapses on the saddle point and vanishes (saddle-node bifurcation), leaving the system monostable again with the chaotic attractor as the only attractor in phase space. At this point the system transitions from a resting state to a chaotic state directly. A trace of such a transition can be seen in Fig. 8. Until time point 50, the activity is at rest. The saddle-node bifurcation occurs at time point 50 and the activity is chaotic after that. At that time point, the value of the parameter wee reaches 21. See also Table I for the rest of the parameters [same as Fig. 5(d)]. Note that this type of bifurcation, the bistable nature of the behavior, and the direct current shift that is present resemble the experimental signature of seizure like event onset [59].
FIG. 8.
Example of the activity entering chaos directly from the resting state. At time point 50, a saddle-node bifurcation occurs and the node on which the activity was resting vanishes. After that point the trajectory converges to the chaotic attractor.
Appendix C. Random search in parameter space for zero-Hopf bifurcations
A nonlinear optimization method was used to find zero-Hopf bifurcations starting from random points in the parameter space. In particular, fminsearch function was used in MATLAB Release 2013b for the reported results. The search tries to find fixed points where one of the eigenvalues of the Jacobian matrix is zero, λ1 = 0, and the other two are purely imaginary conjugates, λ2,3 = ±iω [60]. The algorithm starts from a random point p0 in the parameter space and tries to solve the optimization problem minp z(p) where the function z(p) is given by:
| (C1) |
The set p is a set of parameters including the three variables E, S, and D and four connectivity weights w. The functions gi, i = {1, 2, 3}, are the right-hand side of the ODEs in Eq. (2) in combination with the input-output function in Eq. (5). Minimizing these functions to 0 is equivalent as solving the system of the three nullclines and therefore finding a fixed point in the phase space. The eigenvalues λi, i = {1, 2, 3}, are the eigenvalues of the Jacobian matrix of the system. Minimizing the product πi ∣λi∣2 ensures that at least one of the eigenvalues is 0 which is the criterion for the saddle-node bifurcation. Given that one of the eigenvalues is 0, minimizing the sum ensures that the other two eigenvalues are complex conjugates with zero real parts. This is the criterion for the Andronov-Hopf bifurcation. The penalty term l is a positive number only when the search algorithm diverges outside the valid parameter space in which the search is limited. This number is proportional to the divergence from the valid parameter space. In all other cases l = 0.
The valid parameter space is enclosed in the range [0 0.5] for each of the three variables E, S, and D and the range [0 50] for each of the varying connectivity parameter w. The results shown in Fig. 5(b) are produced for two different cases, for either p = {E, S, D, wee, wed, wse, wds} or p = {E, S, D, wes, wss, wde, wdd}. Note that both combinations of the connectivity parameters involve all three nullclines of the system. Given that the parameter space is a seven-dimensional space, for each q value the algorithm was run 107 times in order to achieve a reasonably comprehensive search.
Appendix D. Distribution of the zero-Hopf phenomena in the parameter space
This section reports some supporting results about the random search which was performed to find zero-Hopf bifurcation points for different values of the parameter q. As shown in Fig. 5(b), the points were only found for low values of q. Varying the parameters wes, wss, wde, and wdd (red dashed line) all the zero-Hopf points were limited at q ≤ 0.45. Figure 9(a) shows the distribution of these points across the four connectivity parameters. It is evident that all zero-Hopf points are found in a limited range of values below 50 for each parameter. The search was actually limited to values below 50, but even if this limit was higher, it would not return more points. This is true assuming that all zero-Hopf points are lying on a single continuous hypersurface in the parameter space.
FIG. 9.
Box plots showing the distribution of ZH points found in the parameter space for the four varying parameters (a) {wes, wss, wde, wdd} and (b) {wee, wed, wse, wds}. The horizontal line indicates the median value, the box edges indicate the 25th and 75th percentiles and the whiskers extend to include the whole range of values. (a) As q increases, the ZH points are found in an increasingly limited range of values below 50. (b) In contrast, the values for the parameter wed are increasing as q increases and they reach the upper limit of 50.
Figure 9(b) shows the distribution of the same points across the varying parameters wee, wed, wse, and wds [black solid line in Fig. 5(b)]. In this case it is apparent that as the q value increases, the zero-Hopf points are found at increasingly higher values of wed. At q = 0.25, the ZH points are actually found near the arbitrarily chosen upper limit of 50 for wed. This suggests that our results might be biased. The other three parameters (wee, wse, and wds) are clearly limited to values lower than 50 so they do not raise any concerns. In order to check whether more ZH points can be found for higher values of q by varying these parameters, the valid ranges for parameter values were expanded to [0 200] for wed and [0 100] for the other three parameters. The random search was run again and indeed a few more ZH points were found for q = 0.3 and q = 0.35. For these additional ZH points, wed has very high values (in the range [150 200]), whereas the other three parameters remain very limited to values below 50 (data not shown). Apparently, the hypersurface that accommodates the ZH points collapses on just one parameter, namely wed. This result suggests that it might be possible to produce zero-Hopf phenomena even for high values of q just by keep increasing disproportionally wed while the other parameters remain stable at low values. It is also evident from the previously reported results [e.g., see Figs. 3 and 7(a)] that the chaotic region can actually be extended for high values of wed. But as Fig. 10 shows, the chaotic attractor quickly flattens out as wed increases. The max-min amplitude along the dimension D decreases much faster than the max-min amplitude along the other two dimensions making the attractor almost a two-dimensional object in phase space. This flattening of the attractor defeats its modeling purpose. Assuming that all the parameters have the same order to magnitude, ZH points cannot be found for high values of q.
FIG. 10.
Increasing wed and keeping everything else constant quickly flattens out the chaotic attractor. Ampl(X) denotes the max-min amplitude in dimension X. Note that q = 1 in this case. See also Table I for the rest of the parameter values.
So based on these results and assuming that all zero-Hopf points are lying on a single continuous hypersurface and also assuming that all connectivity parameter values have the same order of magnitude, no zero-Hopf bifurcations can be found in a model with strong divisive inhibition (high q value) and therefore any abrupt period-doubling transition to chaos is prevented.
Footnotes
PACS numbers: 87.19.lj, 87.19.xm, 02.30.Oz, 05.45.-a
References
- [1].Mitchell SJ, Silver RA. Shunting inhibition modulates neuronal gain during synaptic excitation. Neuron. 2003;38:433–445. doi: 10.1016/s0896-6273(03)00200-9. [DOI] [PubMed] [Google Scholar]
- [2].Wilson NR, Runyan CA, Wang FL, Sur M. Division and subtraction by distinct cortical inhibitory networks in vivo. Nature. 2012;488:343–348. doi: 10.1038/nature11347. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [3].Silver RA. Neuronal arithmetic. Nature Reviews Neuroscience. 2010;11:474–489. doi: 10.1038/nrn2864. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [4].Pouille F, Watkinson O, Scanziani M, Trevelyan AJ. The contribution of synaptic location to inhibitory gain control in pyramidal cells. Physiol Rep. 2013;1:e00067. doi: 10.1002/phy2.67. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [5].Pfeffer CK, Xue M, He M, Huang ZJ, Scanziani M. Inhibition of inhibition in visual cortex: The logic of connections between molecularly distinct interneurons. Nature Neuroscience. 2013;16:1068–1076. doi: 10.1038/nn.3446. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [6].Wilson HR, Kim J. Dynamics of a divisive gain control in human vision. Vision Research. 1998;38:2735–2741. doi: 10.1016/s0042-6989(97)00328-3. [DOI] [PubMed] [Google Scholar]
- [7].Schwartz O, Simoncelli EP. Natural signal statistics and sensory gain control. Nature Neuroscience. 2001;4:819–825. doi: 10.1038/90526. [DOI] [PubMed] [Google Scholar]
- [8].Malmierca MS, Hernndez O, Rees A. Intercollicular commissural projections modulate neuronal responses in the inferior colliculus. European Journal of Neuroscience. 2005;21:2701–2710. doi: 10.1111/j.1460-9568.2005.04103.x. [DOI] [PubMed] [Google Scholar]
- [9].Trotter Y, Celebrini S. Gaze direction controls response gain in primary visual-cortex neurons. Nature. 1999;398:239–242. doi: 10.1038/18444. [DOI] [PubMed] [Google Scholar]
- [10].Hillyard SA, Vogel EK, Luck SJ. Sensory gain control (amplification) as a mechanism of selective attention: Electrophysiological and neuroimaging evidence. Philosophical Transactions of the Royal Society B: Biological Sciences. 1998;353:1257–1270. doi: 10.1098/rstb.1998.0281. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [11].Treue S, Trujillo J.C. Martinez. Feature-based attention influences motion processing gain in macaque visual cortex. Nature. 1999;399:575–579. doi: 10.1038/21176. [DOI] [PubMed] [Google Scholar]
- [12].Chance FS, Abbott LF. Divisive inhibition in recurrent networks. Network: Computation in Neural Systems. 2000;11:119–129. [PubMed] [Google Scholar]
- [13].Battaglia FP, Treves A. Stable and rapid recurrent processing in realistic autoassociative memories. Neural Computation. 1998;10:431–450. doi: 10.1162/089976698300017827. [DOI] [PubMed] [Google Scholar]
- [14].Tabak J, O’Donovan MJ, Rinzel J. Differential control of active and silent phases in relaxation models of neuronal rhythms. Journal of Computational Neuroscience. 2006;21:307–328. doi: 10.1007/s10827-006-8862-7. [DOI] [PubMed] [Google Scholar]
- [15].Tabak J, Rinzel J, Bertram R. Quantifying the relative contributions of divisive and subtractive feedback to rhythm generation. PLoS Computational Biology. 2011;7 doi: 10.1371/journal.pcbi.1001124. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [16].Uhlhaas PJ, Singer W. Neural synchrony in brain disorders: Relevance for cognitive dysfunctions and pathophysiology. Neuron. 2006;52:155–168. doi: 10.1016/j.neuron.2006.09.020. [DOI] [PubMed] [Google Scholar]
- [17].Steriade M, Contreras D. Spike-wave complexes and fast components of cortically generated seizures. i. role of neocortex and thalamus. Journal of Neurophysiology. 1998;80:1439–1455. doi: 10.1152/jn.1998.80.3.1439. [DOI] [PubMed] [Google Scholar]
- [18].Bazhenov M, Timofeev I, Steriade M, Sejnowski TJ. Potassium model for slow (2-3 hz) in vivo neocortical paroxysmal oscillations. Journal of Neurophysiology. 2004;92:1116–1132. doi: 10.1152/jn.00529.2003. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [19].Timofeev I, Bazhenov M, Sejnowski T, Steriade M. Cortical hyperpolarization-activated depolarizing current takes part in the generation of focal paroxysmal activities. Proceedings of the National Academy of Sciences of the United States of America. 2002;99:9533–9537. doi: 10.1073/pnas.132259899. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [20].Destexhe A, Contreras D, Steriade M. Lts cells in cerebral cortex and their role in generating spike-and-wave oscillations. Neurocomputing. 2001;38-40:555–563. [Google Scholar]
- [21].Wendling F, Bartolomei F, Bellanger JJ, Chauvel P. Epileptic fast activity can be explained by a model of impaired gabaergic dendritic inhibition. European Journal of Neuroscience. 2002;15:1499–1508. doi: 10.1046/j.1460-9568.2002.01985.x. [DOI] [PubMed] [Google Scholar]
- [22].Lopes Da Silva F, Blanes W, Kalitzin SN, Parra J, Suffczynski P, Velis DN. Epilepsies as dynamical diseases of brain systems: Basic models of the transition between normal and epileptic activity. Epilepsia. 2003;44:72–83. doi: 10.1111/j.0013-9580.2003.12005.x. [DOI] [PubMed] [Google Scholar]
- [23].Suffczynski P, Kalitzin S, Lopes Da Silva FH. Dynamics of non-convulsive epileptic phenomena modeled by a bistable neuronal network. Neuroscience. 2004;126:467–484. doi: 10.1016/j.neuroscience.2004.03.014. [DOI] [PubMed] [Google Scholar]
- [24].Huang G, Zhang D, Meng J, Zhu X. Interactions between two neural populations: A mechanism of chaos and oscillation in neural mass model. Neurocomputing. 2011;74:1026–1034. [Google Scholar]
- [25].Freeman WJ. Simulation of chaotic eeg patterns with a dynamic model of the olfactory system. Biological Cybernetics. 1987;56:139–150. doi: 10.1007/BF00317988. [DOI] [PubMed] [Google Scholar]
- [26].Babloyantz A, Destexhe A. Low-dimensional chaos in an instance of epilepsy. Proceedings of the National Academy of Sciences of the United States of America. 1986;83:3513–3517. doi: 10.1073/pnas.83.10.3513. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [27].Deco G, Jirsa VK, Robinson PA, Breakspear M, Friston K. The dynamic brain: From spiking neurons to neural masses and cortical fields. PLoS Computational Biology. 2008;4 doi: 10.1371/journal.pcbi.1000092. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [28].Montbri E, Paz D, Roxin A. Macroscopic description for networks of spiking neurons. Physical Review X. 2015;5 doi: 10.1103/PhysRevX.5.021028. [DOI] [Google Scholar]
- [29].Tresser C, Coullet P, De Faria E. Period doubling. Scholarpedia. 2014;9:3958. [Google Scholar]
- [30].Ermentrout G.Bard. Period doublings and possible chaos in neural models. SIAM Journal on Applied Mathematics. 1984;44:80–95. [Google Scholar]
- [31].Breakspear M, Roberts JA, Terry JR, Rodrigues S, Mahant N, Robinson PA. A unifying explanation of primary generalized seizures through nonlinear brain modeling and bifurcation analysis. Cerebral Cortex. 2006;16:1296–1313. doi: 10.1093/cercor/bhj072. [DOI] [PubMed] [Google Scholar]
- [32].Ishizuka S, Hayashi H. Chaotic and phase-locked responses of the somatosensory cortex to a periodic medial lemniscus stimulation in the anesthetized rat. Brain Research. 1996;723:46–60. doi: 10.1016/0006-8993(96)00214-4. [DOI] [PubMed] [Google Scholar]
- [33].Crevier DW, Meister M. Synchronous period-doubling in flicker vision of salamander and man. Journal of Neurophysiology. 1998;79:1869–1878. doi: 10.1152/jn.1998.79.4.1869. [DOI] [PubMed] [Google Scholar]
- [34].Velazquez J.L. Perez, Cortez MA, Snead O. Carter, III, Wennberg R. Dynamical regimes underlying epileptiform events: Role of instabilities and bifurcations in brain activity. Physica D: Nonlinear Phenomena. 2003;186:205–220. [Google Scholar]
- [35].Feigenbaum MJ. Quantitative universality for a class of nonlinear transformations. Journal of Statistical Physics. 1978;19:25–52. [Google Scholar]
- [36].Libchaber A, Laroche C, Fauve S. Period doubling cascade in mercury, a quantitative measurement. Journal de physique. Lettres. 1982;43:211–216. [Google Scholar]
- [37].Testa J, Prez J, Jeffries C. Evidence for universal chaotic behavior of a driven nonlinear oscillator. Physical Review Letters. 1982;48:714–717. [Google Scholar]
- [38].Makin VS, Makin RS, Vorobyev A.Ya., Guo C. Dissipative nanostructures and feigenbaum’s universality in the “metal-high-power ultrashort-pulsed polarized radiation” nonequilibrium nonlinear dynamical system. Technical Physics Letters. 2008;34:387–390. [Google Scholar]
- [39].Kuznetsov AP, Kuznetsov SP, Sataev IR. A variety of period-doubling universality classes in multi-parameter analysis of transition to chaos. Physica D: Nonlinear Phenomena. 1997;109:91–112. [Google Scholar]
- [40].Kim S-Y, Lim W, Kim Y. Universal bicritical behavior of period doublings in unidirectionally coupled oscillators. Progress of Theoretical Physics. 2001;106:17–37. [Google Scholar]
- [41].Kuznetsov SP, Kuznetsov AP, Sataev IR. Multiparameter critical situations, universality and scaling in two-dimensional period-doubling maps. Journal of Statistical Physics. 2005;121:697–748. [Google Scholar]
- [42].Rodrigues S, Barton D, Szalai R, Benjamin O, Richardson MP, Terry JR. Transitions to spike-wave oscillations and epileptic dynamics in a human cortico-thalamic mean-field model. Journal of Computational Neuroscience. 2009;27:507–526. doi: 10.1007/s10827-009-0166-2. [DOI] [PubMed] [Google Scholar]
- [43].Wilson HR, Cowan JD. Excitatory and inhibitory interactions in localized populations of model neurons. Biophysical Journal. 1972;12:1–24. doi: 10.1016/S0006-3495(72)86068-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [44].Tsodyks MV, Skaggs WE, Sejnowski TJ, McNaughton BL. Paradoxical effects of external modulation of inhibitory interneurons. Journal of Neuroscience. 1997;17:4382–4388. doi: 10.1523/JNEUROSCI.17-11-04382.1997. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [45].Deco G, Jirs V, McIntosh AR, Sporns O, Ktter R. Key role of coupling, delay, and noise in resting brain fluctuations. Proceedings of the National Academy of Sciences of the United States of America. 2009;106:10302–10307. doi: 10.1073/pnas.0901831106. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [46].Wang Y, Goodfellow M, Taylor PN, Baier G. Phase space approach for modeling of epileptic dynamics. Physical Review E - Statistical, Nonlinear, and Soft Matter Physics. 2012;85 doi: 10.1103/PhysRevE.85.061918. [DOI] [PubMed] [Google Scholar]
- [47].Borisyuk RM, Kirillov AB. Bifurcation analysis of a neural network model. Biological Cybernetics. 1992;66:319–325. doi: 10.1007/BF00203668. [DOI] [PubMed] [Google Scholar]
- [48].Pinto DJ, Brumberg JC, Simons DJ, Ermentrout GB. A quantitative population model of whisker barrels: Re-examining the wilson-cowan equations. Journal of Computational Neuroscience. 1996;3:247–264. doi: 10.1007/BF00161134. [DOI] [PubMed] [Google Scholar]
- [49].Monteiro LHA, Bussab MA, Berlinck J.G. Chaui. Analytical results on a wilson-cowan neuronal network modified model. Journal of Theoretical Biology. 2002;219:83–91. [PubMed] [Google Scholar]
- [50].Sherman SM, Guillery RW. On the actions that one nerve cell can have on another: Distinguishing “drivers” from “modulators”. Proceedings of the National Academy of Sciences of the United States of America. 1998;95:7121–7126. doi: 10.1073/pnas.95.12.7121. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [51].Cvitanovic P. Acta Phys. Pol. A. 1984;65 [Google Scholar]
- [52].Strogatz SH. Nonlinear Dynamics and Chaos. 1994.
- [53].Dhooge A, Govaerts W, Kuznetsov Yu.A. Matcont: A matlab package for numerical bifurcation analysis of odes. ACM Transactions on Mathematical Software. 2003;29:141–164. [Google Scholar]
- [54].Rssler OE. An equation for continuous chaos. Physics Letters A. 1976;57:397–398. [Google Scholar]
- [55].Sprott JC. Some simple chaotic flows. Physical Review E. 1994;50:R647–R650. doi: 10.1103/physreve.50.r647. [DOI] [PubMed] [Google Scholar]
- [56].Genesio R, Tesi A. Harmonic balance methods for the analysis of chaotic dynamics in nonlinear systems. Automatica. 1992;28:531–548. [Google Scholar]
- [57].Gilmore R, Letellier C, Lefranc M. Chaos topology. Scholarpedia. 2008;3:4592. [Google Scholar]
- [58].Guckenheimer J, Kuznetsov YA. Fold-hopf bifurcation. Scholarpedia. 2007;2:1855. [Google Scholar]
- [59].Jirsa VK, Stacey WC, Quilichini PP, Ivanov AI, Bernard C. On the nature of seizure dynamics. Brain. 2014;137:2210–2230. doi: 10.1093/brain/awu133. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [60].Izhikevich EM. Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. 2006.
- [61].Vogels TP, Abbott LF. Signal propagation and logic gating in networks of integrate-and-fire neurons. Journal of Neuroscience. 2005;25:10786–10795. doi: 10.1523/JNEUROSCI.3508-05.2005. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [62].Chance FS, Abbott LF, Reyes AD. Gain modulation from background synaptic input. Neuron. 2002;35:773–782. doi: 10.1016/s0896-6273(02)00820-6. [DOI] [PubMed] [Google Scholar]
- [63].Fellous J-M, Rudolph M, Destexhe A, Sejnowski TJ. Synaptic background noise controls the input/output characteristics of single cells in an in vitro model of in vivo activity. Neuroscience. 2003;122:811–829. doi: 10.1016/j.neuroscience.2003.08.027. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [64].Prescott SA, De Koninck Y. Gain control of firing rate by shunting inhibition: Roles of synaptic noise and dendritic saturation. Proceedings of the National Academy of Sciences of the United States of America. 2003;100:2076–2081. doi: 10.1073/pnas.0337591100. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [65].Daz J, Razeto-Barry P, Letelier J-C, Caprio J, Bacigalupo J. Amplitude modulation patterns of local field potentials reveal asynchronous neuronal populations. Journal of Neuroscience. 2007;27:9238–9245. doi: 10.1523/JNEUROSCI.4512-06.2007. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [66].Hammond C, Bergman H, Brown P. Pathological synchronization in parkinson’s disease: networks, models and treatments. Trends in Neurosciences. 2007;30:357–364. doi: 10.1016/j.tins.2007.05.004. [DOI] [PubMed] [Google Scholar]
- [67].Trevelyan AJ, Bruns W, Mann EO, Crepel V, Scanziani M. The information content of physiological and epileptic brain activity. Journal of Physiology. 2013;591:799–805. doi: 10.1113/jphysiol.2012.240358. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [68].Klimesch W. Eeg alpha and theta oscillations reflect cognitive and memory performance: A review and analysis. Brain Research Reviews. 1999;29:169–195. doi: 10.1016/s0165-0173(98)00056-3. [DOI] [PubMed] [Google Scholar]










