Skip to main content
PLOS Computational Biology logoLink to PLOS Computational Biology
. 2024 Feb 20;20(2):e1011886. doi: 10.1371/journal.pcbi.1011886

Intra-ripple frequency accommodation in an inhibitory network model for hippocampal ripple oscillations

Natalie Schieferstein 1,2,*, Tilo Schwalger 2,3, Benjamin Lindner 2,4, Richard Kempter 1,2,5,*
Editor: Jonathan David Touboul6
PMCID: PMC10923461  PMID: 38377147

Abstract

Hippocampal ripple oscillations have been implicated in important cognitive functions such as memory consolidation and planning. Multiple computational models have been proposed to explain the emergence of ripple oscillations, relying either on excitation or inhibition as the main pacemaker. Nevertheless, the generating mechanism of ripples remains unclear. An interesting dynamical feature of experimentally measured ripples, which may advance model selection, is intra-ripple frequency accommodation (IFA): a decay of the instantaneous ripple frequency over the course of a ripple event. So far, only a feedback-based inhibition-first model, which relies on delayed inhibitory synaptic coupling, has been shown to reproduce IFA. Here we use an analytical mean-field approach and numerical simulations of a leaky integrate-and-fire spiking network to explain the mechanism of IFA. We develop a drift-based approximation for the oscillation dynamics of the population rate and the mean membrane potential of interneurons under strong excitatory drive and strong inhibitory coupling. For IFA, the speed at which the excitatory drive changes is critical. We demonstrate that IFA arises due to a speed-dependent hysteresis effect in the dynamics of the mean membrane potential, when the interneurons receive transient, sharp wave-associated excitation. We thus predict that the IFA asymmetry vanishes in the limit of slowly changing drive, but is otherwise a robust feature of the feedback-based inhibition-first ripple model.

Author summary

The hippocampus plays a central role in the acquisition and consolidation of explicit memories. During sleep or rest, hippocampal neurons replay recently acquired memories, while the neuronal network exhibits high-frequency oscillations, so-called ripples. To study the potential function of ripples, their generation mechanism needs to be clarified. Here we analyze a model network of inhibitory interneurons as a potential ripple pacemaker. We derive an analytical approximation of the ripple dynamics that explains why the ripple frequency tends to decay over the course of an event (intra-ripple frequency accommodation, IFA). Studying dynamical phenomena such as IFA advances model selection and enables an understanding of how rhythmic brain activity contributes to cognitive functions.

Introduction

Sharp wave-ripples (SPW-R) are a prominent rhythmic signature of neuronal activity in the hippocampus. Hippocampal SPW-Rs are embedded in a larger system of thalamo-cortical rhythms [1, 2] and have been associated with the replay of behaviorally relevant neuronal activity [35]. SPW-Rs may thus be important for cognitive functions such as memory consolidation [611] and planning [4, 1214]. To probe their potential functional role, we first need to understand the generation mechanism of SPW-Rs.

SPW-Rs are brief (50–100 ms) periods of elevated, highly synchronized neuronal activity, which can be measured in the local field potential (LFP) of the hippocampus in vivo [1517], as well as in vitro [1822]. In CA1 the LFP sharp wave (SPW) is most prominent in layer stratum radiatum and is thought to reflect a current sink due to elevated excitatory synaptic transmission from the CA3 Schaffer collaterals. The LFP ripple oscillation (150–250 Hz in vivo, [16]; 210 ± 16 Hz in vitro, [19]) is strongest in stratum pyramidale of CA1 and is thought to reflect inhibitory synaptic currents, and potentially rhythmic excitatory action potentials [23, 24]. It is believed that the local CA1 network can generate ripples in isolation [19, 25, 26].

Various computational models have been put forward to explain ripple generation, relying either on excitation or inhibition as the main pacemaker (see Discussion for A note on mixed models). Excitation-first models assume that SPW and ripple oscillations are generated jointly by the sparsely connected pyramidal cell network, either via axonal gap junctions and antidromic spike propagation [27, 28] or via supralinear dendritic integration [29].

Inhibition-first models posit that the interneuron network (e.g. in CA1) produces ripples in response to transient excitatory input due to a SPW event, which is generated by a separate mechanism upstream (e.g. in CA3) [3032]. Inhibition-first models can be further subdivided into two classes, which we will call here feedback-based and perturbation-based, hinting at the mechanism by which ripples emerge: In the feedback-based models [16, 3340] the recurrent synaptic coupling between interneurons is strong such that for sufficiently strong excitatory drive the network undergoes a bifurcation and can, in theory, produce sustained ripple oscillations as long as the drive remains strong enough. The perturbation-based model [41], on the other hand, assumes weak coupling and cannot produce sustained oscillations. Here ripples emerge as a transient ringing effect in response to a perturbation in the external drive.

Experiments remain inconclusive as to which of the proposed mechanisms is the most plausible for ripple generation [17, 20, 42]. Previous analyses have focused on the average frequency and duration of ripple events as well as their dependency on pharmacological or optogenetic manipulation [17, 21, 43]. We propose that taking into account the transient dynamical features of spontaneous ripples can advance model selection. The instantaneous ripple frequency typically decays over the course of a ripple event. This intra-ripple frequency accommodation (IFA) has been observed in vivo and in vitro and across different animals, brain states, and measurements [9, 25, 37, 4346]. It is therefore an interesting question which of the proposed ripple models can account for IFA, and under which assumptions.

So far, only the feedback-based inhibition-first ripple model has been shown to reproduce IFA in exemplary numerical simulations [37]. Here we use a theoretical mean-field approach and extensive numerical simulations to explain the mechanism of IFA, show that IFA is robust with respect to parameter variation, and predict that IFA depends on the time course of the external, SPW-associated drive. These insights about IFA can be used to distinguish between the feedback-based and the perturbation-based inhibition-first ripple mechanisms.

Results

The feedback-based inhibitory ripple model assumes that a network of synaptically coupled CA1 interneurons, such as parvalbumin-expressing (PV+) basket cells, acts as a delayed negative feedback loop and thus creates fast ripple oscillations when stimulated with excitatory, sharp wave-associated drive from CA3. This idea was first expressed by [16, 33], and has been formally studied in computational models of varying complexity [3439]. Simulations of a biophysically detailed version of this model by [37] revealed that it can reproduce intra-ripple frequency accommodation (IFA) in response to time-dependent, sharp wave-like drive.

To explain the mechanism of IFA analytically, we first demonstrate that IFA is preserved in a spiking network model of reduced complexity (the network used in [34] with all-to-all coupling). We then turn to a mean-field approximation of the network dynamics, which enables us to study the ripple dynamics and IFA as a function of the time course of the excitatory sharp wave-like drive.

Ripples and IFA in a spiking neural network model

We model the CA1 interneuron network as a homogeneous network of N leaky integrate-and-fire (LIF) neurons with membrane time constant τm, capacitance C, and resting potential Eleak. The membrane potential vi of a unit i is given by the following stochastic differential equation:

τmv˙i=-vi+Eleak+τmCIext(t)-τmJNj=1Nkδ(t-tjk-Δ)+2τmσVξi(t) (1)

where the derivative with respect to time t is abbreviated by a dot: vi˙=dvidt. Whenever the membrane potential crosses a spike threshold Vthr, a spike is emitted and the membrane potential is reset instantaneously to a reset potential Vreset. For simplicity there is no absolute refractory period. All interneurons receive the same external, excitatory drive Iext(t) and an independent Gaussian white noise input ξi(t), with zero mean 〈ξi(t)〉 = 0 and unit noise intensity 〈ξi(t)ξj(t′)〉 = δijδ(tt′), scaled by the noise strength parameter σV. The network is fully connected via inhibitory pulse coupling of strength J and with a synaptic delay Δ (see Methods for details and default values of parameters).

The empirical population activity rN(t) in a small time interval [t, t + Δt) is defined as the number of spikes nspk(t, t + Δt) emitted by the population (Methods, Eq (15)), divided by the size of the population and the time step Δt:

rN(t)nspk(t,t+Δt)NΔt. (2)

For plotting purposes, rN is smoothed with a narrow Gaussian kernel (Methods, Eq (16)). In the following, we will illustrate that the key network dynamics in response to constant drive Iext and time-dependent, sharp wave-like drive Iext(t), as presented by [37], are preserved in this model, i.e., there are fast oscillations in the ripple range and there is IFA.

Dynamics for constant drive

The network dynamics for constant drive Iext are illustrated in Fig 1. At low Iext, the network is in a steady-state with units firing asynchronously and irregularly at an overall low rate funit (Fig 1A, left). As Iext increases, the network activity begins to exhibit coherent oscillations (Fig 1A, middle). In the following, we will refer to the dominant frequency of this population oscillation as the network frequency fnet (Methods, black triangle markers in Fig 1B). For a biologically reasonable set of parameters (Methods) the network frequency lies within the ripple range for a large range of external drives (Fig 1B, gray band). For 0.3 nA ≲ Iext ≲ 0.9 nA, the unit activity underlying the network ripple oscillation is sparse and irregular, i.e. the average unit firing rate funit is lower than the frequency fnet of the network oscillation (saturation s = funit/fnet < 1) and the coefficient of variation (CV) of interspike intervals is around 0.5 (Fig 1B, bottom panels). We hence refer to this state as sparse synchrony [37, 47].

Fig 1. Constant-drive dynamics of the spiking network.

Fig 1

(A) Four dynamical regimes depending on the external drive: asynchronous irregular state, sparse synchrony, full synchrony, multiple spikes (left to right, N = 10, 000). Top: population rate rN, middle: raster plot showing spikes of 30 example units, and histogram of membrane potentials v (normalized as density, threshold and reset marked by horizontal dashed lines), bottom: power spectral density of the population rate. (B) Top: network frequency fnet (black) and mean unit firing rate funit (blue) for a range of constant external drives Iext. Grey band marks approximate ripple frequency range (140–220 Hz). Red markers indicate the critical input level Iextcrit (Hopf bifurcation) and the associated network and unit frequency, as resulting from linear stability analysis, see section Linear Stability Analysis in Methods, Eq (70), and [34]. Linestyle indicates network size (N ∈ [102, 103, 104]). Middle: saturation s = funit/fnet. Bottom: coefficient of variation of interspike intervals, averaged across units.

In the mean-field limit N → ∞, the transition to this oscillatory state is sharp (a supercritical Hopf bifurcation) and the transition point can be determined by a linear stability analysis of the stationary state [34]. In a simulated spiking network of finite size, the transition to the oscillatory state is not perfectly sharp because of inevitable fluctuations. Nevertheless, at the bifurcation point, both the network frequency and the unit firing rate are well predicted by the mean-field theory if the network is large enough (Fig 1B, red markers: Iextcrit=0.19 nA with fnet = 305 Hz, funit = 16 Hz, s = 0.05).

For some sufficiently strong drive, the network reaches a state of full synchrony (s ≈ 1), with units firing regularly and at the same average frequency as the population rhythm (Fig 1A, “full synch”). If the drive increases beyond this level, units spike several times per cycle (s > 1, Fig 1A, “multiple spikes”)), which increases the CV of interspike intervals. Since the firing rates in that regime are too high to be biologically plausible, we focus in the following on the dynamical regime between the onset of oscillations (bifurcation point) and the point of full synchrony (Fig 1).

We note that in this reduced model (and also in [34]) the network frequency is a decreasing function of the external drive (Fig 1B). It is generally not straight-forward to infer how the network frequency changes with the external drive Iext: On the one hand, individual neurons reach the spiking threshold faster when the drive is stronger. On the other hand, the resulting increase in spiking activity leads to a stronger delayed feedback inhibition that pushes all membrane potentials further away from the threshold. The network frequency thus results from a self-consistency condition. In fact, the shape of the network frequency curve can be different (increasing or even non-monotonic) depending on the details of the model network architecture (see also Discussion and Fig A in S2 Appendix). The mechanism of IFA that we demonstrate in the following is independent of the shape of the network frequencies under constant drive.

Dynamics for time-dependent drive

During a sharp wave (SPW) event, which is thought here to be generated in CA3 (cf. models introduced by [3032]), pyramidal cells in CA3 transiently increase their firing rates [43, 48]. Here we model the resulting feedforward input Iext(t) to the CA1 interneuron network in a simplified form: as a symmetric, piecewise linear double ramp (Fig 2A, bottom; see also Eq (17) in Methods, and Discussion). This SPW-like drive elicits a transient ripple event in the network (Fig 2A). We measure the instantaneous frequency of the population activity, either by taking a windowed Fourier transform (wavelet spectrogram) and finding the peak in power in each time step, or by taking the inverse of the distances between consecutive peaks in the population activity providing frequencies for a discrete set of time points (Fig 2A, top, solid line vs white dots).

Fig 2. Transient dynamics of the spiking network and IFA.

Fig 2

(A) Example simulation showing a transient ripple with IFA. From bottom to top: SPW-like drive (Eq (17)); histogram of membrane potentials (normalized), horizontal dotted lines: threshold and reset potential; raster plot showing spike times of 10 example units; population rate exhibiting transient ripple oscillation; wavelet spectrogram indicating instantaneous power (blue-yellow colorbar) for a frequency range of 0–400 Hz. Solid curve: continuous estimate of instantaneous frequency based on wavelet spectrogram with gray scale indicating maximal instantaneous power. Dotted line: cutoff frequency for peak detection fmin = 70Hz. Red lines in scalebars: power threshold (see Methods). White dots: discrete estimate of instantaneous frequency based on peak-to-peak distance in population rate. Network size N = 10, 000. (B) Quantification of IFA. Top: Grey dots: discrete instantaneous frequency estimates from 50 repetitions of the simulation shown in (A) with different noise realizations. Grey line: linear regression line with negative slope (χIFA = −3.04 Hz/ms) indicating IFA (see Methods, Eq (18)). Black line: asymptotic frequencies (cf. Fig 1B, top). Bottom: The same SPW-like drive was applied in all 50 simulations. Network size N = 10, 000. (C) Dependency of IFA slope χIFA on the slope of the external drive for different network sizes (color coded). (D) Instantaneous (dots) vs asymptotic (black lines) network frequencies (top) for piecewise linear drives (bottom) of decreasing slopes (left to right). Color/linestyle indicates network size N. Thin, colored linear regression lines illustrate decreasing strength of IFA for shallower drive (regression slopes summarized in C). Asymptotic network frequencies are derived via interpolation of the constant-drive results shown in Fig 1B, top. The asymptotic frequencies fnet for N = 103 and N = 104 are nearly identical (dash-dotted and solid lines). Note that the drive is identical in panels A, B, and in the left panel of D.

There is no established definition of IFA. The central aspect of IFA is the decrease (accommodation) of the instantaneous frequency. Over the course of an experimentally measured ripple event (20–100 ms) the instantaneous ripple frequency typically decreases by about 10–60 Hz (rough visual estimate of peak-to-trough distances seen in [9, 25, 37, 4346]), corresponding to linear slopes of about −0.1 to −3 Hz/ms. We observe such a decreasing trend of the instantaneous frequency in the model even though the external stimulus is perfectly symmetric (Fig 2A, top, see also [37]), i.e. the network exhibits intra-ripple frequency accommodation (IFA). We quantify IFA by linear regression over the discrete instantaneous frequency estimates (Fig 2A, white dots). This quantification of IFA is applied to results from many simulations with different noise realizations and the same SPW-like drive (Fig 2B; Methods, Eq (18)). A negative regression slope (here χIFA = −3.01 Hz/ms) indicates IFA.

In the model, the decrease in instantaneous frequency is not strictly linear over the entire event. In the central portion of the simulated ripple event, where power in the ripple band is large, the ripple frequency decreases monotonically. At the end of the event, however, we observe a small increase of the frequency, albeit with low power (the underlying peaks in the population rate are small) (Fig 2A and 2B top). Similar non-monotonic trajectories of the ripple frequency have been observed in experimentally measured ripples and still been classified as exhibiting IFA [44].

The transient dynamics of the instantaneous frequency in this model can be understood best by comparing it to the asymptotic frequency that the network would settle into if the drive remained constant at the instantaneous value of the drive (Fig 2B, grey dots vs black line). Naturally, this asymptotic frequency follows the same symmetry as the external drive and thus provides a useful reference frame. For our double ramp drive (Methods, Eq (17)), we observe that during the plateau phase (i.e. constant drive) the instantaneous frequency quickly approaches the asymptotic frequency. However, during the rising phase of the drive the instantaneous frequency tends to be higher than the asymptotic reference. During the falling phase it is lower, thus creating the overall IFA asymmetry.

Varying the slope of the external double ramp drive we see that the IFA asymmetry is speed-dependent (Fig 2C and 2D): If the external drive changes more slowly (smaller slope), the network frequency response becomes more symmetric; for very small slopes the instantaneous frequencies approach the symmetric, asymptotic reference frequencies, and IFA vanishes.

In what follows, we aim at understanding the mechanism behind the observations illustrated for constant drive in Fig 1 and time-dependent drive in Fig 2. Further simulations indicate that the network dynamics varies only little when the network size is varied by two orders of magnitude (Figs 1B, 2C and 2D). We hence hypothesize that IFA is preserved in the mean-field limit of an infinitely large network, and will use a mean-field approach to explain the generating mechanism of IFA.

Gaussian-drift approximation of ripple dynamics in the mean-field limit

To facilitate notation in the mathematical analysis, we rescale all voltages to units of the distance between threshold and rest such that the new spiking threshold is at VT = 1 and the resting potential is at EL = 0. The single unit stochastic differential equation (previously Eq (1)) then reads

τmV˙i=-Vi+IE(t)-KτmNj=1Nkδ(t-tjk-Δ)+2Dτmξi(t), (3)

with rescaled external excitatory current IE, inhibitory synaptic strength K, and noise intensity D; see also section Dimensionless equations in the Methods, Eqs (20)–(22); there we provide details on the rescaling and values of default parameters.

In the mean-field limit N → ∞, the dynamics of the density of membrane potentials p(V, t) is described by the Fokker-Planck equation (FPE)

τmtp(V,t)=-V[(I(t)-V)p(V,t)]+DV2p(V,t) (4)
I(t)=IE(t)-II(t)=IE(t)-Kτmr(t-Δ) (5)
r(t)=-DτmVp(VT-,t) (6)

(see Methods, Eq (23) for details). The population rate r(t) defined in Eq (6) represents the mean-field limit of the population activity rN(t) in Eq (2) of the spiking neural network. Compared to the classical application of the FPE [49, 50] there are two essential differences: First, the FPE is nonlocal because of the resetting rule. This rule imposes an absorbing boundary condition at the threshold (Eq (23e)) and a source of probability at the reset point; the latter source can be imposed by a jump condition for the derivative of the density at the reset point (Eq (23g)) that matches the derivative at the threshold point, see e.g. [5155]. Secondly, the FPE (4) is nonlinear because the current I(t) depends on the probability density p(V, t) through the population rate r(t) in Eq (6).

Stable stationary solutions of this nonlocal, nonlinear FPE correspond to asynchronous irregular spiking [34, 51]. It has been shown that an oscillatory (i.e. periodic) solution r(t) emerges via a supercritical Hopf bifurcation when the external drive IE exceeds a critical value [34, 51]. This network oscillation well reproduces the coherent stochastic oscillation of the population activity rN(t) in the finite-size spiking neural network (Fig 1). The network frequency at the onset of oscillations (i.e. at the Hopf bifurcation, where the stationary solution looses stability) is well predicted analytically by a linear stability analysis [34] (see also Methods, Eq (70) and Fig 1B, red markers). However, this analytical prediction quickly breaks down further away from the bifurcation where the oscillation dynamics becomes strongly nonlinear (see power spectral densities with higher harmonics in Fig 1A). In this regime neither an exact periodic solution of the FPE nor an approximation of its frequency is known. We will thus introduce a simplified approach that allows us to approximate the network frequencies at strong drive beyond the bifurcation.

Our approach can be motivated by two observations from the spiking network simulations: (a) In the relevant regime between sparse and full synchrony, units spike at most once per cycle of the population rhythm (funitfnet, Fig 1B). This property will allow us to approximate the time course of a population spike in rN(t) using a first-passage-time ansatz, neglecting the reset mechanism. (b) In between population spikes, the population rate rN(t) is close to zero, and the strong inhibitory feedback pushes the bulk of the membrane potential distribution significantly below threshold (Fig 1A). In those periods the absorbing boundary condition at threshold does not have any significant impact on the dynamics of p(V, t). Observation b) is illustrated in more detail in Fig 3A, which shows the spiking network simulations from Fig 1 (Iext = 0.55 nA) with rescaled voltages.

Fig 3. Illustration of the Gaussian-drift approximation.

Fig 3

Comparison of oscillation dynamics in the spiking network simulation (A) and the Gaussian-drift approximation (B) at constant drive (IE = 4.24). (Ai) Spiking network simulation. Empirical population rate rN (top), and density of membrane potentials (bottom, dimensionless voltage), exhibiting coherent stochastic oscillations with (weak) finite size fluctuations. (Aii) The average cycle of the oscillation dynamics in Ai (computed for 21 bins of 0.24 ms each). Top: population rate; middle: density of membrane potentials, orange marker: local minimum of mean membrane potential; bottom: standard deviation of membrane potential distribution, dashed line: theoretical asymptote D in the absence of boundary conditions (Eq (26)). (Aiii) Snapshots of the membrane potential density over the course of the average cycle shown in (Aii). (Bi) Gaussian-drift approximation (numerical integration of DDE (8) with reset condition). Top: population rate, bottom: mean membrane potential μ(t) (black line) with Gaussian density of membrane potentials p(V, t) painted on in the background for better comparison with Ai. (Bii) Zoom into one oscillation cycle (black bar in Bi). Top: population rate r, middle: density of membrane potentials p(V, t) with mean μ(t), bottom: constant standard deviation D. The mean membrane potential μ(t) (black line) starts in each cycle at μmin (orange) and rises up until μmax (cyan) at time toff, at which point the population spike ends. In a phenomenological account for the single unit reset, μ is reset instantaneously to μreset (yellow). From there μ declines back towards μmin. (Biii) Snapshots of the membrane potential density p(V, t) over the course of one cycle (Bii). Colors mark t = 0 ∼ T (orange), and t = toff (cyan/yellow). Dotted lines in all voltage panels mark spike threshold VT = 1 and reset potential VR = 0. Note that in the theoretical approximation the spike threshold VT = 1 is no longer an absorbing boundary.

These two observations motivate a considerable simplification of the mean-field dynamics: Without the boundary conditions at threshold and reset, the FPE (4) can be solved analytically. In the long-time limit its solution becomes a simple Gaussian—independent of the initial condition (Methods, Eq (24)). We thus approximate the density of membrane potentials as a Gaussian

p(V,t)12πDexp[-(V-μ(t))22D] (7)

(see Fig 3B), and the only time-dependent quantity is the mean membrane potential μ(t), which evolves according to

μ˙(t)=1τm(I(t)-μ(t))=(5)1τm(IE(t)-Kτmr(t-Δ)-μ(t)) (8a)

(see Fig 3Bi and 3Bii, black lines, cf. Methods Eq (25)). Because in the considered regime spikes are mainly driven by the mean input rather than membrane potential fluctuations, the population rate can be well approximated by the drift part of the probability current across the threshold, while diffusion-mediated spiking is ignored (Methods, Eqs (28) and (29); see also [5659]):

r(t)=[μ˙(t)]+p(VT,t)=[μ˙(t)]+2πDexp[-(VT-μ(t))22D]. (8b)

Thus, in our approximate dynamics without reset, the population rate r(t) is given by the membrane potential density at threshold, scaled by the speed μ˙ at which the mean membrane potential approaches the threshold. Whenever the mean membrane potential is decreasing (μ˙<0), the drift current at the threshold is negative, and hence, the rate is clipped to 0 ([x]+ ≔ (x + |x|)/2, see Fig 3Bi and 3Bii, top).

Because the single-neuron dynamics of integrate-and-fire neurons includes a reset mechanism after the spike-threshold has been reached, we add a phenomenological account for such a reset to our population-level description: At the end of a population spike (r[μ˙]+=0,μ¨<0) the mean membrane potential is pushed down (“reset”) by an amount that is proportional to the fraction of neurons that have participated in the population spike (Fig 3Biii, t = 3.62 ms; see Methods Eq (49) for details).

The two coupled Eqs (8a) and (8b) are equivalent to a single delay differential equation (DDE; Methods, Eq (31)). This DDE (including the reset condition) governs the dynamics of the mean of the Gaussian membrane potential distribution, and the resulting drift-based population rate. In the following, we will therefore refer to the DDE dynamics (with the phenomenological reset condition) as the Gaussian-drift approximation.

The main differences between the spiking network model (or the exact mean-field dynamics given by the FPE (4)) and the Gaussian-drift approximation are illustrated in Fig 3 for the case of constant drive. In the spiking network model, the membrane potential density p(V, t) is not strictly Gaussian (Fig 3Aiii). Because of the fire-and-reset mechanism, p(V, t) changes in shape during the oscillation cycle, becoming at times even bimodal (Fig 3Aiii vs Fig 3Biii). Still, we see that our Gaussian assumption (b) is justified: Whenever the membrane potential density is subthreshold in between population spikes, it becomes more Gaussian, and its standard deviation approaches D (Fig 3Aii, bottom, and Fig 3Aiii, first 3 snapshots).

The advantage of our Gaussian-drift approximation is that it reduces the FPE with complex boundary conditions to a simpler DDE with oscillatory solutions that can be studied analytically. In the following, we will first consider the dynamics for constant drive, then extend the Gaussian-drift approximation to understand the transient response to time-dependent drive, and hence explain the emergence of IFA.

Analysis of oscillation dynamics for constant drive

A T-periodic solution of the Gaussian-drift model, Eq (8), must have a mean membrane potential μ(t) = μ(t + T) that oscillates between two local extrema μmin and μmax (Fig 3Bi). Whenever the mean membrane potential increases, a positive population rate r is produced; when the mean membrane potential decreases the population rate is clipped to 0 (Eq (8b)). Let us consider a single cycle of this oscillatory dynamics (Fig 3Bii): The moment when μ reaches its local maximum μmax (Fig 3Bii, cyan circle) is of special importance as it marks the end of the population spike. We will refer to this time as toff:

μ(toff)=μmax,r(toff)=0.

Since the inhibitory feedback II(t) = −mr(t − Δ) is proportional to the population rate with a delay of Δ (Eq (8a)), it follows that this feedback ceases exactly Δ after the end of the population spike, i.e. II(toff + Δ) = 0. It is convenient to define this time as the end of a cycle, i.e. the beginning of the next one. The mean membrane potential at this time is close to its local minimum (see Methods, Step 2) and will be denoted as μminμ(toff + Δ). The period T can then be split into the upstroke time toff needed for the mean membrane potential to rise from μ(t = 0) = μmin towards μ(toff) = μmax, and the downstroke time Δ during which the mean membrane potential is pushed back down to μ(T) = μmin due to the delayed inhibitory feedback (Fig 3Bii). In Methods we derive, through a series of heuristic approximations, analytical expressions for the local extrema μmax (Eq (38)) and μmin (Eq (50)) of the mean membrane potential oscillation as a function of the external drive IE. Using these expressions, we obtain an analytical formula for toff (Eq (47)) and hence for the network frequency:

fnet=T-1=(toff+Δ)-1. (9)

Apart from the network frequency, the Gaussian-drift approximation also allows an intuitive understanding of the mean unit firing rate. When the population spike ends at time toff, the suprathreshold portion of the Gaussian density corresponds to the fraction of units that have spiked in the given cycle (the saturation s, Methods, Eq (48)). The mean unit firing rate can thus be inferred as:

funit=sfnet. (10)

In the following, we compare our analytical Gaussian-drift approximations for the network frequency and mean membrane potential dynamics to the spiking network simulations for a range of external drives IE.

Performance evaluation of the Gaussian-drift approximation

Our theory captures the dependence of the network frequency fnet and the mean unit firing rate funit on the external drive IE, including the transition from sparse to full synchrony for increasing external drive (Fig 4, top). Our analytically derived expression for the saturation s predicts this transition (Fig 4, middle), since s is a monotonically increasing function of μmax (Eq (48)), which monotonically increases as a function of the external drive IE (Eq (38), Fig 4, bottom). We can also estimate analytically the point of full synchrony, i.e. the amount of external drive IEfull that is required for single units to fire approximately at the frequency of the network rhythm (s(IEfull)1; Methods, Eq (51); Fig 4, vertical dashed line). The Gaussian-drift approximation slightly overestimates the point of full synchrony, but correctly predicts its parameter dependencies: For stronger coupling and/or larger noise, stronger external drive is required to reach full synchrony (Fig B in S1 Appendix).

Fig 4. Analytical approximation of the oscillation dynamics for constant drive.

Fig 4

Comparison of dynamics in theory (full lines) and spiking network simulation (dashed lines, N = 10, 000). Top: Network frequency (black triangles) and mean unit frequency (blue circles). Red markers: Hopf bifurcation. Vertical lines indicate the range [IEmin,IEmax=IEfull] for which the theory applies (see Methods, Eq (52)). Middle: Saturation s increases monotonically with the drive (Eq (48)). Bottom: characterization of the underlying mean membrane potential dynamics via local maximum μmax (cyan, Eq (38)), local minimum μmin (orange, Eq (50)) and population reset μreset (yellow, Eq (49)). Default parameters (see Methods).

The theory shows that the amplitude μmaxμmin of the oscillatory mean membrane potential grows with increasing drive (Fig 4, bottom). This is mainly due to a strong decrease of the periodic minimum μmin (Fig 4, bottom, solid orange line), which we also observe in the spiking network simulation (Fig 4, bottom, dashed orange line): stronger external drive increases the amplitude of the population spikes (Fig 1A), resulting in stronger recurrent inhibitory feedback and thus a stronger hyperpolarization of the membrane potentials. The quantities μmax and μreset are pertinent to the Gaussian-drift approximation and have no direct counterpart in the spiking neural network model.

The range of applicability of our theory is defined by our two assumptions (see Methods for details, Eq (52)): (a) units should spike at most once per cycle (i.e. IEIEfull), (b) in between population spikes the bulk of the membrane potential distribution should be subthreshold (i.e. μmin(IE)+3DVT). The resulting range [IEmin,IEmax] for the external current IE covers the large part of the regime of sparse synchrony up to the point of full synchrony (Fig 4, here IEmax=IEfull). We confirmed with numerical simulations that for strong enough drive the Gaussian-drift approximation works for a wide parameter regime w.r.t. noise, coupling strength, and synaptic delay (S1 Appendix).

At low drive (IE<IEmin) this theory breaks down (see also Methods, section Numerical analysis of oscillation dynamics). This is to be expected from a purely drift-based approximation. The dynamics of the spiking network close to its supercritical Hopf bifurcation is largely fluctuation-driven. Such dynamics cannot be captured by focusing only on the oscillation of the mean membrane potential, which has vanishing amplitude as the drive approaches its critical value. This limitation of the theory does not pose a problem, since (a) the fluctuation-driven dynamics around the Hopf bifurcation has already been studied in depth by [34] and (b) our main goal here is to explain the IFA dynamics, which happens in the strongly mean-driven regime IE(t)IEmin.

So far we have studied the case of constant drive describing the asymptotic dynamics of sustained ripple oscillations observed in the long-time limit, after initial conditions have been forgotten. This will be emphasized from here on by adding a superscript “∞”. The simulations in Fig 2 already suggested that IFA emerges due to a deviation of the transient dynamics from these asymptotic dynamics. We will now use the established Gaussian-drift approximation to understand why. In a first step we will maintain the assumption of constant drive and study the transient dynamics that can be induced by perturbations of the initial condition (Fig 5A and 5B). With piecewise constant drive as a simplistic model of a SPW-like drive we then demonstrate the core mechanism of IFA: a hysteresis of the mean membrane potential dynamics (Fig 5B and 5C). We then extend the same formalism to piecewise linear drive, as a more realistic approximation of SPW-like drive, and in order to demonstrate that the IFA asymmetry is preserved even when the drive is fully symmetric (Fig 6). The Gaussian-drift approximation for linear drives allows us to study the dependence of IFA on the slope of the drive. We demonstrate that the hysteresis effect causing IFA is speed-dependent (Fig 7)—a novel, testable prediction of the feedback-based inhibition-first ripple model matching the simulation results from the spiking network (Fig 2D).

Fig 5. Transient dynamics and IFA for piecewise constant external drive.

Fig 5

(Ai, Aii) Dynamics under constant drive depending on the initial mean membrane potential. Top: Population rate. Bottom: Constant external drive (green) and mean membrane potential (gray) with initial value μmin (orange marker). Dotted horizontal lines mark spike threshold VT = 1 and reset potential VR = 0. (Aiii) Direct comparison of the first oscillation cycles in Ai/Aii (gray shaded area) with the asymptotic cycle dynamics. Orange and cyan horizontal lines mark the asymptotic values for μmin(IE) and μmax(IE), respectively. Left: shorter cycle for μmin>μmin(IE). Middle: asymptotic period for μmin=μmin(IE). Right: longer cycle for μmin<μmin(IE). The color of the population rate curve (left, right) expresses the difference in cycle length as a difference in instantaneous frequency (colorbar in B). (B) Difference between the instantaneous frequency of a cycle with constant drive IE and initial condition μmin, and the asymptotic frequency fnet(IE) for a range of external drives IE and initial mean membrane potentials μmin. Black line: asymptotic μmin(IE) (cf. Fig 4, bottom, orange line). Markers indicate example cycles shown in C. Arrows indicate convergence to the asymptotic dynamics after one cycle. If the drive changes after each cycle (dotted lines), the seven examples lead to the trajectory shown in C. Cycles 2 and 6 are also shown in Aiii (left, right), together with their common asymptotic reference dynamics. ȈFA for piecewise constant drive with symmetric step heights. Shaded areas mark oscillation cycles. Bottom: The external drive is increased step-wise, up to the point of full synchrony IEfull8.9 (green staircase). As in A, lines in all panels indicate the asymptotic dynamics associated to the external drive of the respective cycle. Circular markers indicate transient behavior. Cyan: μmax. Orange: μmin. Reset not marked to enhance readability. Gray line: trajectory of the mean membrane potential. Middle: Population rate. Top: the instantaneous network frequency (markers) is first above and then below the respective asymptotic network frequencies (black line). Same colorbar as B. All quantities are derived analytically from the Gaussian-drift approximation. Vertical axis labeled “voltage, drive” in panels A and C applies to membrane potential and external drive.

Fig 6. Transient dynamics and IFA for piecewise linear external drive.

Fig 6

(A) Exemplary transient dynamics during rising vs falling phase of the external drive (“up” vs “down”), given fixed I^E=IE(toff)=5 (green dot). Bottom: external drive (green lines), and trajectories of mean membrane potential (gray lines) depending on initial conditions (orange dots). Dotted horizontal lines mark reference drive I^E, spike threshold VT = 1, and reset potential VR = 0. Top: Population rate. Color (colorbar in B) indicates the difference in cycle length (shaded area) compared to the asymptotic reference (middle panel). Left: shorter cycle for m > 0 and initial μminμmin(I^E) (orange dot vs orange line). Middle: asymptotic period for constant drive (IEI^E,m=0) and initial μmin=μmin(I^E). Right: longer cycle for m < 0 and initial μmin<μmin(I^E). (B) Difference between instantaneous and asymptotic frequency for a range of reference drives I^E and initial mean membrane potentials μmin. Left: linearly increasing drive with slope m = +0.4/ms. Right: linearly decreasing drive with slope m = −0.4/ms. Black line: asymptotic μmin(I^E) for constant drive. White line: initial membrane potential μmin for which fnetinst=fnet(I^E). Stars mark the examples shown in A for I^E=5. Round markers and arrows indicate the trajectory shown in C for piecewise linear drive, numbered by cycle. White space where either: no asymptotic oscillations occur (IE<IEmin,μmin>VT-3D), or (bottom left): no transient solution exists (see Methods, Eq (64)). (C) IFA for symmetric, piecewise linear (SPW-like) drive. Shaded areas mark oscillation cycles. Bottom: The external drive is increased up to the point of full synchrony IEfull8.9 (green line). Colored lines indicate asymptotic dynamics. Gray line: mean membrane potential trajectory μ(t). Markers quantify transient behavior. Cyan: μmax. Orange: μmin. Reset not marked to enhance readability. Middle: Population rate. Top: the instantaneous network frequency (markers) is first above, then below the resp. asymptotic network frequencies (black line). Same colorbar as B. Dashed lines: plateau phase of variable length with IEIEfull, during which the network settles into the asymptotic dynamics. The IFA slope χIFA was derived for an assumed plateau length of 20 ms (as in Fig 2). All quantities are derived analytically. Vertical axis labeled “voltage, drive” in panels A and C applies to membrane potential and external drive.

Fig 7. IFA is speed-dependent.

Fig 7

Transient dynamics for SPW-like drive with slope m = ±0.4/ms (A, cf. Fig 6), m = ±0.2/ms (B), or m = ±0.1/ms (C). Top panels (i): difference between instantaneous and asymptotic network frequency for the possible combinations of external reference drive I^E and initial mean membrane potential μmin, shown separately for positive (top) and negative (bottom) slope of the drive. Trajectories shown in (ii) are overlaid. Bottom panels (ii): Example trajectories through the space shown in (i) for consecutive cycle under SPW-like drive with slope ±m. Top: instantaneous (colored markers) vs asymptotic (black solid line) network frequency as predicted by the theory. Grey dots indicate instantaneous frequencies in spiking network simulations (cf. Fig 2D, N = 10, 000). Black dashed line illustrates asymptotic network frequencies from spiking network simulations (cf. Fig 1B, N = 10, 000). See Table 1 for a quantitative comparison of simulation and theory. Middle: population rate as predicted by the Gaussian-drift approximation. Bottom: instantaneous vs asymptotic μmin (orange) and μmax (cyan). Gray line: trajectory of mean membrane potential μ(t). Green lines show SPW-like external drive (Eq (17)), green dots mark reference drive I^E of each cycle. The difference between instantaneous and asymptotic network frequencies (IFA) becomes less pronounced for smaller slope (C vs A, see also Table 1).

Analysis of oscillation dynamics for piecewise constant, sharp wave-like drive

Even for constant drive, there is transient dynamics if the initial mean membrane potential deviates from the asymptotic minimum μmin. Let us assume that a cycle starts with an initial mean membrane potential μminμmin(IE). We only require that μmin be sufficiently subthreshold, such that the initial population rate is close to zero. What will be the period of the first cycle and how long does it take until the asymptotic dynamics is reached?

First we note that, independent of μmin, the mean membrane potential will rise towards the asymptotic μmax(IE), which can be shown to be independent of the initial condition (Methods, Eq (38)). Thus, only the duration of the first upstroke will be influenced by the initial condition, and the asymptotic dynamics is reached immediately thereafter (Fig 5Ai and 5Aii).

The duration of the first upstroke depends on the distance that the mean membrane potential has to travel, from its initial value μmin to the next peak μmax(IE). For μmin=μmin(IE) the upstroke has length toff(IE) and hence the length of the first cycle is equal to the asymptotic period T. Correspondingly, the instantaneous frequency is equal to the asymptotic network frequency, fnetinst=fnet=1/T (Fig 5Aiii, middle). The upstroke takes less time, if the mean membrane potential starts at a higher value (μmin>μmin(IE), Fig 5Ai and 5Aiii, left), and more time otherwise (Fig 5Aii and 5Aiii, right). Hence the period of the first cycle will be either shorter or longer, which can be rephrased as an instantaneous frequency fnetinst that is higher or lower than the asymptotic fnet(IE). Fig 5B illustrates the instantaneous frequency of the first cycle for different combinations of (constant) drive IE and initial condition μmin (red: instantaneous frequency is higher than asymptotic frequency; blue: instantaneous frequency is lower).

Once the mean membrane potential has reached its first peak, it will follow the asymptotic dynamics, settling into μmin(IE) at the end of the first cycle, and all subsequent cycles will come at the asymptotic frequency fnet(IE) associated to the external drive IE (Fig 5Ai and 5Aii, convergence indicated by arrows in Fig 5B). A change of initial condition can thus only introduce a transient deviation from the asymptotic dynamics in a single cycle. In particular, this implies that stimulation with a square pulse cannot induce IFA in this model (see also Discussion and Fig A in S2 Appendix, panels a5–d5).

What if we change the external drive after each cycle (green line in Fig 5C)? Then the initial mean membrane potential of each cycle i will be the asymptotic minimum associated to the drive of the previous cycle:

μmini=μmin(IEi-1)

i.e. the mean membrane potential dynamics exhibits a history dependence (or hysteresis). Now recall that the asymptotic minimum μmin(IE) is a monotonically decaying function of the drive (except for very strong drive close to IEfull, Fig 5B, black line). Thus, if the external drive increases stepwise, each cycle starts with an initial mean membrane potential above the asymptotic minimum associated to that cycle’s drive, hence the instantaneous frequency is above its asymptotic value (Fig 5B, trajectory through red area: fnetinst,i>fnet(IEi)i). Vice versa, if the external drive decreases stepwise, each cycle starts with an initial mean membrane potential below the asymptotic minimum associated to that cycle’s drive, hence the instantaneous frequency is below its asymptotic value (Fig 5B, trajectory through blue area: fnetinst,i<fnet(IEi)i). In summary:

IEi>IEi-1μmini=μmin(IEi-1)>μmin(IEi)fnetinst,i>fnet(IEi)IEi<IEi-1μmini=μmin(IEi-1)<μmin(IEi)fnetinst,i<fnet(IEi)

Thus, if we approximate the transient change in drive during a sharp wave as a simple, piecewise constant function that first increases after each cycle, and then decreases (Fig 5C, green line), we observe IFA: The asymptotic network frequency associated to the drive in each cycle describes a reference curve that follows the same symmetry as the drive (Fig 5C, top, solid black line). However, the instantaneous network frequency (Fig 5C, top, round markers) is asymmetric over time, as it is above the asymptotic network frequencies during the rising phase of the external drive, and below during the falling phase. The theory thus describes the relationship between instantaneous and asymptotic frequencies that was already described for the spiking network simulations in Fig 2D.

The piecewise constant shape of the drive may not be realistic, but serves to illustrate the core mechanism of IFA: a hysteresis in the oscillation amplitude of the mean membrane potential. A drawback is that this simple model for SPW-like drive is not symmetric in time, since the drive changes after each cycle, and the cycle length increases due to IFA. To show that the IFA asymmetry does not rely on an asymmetry in the drive, we adapted the Gaussian-drift approximation to incorporate time-dependent linear drive.

Analysis of oscillation dynamics for piecewise linear, sharp wave-like drive

Following the same approach as before, we approximate the transient dynamics of the mean membrane potential in a cycle i with initial value μmini and a drive that changes linearly around a level I^Ei with slope m:

(I^Ei,μmini,m)(fnetinst,i,funiti,toffi,μmaxi,μreseti,μmini+1) (11)

The dynamics is quantified in terms of the peak of the mean membrane potential μmaxi, its reset value μreseti, and the value μmini+1 that is reached at the end of cycle i (and may thus be the initial membrane potential of the next cycle i + 1). Most importantly, the duration of the upstroke toffi is inferred, and from that the instantaneous network frequency fnetinst,i=(toffi+Δ)-1 (see Methods, Eqs (56)–(66)). In agreement with our theoretical approximation, which is anchored to the end of the population spike, the reference drive I^Ei is chosen such that I^Ei=IE(toffi), i.e.

IE(t)=I^Ei+m(t-toffi). (12)

The analysis is now a little more complex, since each cycle depends on three parameters (I^Ei,μmini,m), in contrast to only two parameters (IEi,μmini) for piecewise constant drive. We will demonstrate, however, that the essential findings from the basic case of piecewise constant drive still hold, i.e. that IFA is generated by the same hysteresis in the transient dynamics of the mean membrane potential that we have uncovered before (Fig 5 vs Fig 6). To illustrate this by an example, we fix the slope of the linear drive to m = ±0.4/ms. We then compare how the transient dynamics deviates from the asymptotic dynamics, depending on whether the drive is increasing (m = +0.4/ms) or decreasing (m = −0.4/ms) (Fig 6).

The principal insight from the case of constant drive still holds for the case of time-dependent drive, except for a small range of initial values μmini: If the initial mean membrane potential μmini is (sufficiently) larger than the asymptotic reference μmin(I^Ei), the period is shorter than the asymptotic reference T, hence fnetinst,i>fnet(I^Ei) (Fig 6A, left; Fig 6B, red area). In contrast, if μmini is smaller than the asymptotic reference μmin(I^Ei), the period is longer than the asymptotic reference T, hence fnetinst,i<fnet(I^Ei) (Fig 6A, right; Fig 6B, blue area). Exceptions from this “rule” (Fig 6B, area between black and white lines) occur because for time-dependent drive an asymptotic initial value μmini=μmin(I^Ei) (black line) no longer implies the asymptotic period T and frequency fnetinst,i=fnet(I^Ei) (white line, see Methods for details). We note, however, that these exceptions from the rule occur only in a small portion of the state space that is rarely visited in a given ripple event, as we will see in the following.

What can we say about the dynamics of consecutive cycles i, i + 1, … that occur if the drive rises or falls continuously with slope m? At the end of each cycle the mean membrane potential is close to the asymptotic reference μmin(I^Ei) (Methods, Eq (66)). Thus we observe the same hysteresis as before: if the drive increases (m > 0, Fig 6B, left), trajectories of consecutive cycles will lie in the upper right half of the parameter space, where every cycle starts with an initial mean membrane potential that is higher than its asymptotic reference (μminiμmin(I^Ei-1)>μmin(I^Ei)), and hence has an instantaneous frequency that is (mostly) higher than the asymptotic reference (red color code). Vice versa, as the drive decreases (m < 0, Fig 6B, right), trajectories will lie in the lower left half (μminiμmin(I^Ei-1)<μmin(I^Ei)) where instantaneous frequencies are mostly lower than their asymptotic reference (blue color code). Hence even a symmetric, piecewise linear double ramp drive (Methods, Eq (17)), induces the IFA asymmetry (Fig 6C): During the rising phase of the drive the instantaneous frequencies are above the asymptotic reference, and during the falling phase they lie below (Fig 6C, top: markers vs black line; note cycle 5 as the only exception from the above “rule”). The IFA asymmetry thus does not rely on asymmetry in the input. Linear regression over the (semi-)analytically estimated instantaneous frequencies yields an IFA slope of −2.6 Hz/ms which is close to the spiking network simulation (Fig 2B).

Interestingly, the last cycle i = 7 in Fig 6C has a reference drive for which the constant-drive theory no longer applies (I^E7=1.7<2.85=IEmin), hence there is no asymptotic reference for the network frequency (empty marker, Fig 6C, top). This means that at the end of the sharp wave the network can sustain one more ripple cycle at a level of drive that in the beginning would be insufficient to trigger ripples (see cycle 1 in Fig 6C). The transient ripple is thus not only asymmetric in its instantaneous frequency (IFA), but also with respect to the level of drive at which it starts and ends.

We have established that IFA occurs in response to transient, sharp wave-like drive, independent of its symmetry, due to a hysteresis effect in the amplitude of the oscillatory mean membrane potential. Varying the slope m in our double ramp model for SPW-like drive (Methods, Eq (17)) we find that this hysteresis is speed-dependent (Fig 7): If the drive changes more slowly, the transient dynamics approaches the asymptotic dynamics, and the IFA asymmetry is reduced (χIFA → 0 for |m| → 0, Fig 7A–7C; Methods, Eqs (56)–(66)). The theory thus explains the speed-dependence of IFA that we already observed in the spiking network simulations (Fig 2D). The theoretically predicted instantaneous frequencies and IFA slopes are in excellent agreement with the corresponding observations in the spiking neural network simulation if the slope is sufficiently strong (Fig 7Aii and 7Bii, colored vs grey markers, average relative error ϵ: 11–13%, Table 1). The discrepancies between theory and simulation for slow drive (Fig 7Cii, colored vs grey markers) are mainly due to the discrepancies in the estimate of the asymptotic reference frequencies for constant drive (Fig 4).

Table 1. IFA in theory and simulation.

Quantification of the IFA slope χIFA in the spiking network simulations and the theoretical approximations shown in Fig 7Aii–7Cii for different slopes m of the external SPW-like drive. The error ϵ (Eq (67)) quantifies the average relative deviation of the theoretically predicted instantaneous network frequencies (colored markers in Fig 7) from the simulation results (grey dots in Fig 7).

slope of SPW-like drive [1/ms] m = ±0.4 m = ±0.2 m = ±0.1
IFA slope χIFA (Hz/ms, simulation) – 3.04 – 0.74 – 0.29
IFA slope χIFA (Hz/ms, theory) – 2.60 – 1.45 – 0.51
error ϵ in instantaneous frequency 11% 13% 13%

The speed-dependence of IFA is an important prediction of the feedback-based inhibitory ripple model that can be tested in experiments: Optogenetic stimulation of PV+ basket cells can trigger ripple oscillations in the LFP [21] or induce ripple-modulated spiking activity [43]. Increasing and decreasing the intensity of the light pulse could mimick the piecewise linear, double ramp drive studied here. The model predicts that the IFA asymmetry is reduced if the optogenetic light stimulus changes more slowly.

Discussion

We studied the dynamics of hippocampal ripple oscillations in a feedback-based inhibition-first model. By reducing the model complexity we derived an analytical approximation of the mean-field dynamics in the regime of strong drive and strong coupling. For constant drive, our theory (i) yields an approximation of the asymptotic network frequencies and mean unit firing rates far beyond the Hopf bifurcation, (ii) captures the transition from sparse to full synchrony, and (iii) reveals an increase of the mean membrane potential oscillation amplitude for increasing levels of external drive. For a fast changing, sharp-wave like drive we then show that a speed-dependent hysteresis effect in the trajectory of the mean membrane potential produces an IFA-like asymmetry of the instantaneous ripple frequency compared to the asymptotic frequencies. Our analytical approach presents a substantial advancement over previous, exemplary, numerical demonstrations of IFA [37] as it allows a mechanistic understanding of the phenomenon and its parameter dependencies. Our derivation shows that IFA is an intrinsic feature of the feedback-based inhibitory ripple model and that IFA occurs for any fast-enough sharp wave-like drive, independent of other parameter choices. The speed-dependence of IFA is a new prediction that can be tested experimentally.

Simplifying assumptions

To achieve an analytical treatment of the spiking network dynamics, we have made a number of simplifying assumptions. Our resulting reduced model with full recurrent connectivity is a special case of the network analyzed in [34] with sparse recurrent connectivity. We demonstrate in Figs 1 and 2 that the reduced model exhibits ripples and IFA similar to a biologically more realistic model [37] that includes random sparse connectivity, correlations in the background noise, synaptic filtering, absolute refractoriness, local pyramidal cells, and conductance-based synapses. Let us discuss these differences in more detail:

Pyramidal cells. Inhibition-first models posit that the interneuron network generates ripple oscillations and entrains the spiking of local CA1 pyramidal cells. Thus we neglect pyramidal cells in our reduced model—we regard them as a non-essential ingredient. We have no a priori reason to believe that the dynamics of IFA would be qualitatively altered by the presence of pyramidal cells: Numerical simulations of the feedback-based model including pyramidal cells have shown that (i) the inhibitory network can entrain the local pyramidal cell network, and (ii) IFA can occur in the spiking activity of both the excitatory and inhibitory population [37]. The influence of pyramidal cells is most relevant when it comes to determining the boundaries between the parameter spaces for ripple and gamma oscillations [37], or the relation between neuronal spiking activity and LFP [24, 38].

Absolute refractoriness. If an absolute refractory period is added to the reduced model in its default parameter setting (Table 2), the membrane potential distribution becomes bimodal and the network activity oscillates with alternating clusters of active neurons [60]—a state that is beyond the scope of our theory. Unimodality of the membrane potential distribution can easily be recovered by increasing the noise intensity. In the regime of sparse synchrony the network frequency is then virtually independent of the refractory period [60]. The refractory period only leads to a slight delay of the point of full synchrony, which also occurs at a lower network frequency due to the decrease in mean unit firing rate.

Table 2. Default parameters for the spiking network.

Parameter Value Definition
N 10,000 Number of interneurons
τ m 10 ms Membrane time constant
C 100 pF Membrane capacitance
E leak -65 mV Resting potential
V thr -52 mV Spike threshold
V reset - 65 mV Reset potential
J 65 mV Inhibitory coupling strength
Δ 1.2 ms Synaptic delay
σ V 2.62 mV Standard deviation of Gaussian white noise input
V T 1 Spike threshold, dimensionless
V R 0 Reset potential, dimensionless
K 5 Inhibitory coupling strength, dimensionless
D 0.04 Variance of Gaussian white noise input, dimensionless

Conductance-based synapses. A noteworthy qualitative difference in the network dynamics of our reduced model compared to the model in [37] comes from the use of current-based instead of conductance-based synapses. In our network with current-based synapses, an increase in external drive, and thus recurrent feedback, leads to an increase of the hyperpolarization of the membrane potentials, and the achieved membrane voltages can be quite low (< −100 mV in Fig 1A; μmin in Fig 4). In contrast, in a network with conductance-based synapses, the decrease in μmin would still be monotonic, but bounded by the inhibitory reversal potential, and thus confined to a biologically more realistic range. Moreover, inhibitory feedback in a conductance-based model compresses the membrane potential distribution around the reversal potential. To accurately capture the resulting oscillation dynamics, it would thus be necessary to capture not only the first, but also the second moment of the moving membrane potential distribution. We developed such an extension of the Gaussian-drift approximation for a toy model of pulse-coupled oscillators with linear phase-response curve, which mimicks the effect of an inhibitory reversal potential [60]. We find again that IFA is caused by a hysteresis effect, but now with respect to two variables: the mean and the variance of the membrane potential distribution. However, a direct analytical treatment of the conductance-based LIF network of [37] is difficult and may be an interesting subject for future investigations.

Short term plasticity and adaptation. There are many more biological complexities that are neglected not only in our reduced model, but also in the reference simulations by [37]. There is evidence that PV+ interneurons in CA3 exhibit short-term synaptic depression [61]. In principle, short term depression or facilitation could easily be incorporated in our Gaussian-drift approximation by making the synaptic strength K cycle-dependent. Spike frequency adaptation would be another factor relevant for ripple dynamics. However, PV+ interneurons in CA1—some of the prime candidates for a potential inhibition-first ripple generator—do not exhibit strong adaptation [62].

Performance and limitations of the Gaussian-drift approximation

Our reduced network model allows a Gaussian-drift approximation of its oscillation dynamics that can be treated analytically. We added a new phenomenological account for the reset mechanism on the population level, which improves the accuracy and range of applicability of the Gaussian-drift approximation (Fig 9A vs 9B). We would like to stress that this phenomenological reset is not necessary to capture the most important qualitative features of the network, namely the transition from sparse to full synchrony for constant drive and the hysteresis effect leading to IFA for time-dependent drive.

The differences between the simulations of spiking units and the theory have been analyzed in detail for the case of constant drive (S1 Appendix): there was a good quantitative match for a large parameter range, and a slight decrease in performance towards high noise intensities. For a slow-enough time-dependent drive, any approximation error in the asymptotic frequencies at constant drive (Fig 7Cii, top, solid vs dashed line) also contributes to an error in the instantaneous frequencies (Fig 7Cii, top, colored vs gray markers). Furthermore, the trajectory of the instantaneous frequencies in the spiking network simulations is influenced by the non-gaussian shape of the membrane potential distribution, which cannot be captured by the theory. An example is the “wiggle” (local maximum) in the simulated instantaneous network frequencies during the falling phase of the drive as seen e.g. in Fig 7Bii and 7Cii (top, gray markers): It reflects a single cycle of reduced period that occurs, with some phase jitter, in each of the 50 simulations pooled together in the plot. Inspection of the membrane potential distribution during this cycle revealed a significant reset-induced bimodality (S1 Fig). Such details of the transient dynamics cannot be captured by the theory, but are also not crucial to the IFA phenomenon.

Since our approximation is purely drift based, it is not well behaved in the limit of low drive (called “pathological oscillation” in Fig 8). In that sense our approach is truly complementary to previous linear (or weakly nonlinear) analyses around the bifurcation, which break down at strong drive [34]. A theory that can describe the dynamics of recurrent spiking networks both in the drift- and diffusion-dominated regimes, as well as capture the transitions in between, would be desirable. In the context of escape noise models, various approaches have been put forward to approximate the combined effect of drift- and diffusion-mediated spiking [5659, 6365]. The rate expression in our Gaussian-drift approximation (Eq (8b)) is equivalent to the drift-based hazard function derived by [56, 58].

Fig 8. Dynamical states of the DDE system.

Fig 8

Numerical integration of the DDE system (without phenomenological reset) demonstrating 4 distinct dynamical regimes for increasing external drive IE (bottom axis). Blue: Stable fixed point with zero rate. Red: Pathological, fast oscillations. Yellow: Period-2 oscillations. Green: Regular period-1 oscillations. Top: Numerically integrated trajectories of population rate r and mean membrane potential μ over time. Note the changes in scale for the population rate. Bottom: Phase space showing the trajectory (μ(t), r(t)) and the fixed point (IE, 0), which is only stable in the first case (black circle) and unstable otherwise (empty circle). The horizontal colored bar shows the full range of (relevant) drives from 0 to the point of full synchrony IEfull. The boundaries between the four dynamical regimes were approximated numerically. Black triangles mark the levels, for which the above example dynamics are shown.Extra ticks indicate: critical drive IEcrit1.48, for which the spiking network undergoes a Hopf bifurcation (see section Linear Stability Analysis); lower bound IEmin introduced for our theory to ensure that we only consider non-pathological dynamics (see paragraph Range of applicability).

Our approach for analyzing the population rate dynamics in response to strong voltage transients may also be relevant in other contexts, such as gamma oscillations [66, 67], or propagation of synchronous neural activity in synfire-chains [29, 68]. A hysteresis effect similar to IFA was recently observed in optogenetically evoked gamma oscillations in cat visual cortex [69]. It may be an interesting subject for future research to extend the present Gaussian-drift approximation to gamma rhythms in E-I networks.

The shape of the asymptotic network frequency curve as a function of the external drive

A hallmark of the feedback-based inhibition-first ripple model is its state of sparse synchrony, where, for a wide range of external drives, the collective population activity exhibits oscillations in the ripple range, while individual neurons fire sparsely and irregulary at lower firing rates (Fig 1). In this state the asymptotic network frequency is not necessarily proportional to the external, excitatory drive, as is the case in networks at full synchrony or in clustered states [67, 70, 71].

In our reduced model, the asymptotic network frequency decreases with increasing external drive (Fig 1B). In the vicinity of the Hopf bifurcation and in the limit of small synaptic delay such a frequency decrease was shown analytically in [34]. In biologically more realistic model networks with an absolute refractory period and/or conductance-based synaptic coupling, the shape of the asymptotic network frequency curve can be quite different (see Fig A in S2 Appendix, panels a1–d1). Qualitatively, the hysteresis mechanism of IFA that we have uncovered here is preserved in all network architectures (Fig A in S2 Appendix, panels 2–4).

The shape of IFA

Our theory predicts that instantaneous ripple frequencies for a changing drive are different from the asymptotic frequencies for constant drive: If the drive increases the instantaneous frequencies are higher than the asymptotic frequencies, and if the drive decreases the instantaneous frequencies are lower. For a symmetric time course of the drive, this asymmetry immediately implies IFA with a negative regression slope χIFA approximately equal to the regression slope over the deviations between instantaneous and asymptotic frequencies (see Methods, Eq (19)). For asymmetric drive, the IFA slope χIFA may be slightly stronger or weaker, depending on the covariance of the asymptotic reference frequencies with time (see Methods, Eq (19)). In general, the shape of the instantaneous ripple frequency curve over time thus depends both on the shape of the drive, and on the shape of the asymptotic frequency as a function of the external drive (Fig A in S2 Appendix). In our reduced model (Fig A in S2 Appendix, panel d) the asymptotic frequency decreases for increasing values of the constant drive. Thus it is possible for the instantaneous frequencies in the second half of the ripple event (when the external drive decreases) to lie below the asymptotic reference while actually increasing as a function of time (Figs 2D and 7Aii–7Cii, Fig A in S2 Appendix, panels d2, d3). Interestingly, such a switch from decreasing to increasing frequency at the end of the ripple event was also found in some experimental data [44]. Others have reported an almost monotonic decrease or even a small peak in the beginning of the event [25, 37, 43, 45]. In the reduced model, IFA gets stronger (more negative χIFA) if the peak of the drive is asymmetrically shifted to the right (Fig A in S2 Appendix, panel d4). If we incorporate into the model an absolute refractory period (Fig A in S2 Appendix, panel c) or conductance-based synaptic coupling (Fig A in S2 Appendix, panels a,b) the shape of the asymptotic frequency curve changes such that the strongest IFA is achieved with a left-shifted stimulus (Fig A in S2 Appendix, panels a3, b3). It is only for rare combinations of model architecture and stimulus profile that the described asymmetry between instantaneous and asymptotic network frequencies does not imply IFA in the sense of χIFA < 0: If a network with asymptotic frequencies that increase strictly monotonically with the drive is stimulated with a strongly right-shifted stimulus (Fig A in S2 Appendix, panel b4), the resulting ripple does not exhibit IFA. Our theory still correctly predicts this response: the instantaneous frequencies are first above, then below the asymptotic reference.

Modeling sharp wave-like drive to CA1 interneurons

Square pulse. One may argue that a square pulse is the simplest possible model for the sharp wave-associated momentary increase in drive to CA1 interneurons (SPW-like drive). Our analysis, however, showed that transients under constant drive due to perturbations of initial conditions are short-lived (∼one cycle in theory, Fig 5). This implies in particular that a square pulse cannot account for IFA (Fig A in S2 Appendix).

Note that there is a subtlety when comparing the simulation of the reduced model under a square pulse (panel d5 of Fig A in S2 Appendix) with the theoretical analysis for piecewise constant drive in Fig 5: in Fig 5 we analyze the network response to stepwise changes in the drive that occur when the network is already in an oscillatory state (i.e. μmin<VT-3D is sufficiently subthreshold, otherwise the drift-based theory does not apply). In that case, an upwards step in the drive always corresponds to the red regime of Fig 5B, where the instantaneous frequency is higher than the asymptotic frequency. In Fig A in S2 Appendix, on the other hand, a square pulse is delivered to the network in the non-oscillatory steady-state. The stationary mean membrane potential in that state hovers just below the threshold (i.e. μmin>VT-3D). If we apply the Gaussian-drift approximation nevertheless, this corresponds to a starting distribution of membrane potentials with a significant portion already above threshold. In that case a square pulse triggers a first population spike starting with a high, non-zero rate, that silences itself as soon as the inhibitory feedback starts (toff = Δ). Due to the strong inhibitory feedback this cycle can end with a mean membrane potential below the asymptotic μmin associated to the amplitude of the square pulse. The next cycle thus starts in the blue regime of Fig 5B and is longer than the asymptotic period, before the system settles into the asymptotic dynamics. This explains the slight tendency for positive IFA slopes χIFA (Fig A in S2 Appendix, panel d5).

Most importantly, however, the network settles into the asymptotic dynamics very quickly (|χIFA| ∼ 0), so stimulation with a square pulse cannot account for realistic IFA in the feedback-based inhibition-first model.

In experiments, an optogenetic square pulse stimulation of CA1 pyramidal cells has been found to elicit IFA in the local field potential [9, 43]. An extension of the present model, including pyramidal cells and a proxy for the LFP, would be needed to determine the compatibility of the feedback-based inhibition-first model with these findings. It is plausible that a square pulse stimulation of pyramidal cells yields a non-square, double ramp-like drive to locally connected interneurons (see the ramp-like evolution of the evoked firing rates of pyramidal cells in Fig 2E of [9]). As our analysis emphasizes, any such fast-changing double ramp drive to CA1 interneurons is likely to induce IFA. For a direct square-pulse stimulation of interneurons, on the other hand, the present reduced model predicts no IFA in the inhibitory population activity. This prediction could be tested in future experiments.

Double ramp. For most of the manuscript we used a double ramp drive as the simplest model of SPW-like drive that can account for IFA. We chose a symmetric double ramp to highlight that the asymmetric phenomenon of IFA does not depend on asymmetry in the input. Our derivation emphasizes that the only necessary requirement for IFA is an external drive that changes sufficiently fast and first rises, then decays. Thus we predict that IFA in the feedback-based model will occur for a wide range of such SPW-like external drives, even when their time courses are weakly asymmetric (Fig A in S2 Appendix, panels 2–4).

IFA in other ripple models

In the introduction we motivated IFA as a means of dissociating the present feedback-based model from the perturbation-based, inhibition-first model. Indeed, the perturbation-based inhibitory model [41] makes different predictions with respect to IFA: In the perturbation-based model, ripples can emerge only transiently, when a sudden increase in drive synchronizes the interneurons; furthermore, there is no sparse synchrony but all neurons fire in all ripple cycles until the neurons desynchronize: The network frequency during this transient ripple event reflects the momentary unit firing rate and is thus set directly by the strength of the external drive; a symmetric “up-down” drive thus creates a symmetric instantaneous frequency response that first rises and then decays (no IFA). Since the oscillation power in this perturbation-based model decays monotonically over the course of the ripple event, the initial phase of rising frequency even dominates over the subsequent decay (“anti-IFA”). The perturbation-based model thus predicts that IFA only occurs in response to asymmetric drive (e.g. a sudden step up, followed by a ramp down). Excitatory currents measured during spontaneous SPW-Rs can exhibit some asymmetry due to synaptic filtering [26], but generally have a non-zero rise time (CA3 PV+ BC: [21, 72]; CA1 pyramidal cell: [26, 37]). Alternatively, the perturbation-based model could account for IFA under a step-current input, by assuming strong spike frequency adaptation. However, experimental evidence suggests that PV+ interneurons—likely candidates for inhibition-first ripple generation in CA1 [7378]—do not exhibit strong spike frequency adaptation [62]. This speaks more in favor of the feedback-based inhibitory model, which can account for IFA occuring independently of the exact (a)symmetry of the SPW-associated drive (Fig A in S2 Appendix), and without assuming high interneuron firing rates with strong spike frequency adaptation.

In both excitation-first ripple models, the ripple frequency depends on the average spike propagation delay among pyramidal cells—either orthodromically via supralinear dendrites [29, 79] or antidromically via axo-axonal gap junctions [27, 28]. These models may be able to account for IFA by assuming an increase in the spike propagation delay over the course of a ripple event. Such an increase in latency might occur due to increasing somatic depolarization which has been shown to decrease the action potential amplitude [80, 81]. Future work should investigate in more depth whether and under which conditions excitation-first models can generate IFA.

A note on mixed excitatory-inhibitory models for ripple generation

Computational models for ripple generation that combine excitatory and inhibitory populations (mixed, excitatory-inhibitory ‘E-I’ network models) are sometimes listed as a third class of ripple models in the literature [36, 38, 43]. Because CA1 contains both excitatory and inhibitory neurons, any “complete” in silico ripple model would need to incorporate both. In the present work, however, we use the term model to distinguish different generation mechanisms of ripple oscillations. To the best of our knowledge, there is no evidence that an E-I network model can produce oscillations in the ripple range in the absence of recurrent inhibitory coupling if biologically realistic parameters of delays and time constants are used. On the contrary, both theoretical [35] and numerical [37, 39] analyses suggest that the presence of a disynaptic E-I loop promotes network frequencies that are well below the ripple range, which is why E-I network models are typically used in the context of gamma oscillations [66, 82, 83]. We thus posit that, mechanistically, all currently known ripple models can be categorized into either inhibition-first or excitation-first (a potential special case is the model by [84] relying on an excitatory effect of GABAergic synaptic transmission). The respective other cell type can have a modulatory influence, and such modulations have been studied (interneurons in excitation-first models: [28, 29]; pyramidal cells in inhibition-first models: [3541, 43]): Changes in the recruitment of CA1 interneurons (feedforward via CA3, or local via CA1) or in the balance of feedforward drive to CA1 interneurons and pyramidal cells have been suggested to mediate transitions between ripple and gamma network states [37, 39]. Furthermore, there is evidence that ripple phase-locked action potentials of pyramidal cells may increase the ripple power in the local field potential [24].

Measurement and quantification of ripple oscillations: Population activity or local field potential?

Understanding ripple oscillations requires not only understanding their generating mechanism but also the origin of the signal that is used to detect them. Typically, ripple oscillations are quantified using the local field potential (LFP). The LFP is traditionally believed to reflect synaptic currents [23, 85], but modeling studies have suggested that action potentials of pyramidal cells can also contribute when occurring locked to a ripple rhythm [24, 38]. In fact, the participation of pyramidal cells in the ripple rhythm may be crucial for the detectability of ripple power in the LFP [43].

In the present work, we used the population activity of interneurons as a proxy for the LFP signal. The relation between interneuron population activity and LFP in the feedback-based inhibition-first model was partially explored by [37] using numerical simulations: They found that for bandpass-filtered (120–300 Hz) average inhibitory currents as a simple model of the CA1 LFP, IFA in the inhibitory population activity translates into IFA in the LFP. Simulations of the feedback-based inhibition-first model including pyramidal cells [3638] have shown that the inhibitory network can pace pyramidal cell spikes to occur phase-locked to the ripple rhythm. Furthermore, IFA can occur in the spiking activity of both excitatory and inhibitiory populations [37]. It may therefore be reasonable to assume that even in a more complex model of the LFP, taking into account both inhibitory synaptic currents and excitatory action potentials, IFA in the inhibitory population activity may translate into IFA in the LFP.

Note that all proposed ripple models focus on explaining ripple generation in the spiking activity of the respective pacemaker population (excitatory or inhibitory). A detailed model of the CA1 LFP and additional numerical simulations will be needed to confirm, for each of the models, whether they yield a ripple signature in the LFP that is consistent with experimental data.

Conclusion

In conclusion: The feedback-based inhibition-first ripple model can account naturally for IFA without adding further parameter constraints. A deepened understanding of the transient ripple dynamics in each of the proposed models together with extensive experimental testing of the various predictions will hopefully advance our understanding of the generating mechanism of ripple oscillations and enable us to study their potential role in memory consolidation.

Methods

The following section on the Methods of this paper is written to be self-contained. Thus, some of the equations shown in the Results will be reprinted here in the Methods for better readability.

Spiking neural network simulations

Network architecture

To model ripples in a spiking neural network, we consider a fully-connected inhibitory network of noisy leaky integrate-and-fire (LIF) neurons. Each neuron’s membrane potential vi (for i = 1, …, N) is given by the following stochastic differential equation (SDE):

τmv˙i=-vi+Eleak+τmCIext(t)-τmJNj=1Nkδ(t-Δ-tjk)+2τmσVξi(t); (13)

see also Results, Eq (1). Whenever the membrane potential reaches a threshold of Vthr = −52 mV, a spike is recorded and the membrane potential is reset to Vreset = −65 mV. For simplicity there is no absolute refractory period. We choose a membrane time constant τm = 10 ms, resting membrane potential Eleak = −65 mV and membrane capacitance C = 100 pF as used by [37]. All default parameter values are summarized in Table 2.

In our model, Eq (13), each unit receives a common excitatory drive Iext(t) and a common inhibitory synaptic input in form of a sum of presynaptic Dirac-delta spikes from all neurons in the network. Spikes occur at times {tjk}j=1,,Nk=1,2,, where tjk denotes the k-th spike time of neuron j. They are transmitted to all neurons in the network and arrive at the postsynaptic neurons after a synaptic delay of Δ = 1.2 ms. The amplitude of the inhibitory postsynaptic potential elicited by one input spike is determined by the synaptic strength J = 65 mV. Our default network has N = 10, 000 units.

To account for noisy background activity, every unit receives independent Gaussian white noise ξi(t) with 〈ξi(t)〉 = 0, 〈ξi(t)ξj(t′)〉 = δi,jδ(tt′). The strength of the noise is determined by the parameter σV = 2.62 mV, which can be interpreted as the long-time standard deviation of the membrane potential in the absence of a threshold.

In simulations of the spiking network with a finite temporal resolution (Δt = 0.01 ms), we define the empirical population activity as

rN(t)=nspk(t,t+Δt)NΔt (14)

(see also Results, Eq (2)), where

nspk(t,t+Δt)=tt+Δti=1Nkδ(s-tik)ds (15)

denotes the total number of spikes that were emitted from the population in the time interval [t, t + Δt]. The empirical population activity has units of spikes per second and can also be interpreted as the instantaneous firing rate of single neurons in our homogeneous network [86].

For plotting purposes the empirical population activity is smoothed with a Gaussian window g of standard deviation σt = 0.3 ms:

rNplot(t)=(g*rN)(t),g(t)=12πσtexp[-t22σt2]. (16)

To facilitate notation we omit the superscript “plot” in all Figures.

For constant drive, we simulate 5.05 s (time step Δt = 0.01 ms). The initial membrane potentials are drawn randomly from a uniform distribution between the reset and threshold potentials. The initial 50 ms are excluded from analysis, because we are not interested in the initial transient but in the asymptotic network dynamics. The remaining 5 s are sufficient for a basic spectral analysis.

For time-dependent drive, we first simulate the network for 200 ms with a constant baseline drive of IEcrit/2 (see section Linear stability analysis), followed by the time-dependent, sharp wave-like stimulus, which we model as a piecewise linear double ramp of slope ±m with a plateau phase of arbitrary length in between (here 20 ms):

Iext(t)={Iextcrit/2,0tt1=200msIextcrit/2+m(t-t1),t1<tt2=t1+Iextfull-Iextcrit/2mIextfull,t2<tt3=t2+20msIextfull-m(t-t3),t3<tt4=t3+Iextfull-Iextcrit/2m (17)

During the plateau phase the drive is at the approximate point of full synchrony Iextfull for the respective spiking network, as determined by a range of constant-drive simulations.

Simulations of the LIF network model have been performed using the Brian2 simulator [87]. Data storage and parallelization of simulations for large parameter explorations were done using the Python toolkit pypet [88].

Frequency analysis

The network frequency at constant drive is defined as the location of the dominant peak in the power spectral density of the population activity rN(t). The saturation (average fraction of neurons firing in one cycle of the population rhythm) is computed by dividing the mean unit firing rate by the network frequency.

To define the instantaneous network frequency in response to time-dependent drive, we use frequency estimates both in continuous time and for a discrete set of time points. The continuous estimate is derived from the wavelet spectrogram (windowed Fourier transform) of the population activity, which indicates instantaneous power in the frequency band from 0 to 350 Hz over time; the instantaneous frequency at each point in time is defined as the frequency above 70 Hz, that has maximal instantaneous power. The lower limit is introduced to exclude the low-frequency contribution due to the sharp wave. The instantaneous frequency is considered significant, whenever the corresponding instantaneous power exceeds a power threshold. The power threshold is chosen as the average instantaneous power at 0 Hz during the initial 200 ms baseline window, plus 4 standard deviations.

An alternative estimate of the instantaneous frequency is given by the inverse of the peak-to-peak distances in the (smoothed) population rate. We consider only peaks that are more than 4 standard deviations above the average rate during baseline stimulation. This procedure delivers a discrete set of frequency estimates associated with a discrete set of time points. In this paper we mostly rely on this “discrete” estimate of instantaneous frequency since its parameter-dependencies (minimal height of oscillation peaks) are more transparent than the ones of the continuous-time estimate (size of time window for windowed Fourier transform, power threshold). Furthermore it is better suited for comparison with the theory which also describes instantaneous frequency as a discrete measure per cycle (see Eq (65)).

To quantify the network’s instantaneous frequency response to a SPW-like drive (Eq (17)), we perform 50 independent simulations of the network model with the same drive but different noise realizations. Linear regression over the discrete instantaneous frequency estimates (t^i,f^i)i=1,, from discrete oscillation cycles, pooled together from all 50 simulations, yields the average change of the instantaneous frequency over time:

fnetinst(t)χIFA·t+const.,χIFA=Cov(f^,t^)Var(t^) (18)

A negative slope χIFA < 0 indicates IFA.

Note that for our symmetric model of SPW-like drive (Eq (17)), the regression slope χIFA is approximately equal to the slope resulting from linear regression over the deviations of the instantaneous network frequencies f^i from the symmetric, asymptotic reference frequencies fnet(Iext(t^i)) (except for small effects of asymmetric sampling over time):

χIFA=Cov[f^-fnet(Iext(t^)),t^]Var(t^)+Cov[fnet(Iext(t^)),t^]Var(t^)0Cov[f^-fnet(Iext(t^)),t^]Var(t^) (19)

For asymmetric drive, the covariance Cov[fnet(Iext(t^)),t^] needs to be taken into account, when inferring the IFA slope χIFA from our analysis of the asymmetry between instantaneous and asymptotic frequencies.

The point of full synchrony

We estimate the point of full synchrony in the spiking network by interpolating the simulated saturation curve and estimating the level of drive for which it becomes 1.

Extracting the average oscillation cycle

For constant drive, we split the spiking network simulation into individual cycles based on the Hilbert transform of the mean membrane potential. We take a sufficient number of equally spaced samples from each cycle (here 21) and average them across cycles to derive the average trajectory of the population rate and the membrane potential histogram over the course of a ripple cycle. For each of the 21 sample times we can calculate the average membrane potential, which will be used for comparison with the theory.

Mean-field approximation

Dimensionless equations

To facilitate notation in the theoretical part of this paper, we shift and rescale all voltages, such that the spiking threshold becomes VT = 1 and the resting potential is EL = 0. This corresponds to measuring voltage in units of the distance from the resting potential to the spike threshold. The single unit SDE then reads

τmV˙i=-Vi+I(t)+2Dτmξi(t) (20)
I(t)=IE(t)-II(t) (21)

(see also Results Eq (3)), where now

Vi=vi-EleakVthr-Eleak,IE(t)=τmC(Vthr-Eleak)Iext(t),II(t)=Kτm1Ni=1Nk=1njδ(t-Δ-tjk),K=JVthr-Eleak,D=σVVthr-Eleak,VR=Vreset-EleakVthr-Eleak (22)

are all dimensionless quantities.

Fokker-Planck equation

In the mean-field limit of an infinitely large interneuron population (N → ∞) the evolution of the membrane potential density p(V, t) is described by the following Fokker-Planck equation (FPE) (see e.g. [34, 51]):

tp(V,t)=-V(1τm(I(t)-V)p(V,t))+DτmV2p(V,t) (23a)
I(t)=IE(t)-II(t)=IE(t)-Kτmr(t-Δ); (23b)

see also Results Eqs (4)–(5). The FPE can also be written as a continuity equation

tp(V,t)=-VJ(V,t) (23c)

with a probability current

J(V,t)=1τm(I(t)-V)p(V,t)-DτmVp(V,t). (23d)

Since units are reset instantaneously as soon as they reach the spiking threshold, the FPE has an absorbing boundary condition at threshold:

p(VT,t)=0. (23e)

The population rate r is given by the probability current through the threshold:

r(t)=J(VT,t)=(23e)-DτmVp(V,t)|VT; (23f)

see also Results, Eq (6). The fire-and-reset mechanism introduces a derivative discontinuity at the reset potential VR:

limϵ0[Vp(VR+ϵ,t)-Vp(VR-ϵ,t)]=-τmDr(t), (23g)

see also [5155]. The membrane potential density must decay to zero fast enough in the limit of V → −∞:

limV-p(V,t)=limV-Vp(V,t)=0. (23h)

As a probability density, p(V, t) obeys the normalization condition:

-VTp(V,t)dV=1t. (23i)

Derivation of the Gaussian-drift approximation

Without the absorbing boundary at threshold and the source term due to the fire-and-reset rule, the FPE has only natural boundary conditions and can be solved analytically. Its solution pδ for an initial Dirac delta distribution pδ(V, 0) = δ(Vμ0), is a Gaussian density with time-dependent mean μ(t) and variance σ2(t) [89]:

pδ(V,t)=12πσ(t)exp[-(V-μ(t))22σ2(t)]. (24)

The mean membrane potential μ evolves according to the single unit ODE (Eq (20)) without the noise term:

μ˙(t)=1τm(I(t)-μ(t)),μ(0)=μ0; (25)

see also Results (8a). The variance of the membrane potential distribution, σ(t)2, approaches D with a time constant of τm/2:

σ2(t)D(1-e-2t/τm)tD. (26)

The solution for an arbitrary initial condition can be found by convolution of the initial condition with pδ. Hence in the long time limit (t → ∞) all solutions of the FPE with natural boundary conditions tend towards a Gaussian with variance D—independent of the initial condition (which has to satisfy the boundary conditions Eq (23)).

In our ripple-generating network, with strong drive and strong coupling, the bulk of the membrane potential distribution is strongly subthreshold in between population spikes (i.e. unaffected by the non-natural boundary conditions), and thus tends towards a Gaussian density. When the next population spike begins, we can hence approximate the membrane potential density as a Gaussian with fixed variance D:

p(V,t)12πDexp[-(V-μ(t))22D]; (27)

see also Results, Eq (7). This Gaussian approximation allows the derivation of a simple expression for the population rate under strong drive [5658]:

r(t)=J(VT,t)=(23h)-VTVJ(V,t)dV=(23c)-t-VTp(V,t)dV=(27)μ˙(t)p(VT,t). (28)

The rate is given by the value of the Gaussian density at threshold, scaled by the speed at which the mean membrane potential approaches the threshold. Since only upwards-threshold-crossings should contribute to the rate, we add a sign-dependence:

r(t)=[μ˙(t)]+p(VT,t); (29)

see also Results Eq (8b). The rate is clipped to 0 whenever the mean membrane potential decays: [μ˙(t)]+max(0,μ˙(t)) [5658].

Numerical analysis of oscillation dynamics

In its derivation above, we formulated the Gaussian-drift approximation in several equations, describing the membrane potential density p (Eq (27)), mean membrane potential μ (Eq (25)) and population rate r (Eq (29)) separately. Note however that the mean membrane potential μ is the only independent variable and thus the Gaussian-drift approximation can be rephrased as a single delay differential equation (DDE):

μ˙(t)=(25)1τm(I(t)-μ(t))=(23b)1τm(IE-τmKr(t-Δ)-μ(t)) (30)
=(29)(27)1τm(IE-τmK[μ˙(t-Δ)]+12πDexp[-(VT-μ(t-Δ))22D]-μ(t)) (31)

We will nevertheless illustrate the solutions as two-dimensional trajectories of both μ(t) and r(t) (Fig 8), since the population rate is our main variable of interest. The DDE can be integrated numerically using a simple forward Euler method. We numerically explored the dynamics for initial conditions μ(t ≤ 0) ≪ VT. Effectively, we assume that the Gaussian membrane potential distribution is so far subthreshold that no spiking has occured before time 0 (r(t) ∼ p(VT, t) ≈ 0 ∀t ≤ 0). In that case, the intialization of μ˙(t0), which only appears as a second factor in the rate expression Eq (29), is irrelevant as long as it is finite. For constant drive IE, we then find a range of potential dynamics (see Fig 8).

There is a large regime of sufficiently strong drive in which the solution μ exhibits persistent period-1 oscillations (Fig 8, green). This is the regime that we are interested in and the dynamics of which we will approximate in the following.

At lower levels of drive, there are three additional dynamical regimes that we will exclude from analysis: At very low drive, the system has a stable fixed point (μ˙=0) in (μ(t), r(t)) ≡ (IE, 0) (Eq (30), Fig 8, blue). The bifurcation at which the fixed point loses stability can be determined numerically (see Eq (38) for an analytical approximation). Immediately after the bifurcation, the DDE solution exhibits very fast oscillations at (2Δ)−1 ∼ 417 Hz (Fig 8, red). The oscillation amplitude of the mean membrane potential is very small, and a large portion of the Gaussian potential density is suprathreshold at all times. We refer to this oscillation as “pathological” since it is a direct result of the artificial clipping of the rate to 0 whenever the mean membrane potential decays (Eq (29)). Increasing the drive further brings the system into a state of period-2 oscillations where the Gaussian density gets pushed below threshold only every other cycle (Fig 8, yellow).

These regimes of pathological high-frequency or period-2 oscillations exist due to our simplifying assumptions capturing only the mean-driven aspects of the network dynamics. Both regimes occur either shortly before or after the level of drive IEcrit at which the original spiking network undergoes a supercritical Hopf bifurcation (see tick mark in Fig 8). In the vicinity of that bifurcation the spiking network dynamics are fluctuation-driven with either no (IE<IEcrit) or only small-amplitude (IE>IEcrit) oscillations in the mean membrane potential and population rate. These cannot be captured without taking into account the absorbing boundary at threshold and the non-Gaussian shape of the density of membrane potentials. We will exclude these pathological dynamics from analysis by introducing a lower bound IEmin for the theoretical approximations that we develop in the following (see tick mark in Fig 8).

Analytical approximation of oscillation dynamics for constant drive

For strong enough drive the mean membrane potential μ under the Gaussian-drift approximation oscillates periodically between two local extrema μmin and μmax (Fig 8, green regime, see also Fig 9A). The population rate r(t) oscillates at the same frequency and is positive when the mean membrane potential increases, i.e. μ˙(t)0, and 0 otherwise, c.f. Eq (29). The time when the mean membrane potential reaches its local maximum μmax marks the end of the population spike and will be denoted as toff. The inhibitory feedback (Eq (23b)) thus ends at time toff + Δ and we define μminμ(toff + Δ) as the end of a cycle. This allows us to approximate the overall period as T = toff(μmin, μmax) + Δ. Here, we wrote toff(μmin, μmax) to emphasize that toff is the rise time of the mean membrane potential from μmin to μmax (upstroke). Furthermore, Δ is the duration of the subsequent downstroke back to μmin. In the following, we will approximate μmax (Step 1) and μmin (Step 2) and thus derive the network frequency as the inverse of the period:

fnet=T-1=(toff(μmin,μmax)+Δ)-1 (32)

(see also Results, Eq (9)).

Fig 9. Illustration of analytical approximation of DDE dynamics.

Fig 9

(A) One cycle of the oscillatory solution of the DDE for IE = 3.6. Dotted lines for rate (top) and mean membrane potential (bottom) are the result of a numerical integration of the DDE (Eq (31)). All other lines illustrate our analytical considerations. Top: population rate r(t) (black). Bottom: mean membrane potential μ(t) (full grey line: Eq (A2) during upstroke, Eq (39) during downstroke, gray area: μ(t)±3D indicating the width of the Gaussian density p(V, t)). Constant external drive IE (green line), total input I(t) = IEII(t) (dashed line, Eq (42)). Local extrema of the mean membrane potential occur at the intersections of μ and I: Cyan: local maximum μmax (Eq (38)). Orange: approximate local minimum μmin (Eq (43)). Vertical dotted lines mark end of population spike toff as well as intervals toff + kΔ, k ∈ {−2, −1, 1}. Arrows illustrate simplifying assumption (A1). The beginning of the cycle (ton = 0) is determined by μmin. Horizontal gray bars mark the length of one cycle (here T = toff + Δ = 3.44 ms (Eq (47)), corresponding to a network frequency of fnet = 290.7 Hz, (Eq (32))). Inset: magnification highlighting the differences between numerical solution (dotted) and analytical approximation (full line). Due to assumption (A2) μmax is slightly overestimated. Note that the second intersection of μ and I occurs shortly before time toff + Δ. Hence μmin is slightly larger than the true local minimum. (B) Same as A, but with an account for the reset on the population level. At the end of the population spike, μ is reset instantaneously from μmax to μreset (Eq (49)) (yellow marker). This leads to a lower μmin (Eq (50)) and hence a slightly longer period (T = 4.24 ms), i.e. lower network frequency (fnet = 235.8 Hz). (C) Illustration of phenomenological reset. Cyan: density of membrane potentials p(V, toff) at the end of the population spike, centered at μmax (before reset). Cyan hatched area: fraction of active units (saturation, Eq (48)). Grey hatched area: resetting the active portion of p. Yellow: p(V, toff) after reset, centered at μreset. The reset value μreset is calculated as the average of the density that results from summing the grey-dashed (active units) and cyan-non-hatched (silent units) density portions (Eq (49)). Default parameters (see Table 2).

Step 1

To find the local maximum μmax = μ(toff) we have to solve

0=μ˙(toff). (33)

Since the dynamics of the mean membrane potential are given by a delay differential equation, the term on the right-hand side is recurrent:

μ˙(toff)=(25)1τm(I(toff)-μ(toff))

From the yet unknown time toff we have to look back in time (in windows with length of the delay Δ) at the history of the population rate r(t):

=(23b)1τm(IE-τmKr(toff-Δ)-μ(toff))

which in turn depends on μ(t) and μ˙(t):

=(29)1τm(IE-τmK[μ˙(toff-Δ)]+p(VT,toff-Δ)-μ(toff))=(25)(23b)1τm(IE-K[IE-τmKr(toff-2Δ)-μ(toff-Δ)]+p(VT,toff-Δ)-μ(toff))= (34)

We can resolve the recurrence since there is only a finite number of past time windows, during which the population rate, and thus the delayed feedback inhibition II(t) = mr(t − Δ), is significantly above 0. In the first time window [toff − Δ, toff], right before the end of the upstroke, we have to take inhibition into account, since this is what stops the upstroke. In the second time window [toff − 2Δ, toff − Δ], further into the past, we will assume that the inhibitory feedback is negligible, i.e.

I(t)IE0ttoff-Δ.

This assumption implies that the population rate was negligible in the previous time window, i.e.

r(t)00ttoff-2Δ (A1)

(Eq (23b), Fig 9A). Note that t here refers to time since the beginning of the cycle (t = 0). Since the population spike ends at time toff, (A1) is equivalent to the assumption that the population spike lasts at most 2Δ. Adding the subsequent downstroke time of Δ this amounts to an upper bound for the oscillation period of around 3Δ, plus any additional upstroke time with r ≈ 0, which is a reasonable assumption for a feedback loop with delay Δ as argued already by [34].

Under this assumption, we set r(toff − 2Δ) = 0 in Eq (34), and only a finite number of terms remains:

0=μ˙(toff)(A1)1τm[IE-μ(toff)-K[IE-μ(toff-Δ)]p(VT,toff-Δ)].(58) (35)

Note that the rectification [⋅]+ has been dropped in this equation because the mean membrane potential can never be larger than the external drive. Furthermore, we assume that we can approximate the past μ(toff − Δ) during the upstroke by only considering the excitatory drive, I(t) ≈ IE. Under this assumption, Eq (25) yields the exponential relaxation

μ(t)IE-(IE-μmin)e-t/τm0ttoff (36)

Looking backwards from time toff, we can reformulate this assumption as:

μ(toff-x)IE-(IE-μmax)ex/τmx[0,toff] (A2)

The resulting error is small since the exponential relaxation of the mean membrane potential towards the total drive I(t) is governed by the membrane time constant (τm = 10 ms). The population spike on the other hand is quite synchronized (we assumed that it lasts less than 2Δ ≪ τm, (A1)). The time window right before toff, during which the units receive inhibitory feedback and the total drive deviates from IE, is thus small compared to the membrane time constant and alters the trajectory of μ only slightly (Fig 9A, inset).

Eq (35) thus simplifies to

01τm(IE-μmax-K(IE-μmax)eΔ/τmp(VT,toff-Δ))0(27)1-KeΔ/τm12πDexp[-(VT-IE-(μmax-IE)eΔ/τm)22D] (37)

The simplified equation can be readily solved for μmax:

μmaxIE-e-Δ/τm[IE-VT+2Dln[K2πDeΔ/τm]]. (38)

To ensure that the argument of the logarithm in Eq (38) is larger than 1, the coupling must be sufficiently strong: K2πDe-Δ/τm. For weaker coupling the inhibitory feedback is never strong enough to counteract the external drive, hence no local maximum μmax exists. The existence of a lower bound on synaptic strength for the emergence of oscillations is in line with previous analyses [34]. Furthermore, there is also a lower bound for the drive IE: if the drive is too weak, i.e. IE<VT-2Dln[K2πDeΔ/τm], the term inside the square brackets in Eq (38) is negative and we find μmax > IE. But then μmax cannot be reached; instead, the mean membrane potential simply settles into its fixed point IE (Fig 8, blue dynamics). Eq (38) thus yields an analytical approximation of the bifurcation point where the DDE fixed point loses stability and oscillations emerge (numerical, Fig 8, blue to red: IE ∼ 0.61; analytical, Eq (38): IE ∼ 0.56, for default parameters, Table 2). For strong enough coupling and a strong enough external drive a well-defined local maximum μmax < IE exists. Note that μmax does not depend on the initial voltage μmin. This makes sense intuitively for any μmin that is sufficiently smaller than μmax, since the “turning point” of the mean membrane potential depends only on the feedback inhibition resulting from the immediate history shortly before μmax is reached (t ∈ [toff − 2Δ, toff]).

Step 2

We will now approximate the trajectory of the mean membrane potential during its inhibition-induced downstroke and infer μmin = μ(toff + Δ), which corresponds to both the end and the starting value of each cycle of a periodic solution.

Note that while μmin is close to the periodic local minimum of the mean membrane potential, it is a slight overestimation: The local extrema of the mean membrane potential μ occur at its intersections with the total drive I(t) (see Eq (33), Fig 9A). At time toff the mean membrane potential reaches its local maximum and becomes larger than the total drive I(t) (Fig 9A). Since the population spike ends at time toff, the delayed inhibitory feedback II(t) will stop at time toff + Δ: The total drive at this point will equal the external drive (I(toff + Δ) = IE); note that the mean membrane potential μ can never be larger than the external drive IE. Hence, if μ becomes larger than I at time toff and is smaller than I at time toff + Δ, μ must intersect with I(t) and reach its local minimum slightly before time toff + Δ (see Fig 9A, inset). What we define as the initial/final membrane potential of a cycle (μminμ(toff + Δ)) is thus close to but slightly larger than the periodic local minimum. This does not affect our estimate of the period.

The definition of a fixed downstroke duration Δ allows us to find μmin directly by integrating the mean membrane potential ODE Eq (25) up to time toff + Δ, starting from the initial value μ(toff) = μmax:

μ(t)=μmaxe-(t-toff)/τm+1τmtofftI(s)e-(t-s)/τmdsforttoff. (39)

It follows for μmin = μ(toff + Δ) with initial condition μ(toff) = μmax:

μmin=μmaxe-Δ/τm+1τm0ΔI(toff+τ)e-(Δ-τ)/τmdτ (40)

Again, the total current I(toff + τ) with τ ∈ [0, Δ], has a recurrent dependency on the past dynamics, which can be seen by repeated application of Eqs (23b), (29), (25):

I(toff+τ)=IE-K[IE-K[I(toff-(2Δ-τ))-μ(toff-(2Δ-τ))]p(VT,toff-(2Δ-τ))-μ(toff-(Δ-τ))]p(VT,toff-(Δ-τ))

As before we assume that the current I at time toff − (2Δ − τ) ≤ toff − Δ is given exclusively by the excitatory drive (A1), which truncates the infinite recurrent expression above to:

I(toff+τ)(A1)IE-K[IE-K[IE-μ(toff-(2Δ-τ))]p(VT,toff-(2Δ-τ))-μ(toff-(Δ-τ))]p(VT,toff-(Δ-τ)) (41)

We are left with a finite number of terms that depend on the trajectory of μ(t) during the upstroke. Again, we approximate μ(toffx), x ≥ 0 assuming exponential relaxation towards only the external drive, i.e. ignoring inhibition (A2). This abolishes all dependencies on the yet unknown time toff of the end of the population spike:

I(toff+τ)(A2)IE-K(IE-μmax)e(Δ-τ)/τmp(VT,toff-(Δ-τ))+K2(IE-μmax)e(2Δ-τ)/τmp(VT,toff-(2Δ-τ))p(VT,toff-(Δ-τ)) (42)

Inserting this approximation (Eq (42)) into the expression for μmin (Eq (40)) yields:

μminμmaxe-Δ/τm+1τme-Δ/τmIE0Δeτ/τmdτ-1τmK(IE-μmax)0Δp(VT,toff-(Δ-τ))dτ+1τmK2(IE-μmax)eΔ/τm0Δp(VT,toff-(2Δ-τ))p(VT,toff-(Δ-τ))dτ

The integrals can be solved analytically, if we approximate the past trajectory of μ linearly within the Gaussian expressions p:

μ(toff-x)(A2)μmax-(IE-μmax)xτm,0x2Δτm (A3)

Hence,

p(VT,toff-(kΔ-τ))12πDexp[-(VT-μmax+(IE-μmax)kΔ-ττm)22D].

Using this expression, we obtain

μminμmaxe-Δ/τminitialcondition+IE(1-e-Δτm)excitatorydrive-12K[[erf(ϕ(0))-erf(ϕ(Δ))]-K12πDe-c2DeΔτme2Δτm+1[erf(ψ(Δ))-erf(ψ(0))]]inhibitoryfeedback, (43)

where

ϕ(t)=VT-μmax+(IE-μmax)Δ-tτm2D (44)
ψ(t)=-(IE-μmax)(e2Δτm+1)(τm+Δ-t)+τm(IE-VT)(eΔτm+1)2D(e2Δτm+1)τm (45)
c=(VT-IE)2(1-eΔτm)2e2Δτm+1. (46)

Although lengthy, this expression can be easily evaluated numerically.

The local “minimum” μmin is not only the final but also the initial mean membrane potential of each cycle. The length of the upstroke can thus be approximated as the time it takes for the mean membrane potential to rise from μmin to μmax, based on exponential relaxation towards only the excitatory drive IE (A2):

toff(A2)τmln(IE-μminIE-μmax) (47)

Together with the assumed downstroke duration of Δ we arrive at an analytical estimate for the oscillation period T and hence the network frequency fnet (see Eq (32)). Overall we have derived analytical expressions for μmax, μmin, toff, fnet as functions of the external drive IE, which characterize the oscillatory dynamics. To evaluate the accuracy of our analytical approximation we integrate the DDE numerically (Eq (31), Fig 8, green regime of period-1 oscillations) and determine μmax, μmin and fnet. We find that the errors introduced by our simplifying assumptions (A1)–(A3) are small (Fig 10, dashed lines vs square markers).

Fig 10. Analytical vs numerical evaluation of oscillatory solutions in the Gaussian-drift approximation.

Fig 10

Network frequency (top) and dynamics of the mean membrane potential (bottom) quantified in terms of its periodic local minimum μmin (orange) and local maximum μmax (cyan) for different levels of external drive IE. The analytical approximations (solid lines: with reset, dashed lines: without reset) are very close to the results of numerical integration of the DDE Eq (31) (round markers: with reset, square markers: without reset). Including the reset does not affect μmax but leads to a decrease in μmin (Eqs (50) vs (43)) and thus a decrease in network frequency. Results are shown in the relevant range of external drives [IEmin,IEfull] (vertical dotted lines). For parameters see Table 2.

Accounting for the reset mechanism

The quantitative accuracy of our Gaussian-drift approximation with respect to the original spiking network can be increased by adding a phenomenological account for the reset on the population level. During the population spike (t < toff) the single unit reset has little influence on the population rate dynamics, since units spike at most once per cycle. At time toff the population spike ends and the integral over the suprathreshold portion of the Gaussian density of membrane potentials corresponds to the fraction of units that have spiked (the saturation):

sVTp(v,toff)dv=(27)12(1-erf(VT-μmax2D)) (48)

(see Fig 9C, cyan hatched area). Taking into account the reset mechanism at this point would mean shifting the suprathreshold portion of p(V, toff) downwards by an amount VTVR (Fig 9C, gray hatched area), essentially splitting the voltage distribution into two pieces, corresponding to silent units (Fig 9C, non-hatched area under cyan Gauss) and units that have spiked and been reset (Fig 9C, gray hatched area). To preserve our simplified framework of a unimodal, Gaussian voltage distribution, we will instead assume that the Gaussian voltage distribution is reset as a whole, to a new mean membrane potential μreset given by the average of the two distribution pieces (“silent” and “spike+reset”):

μreset-VTvp(v,toff)dv+VT[v-(VT-VR)]p(v,toff)dv=-vp(v,toff)dv-(VT-VR)VTp(v,toff)dv=μmax-(VT-VR)s (49)

This phenomenological account for the reset contains the implicit assumption that in between population spikes the membrane potential distribution “spends enough time” subthreshold that the bimodality created by the reset mechanism vanishes due to diffusion and the distribution becomes roughly Gaussian again. This assumption is satisfied for a relatively large portion of the parameter space spanned by noise intensity, coupling strength, and reset potential (see S1 Appendix). The introduction of the population-reset requires an adjustment of the definition of μmin (Eq (40)): Instead of using μmax as the initial value when integrating the feedback inhibition during the downstroke, we will now use the reset potential: μ(toff+)=μreset. Therefore, the membrane potential μmin = μ(toff + Δ) at the end of the cycle is given by

μmin=μresete-Δ/τminitialcondition+IE(1-e-Δτm)excitatorydrive-12K[[erf(ϕ(0))-erf(ϕ(Δ))]-K12πDe-c2DeΔτme2Δτm+1[erf(ψ(Δ))-erf(ψ(0))]]inhibitoryfeedback. (50)

Except for the initial condition term, all other terms remain unchanged (cf. Eq (43)). Since μreset < μmax (Eq (49)), the introduction of the reset decreases our estimate of the local minimum μmin. This leads to an increase of the upstroke time toff required for the mean membrane potential to rise from μmin to μmax (Eq (47)), and hence to a decrease in the network frequency (Eq (32), see Fig 10).

Note that when the reset is incorporated in the numerical integration of the DDE (31) (e.g. Fig 9B, dotted lines), the population rate after time toff needs an additional artificial clipping to zero, since the slope μ˙(t) will briefly become positive again after the reset (μ˙(toff+)I(toff)-μreset>I(toff)-μmax=0).

The point of full synchrony

Our analytical ansatz allows for a straightforward prediction of the point of full synchrony and its parameter dependencies. As mentioned before, the integral over the suprathreshold-portion of the membrane potential density at the end of the population spike corresponds to the fraction s of active units (saturation, Eq (48)). In the strict sense of

s=VTp(v,toff)dv=1

full synchrony can never be reached, since the Gaussian probability density p approaches zero only in the limit v → ±∞. We can however define approximate full synchrony as the state where only the 0.13th percentile of the distribution remains subthreshold:

s=VTp(v,toff)dv=0.9987μmax-3DVT

Using our mapping from external drive to μmax (Eq (38)), we can derive a closed-form expression for the external drive that is required to achieve full synchrony:

IEfull=VT+D3+e-Δ/τm2ln[K2πDeΔ/τm]1-e-Δ/τm (51)

Range of applicability of the Gaussian-drift approximation

There are two main constraints on the applicability of our theory:

(a) Since we assume that units spike at most once per cycle, the theory is only valid up to the point of full synchrony IEfull where network frequency and mean unit firing rate coincide.

(b) The assumption of a unimodal distribution of membrane potentials is only valid if, in between population spikes, the bulk of the membrane potential distribution is pushed sufficiently below threshold such that it can diffuse back to approximately Gaussian shape. We will thus require that at its lowest point the Gaussian density is almost entirely subthreshold:

-VTp(v,ton)dv=0.9987μmin(IE)+3D!VT (52)

Criteria (a) and (b) yield a finite range [IEmin,IEmax] of external drives for which the theory applies. Since for most parameter settings μmin is an almost monotonically decaying function of the drive (see Fig 10 and Results), constraint (b) is usually only relevant for the lower boundary IEmin of the drive, while the upper boundary is determined by constraint (a): IEmax=IEfull (Fig 10). For extreme combinations of high noise and weak coupling, however, μmin rises again for high drive, so constraint (b) also sets the upper boundary IEmax<IEfull (see Fig Ad in S1 Appendix, bottom right panel, corresponding to the three crossed-out parameter settings in Fig B in S1 Appendix: no valid theoretical estimate of the network frequency at full synchrony, since IEmax<IEfull).

Quantifying performance

We quantify the performance of our Gaussian-drift approximation across the range of drives for which

  • (a)

    the theory applies (IE[IEmin,IEmax])

  • (b)

    the spiking network has not crossed the point of full synchrony (IEIEfull,sim)

The size of this regime varies for different parameter settings. To ensure comparability we interpolate the results of all spiking network simulations to the same fine resolution:

IEi=IEmin+0.1i,IEi[IEmin,min(IEmax,IEfull,sim)].

We then compute the average relative error of the estimated network frequencies for each parameter setting:

χerr1ni|fnetsim(IEi)-fnettheory(IEi)|fnetsim(IEi)[0,1] (53)

We introduce a second score to quantify what portion of the relevant range of spiking network dynamics (from the Hopf bifurcation, IEcrit,sim, to the point of full synchrony, IEfull,sim) is covered by the theory:

χapplmin(IEmax,IEfull,sim)-IEminIEfull,sim-IEcrit,sim[0,1] (54)

We define an overall performance index as

χpχappl(1-χerr)[0,1] (55)

The larger the performance index χp the better the Gaussian-drift approximation captures the spiking network dynamics. A systematic evaluation of the performance of the Gaussian-drift approximation for different network parameters and levels of drive can be found in S1 Appendix.

Analytical approximation of oscillation dynamics for linear drive

We want to characterize the transient dynamics of any cycle i with initial mean membrane potential μmini and linear drive IE(t)=I^Ei+m(t-toffi) by deriving functions

(I^Ei,μmini,m)(fnetinst,i,funiti,toffi,μmaxi,μreseti,μmini+1). (56)

Here μmini+1 refers to the mean membrane potential reached at the end of cycle i, which potentially serves as the initial condition for the next cycle.

Step 1

The local maximum of the mean membrane potential μmaxi can be found with the same ansatz that was used for constant drive:

0=μ˙(toffi)=1τm(I^Ei-Kτmr(toffi-Δ)-μmaxi)=(29)1τm[I^Ei-μmaxi-Kτm[μ˙(toffi-Δ)]+p(VT,toffi-Δ)]=(25)1τm[I^Ei-μmaxi-K[IE(toffi-Δ)-Kτmr(toffi-2Δ)0,(A1)-μ(toffi-Δ)]+p(VT,toffi-Δ)]

Again we truncate the recurrent expression for the total current two Δ-time windows before the end of the population spike (A1):

(A1)1τm[I^Ei-μmaxi-K[I^Ei-mΔ-μ(toffi-Δ)]p(VT,toffi-Δ)]. (57)

Again we approximate the trajectory of μ during the upstroke based on relaxation towards only the excitatory drive, which is now a linear function of time (cf. (A2)). Using the beginning of cycle i as the time origin t = 0, we obtain

μ(t)(25)μminie-t/τm+1τm0te-(t-t¯)/τmIE(t¯)t¯=IE(t)-mτm+(mτm+μmini-IE(0))e-t/τm (58)

Under this approximation μmaxi can be written as

μmaxi=μ(toffi)(58)I^Ei-mτm+(mτm+μmini-IE(0))e-toffi/τm (59)

and the trajectory before time toffi (x ≥ 0) can be approximated as:

μ(toffi-x)=(58)IE(toffi-x)-mτm+(mτm+μmini-IE(0))e-(toffi-x)/τm=(59)(μmaxi-I^Ei+mτm)ex/τm+IE(toffi-x)=I^Ei-mx-mτm=I^Ei+(μmaxi-I^Ei)ex/τm+m(τmex/τm-(x+τm)) (60)

Inserting the above expression in the local maximum condition (Eq (57)) yields:

0I^Ei-μmaxi-K[(I^Ei-μmaxi)eΔ/τm-mτm(eΔ/τm-1)]2πD·exp[-(VT-I^Ei-(μmaxi-I^Ei)eΔ/τm-m(τmeΔ/τm-(Δ+τm)))22D] (61)

For constant drive (m=0,IE(t)I^Ei) we recover Eq (37), and thus the asympotic solution μmax. For m ≠ 0, we can solve Eq (61) for small m. To this end, we insert the perturbation series μmaxi=μmax(I^Ei)+μ^m+O(m2) and only keep the terms linear in m. Using Eq (37), we obtain to first order in m

μmaxiμmax(I^Ei)+mμ^

with

μ^τm(1-e-Δ/τm)[IE-VT]2Dln[K2πDeΔ/τm]+2ln[K2πDeΔ/τm]-[τm-(Δ+τm)e-Δ/τm]>0 (62)

Since the drive is strongly superthreshold (I^EiVT) the first order deviation mμ^ of μmaxi from its asymptotic value μmax(I^Ei) is generally very small. For biologically plausible parameters (e.g. a synaptic delay Δ that is not too small), the second term of Eq (62) is slightly larger than the first. Thus, sgn(mμ^)=-sgn(m), i.e. μmaxi is slightly smaller than its asymptotic value for linearly increasing drive, and slightly larger otherwise (see Fig 6C, cyan dots vs line).

Since already the zeroth-order approximation μmax(I^Ei) is close to the numerical solution μmaxi of Eq (61), and the reset mechanism remains unchanged, this implies that also μreseti is close to μreset(I^Ei).

The duration of the upstroke toffi can be obtained from Eq (59) taking into account that IE(0)=I^Ei-mtoffi:

toffi=-τmW(I^Ei-mτm-μmaximτmexp(-1+I^Ei-μminimτm))+I^Ei-mτm-μminim (63)

where W is the Lambert W function, which has solutions for arguments > − exp(−1). For positive slope m > 0 and low drive (I^Ei-μmaxi<mτm) this introduces a constraint on the initial value:

μmini>I^Ei-mτmlog(-mτmI^Ei-mτm-μmaxi) (64)

(cf. Figs 6B and 7Ai).

The instantaneous frequency of the cycle follows as

fnetinst,i=(toffi+Δ)-1. (65)

Note that for linearly changing drive, the asymptotic initial value μmin(I^Ei) no longer implies the asymptotic frequency fnetinst,i=fnet(I^Ei). If the drive increases linearly with a positive slope m > 0, the correct initial value μmini leading to fnet(I^Ei) (Fig 6B, left, white line) is larger than μmin(I^Ei) (Fig 6B, left, black line). Conversely, if the drive decreases linearly (m < 0), the correct initial value μmini leading to fnet(I^Ei) is smaller than μmin(I^Ei) (Fig 6B, right, white vs black line). The correction of the initial condition can be understood as follows: if e.g. the slope is positive, m > 0, the driving current IE(t) is smaller than I^Ei during the upstroke of μ(t), which lasts until time toffi (cf. Eq (12)) and Fig 6A, left). A smaller driving current results in a slower increase of μ(t), (cf. Eq (8a)), and hence a longer rise time from μmini to μmaxi. In order to match the asymptotic frequency, the initial value must therefore be chosen larger than μmin(I^Ei) so as to compensate the slower increase of μ(t) during the rising phase. A similar argument holds when the driving current is decreasing (m < 0).

Step 2

We will now compute the mean membrane potential μmini+1 that is reached at the end of a cycle i. Note that this step is technically not necessary to understand the instantaneous frequency fnetinst,i of an isolated cycle i, which we already derived above as a function of the reference drive I^Ei, the slope m, and an arbitrary initial condition μmini (Eq (65)). It is however of interest to demonstrate that the mean membrane potential μmini+1 at the end of the cycle is close to the asymptotic reference μmin(I^Ei), since it serves as the initial condition of the next cycle. It is this property, together with the dependence of fnetinst,i on the initial condition μmini, that implies the emergence of IFA under a piecewise linear drive that first increases, and then decreases over multiple consecutive ripple cycles (cf. Results, Fig 6C).

We find μmini+1 by integrating the total current during the downstroke:

μmini+1=μresetie-Δ/τm+1τm0ΔI(s+toffi)e-(Δ-s)/τmds (66)

The total current I can be split into two parts: Istat, which is approximately equal to the feedback current for constant drive IEI^Ei (Eq (41), except for slight deviations in μmaxi), and an additive new term Im caused by the linear change in the external drive:

I(t+toffi)=Istat(t+toffi)+Im(t+toffi)Istat(t+toffi)=I^Ei-Kp(VT,toffi-(Δ-t))(I^Ei-μmaxi)e(Δ-t)/τm+K2p(VT,toffi-(Δ-t))p(VT,toffi-(2Δ-t))(I^Ei-μmaxi)e(2Δ-t)/τmIm(t+toffi)=m(t+Kp(VT,toffi-(Δ-t))(Δ-t)-K2p(VT,toffi-(Δ-t))p(VT,toffi-(2Δ-t))(2Δ-t))

For constant drive (m = 0) we recover the asymptotic solution μmin(I^Ei) (Eq (50)). For m ≠ 0 we integrate Eq (66) numerically and find that μmini+1 is indeed close to the asymptotic solution μmin(I^Ei). One can infer intuitively, that the sign of the (small) deviation of μmini+1 from μmin(I^Ei) equals the sign of the slope m: If the drive increases linearly, the drive during the upstroke of the mean membrane potential (0ttoffi) is below the reference drive I^Ei=IE(toffi) (Eq (12)). Thus μ rises towards a slightly smaller maximum μmaxi (Eq (62)) and with slightly reduced speed, which decreases the resulting population rate and thus the inhibitory feedback (Eqs (29) and (23b)). Furthermore, the excitatory drive during the feedback-induced downstroke of μ (t[toffi,toffi+Δ])) is stronger than the reference drive (Eq (12)). It follows that the downstroke of the mean membrane potential is reduced, and thus μmini+1μmin(I^Ei). The opposite argument can be made for linearly decreasing drive.

At this point it is clear that a piecewise linear drive, that first increases and then decreases over multiple ripple cycles (Eq (17)) will inevitably induce IFA. It is also clear, that this IFA asymmetry will vanish in the limit of infinitely slow drive (|m| → 0).

Showing a concrete example of IFA under linear drive (Fig 6C) requires a forward integration of the network response over multiple cycles. Note that our analytical ansatz centered around the reference point I^E=IE(toffi) (Eq (56)) focuses on isolated cycles. For any individual cycle i—characterized by the drive I^Ei, the slope m and the initial condition μmini—we can (semi-)analytically calculate the frequency difference Δfi=fnetinst,i-fnet(I^Ei) (Eqs (65) and (32)). The colorplots in Fig 6B show Δfi at a fine resolution on two hyperplanes in this three-dimensional parameter space (drive with slope m = ±0.4/ms). In retrospect, the drive at the beginning and end of any individual cycle can easily be inferred as IE(t0i)=I^Ei-mtoffi and IE(Ti)=I^Ei+mΔ. Thus, to approximate the ripple dynamics over multiple consecutive cycles under linear drive (Fig 6B and 6C, circular markers) we search the previously explored parameter space and match cycles self-consistently such that the drive at the end of one cycle equals the drive at the beginning of the next: I^Ei+mΔ=I^Ei+1-mtoffi+1. For the double ramp examples shown in Figs 6 and 7 we further restricted the search for trajectories during the upwards-ramp to trajectories where the drive in the last cycle ends at the plateau level IEfull.

Comparing theory and simulation for piecewise linear drive

Since the theory provides a discrete, cycle-wise estimate of the instantaneous network frequency, we compare the result to the discrete estimate of instantaneous frequency in the simulations, based on the inverse of the distances between consecutive peaks in the oscillatory population rate. In the spiking network, SPW-like drive is modeled as a piecewise linear double ramp with an intermediate plateau phase of 20 ms (Eq (17)). The Gaussian-drift approximation is used to estimate the instantaneous network frequencies separately for the rising and falling phases of the drive (linear increase or decrease with slope ±m). The plateau phase is ignored, since the network frequencies rapidly converge to the asymptotic frequency associated to the drive during the plateau phase. In both simulation and theory IFA is quantified by computing a linear regression slope over the instantaneous frequencies. The theoretically estimated instantaneous frequencies are shifted in time to account for a hypothetical plateau phase of 20 ms in between up- and downstroke and allow full comparability with the simulation results.

For every theoretical instantaneous frequency estimate (ti,ftheoryinst(ti)) an error is calculated relative to the average instantaneous frequencies observed in the spiking simulation around the same time point (ti ± 1.5 ms):

fsiminst(ti){fsiminst(t)}t[ti-1.5,ti+1.5]

The average relative error of the theoretical estimate, compared to the simulations, is then computed as

ϵ1ni|ftheoryinst(ti)-fsiminst(ti)|fsiminst(ti)[0,1], (67)

see Table 1.

Stationary solution and linear stability analysis of the Fokker-Planck Equation

In what follows, we briefly summarize how we approximated the spiking network’s bifurcation point in the mean-field limit as well as the network frequency and the mean unit firing rate in the bifurcation (red triangle and circular marker in Fig 1B). This involves finding the stationary solution of the Fokker-Planck Equation (Eq (23)) and analyzing its linear stability with respect to the external drive. These are standard procedures described in the literature [34, 51, 53, 90]. Here, for an easier reference, we reproduced these findings using the mathematical symbols used throughout the manuscript.

Stationary solution

The stationary solution p0(V) of the FPE has been derived by [34] (see also [51]). The constant population rate r0 and resulting total drive I0 in the stationary state can be inferred self-consistently by solving:

I0=IE-Kτmr0r0=fLIF(I0)

where

fLIF(I)=(τmπ(I-VT)/2D(I-VR)/2Dex2erfc(x)dx+τref)-1

is the firing rate of an uncoupled LIF neuron receiving constant drive I and Gaussian white noise of intensity D (f-I curve, [91]).

Linear stability analysis

For a given external drive IE one assumes a weak, periodic perturbation of the population rate around its stationary value:

r(t)=r0(IE)+ϵr1(t)=r0(IE)+ϵeiωt+λt (68)

The bifurcation where the stationary state loses stability corresponds to λ = 0. In the recurrently coupled network this perturbation of the rate translates into a perturbation of the input current:

I(t)=IE-Kτmr0(IE)I0-ϵKτmr1(t-Δ)-I1(t) (69)

The linear response of the LIF units to this weakly modulated drive I(t) under Gaussian white noise is given by convolution with the linear response function G:

r(t)=r0(IE)+ϵ0G(s,IE)I1(t-s)s

This output rate must match the weakly periodically modulated rate r(t) that we assumed in the beginning (Eq (68)):

r1(t)=0G(s,IE)I1(t-s)ds=(69)-Kτm0G(s,IE)r1(t-Δ-s)ds

At the bifurcation (r1(t) = eiωt, Eq (68)) this self-consistent condition is equivalent to

1=-KτmG˜(ω,IE)e-iωΔ{1=Kτm|G˜|,amplitudecondition0=π+arg(G˜)-ωΔ,phasecondition (70)

where G˜ denotes the Fourier transform of the linear response function (susceptibility). We use the exact expression for the susceptibility of an LIF unit under Gaussian white noise (more specifically, the complex-conjugated of the expression derived by [53]):

G˜(ω)=r0Diωiω+1D-iω-1(I0-VTD)-eδD-iω-1(I0-VRD)D-iω(I0-VTD)-eδe-iωτrefD-iω(I0-VRD).

where D are parabolic cylinder functions and δ=VR2-VT2+2I0(VT-VR)4D (for an alternative expression in terms of confluent hypergeometric functions, see [92]). We solve the amplitude and phase condition (Eq (70)) numerically to find the critical drive IEIEcrit (at which the stationary state loses stability) and the corresponding frequency ω of the emerging oscillation. The network frequency and mean unit firing rate at the bifurcation are thus given by fnetcrit=ω/2π and funitcrit=r0(IEcrit). This approach is equivalent to the derivation by [34] via linear expansion of the FPE solution (see also [71]).

Supporting information

S1 Appendix. Performance evaluation of the Gaussian-drift approximation for constant drive.

The Gaussian-drift approximation under constant drive is compared to spiking network simulations for a range of noise intensities D and inhibitory coupling strengths K (Fig A). The parameter dependencies of the point of full synchrony are captured well by the theory (Fig B).

(PDF)

pcbi.1011886.s001.pdf (306.6KB, pdf)
S2 Appendix. The influence of network architecture and the shape of the external drive on the asymptotic and instantaneous ripple oscillation dynamics.

A covariation of network architecture and stimulus profile demonstrates that IFA is modulated by, but occurs largely independent of, the shape of the asymptotic network frequency as a function of the external drive (Fig A). Furthermore, we illustrate that a simple square pulse cannot elicit IFA in the feedback-based inhibition-first model.

(PDF)

pcbi.1011886.s002.pdf (212KB, pdf)
S1 Fig. Transient bimodality in the membrane potential distribution can affect instantaneous ripple frequency dynamics in the spiking network.

(A) Same layout as in Fig 2A: Spiking network response to an isolated downwards ramp stimulus with the same slope as in Fig 2D, middle, after time t > 10 ms (N = 10, 000). Note that units that participate in the third population spike tend not to spike in the fourth population spike (red lines in raster plot), which is an indication of a residual bimodality in the membrane potential distribution from one cycle to the next (only faintly visible in voltage plot, see red square). (B) Same layout as Fig 2B: instantaneous network frequencies pooled together from 50 such ramp-down-only simulations. What appears as a continuous non-monotonic “wiggle” in the instantaneous frequencies of Fig 2D, middle (gray dots) is now clearly identifiable as a single outlier cycle (marked in red).

(TIF)

pcbi.1011886.s003.tif (488KB, tif)

Acknowledgments

We thank José R. Donoso, Nikolaus Maier, Naomi Auer, and Gaspar Cano for valuable discussions and comments on the manuscript.

Data Availability

All code written in support of this publication is publicly available at https://github.com/NatalieSchieferstein/interneuron_ripples_with_ifa.git (https://doi.org/10.5281/zenodo.10602552). An exemplary simulation result file is available from https://zenodo.org/doi/10.5281/zenodo.10018496.

Funding Statement

This work was supported by the German Research Foundation (Deutsche Forschungsgemeinschaft [DFG], SFB 1315, project-ID 327654276 to BL and RK, and GRK 1589/2 to RK and NS; https://www.dfg.de/de). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

References

  • 1. Todorova R, Zugaro M. Hippocampal ripples as a mode of communication with cortical and subcortical areas. Hippocampus. 2020;30(1):39–49. doi: 10.1002/hipo.22997 [DOI] [PubMed] [Google Scholar]
  • 2. Oyanedel CN, Durán E, Niethard N, Inostroza M, Born J. Temporal associations between sleep slow oscillations, spindles and ripples. Eur J Neurosci. 2020;52(12):4762–4778. doi: 10.1111/ejn.14906 [DOI] [PubMed] [Google Scholar]
  • 3. Lee AK, Wilson MA. Memory of sequential experience in the hippocampus during slow wave sleep. Neuron. 2002;36(6):1183–1194. doi: 10.1016/S0896-6273(02)01096-6 [DOI] [PubMed] [Google Scholar]
  • 4. Diba K, Buzsáki G. Forward and reverse hippocampal place-cell sequences during ripples. Nat Neurosci. 2007;10(10):1241–1242. doi: 10.1038/nn1961 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5. Vaz AP, Wittig JH, Inati SK, Zaghloul KA. Replay of cortical spiking sequences during human memory retrieval. Science. 2020;367(6482):1131–1134. doi: 10.1126/science.aba0672 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6. Buzsáki G. Two-stage model of memory trace formation: A role for “noisy” brain states. Neuroscience. 1989;31(3):551–570. doi: 10.1016/0306-4522(89)90423-5 [DOI] [PubMed] [Google Scholar]
  • 7. Girardeau G, Benchenane K, Wiener SI, Buzsáki G, Zugaro MB. Selective suppression of hippocampal ripples impairs spatial memory. Nat Neurosci. 2009;12(10):1222–1223. doi: 10.1038/nn.2384 [DOI] [PubMed] [Google Scholar]
  • 8. Ego-Stengel V, Wilson MA. Disruption of ripple-associated hippocampal activity during rest impairs spatial learning in the rat. Hippocampus. 2010;20(1):1–10. doi: 10.1002/hipo.20707 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9. Fernández-Ruiz A, Oliva A, de Oliveira EF, Rocha-Almeida F, Tingley D, Buzsáki G. Long-duration hippocampal sharp wave ripples improve memory. Science. 2019;364(6445):1082–1086. doi: 10.1126/science.aax0758 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10. Klinzing JG, Niethard N, Born J. Mechanisms of systems memory consolidation during sleep. Nat Neurosci. 2019;22:1598–1610. doi: 10.1038/s41593-019-0467-3 [DOI] [PubMed] [Google Scholar]
  • 11. Staresina BP, Niediek J, Borger V, Surges R, Mormann F. How coupled slow oscillations, spindles and ripples control neuronal processing and communication during human sleep. Nat Neurosci. 2023;26(8):1429–1437. doi: 10.1038/s41593-023-01381-w [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12. Carr MF, Jadhav SP, Frank LM. Hippocampal replay in the awake state: A potential substrate for memory consolidation and retrieval. Nat Neurosci. 2011;14(2):147–153. doi: 10.1038/nn.2732 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13. Jadhav SP, Kemere C, German PW, Frank LM. Awake hippocampal sharp-wave ripples support spatial memory. Science. 2012;336(6087):1454–1458. doi: 10.1126/science.1217230 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14. Pfeiffer BE, Foster DJ. Hippocampal place-cell sequences depict future paths to remembered goals. Nature. 2013;497(7447):74–79. doi: 10.1038/nature12112 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15. O’Keefe J. Place units in the hippocampus of the freely moving rat. Exp Neurol. 1976;51(1):78–109. doi: 10.1016/0014-4886(76)90055-8 [DOI] [PubMed] [Google Scholar]
  • 16. Buzsáki G, Horvath Z, Urioste R, Hetke J, Wise K. High-frequency network oscillation in the hippocampus. Science. 1992;256(5059):1025–1027. doi: 10.1126/science.1589772 [DOI] [PubMed] [Google Scholar]
  • 17. Buzsáki G. Hippocampal sharp wave-ripple: A cognitive biomarker for episodic memory and planning. Hippocampus. 2015;25(10):1073–1188. doi: 10.1002/hipo.22488 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18. Wu C, Shen H, Luk WP, Zhang L. A fundamental oscillatory state of isolated rodent hippocampus. J Physiol. 2002;540(2):509–527. doi: 10.1113/jphysiol.2001.013441 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19. Maier N, Nimmrich V, Draguhn A. Cellular and network mechanisms underlying spontaneous sharp wave–ripple complexes in mouse hippocampal slices. J Physiol. 2003;550(3):873–887. doi: 10.1113/jphysiol.2003.044602 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20. Maier N, Kempter R. Hippocampal Sharp Wave/Ripple Complexes—Physiology and Mechanisms. In: Axmacher N, Rasch B, editors. Cogn. Neurosci. Mem. Consol. Springer International Publishing; Switzerland; 2017. p. 227–249. [Google Scholar]
  • 21. Schlingloff D, Kali S, Freund TF, Hajos N, Gulyas AI. Mechanisms of sharp wave initiation and ripple generation. J Neurosci. 2014;34(34):11385–11398. doi: 10.1523/JNEUROSCI.0867-14.2014 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22. Bazelot M, Teleńczuk MT, Miles R. Single CA3 pyramidal cells trigger sharp waves in vitro by exciting interneurones. J Physiol. 2016;594(10):2565–2577. doi: 10.1113/JP271644 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23. Mitzdorf U. Current source-density method and application in cat cerebral cortex: Investigation of evoked potentials and EEG phenomena. Physiol Rev. 1985;65(1):37–100. doi: 10.1152/physrev.1985.65.1.37 [DOI] [PubMed] [Google Scholar]
  • 24. Schomburg EW, Anastassiou CA, Buzsáki G, Koch C. The spiking component of oscillatory extracellular potentials in the rat hippocampus. J Neurosci. 2012;32(34):11798–11811. doi: 10.1523/JNEUROSCI.0656-12.2012 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25. Sullivan D, Csicsvari J, Mizuseki K, Montgomery S, Diba K, Buzsáki G. Relationships between hippocampal sharp waves, ripples, and fast gamma oscillation: Influence of dentate and entorhinal cortical activity. J Neurosci. 2011;31(23):8605–8616. doi: 10.1523/JNEUROSCI.0294-11.2011 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26. Maier N, Tejero-Cantero Á, Dorrn AL, Winterer J, Beed PS, Morris G, et al. Coherent phasic excitation during hippocampal ripples. Neuron. 2011;72(1):137–152. doi: 10.1016/j.neuron.2011.08.016 [DOI] [PubMed] [Google Scholar]
  • 27. Traub RD, Schmitz D, Jefferys JGR, Draguhn A. High-frequency population oscillations are predicted to occur in hippocampal pyramidal neuronal networks interconnected by axoaxonal gap junctions. Neuroscience. 1999;92(2):407–426. doi: 10.1016/S0306-4522(98)00755-6 [DOI] [PubMed] [Google Scholar]
  • 28. Traub RD, Bibbig A. A model of high-frequency ripples in the hippocampus based on synaptic coupling plus axon-axon gap junctions between pyramidal neurons. J Neurosci. 2000;20(6):2086–2093. doi: 10.1523/JNEUROSCI.20-06-02086.2000 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29. Memmesheimer RM. Quantitative prediction of intermittent high-frequency oscillations in neural networks with supralinear dendritic interactions. Proc Natl Acad Sci USA. 2010;107(24):11092–11097. doi: 10.1073/pnas.0909615107 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30. Evangelista R, Cano G, Cooper C, Schmitz D, Maier N, Kempter R. Generation of sharp wave-ripple events by disinhibition. J Neurosci. 2020;40(41):7811–7836. doi: 10.1523/JNEUROSCI.2174-19.2020 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31. Levenstein D, Buzsáki G, Rinzel J. NREM sleep in the rodent neocortex and hippocampus reflects excitable dynamics. Nat Commun. 2019;10(1):1–12. doi: 10.1038/s41467-019-10327-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32. Ecker A, Bagi B, Vértes E, Steinbach-Németh O, Karlócai MR, Papp OI, et al. Hippocampal sharp wave-ripples and the associated sequence replay emerge from structured synaptic interactions in a network model of area CA3. Elife. 2022;11:e71850. doi: 10.7554/eLife.71850 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33. Ylinen A, Bragin A, Nádasdy Z, Jandó G, Szabó I, Sik A, et al. Sharp wave-associated high-frequency oscillation (200 Hz) in the intact hippocampus: network and intracellular mechanisms. J Neurosci. 1995;15(1):30–46. doi: 10.1523/JNEUROSCI.15-01-00030.1995 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34. Brunel N, Hakim V. Fast Global Oscillations in Networks of Integrate-and-Fire Neurons with Low Firing Rates. Neural Comput. 1999;11(7):1621–1671. doi: 10.1162/089976699300016179 [DOI] [PubMed] [Google Scholar]
  • 35. Brunel N, Wang XJ. What determines the frequency of fast network oscillations with irregular neural discharges? I. Synaptic dynamics and excitation-inhibition balance. J Neurophysiol. 2003;90(1):415–430. doi: 10.1152/jn.01095.2002 [DOI] [PubMed] [Google Scholar]
  • 36. Taxidis J, Coombes S, Mason R, Owen MR. Modeling sharp wave-ripple complexes through a CA3-CA1 network model with chemical synapses. Hippocampus. 2012;22:995–1017. doi: 10.1002/hipo.20930 [DOI] [PubMed] [Google Scholar]
  • 37. Donoso JR, Schmitz D, Maier N, Kempter R. Hippocampal ripple oscillations and inhibition-first network models: Frequency dynamics and response to GABA modulators. J Neurosci. 2018;38(12):3124–3146. doi: 10.1523/JNEUROSCI.0188-17.2018 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38. Ramirez-Villegas JF, Willeke KF, Logothetis NK, Besserve M. Dissecting the synapse- and frequency-dependent network mechanisms of in vivo hippocampal sharp wave-ripples. Neuron. 2018;100(5):1224–1240.e13. doi: 10.1016/j.neuron.2018.09.041 [DOI] [PubMed] [Google Scholar]
  • 39. Melonakos ED, White JA, Fernandez FR. A model of cholinergic suppression of hippocampal ripples through disruption of balanced excitation/inhibition. Hippocampus. 2018;29(9):1–14. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40. Braun W, Memmesheimer RM. High-frequency oscillations and sequence generation in two-population models of hippocampal region CA1. PLoS Comput Biol. 2022;18(2):e1009891. doi: 10.1371/journal.pcbi.1009891 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41. Malerba P, Krishnan GP, Fellous JM, Bazhenov M. Hippocampal CA1 ripples as inhibitory transients. PLOS Comput Biol. 2016;12(4):e1004880. doi: 10.1371/journal.pcbi.1004880 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42. Liu AA, Henin S, Abbaspoor S, Bragin A, Buffalo EA, Farrell JS, et al. A consensus statement on detection of hippocampal sharp wave ripples and differentiation from other fast oscillations. Nat Commun. 2022;13 (6000). doi: 10.1038/s41467-022-33536-x [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 43. Stark E, Roux L, Eichler R, Senzai Y, Royer S, Buzsáki G. Pyramidal cell-interneuron interactions underlie hippocampal ripple oscillations. Neuron. 2014;83(2):467–480. doi: 10.1016/j.neuron.2014.06.023 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 44. Ponomarenko AA, Korotkova TM, Sergeeva OA, Haas HL. Multiple GABAA receptor subtypes regulate hippocampal ripple oscillations. Eur J Neurosci. 2004;20(8):2141–2148. doi: 10.1111/j.1460-9568.2004.03685.x [DOI] [PubMed] [Google Scholar]
  • 45. Nguyen DP, Kloosterman F, Barbieri R, Brown EN, Wilson MA. Characterizing the dynamic frequency structure of fast oscillations in the rodent hippocampus. Front Integr Neurosci. 2009;3. doi: 10.3389/neuro.07.011.2009 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 46. Hulse BK, Moreaux LC, Lubenov EV, Siapas AG. Membrane potential dynamics of CA1 pyramidal neurons during hippocampal ripples in awake mice. Neuron. 2016;89(4):800–813. doi: 10.1016/j.neuron.2016.01.014 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 47. Brunel N, Hakim V. Sparsely synchronized neuronal oscillations. Chaos. 2008;18(1):015113. doi: 10.1063/1.2779858 [DOI] [PubMed] [Google Scholar]
  • 48. Csicsvari J, Hirase H, Czurkó A, Mamiya A, Buzsáki G. Oscillatory coupling of hippocampal pyramidal cells and interneurons in the behaving rat. J Neurosci. 1999;19(1):274–287. doi: 10.1523/JNEUROSCI.19-01-00274.1999 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49. Risken H. The Fokker-Planck Equation: Methods of solution and applications. 2nd ed. Springer; 1989. [Google Scholar]
  • 50. Gardiner CW. Handbook of Stochastic Methods for Physics, Chemistry and the Natural Sciences. 2nd ed. Springer; 1985. [Google Scholar]
  • 51. Abbott LF, van Vreeswijk C. Asynchronous states in networks of pulse-coupled oscillators. Phys Rev E. 1993;48(2):1483–1490. doi: 10.1103/PhysRevE.48.1483 [DOI] [PubMed] [Google Scholar]
  • 52. Brunel N. Dynamics of sparsely connected networks of excitatory and inhibitory neurons. J Comput Neurosci. 2000;8:183–208. doi: 10.1023/A:1008925309027 [DOI] [PubMed] [Google Scholar]
  • 53. Lindner B, Schimansky-Geier L. Transmission of noise coded versus additive signals through a neuronal ensemble. Phys Rev Lett. 2001;86(14):2934–2937. doi: 10.1103/PhysRevLett.86.2934 [DOI] [PubMed] [Google Scholar]
  • 54. Fourcaud N, Brunel N. Dynamics of the Firing Probability of Noisy Integrate-and-Fire Neurons. Neural Comput. 2002;14(9):2057–2110. doi: 10.1162/089976602320264015 [DOI] [PubMed] [Google Scholar]
  • 55. Brunel N, Hakim V, Richardson MJE. Firing-rate resonance in a generalized integrate-and-fire neuron with subthreshold resonance. Phys Rev E. 2003;67(5):051916. doi: 10.1103/PhysRevE.67.051916 [DOI] [PubMed] [Google Scholar]
  • 56. Goedeke S, Diesmann M. The mechanism of synchronization in feed-forward neuronal networks. New J Phys. 2008;10(1):015007. doi: 10.1088/1367-2630/10/1/015007 [DOI] [Google Scholar]
  • 57. Plesser HE, Gerstner W. Noise in Integrate-and-Fire Neurons: From Stochastic Input to Escape Rates. Neural Comput. 2000;12(2):367–384. doi: 10.1162/089976600300015835 [DOI] [PubMed] [Google Scholar]
  • 58. Chizhov AV, Graham LJ. Population model of hippocampal pyramidal neurons, linking a refractory density approach to conductance-based neurons. Phys Rev E. 2007;75(1):011924. doi: 10.1103/PhysRevE.75.011924 [DOI] [PubMed] [Google Scholar]
  • 59. Schwalger T. Mapping input noise to escape noise in integrate-and-fire neurons: a level-crossing approach. Biol Cybern. 2021;115(5):539–562. doi: 10.1007/s00422-021-00899-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 60.Schieferstein N. Hippocampal ripple oscillations in inhibitory network models [PhD Thesis]. Humboldt University Berlin; 2023.
  • 61. Kohus Z, Káli S, Rovira-Esteban L, Schlingloff D, Papp O, Freund TF, et al. Properties and dynamics of inhibitory synaptic communication within the CA3 microcircuits of pyramidal cells and interneurons expressing parvalbumin or cholecystokinin. J Physiol. 2016;594(13):3745–3774. doi: 10.1113/JP272231 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 62. Pawelzik H, Hughes DI, Thomson AM. Physiological and morphological diversity of immunocytochemically defined parvalbumin- and cholecystokinin-positive interneurones in CA1 of the adult rat hippocampus. J Comp Neurol. 2002;443(4):346–367. doi: 10.1002/cne.10118 [DOI] [PubMed] [Google Scholar]
  • 63. Chizhov AV, Graham LJ. Efficient evaluation of neuron populations receiving colored-noise current based on a refractory density method. Phys Rev E. 2008;77(1):011910. doi: 10.1103/PhysRevE.77.011910 [DOI] [PubMed] [Google Scholar]
  • 64. Tchumatchenko T, Malyshev A, Geisel T, Volgushev M, Wolf F. Correlations and synchrony in threshold neuron models. Phys Rev Lett. 2010;104(5):058102. doi: 10.1103/PhysRevLett.104.058102 [DOI] [PubMed] [Google Scholar]
  • 65. Badel L. Firing statistics and correlations in spiking neurons: A level-crossing approach. Phys Rev E. 2011;84(4):041919. doi: 10.1103/PhysRevE.84.041919 [DOI] [PubMed] [Google Scholar]
  • 66. Börgers C. The PING Model of Gamma Rhythms. In: An Introd. to Model. Neuronal Dyn. Springer; 2017. p. 249–261. [Google Scholar]
  • 67. Wang XJ, Buzsáki G. Gamma oscillation by synaptic inhibition in a hippocampal interneuronal network model. J Neurosci. 1996;16(20):6402–6413. doi: 10.1523/JNEUROSCI.16-20-06402.1996 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 68. Diesmann M, Gewaltig MO, Aertsen A. Stable propagation of synchronous spiking in cortical neural networks. Nature. 1999;402(December):529–533. doi: 10.1038/990101 [DOI] [PubMed] [Google Scholar]
  • 69. Lewis CM, Ni J, Wunderle T, Jendritza P, Lazar A, Diester I, et al. Cortical gamma-band resonance preferentially transmits coherent input. Cell Rep. 2021;35(5):109083. doi: 10.1016/j.celrep.2021.109083 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 70. Bartos M, Vida I, Jonas P. Synaptic mechanisms of synchronized gamma oscillations in inhibitory interneuron networks. Nat Rev Neurosci. 2007;8(1):45–56. doi: 10.1038/nrn2044 [DOI] [PubMed] [Google Scholar]
  • 71. Brunel N, Hansel D. How Noise Affects the Synchronization Properties of Recurrent Networks of Inhibitory Neurons. Neural Comput. 2006;18(5):1066–1110. doi: 10.1162/neco.2006.18.5.1066 [DOI] [PubMed] [Google Scholar]
  • 72. Hajos N, Karlocai MR, Nemeth B, Ulbert I, Monyer H, Szabo G, et al. Input-output features of anatomically identified CA3 neurons during hippocampal sharp wave/ripple oscillation in vitro. J Neurosci. 2013;33(28):11677–11691. doi: 10.1523/JNEUROSCI.5729-12.2013 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 73. Csicsvari J, Hirase H, Czurkó A, Mamiya A, Buzsáki G. Fast network oscillations in the hippocampal CA1 region of the behaving rat. J Neurosci. 1999;19(16):RC20. doi: 10.1523/JNEUROSCI.19-16-j0001.1999 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 74. Klausberger T, Magill PJ, Márton LF, Roberts JDB, Cobden PM, Buzsáki G, et al. Brain-state- and cell-type- specific firing of hippocampal interneurons in vivo. Nature. 2003;421:844–848. doi: 10.1038/nature01374 [DOI] [PubMed] [Google Scholar]
  • 75. Klausberger T, Somogyi P. Neuronal diversity and temporal dynamics: The unity of hippocampal circuit operations. Science. 2008;321(5885):53–57. doi: 10.1126/science.1149381 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 76. Varga C, Golshani P, Soltesz I. Frequency-invariant temporal ordering of interneuronal discharges during hippocampal oscillations in awake mice. Proc Natl Acad Sci USA. 2012;109(40):E2726–E2734. doi: 10.1073/pnas.1210929109 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 77. Lapray D, Lasztoczi B, Lagler M, Viney TJ, Katona L, Valenti O, et al. Behavior-dependent specialization of identified hippocampal interneurons. Nat Neurosci. 2012;15(9):1265–1271. doi: 10.1038/nn.3176 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 78. Katona L, Lapray D, Viney TJ, Oulhaj A, Borhegyi Z, Micklem BR, et al. Sleep and movement differentiates actions of two types of somatostatin-expressing GABAergic interneuron in rat hippocampus. Neuron. 2014;82(4):872–886. doi: 10.1016/j.neuron.2014.04.007 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 79. Jahnke S, Timme M, Memmesheimer RM. A unified dynamic model for learning, replay, and sharp-wave/ripples. J Neurosci. 2015;35(49):16236–16258. doi: 10.1523/JNEUROSCI.3977-14.2015 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 80. Shu Y, Hasenstaub A, Duque A, Yu Y, McCormick DA. Modulation of intracortical synaptic potentials by presynaptic somatic membrane potential. Nature. 2006;441(7094):761–765. doi: 10.1038/nature04720 [DOI] [PubMed] [Google Scholar]
  • 81. Shu Y, Yu G, Yang J, McCormick DA. Selective control of cortical axonal spikes by a slowly inactivating K+ current. Proc Natl Acad Sci USA. 2007;104(27):11453–11458. doi: 10.1073/pnas.0702041104 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 82. Traub RD, Whittington MA, Buhl EH, Jefferys JGR, Faulkner HJ. On the mechanism of the γβ frequency shift in neuronal oscillations induced in rat hippocampal slices by tetanic stimulation. J Neurosci. 1999;19(3):1088–1105. doi: 10.1523/JNEUROSCI.19-03-01088.1999 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 83. Buzsáki G, Wang XJ. Mechanisms of gamma oscillations. Annu Rev Neurosci. 2012;35:203–225. doi: 10.1146/annurev-neuro-062111-150444 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 84. Perumal MB, Latimer B, Xu L, Stratton P, Nair S, Sah P. Microcircuit mechanisms for the generation of sharp-wave ripples in the basolateral amygdala: A role for chandelier interneurons. Cell Rep. 2021;35(6):109106. doi: 10.1016/j.celrep.2021.109106 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 85. Buzsáki G, Anastassiou CA, Koch C. The origin of extracellular fields and currents—EEG, ECoG, LFP and spikes. Nat Rev Neurosci. 2016;13(6):407–420. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 86. Gerstner W, Kistler WM. Spiking Neuron Models. Cambridge University Press; 2002. [Google Scholar]
  • 87. Goodman DFM, Brette R. The brian simulator. Front Neurosci. 2009;3(2):192–197. doi: 10.3389/neuro.01.026.2009 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 88. Meyer R, Obermayer K. Pypet: A python toolkit for data management of parameter explorations. Front Neuroinform. 2016;10:38. doi: 10.3389/fninf.2016.00038 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 89. Uhlenbeck GE, Ornstein LS. On the Theory of the Brownian Motion. Phys Rev. 1930;36(5):823–841. doi: 10.1103/PhysRev.36.823 [DOI] [Google Scholar]
  • 90. Gerstner W, Kistler WM, Naud R, Paninski L. Neuronal Dynamics: From single neurons to networks and models of cognition and beyond. Cambridge University Press; 2014. [Google Scholar]
  • 91. Holden AV. Models of the Stochastic Activity of Neurones. Berlin: Springer-Verlag; 1976. [Google Scholar]
  • 92. Brunel N, Chance FS, Fourcaud N, Abbott LF. Effects of synaptic noise and filtering on the frequency response of spiking neurons. Phys Rev Lett. 2001;86(10):2186–2189. doi: 10.1103/PhysRevLett.86.2186 [DOI] [PubMed] [Google Scholar]
PLoS Comput Biol. doi: 10.1371/journal.pcbi.1011886.r001

Decision Letter 0

Lyle J Graham, Jonathan David Touboul

18 Jun 2023

Dear Ms Schieferstein,

Thank you very much for submitting your manuscript "Intra-ripple frequency accommodation in an inhibitory network model for hippocampal ripple oscillations" for consideration at PLOS Computational Biology.

As with all papers reviewed by the journal, your manuscript was reviewed by members of the editorial board and by several independent reviewers. In light of the reviews (below this email), we would like to invite the resubmission of a significantly-revised version that takes into account the reviewers' comments.

We cannot make any decision about publication until we have seen the revised manuscript and your response to the reviewers' comments. Your revised manuscript is also likely to be sent to reviewers for further evaluation.

When you are ready to resubmit, please upload the following:

[1] A letter containing a detailed list of your responses to the review comments and a description of the changes you have made in the manuscript. Please note while forming your response, if your article is accepted, you may have the opportunity to make the peer review history publicly available. The record will include editor decision letters (with reviews) and your responses to reviewer comments. If eligible, we will contact you to opt in or out.

[2] Two versions of the revised manuscript: one with either highlights or tracked changes denoting where the text has been changed; the other a clean version (uploaded as the manuscript file).

Important additional instructions are given below your reviewer comments.

Please prepare and submit your revised manuscript within 60 days. If you anticipate any delay, please let us know the expected resubmission date by replying to this email. Please note that revised manuscripts received after the 60-day due date may require evaluation and peer review similar to newly submitted manuscripts.

Thank you again for your submission. We hope that our editorial process has been constructive so far, and we welcome your feedback at any time. Please don't hesitate to contact us if you have any questions or comments.

Sincerely,

Jonathan David Touboul

Academic Editor

PLOS Computational Biology

Lyle Graham

Section Editor

PLOS Computational Biology

***********************

Reviewer's Responses to Questions

Comments to the Authors:

Please note here if the review is uploaded as an attachment.

Reviewer #1: In this paper, Schieferstein et al., look at the theoretical mechanisms of intra-frequency accommodation (IFA) of hippocampal ripple oscillations (>150 Hz) in a fully connected network of N inhibitory cells. Using the integrate-and-fire passive model, they develop an analytical mean-field approach for the large N condition to derive a drift-based approximation. This approximation is used to explain how IFA continuously emerges in response to symmetrical fast input ramps accompanying sharp-waves. They elegantly show that under these conditions, a hysteresis emerges in the trajectory of membrane potential from the minimal level to the spike threshold that controls cycle duration in response to fast symmetric changing drive.

The paper is well written, and in spite of the strong theoretical background, it is easy to read. Results are clearly presented and contribute to a better understanding of the behavior of inhibition-first models of ripple oscillations developed by the senior author’s previous work. There is value in adopting an analytical solution of the model operating in a regular oscillatory regime. Apart from some specific comments, my only concern is with providing a more balanced view of the implications of results when it comes to the real biological phenomenon.

Below comments and suggestions that I hope can be useful to improve the paper.

1- The motivation of using symmetrical inputs is justified, since the authors look to explain an asymmetric phenomenon, IFA, from the intrinsic properties of the bifurcation-based model. However, IFA emerges directly along the fast rising phase of an input after the initial transitory response, not apparently requiring the descending phase (fig.5 and 6, eq 50 for toff, and 35 for fnet). As long as the input is fast enough to drive the instantaneous cycle above the expected asymptotic frequency, IFA will emerge. While the non-trivial hysteresis effect to a symmetric input emphasizes IFA below fnet, in real case scenarios sharp-waves are not necessarily symmetric. Thus, the emphasis on the hysteresis drags the focus out to the real speed-dependent mechanism which may actually operate. The paper would benefit by better considering these aspects (the authors partially touch this point only at the level of discussion, lines 490-495).

2- At introduction, there are references to various computational models of ripple generation, basically, inhibition-first and excitation-first models. The authors then refer to the experimental evidence and debate, and propose that looking at transient properties such as IFA can advance model selection. Arguing that only inhibition-first models reproduces IFA, it is implicitly assumed that they are best posed to explain ripple generation. However, all this requires some corrections. First, other examples of excitation-first models, including excitatory-inhibitory networks, are proposed to explain a range of frequency behavior during ripples (e.g. PMID: 21452258, PMID: 30482688, PMID: 2112357, PMID: 21452258). By bringing the focus to inhibition-first models (based only on recurrently connected inhibitory cells) the introduction reads unbalanced. Actually, some of the cited models consider a network of excitatory and inhibitory cells to generate transient ripple oscillations (e.g., ref 32). This is important, since ripples generation requires excitatory-inhibitory interactions as experiments suggest: optogenetic activation of PV cells does not elicit ripples, while their activation by pyramidal cells does (Stark et al., 2014). Second, some of the cited references are not about computational models of inhibition-first ripple oscillations; rather they are either about up-down slow transitions, stationary gamma oscillations and/or they reflect experimental data. Finally, some of these problems drag into results and discussion (for example: line 46, line 512).

3- Fig.4 and text: it could be useful to specify that simulation data refers to the N=10000 model, where the lim to inf applies. The limit of applicability of the mean-field solution seems to draw two different scenarios above and below simulations.

4- Line 440: The experimental results of optogenetic stimulation of PV cells generating ripples is shown at CA3 in vitro, which represents a reduced experimental model (the biological complexities explaining why that experiment worked are far from the focus of the paper). The basic pyr-inh microcircuit motif was demonstrated experimentally by Bazelot & Miles in CA3 (PMID: 26728572) and confirmed in vivo by Stark in CA1 (ref 40). Importantly here, the statement “The speed-dependence of IFA is an important prediction of the bifurcation based model….” may be confusing. The whole paper is about showing how the mean-field solution for the IFA can be applied far from the bifurcation regime (IE >> IE min; fig.4) and so this makes the intra-ripple accommodation phenomena not necessarily linked to the bifurcation point (which would be rather obvious) nor to inhibition-first model.

5- Line 487: The statement that decreases/increases of ripple frequency are observed experimentally refers to non-physiological data, such as in ref 41 in response to treatment with GABAa receptor agonists and antagonists. This should be clarified to avoid the reader making a wrong impression.

6- Discussion will benefit from inserting a short paragraph on the limitations of the study/model. Apart from the applicability range of the mean-field approximation, other aspects to be considered are: a) Real LFP signals are contributed by spatially aligned currents from pyramidal cells. This means that inhibitory potential dynamics are strongly influenced by the integration properties in post-synaptic pyramidal cells (in contrast to statement in line 525 regarding frequency structure of IPSPs), which are not considered by the model. b) only passive leaky currents are considered in eq 1, but GABAergic cells exhibit different degree of firing rate accommodation. c) short-term plasticity of inhibitory potentials are not included. Both facilitation and depression are characteristic of heterogeneous populations of basket cells. Finally, the mean-field approximation applies to any other regular oscillatory regime with different fnet (such as for gamma and ref 34 &35) and so results may not uniquely applied to the ripple phenomenon.

7- For the pathological fast ripple regime, other models have been proposed based on excitation-first and/or excitatory-inhibitory interactions. The paper misses the opportunity to comment this further (e.g. lines 523, 727), although not strictly necessary. Importantly, note that experimental evidence demonstrates that physiological ripples always span the 100-250Hz frequency range and anything >250 Hz is considered pathological. Given that ripple frequency in inhibitory-first models is determined by interneuron high-frequency firing rates, the mean ripple frequency in these models is not informative. Rather, it is the dynamics of the intra-ripple adaptation what really matter. This comment is just a side note for the authors to consider if appropriate.

Suggestions on the paper structure and readability

8- Many equations are numbered indisctinly in the results and method sections. For example, eq. 1 and 13 are the same, eq. 2 and 14, etc.. (though some terms are written differently). This is possibly inevitable but the reader can get lost easily and it requires attention moving all along the results and method section.

9- The method section is rather long, especially for the mean-field approximation sections. One the one hand this is a major part of the paper, on the other it reads very dense and to get the main point requires digging into the underlying math. A suggestion would be to provide a short summary (also considering the previous point) which integrates the figures, and moving a large part of the mathematical derivation to supplementary material (or annex).

Reviewer #2: Reviewer’s report for PLoS CB on

“Intra-ripple frequency accommodation in an inhibitory network model for hippocampal

ripple oscillations”

by Natalie Schieferstein, Tilo Schwalger, Benjamin Lindner, Richard Kempter

Also attached as a pdf.

Overview.

This paper reports on the modeling and analysis of IFA. It provides a follow-up to an earlier paper from the Kempter lab: Donoso, Schmitz, Maier, Kempter (2018) J Neurosci

The spiking model involves all-to-all coupled LIF units; coupling is by an inhibitory current pulse with propagation delay of 1.2 ms. The application is SWRs in CA1 driven by excitatory drive IE from CA3; IE is represented by a linear ramp-up then plateau then linear ramp-down with total duration of 10s ms. The goal is to provide and to analyze a MF-like approximation for the experimental and simulated observations: generally, the intra-SWR cycle frequency shows ‘accommodation’, decreasing throughout the stimulus but with more or less increasing frequency during the late down-ramp, depending on the speed of the down-ramp.

The treatment proceeds in a well-organized hierarchy to describe the response to increasingly detailed features of the transient input. The authors begin with strictly steady input and approximating the steady response in terms of network frequency and cell frequency vs IE (Fig 4). This analytic part alone is quite nice and of value. Then, they provide math analysis of the ramping up and ramping down in stepwise fashion, with an IE incremental step occurring at the end of each response cycle; finally they mathematically treat the linear ramp case.

The response during ramping deviates from close tracking of the asymptotic frequency, depending on the speed and direction of the changing IE. There is asymmetry (hysteresis) between increasing and decreasing IE. A substantial portion of the paper involves accounting for the deviations with simulation and math analysis; an attempt with various approximations to describe/dissect the response to transient stimuli.

The math analysis is insightful with clever approximations/assumptions, based on a Fokker-Planck description of the noisy response of the LIF network. The authors provide a valuable and novel analytic treatment for responses to a transient input. But some readers may prefer a more direct demonstration of the gaussian approximation without the detailed analysis (see our suggestion below, just before Major Concerns).

The paper is well-organized and clearly written, with generally understandable figures and captions. It’s long -- with Methods being longer than the main text (20 pages compared to 19 pages) and 4 pages of Supp Info.

Curiously, the authors’ I-only model shows decreasing asymptotic network frequency with increasing IE. This behavior contrasts with the network frequency vs IE relation seen in many cell-based models for an I-only network, including that in Donoso et al. (2018), Fig 1D, and, for example, in models for ING-type gamma oscillations the frequency typically increases with IE (Fig 12A in Wang & Buzsáki 1996; Fig 4C in Bartos, Vida & Jonas 2007).

We offer the following suggestion before listing our major/minor concerns.

The approximations, simplifying assumptions, and analysis are impressive for arriving at an analytical solution. A key factor is the gaussian profile assumption and by-passing of the cellular-threshold resetting. The DDE by itself is a valuable reduction and can be integrated numerically for demonstrating key features of IFA. Figures 3 and 4 are quite good for the case of steady input drive. Eventually, after going through much analysis in the paper, I realized that simulations of IFA for the direct numerical integration of the DDE (as in Fig 6) could be shown earlier, say, just after Fig 4. I suggest a comparison of the DDE simulations with the LIF network simulations to directly illustrate IFA (by simulation for a piecewise linear IE(t) ) with time courses like those in, say, Figs 2A, 3A, 3B. Many readers may be satisfied with these demonstrations and the text could offer them the option to by-pass the results and details of the analytical approach (pages 12-16 and in Methods, pages 27-33). You could include the time course of the mean of V across cells from the spiking network, likewise for Fig 1. The other figures could remain pretty much as they are.

Major Concerns:

1. In this LIF I-only network model the synaptic current Isyn is a delta-pulse, decreasing the post-synaptic V(t) by J/N=65 mV/10000. There is not a finite time course or decay time constant for Isyn. This representation leads to stronger post-synaptic hyperpolarization (decreased µmin) with increased IE. The hyperpolarization can be 2-3 x 13 mV below Vrest (see Fig 2A), say to -100 mV or more negative. I urge you to make this point very explicitly in the text. Please provide intuitive explanation for this decrease of µmin with increasing IE. Is it that, larger IE tends to speed the passage through threshold, enabling more spikes to occur in a firing episode (greater saturation, larger value of s) and thereby more inhibitory current pulses, decreased µmin? Is this the basis for decreased asymptotic frequency with IE? Please give the reader an intuitive explanation for the decrease of µmin with IE.

2. With conductance-based inhibitory synaptic current there could be saturation of the synaptic activation variable, w(t), according to Isyn = Gsyn w(t) (V-Vsyn) (with modestly hyperpolarizing Vsyn and 0 � w � 1) and recovery with time scale, τsyn. For GABAA inhibition this time scale can be 3-10 ms. Would a Gsyn-based Isyn allow for the dynamic effects and brief intra-SWR events that you find for pulse-coupling in the asymptotic frequency vs IE. How would the dynamics of IFA be altered? Reference to the earlier paper (Donovo et al., 2018) and synaptic filtering is barely adequate. I would like to see also a simulation of the authors’ LIF network with gsyn(t) = Gsyn w(t) for your typical ramping IE(t) with, say, two illustrative values for τsyn and Gsyn. It could be in Supp Info. It’s surprising to us why decreasing Gsyn (GABA blockers) in Donoso et al. (2018) did not affect the network frequency.

Our simulations of your model, but with conductance-based synaptic current, suggested that there is a longer transient after a firing episode before the distribution of membrane potential becomes Gaussian-like. With large noise/small external excitatory drive, bimodal or trimodal distribution can exist for more than 500 ms.

3. The idea of no more than one spike per neuron per cycle is clever to avoid the resetting. Can you provide references for exploiting this feature in other studies for network firing events? Is the methodology of FPE and approximations for ‘disregarding reset’ generalizable to some other network cases: E-only or to E-I networks, or other architectures (say, none all-to-all)?

4. I see that you concluded from the analytic formulae in line 794 (according to Eq 41) that K needs to be sufficiently large. That’s fine but I still wonder if further direct interpretation are gleaned from the formulae themselves, as they are quite complex.

For example, do you see directly from the analysis about limits for synaptic strength? Maybe you can point to why there is a lower bound for the synaptic strength instead of an upper bound? I will accept that for µmax to be large enough, the recurrent inhibition needs to be weak.

Minor Concerns:

5. I did not find a listing of individual contributions of the co-AUs to the project. I suppose you have provided this during the submission process.

6. I appreciate that you address, early on, for the naive reader, the lack of a precise definition of IFA (lines 115-124). Can you maybe say explicitly what features are not part of IFA? For example, some “nots” might be “network frequency decreases with IE in the asymptotic case” (True for your network but IFA may not depend on this), and another “not” is “network frequency changes slowly (decreases/increases) with time during steady input” (There is no explicit slow time scale or adaptation in your formulation.).

My impression is that IFA refers to the dynamic effect about deviations from steady (adiabatic) tracking of the asymptotic frequency for the instantaneous value of a dynamically changing IE. IFA decreases with slower dynamic changes in IE. IFA is not ‘adaptation of network frequency’ for steady input. “accommodation” in IFA is specifically a term applied to responses to dynamic (e.g., ramping) stimuli as used historically to describe behavior at the individual neuron level.

7. Is it feasible under steady or slowly changing inputs (e.g. optogenetics) to obtain experimental data to support the modeling result that CV decreases with increased IE? Is this mostly due to decreased standard deviation or increased period?

Also from experimental data, what is the variability of local period/frequency during a SPW? Is there correlation between durations of successive SPWs?

8. I suggest that for Fig 3 the caption should explicitly describe the shaded representation of the gaussian distribution as, say, “painted on” and state explicitly that the black curves in Fig 3B are time courses of the numerical integration of the DDE, according to the equation numbers (8a, b).

9. Can you briefly describe the phenomenological ‘resetting’ for the DDE early on, around equations (9-10), intuitively without the math details?

10. Include horizontal lines at Vthr, Vreset for all time course plots (e.g., Fig 1A, Fig 2A, Fig 3Ai, Bi).

11. Fig 6C bottom. I think the y-axis label should be for both V and for IE.

12. On page 3 (line 62) it’s mentioned that a refractory period is ignored for simplicity. What effects are we overlooking?

13. In the middle panel of Fig 2D there’s evidence of a small wiggle (gray dots) during decreasing phase of ramp. What is that about?

14. Re: the stability analysis that leads to Hopf bifurcation, HB, here and in Brunel & Hakim (1999). The authors state that the HB is supercritical at a few points in the text. It appears to be supercritical but does the analysis really distinguish that the HB is super- or sub-critical? My impression is that the determination of “direction” would require looking at higher order terms, beyond linear. Please confirm or restate.

15. Re: caption Fig 6. “Thin grey line: mean membrane potential trajectory μ(t) in response to SPW-like drive IE(t) (numerical integration of DDE Eq (8)).” Please state explicitly if this integration of DDE also includes the resetting.

16. The classification of inhibition-first models into categories that are named, perturbation-based and bifurcation-based, is not meaningful enough for a general reader. Since your operating range for the DDE is for strong coupling and strong drive it is far from the HB regime of weakly modulated spiking of Brunel-Hakim; why call it bifurcation-based? You are describing distinctly large-amplitude episodic behavior. Also, what is meant by perturbation-based for the other category?

Actually, I’m wondering why you go through the HB derivation if it already is in the literature?

I suppose, in your view, that an essential feature of a ‘bifurcation-based’ model is that the onset of oscillations occurs with weakly modulated (sparse) spiking. Increasing IE leads to increased saturation “s” and therefore more negative µmin which, in turn, overwhelms the increasing drive to produce slower network frequency. But for the Donoso et al. model (also bifurcation-based) the asymptotic frequency changes little while ‘s’ increases to 0.8 (almost its maximum) with increasing input drive and then both ‘s’ and frequency increase with IE. What is the more important effect in your model of increasing drive: that frequency decreases or that ‘s’ increases from near zero?

17. What is the initial condition for the numerical integration of the DDE, i.e. µ(t) for -Δ<t<0? dde="" infinite-dimensional="" is="" since="" the="">

18. Along the lines of the just-previous query, what is the initial condition for the LIF network? We found in our simulation that if the membrane potentials are initialized uniformly on [Vreset, Vthr], there will be a longer initial transient with multiple clusters in the distribution of membrane potentials. Will these clusters eventually merge into one Gaussian-like cluster or is it possible that for some parameter values, there are more than one cluster (e.g., two stable clusters) as time gets large? The raster may look like ABABAB where there are two different ISIs. This may also happen when the noise level D is large.

19. Re: line 496. “wrt” should be spelled out.

20. Can you mention/summarize, say in Discussion, the features of your analytic treatment that could generalize to other applications? And highlight the advantages of the analytic approximations, beyond having the DDE derivation, over, say, just integrating the DDE.</t<0?>

Reviewer #3: Overview

In the present MS, the authors suggest that the intra-ripple frequency adaptation observed in experiments is due to the inhibitory interneuron-based network model. In the inhibitory-based ripple model, the instantaneous frequency is inversely related to the magnitude of the instantaneous excitatory input. The basic idea of the present MS is that the instantaneous frequency depends not only on the instantaneous excitatory input, but also on the membrane potential at the end of the previous ripple cycle, endowing the system with hysteresis. Thus, when the input magnitude increases, the actual instantaneous frequency decreases more rapidly than expected, whereas when the input magnitude decreases, the instantaneous frequency increases more slowly than expected. When the input is a sufficiently short symmetric ramp, this hysteresis automatically translates to a largely monotonously decreasing instantaneous frequency.

General

I find that the MS is by and large clearly written and that the analytical and computational methods are rigorous and well explained. My main problem with the work is that the starting point for the development of the analytical model, the integrate and fire network model, yields asymptotic behavior that is inconsistent with previous theoretical and experimental findings. If the basis is inconsistent, then the entire work may be an interesting mathematical analysis but the relevance for the neurophysiology of ripples would remain unclear.

If the authors can modify the starting point for consistency with previous findings, then I think the MS warrants further consideration.

Major comments

(1) A main inconsistency is between the present integrate-and-fire model and previous spiking and firing rate models. In the present implementation, the steady-state instantaneous frequency decreases with increased input magnitude (Fig. 1B, top; Fig. 4), whereas in the noise-driven implementation, the correlation is clearly positive (see Fig. 6 of Brunel and Hakim, 1999) or by and large flat (Fig. 2 of Brunel and Wang, 2003). It is unclear whether the opposite dependencies of frequency on input magnitude are due to (1) structural differences in the model, (2) differences in parameters, (3) different working domains, or (4) the nature of the input. This inconsistency appears for both the analytical and the spiking models. Please explain the exact differences between the present integrate and firing simulation and the Brunel and Hakim simulation, and between the present firing rate model and the 1999 one.

(2) During spontaneous ripples, the mean ripple frequency exhibits a positive correlation with the SPW amplitude, which is consistent with the Brunel and Hakim model. Moreover, when the excitatory input is controlled experimentally using optogenetics, ripple frequency increases (see Fig. 1 of Stark et al., 2014). These inconsistencies (also see Comment #1 above) cast doubt on the relevance of the starting point of the MS (Figs. 1-4).

(3) In experiments, the frequency accommodation is clearly apparent also when the input is a constant optogenetic pulse – for long and short pulses, in CA1, DG, and neocortex. This critical feature of in vivo ripples appears to already test the predictions of the present MS but is neither accounted for nor discussed. The authors state that the “simplest form” of a SPW model is a symmetric piecewise linear ramp, but a pulse is certainly simpler from a theoretical point of view. Can the authors replicate the IFA phenomenon with a pulse?

(4) The authors arrive at the hysteresis insight via an elaborate large-signal approximation to the firing rate model of Brunel and Hakim (1999), but it is not clear to me that the effort is actually warranted. The basic logic is that the latency to the spike depends on the initial membrane potential, and this can already be seen in a single neuron spiking model. Can the authors demonstrate the hysteresis effect on the membrane potential of integrate and fire neurons?

Minor comments

(5) Please extend Fig. 1 to the regime where s > 1: give examples (in Fig. 1A) and extend the summary (Fig. 1B) to include the second bifurcation point.

(6) Please add raster plots to Fig. 1A for every regime (e.g., 100 random spike trains out of the 10,000).

(7) Please add spectra of the membrane potentials to Fig. 1A. Do the Vm spectra also exhibit harmonics at full synchronization?

(8) Please report the exact value of s at the first bifurcation, around 0.2 nA

(9) Please add estimates of the dispersion of the frequency estimation to Fig. 2A, i.e., add error bars to the white dots.

(10) Please explain why the asymptotic steady state frequency dips at the end of the first ramp and the beginning of the second ramp in Fig. 2B

(11) In Fig. 2B, the grey dots at the plateau of the input appear to decrease in frequency. Certainly the first cluster (around -10 ms) is above the line. The resolution is not perfect, but the second cluster (around -5 ms) appears to be mostly above the line, whereas the third cluster (around 5 ms) appears to be on the line. Thus, it appears that there is some accommodation even when the input is constant – can the authors comment on this? See also Comment #3 above. If what I wrote is correct, then it seems that a main conclusion of the MS, namely that a symmetric ramp is the “minimal input” that is required to replicate the IFA phenomenon in the inhibitory network model, should be reassessed.

(12) Please write all partial derivatives explicitly in all equations.

(13) What is the source of the “higher harmonics” in Fig. 1A? Are these harmonics due to the fitting of integer multiple sine waves to the sequence of “delta functions”, or is there rhythmicity at 400 Hz (as in Fig. 2A)?

(14) In Fig. 5A, the legend (and Fig. 5B) refers to seven examples, but only three are shown.

(15) Several of the equations are repeated in both the Results and Methods sections. Please refrain from the repetition, or state explicitly next to every repeated equation which other equation it replicates (in the Methods).

**********

Have the authors made all data and (if applicable) computational code underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data and code underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data and code should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data or code —e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: No: They argue data will be available upon publication

Reviewer #2: No: The reviewer did not find a statement in the paper about uploading their simulation code(s) to a public depository.

Reviewer #3: No: The authors state that the DOI will be provided after acceptance

**********

PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: No

Reviewer #3: No

Figure Files:

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email us at figures@plos.org.

Data Requirements:

Please note that, as a condition of publication, PLOS' data policy requires that you make available all data used to draw the conclusions outlined in your manuscript. Data must be deposited in an appropriate repository, included within the body of the manuscript, or uploaded as supporting information. This includes all numerical values that were used to generate graphs, histograms etc.. For an example in PLOS Biology see here: http://www.plosbiology.org/article/info%3Adoi%2F10.1371%2Fjournal.pbio.1001908#s5.

Reproducibility:

To enhance the reproducibility of your results, we recommend that you deposit your laboratory protocols in protocols.io, where a protocol can be assigned its own identifier (DOI) such that it can be cited independently in the future. Additionally, PLOS ONE offers an option to publish peer-reviewed clinical study protocols. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols

Attachment

Submitted filename: Review_fnl.pdf

pcbi.1011886.s004.pdf (109.1KB, pdf)
PLoS Comput Biol. doi: 10.1371/journal.pcbi.1011886.r003

Decision Letter 1

Lyle J Graham, Jonathan David Touboul

31 Dec 2023

Dear Ms Schieferstein,

Thank you very much for submitting your manuscript "Intra-ripple frequency accommodation in an inhibitory network model for hippocampal ripple oscillations" for consideration at PLOS Computational Biology. As with all papers reviewed by the journal, your manuscript was reviewed by members of the editorial board and by several independent reviewers. The reviewers appreciated the attention to an important topic. Based on the reviews, we are likely to accept this manuscript for publication, providing that you modify the manuscript according to the review recommendations.

All Reviewers and Editors handling the manuscript are happy with the revision and we are ready to accept your paper. Reviewer #3 made further suggestions for you. After discussion between the Editors, we decided to give you an opportunity to consider these suggestions and include any changes relative to these two points (or leave your revision unchanged), so the manuscript management system considers this as a minor revision.

Please prepare and submit your revised manuscript within 30 days. If you anticipate any delay, please let us know the expected resubmission date by replying to this email.

When you are ready to resubmit, please upload the following:

[1] A letter containing a detailed list of your responses to all review comments, and a description of the changes you have made in the manuscript. Please note while forming your response, if your article is accepted, you may have the opportunity to make the peer review history publicly available. The record will include editor decision letters (with reviews) and your responses to reviewer comments. If eligible, we will contact you to opt in or out

[2] Two versions of the revised manuscript: one with either highlights or tracked changes denoting where the text has been changed; the other a clean version (uploaded as the manuscript file).

Important additional instructions are given below your reviewer comments.

Thank you again for your submission to our journal. We hope that our editorial process has been constructive so far, and we welcome your feedback at any time. Please don't hesitate to contact us if you have any questions or comments.

Sincerely,

Jonathan David Touboul

Academic Editor

PLOS Computational Biology

Lyle Graham

Section Editor

PLOS Computational Biology

***********************

A link appears below if there are any accompanying review attachments. If you believe any reviews to be missing, please contact ploscompbiol@plos.org immediately:

All Reviewers and Editors handling the manuscript are happy with the revision and we are ready to accept your paper. Reviewer #3 made further suggestions for you. After discussion between the Editors, we decided to give you an opportunity to consider these suggestions and include any changes relative to these two points (or leave your revision unchanged), so the manuscript management system considers this as a minor revision.

Reviewer's Responses to Questions

Comments to the Authors:

Please note here if the review is uploaded as an attachment.

Reviewer #1: The authors have addressed extensively my comments and I found the new version is improved. So, I endorse publication.

Reviewer #2: Your detailed responses to reviewers' concerns were appreciated, as were many of the modifications in your revised paper.

Reviewer #3: The paper has been revised and improved. Two key points remain open; both were raised in my previous assessment as well as by other reviewers in the previous round. The concerns did receive detailed replies in the response letter, but insufficient clarification in the paper itself. I believe that once clarified, the paper could make an interesting contribution to the literature.

First, the physiological observation that work aims to explain/replicate remains insufficiently clear. There are within-ripple changes in frequency, sure. But it is unclear whether it is an outcome of the system, the input to the system, etc. Furthermore, it is unclear whether there is only decrease (“adaptation”) or also an increase of frequency during the beginning of the ripples (see countless wideband traces in the literature, e.g., Fig. 1 of Buzsaki et al., 1992; Fig. 2A of Csicsvari et al., 1999). One way to address this experimentally is to take control over the input. While I agree that light power applied to pyramidal cells does not map 1:1 to the current input to the interneurons, the two are likely to be positively correlated, in particular for weak inputs and even for recurrent networks. Looking at the spectrograms in the experimental papers once again in detail, it seems clear that when the input was constant, there was prominent frequency adaptation of the LFP frequency (e.g., Fig. 2C, Fig. 4B, and Fig. S3C of Stark et al., 2014; Fig. S4A of Fernández-Ruiz et al., 2019). It remains unclear whether the present model replicates this observation?

Second, even after adding Supplementary Figure C, it is still unclear whether the model does or does not predict frequency accommodation for a pulse input to the interneurons. In Fig. 2B and in their reply to my previous Comment 11 the authors do report frequency accommodation during fixed input. However, when addressing the previous Comment 3 the opposite is stated, while minimizing the importance of the 1 cycle caveat. This does not receive consistent and direct treatment in the manuscript either. Because the issue is key, it should not be limited to discussions during the peer review process. Rather, the behavior with respect to a pulse input should be shown clearly in a main figure and discussed heads-on in the paper.

**********

Have the authors made all data and (if applicable) computational code underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data and code underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data and code should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data or code —e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: Yes

Reviewer #3: Yes

**********

PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: No

Reviewer #3: No

Figure Files:

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email us at figures@plos.org.

Data Requirements:

Please note that, as a condition of publication, PLOS' data policy requires that you make available all data used to draw the conclusions outlined in your manuscript. Data must be deposited in an appropriate repository, included within the body of the manuscript, or uploaded as supporting information. This includes all numerical values that were used to generate graphs, histograms etc.. For an example in PLOS Biology see here: http://www.plosbiology.org/article/info%3Adoi%2F10.1371%2Fjournal.pbio.1001908#s5.

Reproducibility:

To enhance the reproducibility of your results, we recommend that you deposit your laboratory protocols in protocols.io, where a protocol can be assigned its own identifier (DOI) such that it can be cited independently in the future. Additionally, PLOS ONE offers an option to publish peer-reviewed clinical study protocols. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols

References:

Review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript.

If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

PLoS Comput Biol. doi: 10.1371/journal.pcbi.1011886.r005

Decision Letter 2

Lyle J Graham, Jonathan David Touboul

1 Feb 2024

Dear Ms Schieferstein,

We are pleased to inform you that your manuscript 'Intra-ripple frequency accommodation in an inhibitory network model for hippocampal ripple oscillations' has been provisionally accepted for publication in PLOS Computational Biology.

Before your manuscript can be formally accepted you will need to complete some formatting changes, which you will receive in a follow up email. A member of our team will be in touch with a set of requests.

Please note that your manuscript will not be scheduled for publication until you have made the required changes, so a swift response is appreciated.

IMPORTANT: The editorial review process is now complete. PLOS will only permit corrections to spelling, formatting or significant scientific errors from this point onwards. Requests for major changes, or any which affect the scientific understanding of your work, will cause delays to the publication date of your manuscript.

Should you, your institution's press office or the journal office choose to press release your paper, you will automatically be opted out of early publication. We ask that you notify us now if you or your institution is planning to press release the article. All press must be co-ordinated with PLOS.

Thank you again for supporting Open Access publishing; we are looking forward to publishing your work in PLOS Computational Biology. 

Best regards,

Jonathan David Touboul

Academic Editor

PLOS Computational Biology

Lyle Graham

Section Editor

PLOS Computational Biology

***********************************************************

PLoS Comput Biol. doi: 10.1371/journal.pcbi.1011886.r006

Acceptance letter

Lyle J Graham, Jonathan David Touboul

14 Feb 2024

PCOMPBIOL-D-23-00433R2

Intra-ripple frequency accommodation in an inhibitory network model for hippocampal ripple oscillations

Dear Dr Schieferstein,

I am pleased to inform you that your manuscript has been formally accepted for publication in PLOS Computational Biology. Your manuscript is now with our production department and you will be notified of the publication date in due course.

The corresponding author will soon be receiving a typeset proof for review, to ensure errors have not been introduced during production. Please review the PDF proof of your manuscript carefully, as this is the last chance to correct any errors. Please note that major changes, or those which affect the scientific understanding of the work, will likely cause delays to the publication date of your manuscript.

Soon after your final files are uploaded, unless you have opted out, the early version of your manuscript will be published online. The date of the early version will be your article's publication date. The final article will be published to the same URL, and all versions of the paper will be accessible to readers.

Thank you again for supporting PLOS Computational Biology and open-access publishing. We are looking forward to publishing your work!

With kind regards,

Anita Estes

PLOS Computational Biology | Carlyle House, Carlyle Road, Cambridge CB4 3DN | United Kingdom ploscompbiol@plos.org | Phone +44 (0) 1223-442824 | ploscompbiol.org | @PLOSCompBiol

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    S1 Appendix. Performance evaluation of the Gaussian-drift approximation for constant drive.

    The Gaussian-drift approximation under constant drive is compared to spiking network simulations for a range of noise intensities D and inhibitory coupling strengths K (Fig A). The parameter dependencies of the point of full synchrony are captured well by the theory (Fig B).

    (PDF)

    pcbi.1011886.s001.pdf (306.6KB, pdf)
    S2 Appendix. The influence of network architecture and the shape of the external drive on the asymptotic and instantaneous ripple oscillation dynamics.

    A covariation of network architecture and stimulus profile demonstrates that IFA is modulated by, but occurs largely independent of, the shape of the asymptotic network frequency as a function of the external drive (Fig A). Furthermore, we illustrate that a simple square pulse cannot elicit IFA in the feedback-based inhibition-first model.

    (PDF)

    pcbi.1011886.s002.pdf (212KB, pdf)
    S1 Fig. Transient bimodality in the membrane potential distribution can affect instantaneous ripple frequency dynamics in the spiking network.

    (A) Same layout as in Fig 2A: Spiking network response to an isolated downwards ramp stimulus with the same slope as in Fig 2D, middle, after time t > 10 ms (N = 10, 000). Note that units that participate in the third population spike tend not to spike in the fourth population spike (red lines in raster plot), which is an indication of a residual bimodality in the membrane potential distribution from one cycle to the next (only faintly visible in voltage plot, see red square). (B) Same layout as Fig 2B: instantaneous network frequencies pooled together from 50 such ramp-down-only simulations. What appears as a continuous non-monotonic “wiggle” in the instantaneous frequencies of Fig 2D, middle (gray dots) is now clearly identifiable as a single outlier cycle (marked in red).

    (TIF)

    pcbi.1011886.s003.tif (488KB, tif)
    Attachment

    Submitted filename: Review_fnl.pdf

    pcbi.1011886.s004.pdf (109.1KB, pdf)
    Attachment

    Submitted filename: response_to_reviewer_reports.pdf

    pcbi.1011886.s005.pdf (7.5MB, pdf)
    Attachment

    Submitted filename: response_to_reviewers.pdf

    pcbi.1011886.s006.pdf (140.1KB, pdf)

    Data Availability Statement

    All code written in support of this publication is publicly available at https://github.com/NatalieSchieferstein/interneuron_ripples_with_ifa.git (https://doi.org/10.5281/zenodo.10602552). An exemplary simulation result file is available from https://zenodo.org/doi/10.5281/zenodo.10018496.


    Articles from PLOS Computational Biology are provided here courtesy of PLOS

    RESOURCES