Skip to main content
. Author manuscript; available in PMC: 2023 Jun 1.
Published in final edited form as: Nat Neurosci. 2022 Nov 24;25(12):1724–1734. doi: 10.1038/s41593-022-01189-0

Figure 1 |. Improving inference of network state from 2p imaging.

Figure 1 |

(a) Calcium imaging offers the ability to monitor the activity of many neurons simultaneously, in 3-D, often with cell types of interest and layers identified. In contrast, electrophysiology sparsely samples the neurons in the vicinity of a recording electrode, and may be biased toward neurons with high firing rates. (b) Calcium fluorescence transients are a low-passed and lossy transformation of the underlying spiking activity. Spike inference methods may provide a reasonable estimate of neurons’ activity on coarse timescales (left), but yield poor estimates on fine timescales (right; data from ref. 7). (c) RADICaL uses a recurrent neural network-based generative model to infer network state - i.e., de-noised event rates for the population of neurons - and assumes a time-varying ZIG observation model. For any given trial, the time-varying network state can be captured by three pieces of information: the initial state (i.e., “initial condition”) of the dynamical system (trial-specific), the dynamical rules that govern state evolution (shared across trials), and any time-varying external inputs (i.e., “inferred inputs”) that may affect the dynamics (trial-specific). (d) Top: in 2p imaging, the laser’s serial scanning results in different neurons being sampled at different times within the frame. Bottom: individual neurons’ sampling times are known with sub-frame precision (colors) but are typically analyzed with whole-frame precision (gray). (e) Sub-frame binning precisely captures individual neurons’ sampling times but results in neuron-time points without data. The numbers in the table indicate the deconvolved event in each frame. (f) SBTT is a novel network training method for sparsely sampled data that prevents unsampled time-neuron data points from affecting the gradient computation.