Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2019 Sep 14.
Published in final edited form as: Nat Methods. 2019 Jan;16(1):14–15. doi: 10.1038/s41592-018-0276-8

Imaging whole nervous systems: insights into behavior from brains inside bodies

John A Calarco 1, Aravinthan DT Samuel 2
PMCID: PMC6744950  NIHMSID: NIHMS1049646  PMID: 30573822

Abstract

The development of systems combining rapid volumetric imaging and three dimensional tracking has enabled the measurement of brainwide dynamics in freely behaving animals. These advances provide an exciting opportunity to understand the organization of neural circuits in the context of voluntary and natural behaviors. In this commentary, we highlight recent progress in this burgeoning area of research.


Ethology aims to understand the natural behaviors of freely acting animals. Marrying ethology with neuroscience requires access to underlying neurons during voluntary behavior. This requires experimental intervention and, with it, compromise. Inserting recording devices into the brain, whether electrical or optical, is disruptive to the animal and laborious for the experimenter. When a behavior is studied in the laboratory, an animal’s movements or the richness of its sensory environment are often limited.

Full access to the brain in an unmanipulated animal as it freely performs natural behaviors in a realistic setting would be ideal. This might be realized using small transparent transgenic animals. Powerful microscopes and genetically encoded optical indicators have allowed direct visualization of neural activity inside behaving C. elegans1,2 and larval zebrafish35. Recent technological advances are extending these recordings to more neurons, more behaviors, and even more animal models. Brain and behavior can be studied right out of the box in a growing menagerie of vertebrates and invertebrates, using animals moved straight from their rearing chambers into microscope arenas.

Advanced volumetric imaging – methods like light-sheet, light-field, and multifocus microscopy using optical reporters of neural activity like the GCaMP family of calcium indicators6 – led to the first imaging of many or most nesurons in the nervous systems of restrained larval zebrafish and C. elegans with the cellular and temporal resolution needed to observe circuit processing711. Being able to record system-wide activity patterns in immobilized animals has already led to several surprising discoveries about the neural organization of behavioral states and sensorimotor pathways. For example in zebrafish, correlating sensory input and brainwide dynamics with the fictive behaviors of its immobilized larva has revealed a multiplicity of pathways that build its optomotor responses10,12,13. In C. elegans, brainwide dynamics in immobilized worms during fictive forward and backward movements has revealed attractor dynamics that synchronize large and distinct portions of the nervous system during different motor states8.

When studying neural circuits during fictive behaviors of restrained animals, activity patterns resemble, if not necessarily mirror, their activity in freely behaving animals. But studying immobilized animals has limits. Freely behaving animals use sensory and proprioceptive feedback loops to gauge their own movements and movement decisions with respect to their environments. The full activity of such feedback loops would be difficult to replicate without the most elaborate virtual reality setups. Moreover, the most complex but ethologically important behaviors -- social behaviors like mating and courtship or predatory behaviors like prey capture -- are also outside plausible virtual reality.

Moving brainwide imaging to moving animals required accelerating and adapting the original volumetric imaging platforms, and coupling them to tracking systems to keep the neural circuits inside an animal within the field of view of high resolution and high magnification optics. For example, brainwide imaging in crawling C. elegans was achieved by simply speeding up spinning disk confocal microscopy and combining it with a motion compensating motorized stage. This re-engineering achieved xyz tracking of brainwide activity with cellular resolution via volumetric imaging of panneuronally expressed genetically encoded calcium indicators1,2.

Similarly, tracking the much faster movements and larger brains of swimming zebrafish larva was achieved by adapting existing imaging setups. Symvoulidis et al. developed a widefield microscope with a scanning high-magnification field of view that could be rapidly aimed at a swimming larva based on a separate recording of its location5. Cong et al. merged light field and multifocus microscopy,entrained on the swimming larva using a motion-compensating motorized stage, to capture large image volumes with high spatial and temporal resolution3. Kim et al. developed a variant of HiLo microscopy, a structured illumination technique, and GPU based motion prediction to keep up with the larva and achieve brainwide imaging despite the most vigorous movements and accelerations4.

In Drosophila larva, brainwide imaging has been done with light-sheet microscopy of an immobilized sample, which delivered the isotropic spatial resolution needed for their densely packed neurons14. Adapting light-sheet microscopy for freely moving Drosophila required the technological breakthrough of SCAPE microscopy, a light-sheet method that uses just one microscope objective to both deliver the planar excitation light as well as perform volumetric image scans15. Accelerated point scanning two-photon microscopy has also been developed for freely crawling Drosophila larvae, with a setup that is able to lock onto individual neuronal targets, tracking them rapidly in xyz despite the animal’s considerable peristaltic movements and deformations16.

For all the obvious advantages in studying brainwide activity in intact freely moving animals, new and unique challenges arise. Extracting each neural activity trace from long recordings with high temporal resolution of hundreds or thousands of neurons that are uniformly labeled with optical indicators is challenging in any setup. The challenge lies in the post-acquisition identification and registration of all individual neurons through space and time. When neurons stay nearly fixed within an image volume, it is relatively straightforward for software to track the same neuron from frame to frame based on the neuron’s position or to use more sophisticated computational approaches like nonnegative matrix factorization to separate and cluster the activities of all distinct neurons17,18. But these methods fail when neurons move dramatically from volume to volume, and when animal movements cause non-linear deformations between neighboring cells. In this case, analysis of brainwide imaging in behaving animals has required laborious manual annotation or semi-automated tracking with substantial manual proofreading. But these recourses introduce a severe bottleneck in the experimental pipeline. Microscopes can now generate volumetric image series at a rate that is many orders of magnitude faster than they can be analyzed.

Better and faster analysis engines are needed. One approach used in C. elegans is to use machine-learning to teach a computer to recognize neurons based on the local constellation of neighboring neurons19. Even when the brain undergoes large overall distortions, neighboring neurons will generally maintain their local relative positions. The human proofreader only needs to check the errors made by the machine learning algorithm. Another approach might be to simplify neuron localization and discrimination by adding optical labels. It is difficult to separate and disambiguate a large cluster of neurons with identical panneuronal fluorescent labels such as cytosolic GCaMP to reveal calcium dynamics and another nuclear localized fluorescent protein to denote cell position. The spacing that comes with nuclear localization is useful for drawing clear boundaries between neighboring neurons, but identifying neurons based on nuclear markers is challenging. Even in a relatively invariant nervous system like C. elegans where axons and dendrites are stitched together by stereotyped synaptic connections, cell bodies can move and rearrange within ganglia, making their relative positions variable from measurement to measurement. However, a rich palette of fluorescent proteins are now available which can be imaged together with GCaMP due to their separable excitation and/or emission spectra20. Advances in labeling neighboring cells with unique combinations of fluorescent proteins, such as the “Brainbow” approach, could enable large numbers of neurons to be uniquely identified and tracked longitudinally21,22. In C. elegans, Drosophila, and zebrafish, the ability to express different combinations of fluorescent proteins in specific neurons can already be achieved by leveraging the wealth of known neuron subtype-specific promoters and drivers2325. A combination of machine-learning algorithms and advanced cell labeling (along with the necessary multicolor adaptations to the microscopes) is likely to deliver the much needed acceleration of the analysis pipeline.

Even with fully extracted neural dynamics, understanding voluntary behaviors in freely moving animals also poses a unique challenge. Every experiment with different animals spontaneously generating their own behaviors can be expected to generate a unique trajectory in locomotory activity and brain activity. Computer vision can reliably quantify, segment, and label the movement repertoires of behaving animals26. Even in animals with relatively simple body plans like C. elegans, stochastic transitions between motor patterns and behavioral states creates inherent variability from dataset to dataset. This creates the significant challenge of aligning and comparing datasets across experiments to extract common mechanisms. Moreover, outwardly identical motor states can differ in internal variables that create differences in how the animal makes decisions or responds to sensory inputs. Large-scale, high resolution quantitative analysis of the movements of freely moving Drosophila and C. elegans has shown that Markov models can be used to describe their outward behavior patterns as time series involving transitions between distinct states27,28. These analyses have revealed a surprisingly large number of behavioral states diversified by hidden variables.

If the experimental pipeline for acquiring and analyzing data can be sped up, a brute force approach can be used to deal with the intrinsic variability of datasets: simply acquire so much data that many similar epochs across time series can be collected, clustered, and compared. An alternative holistic approach to dealing with the intrinsic variability of datasets is to turn to modeling and machine learning (Scott Lindermann, personal communication). For example, one might build a computational model of a behaving animal that is constrained to a Markov process. The Markov model would start with the observed behavioral states of freely moving animals as well as hidden states, and then might adaptively learn the transition probabilities between states, whether spontaneous or stimulus-evoked, to match the statistics of experimental data. Computational models might be iterated and compared to experimental datasets, progressively refined until they approximate the behavioral rules of real animals. At this point, the computational Markov model that resembles the real animal could become a framework for interpreting and understanding neural dynamics.

Although much remains to be developed in the various technologies for studying brain and behavior in freely moving animals, studies in C. elegans, zebrafish larvae, and Drosophila larvae are underway. Many of these studies are pursuing behaviors that simply cannot be studied in semirestrained or immobilized animals, such as prey capture in zebrafish larva3 and the mating behavior of C. elegans (V. Susoy, V. Venkatachalam, and A. Leifer, personal communication). Another exciting avenue is to develop new model organisms that appear to be “made for microscopy” such as the cnidarian Hydra vulgaris29 and the marine annelid Platynereis dumerilii30. Genome editing approaches, such as the CRISPR/Cas system31, will undoubtedly enable the transgenesis and study of additional metazoans, facilitating comparative studies of neural circuits and their evolution. Systems neuroscience among these small and powerful model animals is enjoying rapid growth.

References:

  • 1.Nguyen JP et al. Whole-brain calcium imaging with cellular resolution in freely behaving Caenorhabditis elegans. Proc Natl Acad Sci U S A 113, E1074–1081, doi: 10.1073/pnas.1507110112 (2016). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Venkatachalam V et al. Pan-neuronal imaging in roaming Caenorhabditis elegans. Proc Natl Acad Sci U S A 113, E1082–1088, doi: 10.1073/pnas.1507109113 (2016). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Cong L et al. Rapid whole brain imaging of neural activity in freely behaving larval zebrafish (Danio rerio). Elife 6, doi: 10.7554/eLife.28158 (2017). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Kim DH et al. Pan-neuronal calcium imaging with cellular resolution in freely swimming zebrafish. Nat Methods 14, 1107–1114, doi: 10.1038/nmeth.4429 (2017). [DOI] [PubMed] [Google Scholar]
  • 5.Symvoulidis P et al. NeuBtracker-imaging neurobehavioral dynamics in freely behaving fish. Nat Methods 14, 1079–1082, doi: 10.1038/nmeth.4459 (2017). [DOI] [PubMed] [Google Scholar]
  • 6.Fosque BF et al. Neural circuits. Labeling of active neural circuits in vivo with designed calcium integrators. Science 347, 755–760, doi: 10.1126/science.1260922 (2015). [DOI] [PubMed] [Google Scholar]
  • 7.Ahrens MB et al. Brain-wide neuronal dynamics during motor adaptation in zebrafish. Nature 485, 471–477, doi: 10.1038/nature11057 (2012). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Kato S et al. Global brain dynamics embed the motor command sequence of Caenorhabditis elegans. Cell 163, 656–669, doi: 10.1016/j.cell.2015.09.034 (2015). [DOI] [PubMed] [Google Scholar]
  • 9.Schrodel T, Prevedel R, Aumayr K, Zimmer M & Vaziri A Brain-wide 3D imaging of neuronal activity in Caenorhabditis elegans with sculpted light. Nat Methods 10, 1013–1020, doi: 10.1038/nmeth.2637 (2013). [DOI] [PubMed] [Google Scholar]
  • 10.Naumann EA et al. From Whole-Brain Data to Functional Circuit Models: The Zebrafish Optomotor Response. Cell 167, 947–960 e920, doi: 10.1016/j.cell.2016.10.019 (2016). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Abrahamsson S et al. Multifocus microscopy with precise color multi-phase diffractive optics applied in functional neuronal imaging. Biomed Opt Express 7, 855–869, doi: 10.1364/BOE.7.000855 (2016). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Dunn TW et al. Neural Circuits Underlying Visually Evoked Escapes in Larval Zebrafish. Neuron 89, 613–628, doi: 10.1016/j.neuron.2015.12.021 (2016). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Kawashima T, Zwart MF, Yang CT, Mensh BD & Ahrens MB The Serotonergic System Tracks the Outcomes of Actions to Mediate Short-Term Motor Learning. Cell 167, 933–946 e920, doi: 10.1016/j.cell.2016.09.055 (2016). [DOI] [PubMed] [Google Scholar]
  • 14.Chhetri RK et al. Whole-animal functional and developmental imaging with isotropic spatial resolution. Nat Methods 12, 1171–1178, doi: 10.1038/nmeth.3632 (2015). [DOI] [PubMed] [Google Scholar]
  • 15.Bouchard MB et al. Swept confocally-aligned planar excitation (SCAPE) microscopy for high speed volumetric imaging of behaving organisms. Nat Photonics 9, 113–119, doi: 10.1038/nphoton.2014.323 (2015). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Karagyozov D, Mihovilovic Skanata M, Lesar A & Gershow M Recording Neural Activity in Unrestrained Animals with Three-Dimensional Tracking Two-Photon Microscopy. Cell Rep 25, 1371–1383 e1310, doi: 10.1016/j.celrep.2018.10.013 (2018). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Friedrich J et al. Multi-scale approaches for high-speed imaging and analysis of large neural populations. PLoS Comput Biol 13, e1005685, doi: 10.1371/journal.pcbi.1005685 (2017). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Friedrich J et al. in Conf Neural Inf Process Syst:1–5 (Montreal, Canada, 2015). [Google Scholar]
  • 19.Nguyen JP, Linder AN, Plummer GS, Shaevitz JW & Leifer AM Automatically tracking neurons in a moving and deforming brain. PLoS Comput Biol 13, e1005517, doi: 10.1371/journal.pcbi.1005517 (2017). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Day RN & Davidson MW The fluorescent protein palette: tools for cellular imaging. Chem Soc Rev 38, 2887–2921, doi: 10.1039/b901966a (2009). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Livet J et al. Transgenic strategies for combinatorial expression of fluorescent proteins in the nervous system. Nature 450, 56–62, doi: 10.1038/nature06293 (2007). [DOI] [PubMed] [Google Scholar]
  • 22.Weissman TA & Pan YA Brainbow: new resources and emerging biological applications for multicolor genetic labeling and analysis. Genetics 199, 293–306, doi: 10.1534/genetics.114.172510 (2015). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Hobert O, Glenwinkel L & White J Revisiting Neuronal Cell Type Classification in Caenorhabditis elegans. Curr Biol 26, R1197–R1203, doi: 10.1016/j.cub.2016.10.027 (2016). [DOI] [PubMed] [Google Scholar]
  • 24.Jenett A et al. A GAL4-driver line resource for Drosophila neurobiology. Cell Rep 2, 991–1001, doi: 10.1016/j.celrep.2012.09.011 (2012). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Marquart GD et al. A 3D Searchable Database of Transgenic Zebrafish Gal4 and Cre Lines for Functional Neuroanatomy Studies. Front Neural Circuits 9, 78, doi: 10.3389/fncir.2015.00078 (2015). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Kabra M, Robie AA, Rivera-Alba M, Branson S & Branson K JAABA: interactive machine learning for automatic annotation of animal behavior. Nat Methods 10, 64–67, doi: 10.1038/nmeth.2281 (2013). [DOI] [PubMed] [Google Scholar]
  • 27.Berman GJ, Bialek W & Shaevitz JW Predictability and hierarchy in Drosophila behavior. Proc Natl Acad Sci U S A 113, 11943–11948, doi: 10.1073/pnas.1607601113 (2016). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Liu M, Sharma AK, Shaevitz JW & Leifer AM Temporal processing and context dependency in Caenorhabditis elegans response to mechanosensation. Elife 7, doi: 10.7554/eLife.36419 (2018). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Dupre C & Yuste R Non-overlapping Neural Networks in Hydra vulgaris. Curr Biol 27, 1085–1097, doi: 10.1016/j.cub.2017.02.049 (2017). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Chartier TF, Deschamps J, Durichen W, Jekely G & Arendt D Whole-head recording of chemosensory activity in the marine annelid Platynereis dumerilii. Open Biol 8, doi: 10.1098/rsob.180139 (2018). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Knott GJ & Doudna JA CRISPR-Cas guides the future of genetic engineering. Science 361, 866–869, doi: 10.1126/science.aat5011 (2018). [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES