Skip to main content
eLife logoLink to eLife
. 2020 Apr 14;9:e53350. doi: 10.7554/eLife.53350

The natverse, a versatile toolbox for combining and analysing neuroanatomical data

Alexander Shakeel Bates 1,, James D Manton 1,, Sridhar R Jagannathan 2, Marta Costa 1,2, Philipp Schlegel 1,2, Torsten Rohlfing 3, Gregory SXE Jefferis 1,2,
Editors: K VijayRaghavan4, K VijayRaghavan5
PMCID: PMC7242028  PMID: 32286229

Abstract

To analyse neuron data at scale, neuroscientists expend substantial effort reading documentation, installing dependencies and moving between analysis and visualisation environments. To facilitate this, we have developed a suite of interoperable open-source R packages called the natverse. The natverse allows users to read local and remote data, perform popular analyses including visualisation and clustering and graph-theoretic analysis of neuronal branching. Unlike most tools, the natverse enables comparison across many neurons of morphology and connectivity after imaging or co-registration within a common template space. The natverse also enables transformations between different template spaces and imaging modalities. We demonstrate tools that integrate the vast majority of Drosophila neuroanatomical light microscopy and electron microscopy connectomic datasets. The natverse is an easy-to-use environment for neuroscientists to solve complex, large-scale analysis challenges as well as an open platform to create new code and packages to share with the community.

Research organism: D. melanogaster, Mouse, Zebrafish

Introduction

Neuroanatomy has become a large-scale, digital and quantitative discipline. Improvements in sample preparation and imaging increasingly enable the collection of large 3D image volumes containing complete neuronal morphologies in the context of whole brains or brain regions. Neuroscientists, therefore, need to tackle large amounts of morphological data, often writing custom code to enable repeated analysis using their specific requirements. They also need to analyse neuronal morphology and connectivity in the context of whole nervous systems or sub-regions. However, it is desirable not to rewrite basic functionalities such as reading various types of data file, representing neurons in different data structures, implementing spatial transforms between samples, integrating popular datasets or performing popular analyses from scratch. Scaling up or developing custom analysis strategies is simpler and more feasible for researchers if they can reuse existing infrastructure. This has been amply demonstrated by flexible but open source platforms such as ImageJ/Fiji for image analysis (Schindelin et al., 2012) or Bioconductor for bioinformatics (Huber et al., 2015). One important consequence of these free and open-source tools is that they aid collaboration and reproducibility, and reduce the overhead when switching between different types of analysis. Together, these considerations have motivated us to create the NeuroAnatomy Toolbox (nat) and its extensions, which we detail in this paper.

A number of software tools are already available to analyse neuronal data (Billeci et al., 2013; Brown et al., 2005; Cuntz et al., 2010; Feng et al., 2015; Gensel et al., 2010; Glaser and Glaser, 1990; Ho et al., 2011; Katz and Plaza, 2019; Kim et al., 2015; Meijering et al., 2004; Myatt et al., 2012; Narro et al., 2007; Peng et al., 2014; Pool et al., 2008; Saalfeld et al., 2009; Schmitz et al., 2011; Wearne et al., 2005). However, most focus on image processing and the morphological analysis options available are fairly basic, such as examining arbour lengths or performing Sholl analyses (Sholl, 1953). Of these, the trees toolbox (Cuntz et al., 2010), has particularly strong support for morphological analysis of neurons but focuses on individual neurons in isolation rather than neurons within the volume of the brain as a whole.

Recent technological advances have made acquiring large amounts of neuronal morphology data in their whole-brain contexts feasible across phyla (Chiang et al., 2011; Cook et al., 2019; Economo et al., 2016; Jenett et al., 2012; Kunst et al., 2019; Li et al., 2019; Oh et al., 2014; Ohyama et al., 2015; Ryan et al., 2016; Winnubst et al., 2019; Zheng et al., 2018). Image data are typically registered against a template space, allowing one to compare data from many brains directly and quantitatively. This significantly aids the classification of neuronal cell types because it allows type classification relative to the arbours of other neuronal types (Sümbül et al., 2014) and anatomical subvolumes. However, while this enables the comparison of data within a given study, template spaces are often different across studies or laboratories, hindering data integration.

This paper describes the Neuroanatomy Toolbox (nat), a general purpose open source R-based package for quantitative neuroanatomy, and a suite of extension R packages that together we call the natverse. A distinctive feature of the natverse, as compared with other available tools, is to analyse neurons within and across template spaces and to simplify access to a wide range of data sources. Neurons can be read from local files or from online repositories (Ascoli et al., 2007; Chiang et al., 2011; Economo et al., 2016; Jenett et al., 2012; Kunst et al., 2019; Winnubst et al., 2019) and web-based reconstruction environments (Katz and Plaza, 2019; Saalfeld et al., 2009; Schneider-Mizell et al., 2016). The natverse can be installed in two lines of code as described on the project website (https://natverse.org). Every function is documented with a large number of examples based on bundled or publicly available data. Example pipeline code, and code to generate the figures in this manuscript is available through https://github.com/natverse/nat.examples. We provide online community support through our nat-user mailing list: https://groups.google.com/forum/#!forum/nat-user.

The natverse has recently been employed for large-scale analysis of zebrafish data (Kunst et al., 2019), and we provide examples across a range of invertebrate and vertebrate species. We then give more specific examples focussing on cell type identification across Drosophila datasets. Using the natverse, we have created bridging registrations that transform data from one template to another along with mirroring registrations (e.g. left-to-right hemisphere) and made these easily deployable. This unifies all publicly available Drosophila neuroanatomical datasets, including those image data for genetic resources and whole brain connectomics.

We now give an overview of the natverse and showcase a number of common applications. These applications include quantifying the anatomical features of neurons, clustering neurons by morphology, analysing neuroanatomical data relative to subvolumes, in silico intersections of genetic driver lines, matching light-level and EM-level neuronal reconstructions and registering and bridging neuroanatomical data to and between template spaces.

Results

Software packages for neuroanatomy

We have opted to develop our software in R, a leading platform for bioinformatics and general data analysis. R is free and open source, and is supported by high-quality integrated development environments (e.g. Rstudio). It features a well-defined system for creating and distributing extension packages that bundle code and documentation. These can easily be installed from high-quality curated repositories (CRAN, Bioconductor) as well as via GitHub. R supports a range of reproducible research strategies including reports and notebooks and integrates with the leading cross-platform tools in this area (jupyter, binder).

The core package of the natverse is the Neuroanatomy Toolbox, nat. It supports 3D visualisation and analysis of neuroanatomical data (Figure 1a), especially tracings of single neurons (Figure 1b). nat allows a user to read neuronal data from a variety of popular data formats produced by neuron reconstruction tools (Figure 1a). Typical image analysis pipelines include imaging neurons with confocal microscopy, reconstructing them using Fiji Simple Neurite Tracer (Longair et al., 2011) then saving them as SWC files (Cannon et al., 1998); nat can read a collection of such files with a single command. In addition, a user can, for example, mark the boutons on each neuron using Fiji’s point tool and export that as a CSV, load this into nat and then analyse the placement of these synaptic boutons with respect to the originally traced neuron (Figure 1—figure supplement 1).

Figure 1. The natverse.

(a) R packages that constitute the natverse. Packages are coloured by whether they are general purpose, or cater specifically for Mus musculus, Danio rerio or Drosophila melanogaster datasets. Coarse division into packages for fetching remote data, implementing registrations and analysing data are shown. Data, as outputted by most reconstruction pipelines, can also be read by nat. (b) The natverse is designed to work best in the RStudio environment (RStudio Team, 2015), by far the most popular environment in which to execute and write R code. 3D visualisation is based on the R package rgl (Murdoch, 2001), which is based on OpenGL and uses the XQuartz windowing system. It runs on Mac, Linux and Windows platforms. (c) R functions can be called by other popular scientific programming languages, some example packages/libraries are shown. In particular, there is support for bidirectional interaction with Python, with interactive use supported by Jupyter or R Markdown notebooks. One of us (P. Schlegel) has developed Python code inspired by the natverse for analysing neuron morphology, NAVIS (https://github.com/schlegelp/navis), and talking to the CATMAID API, PyMaid (https://github.com/schlegelp/pyMaid). These python libraries transform natverse-specific R objects into Python data types and can call natverse R functions like nblast.

Figure 1.

Figure 1—figure supplement 1. A basic analysis pipeline.

Figure 1—figure supplement 1.

A simple pipeline for neuron analysis with nat. Bouton placement for a layer 5 pyramidal neuron from the mouse primary somatosensory cortex is examined relative to the neuron’s cable length and position within the layered structure of the cortex. Coloured names indicate layer transitions. Data courtesy of A. Vourvoukelis, A. von Klemperer and C.J. Akerman.

We have extended nat by building the natverse as an ecosystem of interrelated R packages, each with a discrete purpose (Figure 1a). The natverse is developed using modern software best practices including revision control, code review, unit testing, continuous integration, and comprehensive code coverage. Developing sets of functions in separate packages helps compartmentalise development, ease troubleshooting and divides the natverse into documented units that users can search to find the more specific code examples or functions that they need. To the casual user, these divisions may initially be of little consequence. We therefore provide a single wrapper package, natverse; installing this results in the installation of all packages and their dependencies, immediately giving the user all the capabilities described in this paper (Figure 1a). Natverse packages have already been used in recent publications from our lab (Cachero et al., 2010; Costa et al., 2016; Dolan et al., 2019; Dolan et al., 2018a; Dolan et al., 2018b; Frechter et al., 2019; Grosjean et al., 2011; Huoviala et al., 2018; Jefferis et al., 2007) and others (Clemens et al., 2018; Clemens et al., 2015; Eichler et al., 2017; Felsenberg et al., 2018; Jeanne et al., 2018; Kunst et al., 2019; Saumweber et al., 2018; Zheng et al., 2018), with the nat.nblast packaged described in Costa et al., 2016. Confirmed stable versions of nat, nat.templatebrains, nat.nblast, nat.utils and nabor can be downloaded from the centralised R package repository, CRAN, with developmental versions available from our GitHub page (https://github.com/natverse/).

In brief, natverse packages can be thought of as belonging to four main groups (Figure 1a). The first two support obtaining data, either by a) interacting with repositories and software primarily used for neuron reconstructions from electron micrograph (EM) data, including CATMAID, NeuPrint and DVID (Clements et al., 2020; Katz and Plaza, 2019; Saalfeld et al., 2009; Schneider-Mizell et al., 2016) or b) interacting with repositories for light-level data, including MouseLight, FlyCircuit, Virtual Fly Brain, NeuroMorpho, the InsectBrainDB and the FishAtlas projects. Additional R packages help with c) manipulating and deploying registrations to move data between brainspaces, and d) data analysis and visualisation (see Materials and methods for additional details). In order to see how one can use the natverse in RStudio to visualise and analyse neurons, please see Videos 15.

Video 1. Short tutorial videos.

Download video file (11.8MB, mp4)

Short tutorial on how to use basic natverse functionality in RStudio, for example, loading and installing the natverse, plotting neurons and volumes, bridging between template brains, using NBLAST and comparing EM and LM data.

Video 2. Installing the natverse.

Download video file (17.9MB, mp4)

Video 3. Bridging neuron data.

Download video file (3.2MB, mp4)

Video 4. Morphological clustering.

Download video file (6.8MB, mp4)

Video 5. Comparing different datasets.

Download video file (16.2MB, mp4)

Manipulating neuroanatomical data

Neuron skeleton data

Raw 3D images enable true to life visualisation but simplified representations are usually required for data analysis. For example, neurons can be traced to generate a compact 3D skeleton consisting of 3D vertices joined by edges. A more accurate representation would be a detailed mesh describing a 3D neuron, but it is often easier and quicker to work with skeleton representations.

The natverse provides functions for morphological and graph-theoretic analyses of neurons, collections of neurons, neurons as vector clouds and neurons as tree graphs (Figure 2a). The natverse mainly operates with skeleton data, but the geometry of neuron mesh data can be analysed using the more general R packages Rvcg and Morpho (Schlager, 2017). The natverse represents skeletonised neurons as neuron objects, with the neuron’s geometry in the standard SWC format where each node in the skeleton has its own integer identifier. There are additional data fields (Figure 2—figure supplement 2), the treenode IDs for branch points, the location of its synapses in 3D space and their polarity, including the source file, leaf nodes and series of IDs that belong to continuous non-branching segments of the neuron (Figure 2—figure supplement 2).

Figure 2. Neurons in nat.

(a) Data classes defined by nat. A D. melanogaster DA1 olfactory projection neuron (Costa et al., 2016) is shown as part of four different data types, useful for different sorts of analyses: as a neuron object (left), as part of a larger neuronlist of multiple olfactory projection neurons (middle), as a vector-cloud dotprops object (right, upper) and a ngraph object (right, lower). In grey, the FCWB template brain, a hxsurf object from the package nat.flybrains, is shown. Generic code for visualizing these data types shown. (b) Visualisation, with generic sample code, of connectomic data from a dense reconstruction inner plexiform layer of the mouse retina is shown, coloured by the cell class and then cell type annotations given by their reconstructors (Helmstaedter et al., 2013). Because this dataset contains many neuron fragments that have been severely transected, we only consider skeletons of a total cable length greater than 500 μm using functions summary and subset. Somata are shown as spheres. (c) A synaptic-resolution neuron reconstruction for a D. melanogaster lateral horn neuron (Dolan et al., 2018a) has been read from a live CATMAID project hosted by Virtual Fly Brain (https://fafb.catmaid.virtualflybrain.org/) using catmaid, and plotted as a neuron object. It is rooted at the soma, consistent with the convention. (d) Boxed, Strahler order is a measure of branching complexity for which high Strahler order branches are more central to a neuron’s tree structure, and the lower order ones more peripheral, such that branches with leaf nodes are all Strahler order 1. Main, the same neuron which has had its lower Strahler order branches (see inset) progressively pruned away. (e) We can extract the longest path through a neuron, its ‘spine’, purple, a method that could help define the tracts that a neuron might traverse. (f) Boxed, in insect neurons, the main structure of the neuron is supported by a microtubular backbone. As it branches its more tortuous, smaller caliber neurites loose the microtubule, and make more postsynapses (Schneider-Mizell et al., 2016). Main, in CATMAID users can tagged the tree nodes that mark the position where neurite loses its microtubular backbone, so an R user can use prune family functions to remove, add or differentially colour microtubular backbone versus twigs. (g) Both presynapses and postsynapses can been manually annotated in CATMAID, and be visualised in R. Because neurons from CATMAID have synaptic data, they are treated as a slightly different class by the natverse, called catmaidneuron. A neuronlist can also comprise many catmaidneuron objects. (h) Right, using synaptic information, it is possible to use a graph theoretic approach which divides the neuron at the point of maximum ‘flow’ - the region in the neuron at which there are the most parallel paths - having ‘drawn’ a path between each input synapse and each output synapse that pass through every node on the skeleton (Schneider-Mizell et al., 2016). This helps divide a neuron into its dendrites, axon, intervening cable (maximum flow, the primary dendrite) and its cell body fiber (no flow). In insects, the cell body lies outside the neuropil and is connected to its arbour by a single fiber. Main, axon-dendrite split shown for exemplary neuron using seesplit3d.

Figure 2.

Figure 2—figure supplement 1. Neurogeometry and skeleton annotations with nat.

Figure 2—figure supplement 1.

(a) We used neuromorphr to obtain information about 40599 mammalian neurons from NeuroMorpho.org, in the brain region ‘neocortex’, from 193 different laboratories. However, it is quite difficult to compare morphologies acquired from different laboratories, that have used different imaging pipelines, neuron reconstruction tools and staining protocols (Farhoodi et al., 2019). We can use neuromorphr to access neuron metadata on NeuroMorpho.org, and choose only those neurons from the same laboratory (Bob Jacobs’), that have been obtained using a Golgi stain and the reconstruction software NeuroLucida, and that are classed as a principal cell (Anderson et al., 2010; Anderson et al., 2009; Jacobs et al. (2018); Jacobs et al., 2016; Jacobs et al., 2015; Jacobs et al., 2011; Jacobs et al., 2001; Jacobs et al., 1997; Reyes et al. (2016); Travis et al., 2005), giving us 3174 neurons. Branch points per micron cable can then be plotted for 24 different species’ cortical principal neurons using ggplot2. (b) Inset, a Sholl analysis (Sholl, 1953) obtains the number of intersections the neurons’ morphologies make with radii of increasing size centered at their roots. Main, the mean intersections for a subset of the species’ cortical principal cells is shown. (c) The distribution of pre- and post-synapses along three neurons arbors, split by axon and dendrite, then further by Strahler order, then further by presence (backbone) or absence (twigs) of microtubule. Olfactory related neurons (gold), associative memory related neurons (purple) and unknown inputs (grey). In this case, neuron#1 and neuron#2 are of the same cell type and look similar (PD2a1), but neuron#3 differs, indeed it belongs to a different cell type (PD2b1) (Dolan et al., 2018a).
Figure 2—figure supplement 2. Neuron data structure.

Figure 2—figure supplement 2.

Schematic representation of the data structure behind neuron objects and neuronlist objects. Objects of class neuronlist are essentially lists of neuron objects, representing one or more neurons, with some attached metadata.

Neurons have tree like structures that are critical to their function (Cuntz et al., 2010). ngraph data objects represent a neuron as a tree graph originating at a root (usually the soma) with directed edges linking each of the neuron’s tree nodes (Figure 2a). This representation provides a bridge to the rich and efficient graph theory analysis provided by the igraph package (Csardi and Nepusz, 2006).

Objects of class neuron are lists of data objects, like data.frames, describing properties such as the 3D position and interconnectivity of points in a neuron. Objects of class neuronlist are lists of neuron objects, representing one or more neurons, with some attached metadata. This attached metadata can give information like a neuron’s name, some unique identifier, its cell type, etc (Figure 2—figure supplement 2). An ngraph, neuron or neuronlist can be passed to many functions in the natverse, and also to other functions familiar to R users for which we have written specific methods. For example, users can call subset on a neuronlist to select only those neurons with a certain entry in their corresponding metadata, for example all amacrine cells. Methods passed to plot3d enable a neuronlist to be coloured by its metadata entries when it is plotted (Figure 2b), in this case connectomic data from the inner plexiform layer of the mouse retina is shown (Helmstaedter et al., 2013). Many functions are built to work with neuron objects but will also have a method that allows them to be applied to every neuron in a given neuronlist via the nat function nlapply. R users will be familiar with this logic from using the base R function lapply.

Basic analysis

A useful function with methods for neuron objects and neuronlist objects is summary. This gives the user counts for tree nodes, branch points, leaf nodes and the total combined cable length of a neuron (Figure 2—figure supplement 1a). We can further use the natverse to identify points on a neuron that have particular properties based on the neuron’s skeleton structure (Figure 2c–e) or because we have some other data that identifies the position of some biological feature (Figure 2f–g), or both (Figure 2h). Branching can be assessed by branching density, for example a Sholl analysis (sholl_analysis) (Figure 2—figure supplement 1b), or decomposed by branching complexity, for example Strahler order (Figure 2—figure supplement 1c). Geodesic distances, that is within-skeleton distances, can be calculated between any tree node in the graph (Figure 2—figure supplement 1c) with the help of functions from the R package igraph (Csardi and Nepusz, 2006), and Euclidean distances can be calculated using our R package nabor.

Some reconstruction environments allow tree nodes to be tagged with extra information, for example CATMAID. This can include neurite diameter, microtubules (Figure 2e) and pre- and postsynapses (Figure 2f). This information is fetched when the catmaid package reads a neuron. It can be used by a graph theoretic algorithm (Schneider-Mizell et al., 2016; Figure 2g, inset) to divide a neuron into its dendrites, axon and intervening cable (Figure 2h). We put this information together in the example in Figure 2—figure supplement 1c, which shows the geodesic distribution of pre- and postsynapses along three neurons arbors, split by axon and dendrite, then further by Strahler order, then further by presence or absence of microtubule. Here, for our three exemplar neurons, presynapses only exist on microtubular backbones, and are laid in high number except at the highest Strahler orders while postsynapses are mainly on twigs, and at Strahler order 1–2. We can also identify connected neurons using catmaid functions, and see that the dendrites of these cells only receive particular inputs.

Neuroanatomical volumes

The natverse also helps users to analyse neuronal skeletons with respect to volume objects that might represent neuroanatomical structures on a scale from whole neural tissues to neuropil subvolumes. 3D objects from diverse sources can be visualised and analysed with nat, and we can calculate their volumes (Figure 3a). By using the nat function make_model, a user can interactively create their own 3D objects from, for example, 3D points from a neuron’s cable or its synapses (Figure 3b); points can easily be retrieved by giving the function a labelled data.frame, matrix, neuron, neuronlist, hxsurf or mesh3d object (Figure 2—figure supplement 2). The resulting volume could be, for example, the envelope around a dendrite, which may correlate with other features of a neuron (Figure 3b). Using the nat function prune_in_volume, a skeleton can be cut to include or exclude the cable within a given volume, while the function pointsinside can tell a user which synapses lie within a standard neuropil segmentation (Figure 3c).

Figure 3. Neuroanatomical models with nat.

(a) We accessed the InsectBrainDB.org via insectbrainr to obtain template brains for different species of insect (Brandt et al., 2005; de Vries et al., 2017; El Jundi et al., 2018; Heinze and Reppert (2012); Kurylas et al. (2008); Løfaldli et al. (2010); Stone et al. (2017); Zhao et al., 2014). The package insectbrainr converts retrieved OBJ files into hxsurf objects, which contain one set of 3D points for each whole brain, and then different sets of edges between these points to form 3D neuropil subvolumes. These subvolumes were already defined by expert annotators. Their volume are compared across insect brain, normalised by total brain size. Insect template brain data curated by: S. Heinze, M. Younger, J. Rybak, G. Pfuhl, B. Berg, B. el Jundi, J. Groothuis and U. Homberg. (b) We can create our own subvolumes by pulling synaptic neuron reconstructions (Berck et al., 2016) from a first-instar larva EM dataset (Ohyama et al., 2015) (a public CATMAID instance hosted by Virtual Fly Brain), extracting dendritic post synapses from olfactory projections neurons, and using synapse clouds from neurons of the same cell type, to define glomerular volumes by creating a bounding volume, i.e an α-shape or convex hull. Their volumes can then be calculated, and correlated with the number of presynapses the same neurons make in two higher-order brain regions, the lateral horn and the mushroom body calyx. (c) Volumes can be used to analyse skeleton data. In (c) we look again at olfactory projection neurons, this time from an adult fly EM dataset (Zheng et al., 2018) and use the nat function pointsinside with standard neuropil volumes (Ito et al., 2014) to find the numbers of presynapses GABAergic and cholinergic olfactory projection neurons from the antennal lobe make in different neuropils. These neuropils exist as a hxsurf object in our R package nat.flybrains.

Figure 3.

Figure 3—figure supplement 1. Superxovel analysis with nat.

Figure 3—figure supplement 1.

(a) The basic connectivity scheme of the lateral horn, a second order olfactory centre in insects. Olfactory projection neurons (in orange) connect to lateral horn neurons (in cyan) (Frechter et al., 2019). (b) To create anatomically meaningful continuous voxels for the lateral horn, rather than random contiguous partitions of our standard neuropil space (Ito et al., 2014), we first removed the highest Strahler order branch (assign_strahler) from projection neuron axons’ so that their sub-branches could be clustered into 25 separate groups (nblast). For each cluster, a 3-D weighted kernel density estimate was generated based on 3D points (xyzmatrix) extracted from clustered sub-branches, using the R package ks (Duong, 2007). Points were spaced on neurites at 1 μm intervals (resample) and weighted as 1/total number of points in the cluster, so that supervoxels could be directly compared. (c) An 'inclusion' score for each neuron was calculated for each supervoxel by summing the density estimate for each point in the chosen arbor, again sampled at 1 μm intervals, and normalised by the total number of points in each arbor. An atlas of the lateral horn, colouring supervoxels by the modality/valence of their strongest input neurons. (d) A ‘projection’ was calculated between each lateral horn voxel and each lateral horn output voxel based on the number of neuronal cell types that have processes in both and the density of this arborisation. An atlas of the lateral horn output regions can then be made, colouring supervoxels by the modality/valence of their strongest input lateral horn supervoxels.

Advanced analysis

Because the natverse is a flexible platform that allows users to easily write their own R code to support intricate procedures, very specific analyses can be performed. For example, we might be interested in using skeletons to define anatomical subvolumes and analysing the projections between such subvolumes. For Figure 3—figure supplement 1, we developed custom code on top of natverse functionality to examine light-level D. melanogaster olfactory projections to, and target neurons with dendrites in a subregion of the brain called the lateral horn (Chiang et al., 2011; Frechter et al., 2019; Grosjean et al., 2011). We voxelised the lateral horn as well as its target regions into overlapping kernel density estimates based on agglomerating similarly shaped sub-branches for projection neuron axons. This analysis reveals substructure in a neuropil, and the 3D locations that are likely to receive input from these new subregions (Figure 3—figure supplement 1d). The natverse contains other functions to infer connectivity from light-level data, including potential_synapses, an implementation of a synapse prediction algorithm that makes use of spatial proximity and approach angle (Stepanyants and Chklovskii, 2005), and overlap, a simpler algorithm that measures the putative overlap in Euclidean space between neuron pairs (Frechter et al., 2019).

Cell typing neurons

Neuronal cell type is a useful classification in neuroscience (Bates et al., 2019). Neuronal cell typing can be done by expert examination (Helmstaedter et al., 2013), purely by morphological clustering (Jeanne and Wilson, 2015), or a combination of both (Frechter et al., 2019). Many neurogeometric algorithms for assessing similarity exist. Some are invariant to the 3D embedding space (Li et al., 2017; Sholl, 1953; Wan et al., 2015), but those that are dependent on neurons’ relative positioning in a template space have typically met with greater success (Li et al., 2017; Zhao and Plaza, 2014). NBLAST (Costa et al., 2016) is a recent morphological similarity algorithm (Frechter et al., 2019; Jeanne et al., 2018; Kohl et al., 2013; Kunst et al., 2019; Masse et al., 2012; Strutz et al., 2014; Zheng et al., 2018). NBLAST is included in the natverse in our nat.nblast package (Costa et al., 2016).

In many parts of mammalian nervous systems, morphologically similar neurons are repeated in space, and so aligning neurons to one another, without a specified template space, is sufficient for quantitative comparison (Figure 4a). NBLAST scores can be hierarchically clustered in R, plotted as a dendrogram, and used to visualize morphological groups at a defined group number or cut height (Figure 4a). Often, this forms a good starting point for cell typing, but might not be in exact agreement with manually defined cell types (Figure 4b). This can be due to neuron reconstructions being differently severed by the field of view or size of the tissue sample collected (Helmstaedter et al., 2013), or due to registration offsets between registered neuronal skeletons (Chiang et al., 2011; Kunst et al., 2019). The natverse includes interactive functions, such as nlscan, that allow users to visually scan neurons and identify mis-assignments (Figure 4c), or find.neuron and find.soma, that allow users to select a neuron of interest from a 3D window (Figure 4c).

Figure 4. Cell typing with nat.

(a) Neurons from a dense reconstruction from EM data of the mouse retina inner plexiform layer (Helmstaedter et al., 2013) can either be NBLAST-ed in situ (upper) or after alignment by their principal axes in 3D space (lower) in order to make a first pass at defining, finding or discovering morphological neuronal cell types using NBLAST. (b) A tSNE plot visualising the results of an aligned NBLAST of neurons in A, coloured by the manually annotated cells types seen in Figure 2c, with shapes indicating the cell class. (c) Manual sorting using the interactive nat functions nlscan and find.soma or find.neuron can help identify misclassifications and make assignments.

Figure 4.

Figure 4—figure supplement 1. Cell typing zebrafish neurons with nat.

Figure 4—figure supplement 1.

(a) Light-level neurons registered to a standard brain for the larval zebrafish (Kunst et al., 2019) can be read into R using fishatlas, and all transformed onto the right hemisphere using the mirroring registration generated by Kunst et al. (b) They can then be NBLAST-ed to discover new cell types. 25 NBLAST clusters are shown, generic exemplary code shown in Monaco font. (c) Upper, shows this process for a small group of neurons from the telencephalon. Mirroring neurons to the same side produces cohesive clusters. Lower, neurons can be cut up (pruned) relative to user defined coordinates to, in this case, divide them by telencephalon region. NBLAST can be run separately on these two groups, and the results were compared using a tanglegram to see whether neurons that cluster together in the posterior telencephalon, also cluster together in the ventral telencephalon.
Figure 4—figure supplement 2. Cell typing vinegar fly neurons with nat.

Figure 4—figure supplement 2.

The same basic process can be done for FlyCircuit neurons (Chiang et al., 2011), here subsetted (in_volume) to those neurons that have arbor in both the antennal lobe and lateral horn (i.e. olfactory projection neurons).

In smaller brains, like insect central brains or larval fish central brains, the overlap of both axons and dendrites in 3D space is an excellent starting point for defining a neuronal type, since neurite apposition is suggestive of synaptic connectivity (Rees et al., 2017) and neurites are highly stereotyped (Jenett et al., 2012; Pascual et al., 2004). If they have been registered to whole brain templates (Chiang et al., 2011; Costa et al., 2016; Kunst et al., 2019), it is desirable to choose a canonical brain hemisphere and standardise such that all neurons are mapped onto this side to approximately double the neurons available for clustering and assign the same cell types on both hemispheres (Figure 4—figure supplement 1, Figure 4—figure supplement 2).

Comparing disparate datasets

Template brains in D. melanogaster

It is highly desirable to compare neurons between datasets within a singular template space. Considering just the case of D. melanogaster, separate template brains ‘contain’ many large and useful but disparate datasets (Table 1):~23,000 single light-level neuronal morphologies, hundreds of neuronal tracings from dye fills, a collection of ~11,000 genetic driver lines, ~100 neuroblast clones, and connectomic data, including a brainwide draft connectome on the horizon (Scheffer and Meinertzhagen, 2019; Zheng et al., 2018). Because of the wealth of data available for D. melanogaster, we focus on its brain for our registration examples.

Table 1. Neuron morphology resources currently available for the adult D. melanogaster brain.
Dataset Type Count Citations
FlyCircuit Single neuron morphologies stochastically labeled from dense transmitter-related lines ~23,000 neurons (Chiang et al., 2011; Shih et al., 2015)
FlyLight GMR collection Collection of genetic driver lines, driven by orthogonal transcription factors GAL4 (Brand and Perrimon, 1993) or (Lai and Lee, 2006) LexA ~3500 GAL4 lines
~1500 LexA lines
(Jenett et al., 2012; Pfeiffer et al., 2008)
Vienna Tiles collection Collection of genetic driver lines, driven by orthogonal transcription factors GAL4 or LexA ~8000 GAL4 lines
~3000 LexA lines
(Kvon et al., 2014; Tirian and Dickson, 2017)
FlyLight split-GAL4 collection Genetic driver lines labelling small constellations of neurons using the split-GAL4 system ~400 sparse lines covering the mushroom body, lobula plate and columns, visual projection neurons, ellipsoid body, descending neurons, central complex, olfactory projection neurons (Y. Aso, personal communication, 2019) and lateral horn. (Aso et al., 2014; Aso and Rubin, 2016; Dolan et al., 2019; Klapoetke et al., 2017; Namiki et al., 2018; Robie et al., 2017; Wolff and Rubin, 2018; Wu et al., 2016)
K. Ito, T. Lee and V. Hartenstein Neuroblast clones for the central brain larval-born neurons, generated using the MARCM method (Lee and Luo, 2001) ~100 neuroblast clones (Ito et al., 2013; Wong et al., 2013; Yu et al., 2013)
FlyEM and Harvard Medical School Volume-restricted connectomes Hundreds of neurons from the mushroom body alpha lobe, two antennal lobe glomeruli and several columns of the optic medulla (Horne et al., 2018; Takemura et al., 2015, Takemura et al., 2013, Takemura et al., 2017; Tobin et al., 2017)
FAFB project Serial section transmission electron microscopy data for a single, whole adult female fly brain (Zheng et al., 2018), that has a partial automatic segmentation available (Li et al., 2019) Raw image data for ~ 150,000 neurons of which several hundred have been partially reconstructed in recent publications, 7 thousand more unpublished; anestimated ~ 5% of neurons have some level of reconstruction. (Dolan et al., 2019; Dolan et al., 2018b; Felsenberg et al., 2018; Frechter et al., 2019; Huoviala et al., 2018; Sayin et al., 2019; Zheng et al., 2018)
Various laboratories Single neuron morphologies extracted from dye-filling (e.g. with biocytin) neurons Hundreds across a range of studies, some cited here (Frechter et al., 2019; Grosjean et al., 2011; Jeanne et al., 2018; Jefferis et al., 2007)

Two approaches have been taken in specifying template spaces: a) choosing a single brain avoids any potential artifacts generated by the averaging procedure, but b) an average brain can reduce the impact of biological variation across individuals and deformations introduced during sample preparation, thus increasing the likelihood of successful and accurate registration (Bogovic et al., 2018). Quantitative neuroanatomical work requires images to be spatially calibrated (i.e. with an accurate voxel size), but such calibrations are not present in all template brains.

Table 2 lists the template brains for D. melanogaster considered in this work and details the resources available for each; some are shown in Figure 6. Initially, only raw unregistered data were publicly available for FlyCircuit (Chiang et al., 2011). Subsequently data registered to one of two template brains (one for each sex). The FlyLight project provides only raw image data (Jenett et al., 2012).

Table 2. Exemplar Drosophila template brains.
Template Brain Description Resources DOI Citation
Wuerzburg Single nc82-stained female brain - - (Rein et al., 2002)
TEFOR Averaged brain generated from Reinet al. dataset (22, 22) - - (Arganda-Carreras et al., 2018)
JRC2018F A symmetrised high-quality template using brp-SNAP - 10.6084/m9.figshare.6825923 (Bogovic et al., 2018)
Cell07 Partial intersex nc82-stained averaged brain (14, 2) ~240 lateral horn projection neuron tracings 10.5281/zenodo.10570 (Jefferis et al., 2007)
T1 Intersex nc82-stained averaged brain The Vienna Tiles collection 10.5281/zenodo.10590 (Yu et al., 2010)
IS2 Intersex nc82-stained averaged brain 1018 3D confocal images of fruitless neurons 10.5281/zenodo.10595 (Cachero et al., 2010)
FCWB Intersex Dlg-stained averaged brain (17, 9) Good for FlyCircuit data, ~16,000 neurons re-registered 10.5281/zenodo.10568 (Costa et al., 2016)
JFRC Single nc82-stained female brain The FlyLight collection - (Jenett et al., 2012)
JFRC2 Spatially calibrated copy of JFRC The FlyLight collection 10.5281/zenodo.10567 This study
IBN Tri-labelled half brain, with n-syb-GFP Neuropil and tract segmentations (half-brain) - (Ito et al., 2014)
IBNWB Synthetic whole-brain version of IBN Neuropil and tract segmentations (whole-brain) 10.5281/zenodo.10569 This study
FAFBV14 An aligned volume for a single whole female fly brain from EM data Thousands of single neuron partial manual reconstructions and fragmented automatic segmentation (Li et al., 2019) - (Zheng et al., 2018)

Template brains and registered data are publicly available for the Vienna Tiles GAL4 libraries (Tirian and Dickson, 2017) but are not distributed in bulk form. We created an intersex reference brain for the FlyCircuit dataset and added spatial calibrations and re-registered data to our new template brains as necessary (see Materials and methods) before constructing bridging registrations. We have deposited all template brain images, in NRRD format (http://teem.sourceforge.net/nrrd/) at http://zenodo.org to ensure long-term availability. Two spatial transforms are most useful when considering template brains - a) mirroring data left-right, so that neurons reconstructed or registered to either hemisphere may be compared, and b) bridging between these templates, to cross-compare data.

Mirroring data in D. melanogaster

Whilst the Drosophila brain is highly symmetric it is not perfectly so and the physical handling of brains during sample preparation introduces further non-biological asymmetries. A simple 180° flip about the medio-lateral axis is therefore insufficient (Figure 5—figure supplement 1a). To counter this, we have constructed non-rigid warping registrations for a number of template spaces that introduce the small displacements required to fix the mapping from one hemisphere to the other (Figure 5—figure supplement 1, see Materials and methods).

Our mirroring registrations can be deployed using the function mirror_brain. Our mirroring registrations can be used to counter non-biological asymmetries, allowing the investigation of relevant similarities and differences in morphology between the two sides of the brain (Figure 5—figure supplement 1a). Our mirroring procedure (see Materials and methods) does not introduce any systematic errors into neuron morphology.

NBLAST was used to calculate morphologically determined similarity scores between DL2d projection neurons taken from the same side of the brain and compare them with those calculated between DL2d projection neurons taken from alternate sides of the brain (Figure 5b). We do not find the distributions of scores (Figure 5c) to be significantly different (D = 0.025, p=0.094, two-sample Kolmogorov-Smirnov test). Extending this, we have used these scores to classify neurons based on their bilateral symmetry. Figure 5d shows 12 example neurons, taken from the bilateral subset of the FlyCircuit dataset, spanning the range of similarity scores from most asymmetric (A) to most bilaterally symmetric (L). Interestingly, the distribution of scores suggest that most bilateral neurons are reasonably symmetric.

Figure 5. Sample applications of mirroring registrations.

(a) Three FlyCircuit neurons along with mirrored versions; a visual projection neuron, OA-VUMa2 (Busch et al., 2009) and the CSD interneuron (Dacks et al., 2006). Co-visualisation facilitates the detection of differences in innervation, such as the higher density of innervation for the CSD interneuron in the lateral horn on the contralateral side compared to the ipsilateral lateral horn. (b) Neurons from the same side of the brain and alternate side of brain are compared and a similarity score generated. (c) Distributions of similarity scores for comparisons within the same brain hemisphere and across brain hemispheres. These scores are similar, because the mirroring registration is good. (d) Sequence of 12 example neurons (black) with mirrored counterparts (grey), having equally spaced similarity scores. Below, full distribution of scores for all neurons in FlyCircuit dataset. (e) Segment-by-segment measures of neuron similarity. Redder shades correspond to low degrees of symmetricity, bluer shades higher. Flipped version of neuron in gray.

Figure 5.

Figure 5—figure supplement 1. Mirroring procedure.

Figure 5—figure supplement 1.

(a) The full process undergone by an image during a mirroring registration. (1) Original image. (2) Flipped 180° around the medio-lateral axis. (3) Affinely transformed. (4) Non-rigidly warped. (b) Heatmaps of deformation magnitude fields for mirroring FlyCircuit and FlyLight reference brains. (c) Distribution of deformation displacements for both brains, in a single mirroring operation and a full round-trip. Illustrations show the transformations that points undergo.

It is also possible to use our mirroring registrations to test the degree of symmetry for sections of neurons. We take segments of a neuron and use our similarity metric to compute a score between the segment and the corresponding segment in the mirrored version of the neuron. This allows differences in innervation and axonal path between the two hemispheres to be clearly seen (Figure 5e).

Bridging template spaces in D. melanogaster

Simply rescaling a sample image to match a reference brain usually fails due to differences in position and rotation (Figure 6—figure supplement 1a). An affine transformation can account for these differences, but not for differences in shape that may be of biological or experimental origin. To correct for these, we use a full non-rigid warping deformation, as described previously (Jefferis et al., 2007; Rohlfing and Maurer, 2003; Rueckert et al., 1999), see our Materials and methods. Briefly, a regular lattice of control points is created in the reference brain and corresponding control points in the sample brain are moved around to specify the deformations required to take the sample data into the reference space (Figure 6c–g). Deformations between control points are interpolated using B-splines, which define a smooth deformation of sample to reference (Figure 6f). The use of a mutual information metric based on image intensity avoids the requirement for landmarks to be added to each image – a time-consuming task that can often introduce significant inaccuracies. Our approach allows for the unsupervised registration of images and the independent nature of each registration allows the process to be parallelised across CPU cores. By utilizing a high-performance computational cluster, we re-registered, with high accuracy, the entire FlyCircuit dataset within a day.

Figure 6. Bridging registrations for brain templates.

(a) A small sample of Drosophila template brains used around the world are shown. (b) A partial neuron tracing (purple) made using Simple Neurite Tracer (Longair et al., 2011) being transformed (xform_brain) from the IS2 standard brainspace to the FCWB, where it can be co-visualised with a more complete, matching morphology from FlyCircuit, using our R package flycircuit. (c) Outermost neuropil boundaries for FlyCircuit (red) and FlyLight (cyan) template brains. Primed space names indicate original spaces for data. Unprimed space names indicate current space of image. (d) Neuropil segmentation from JFRC2 space alongside FCWB reformatted version. (e) CSD interneuron from FlyCircuit (red) and FlyLight, GMR GAL4 expression pattern (cyan). (f) Neuropil segmentation from JFRC2 (Ito et al., 2014) space that has been bridged into FCWB space, so it can be seen along with selected neurons from FlyCircuit. (g) A traced neuron in FCWB space alongside morphologically similar neuron from FlyCircuit. (h) Expression pattern of Vienna Tiles line superimposed on expression pattern of FlyLight line. (Since we made our bridging publically available in April 2014, Otsuna et al., 2018 have also, separately, bridged these two datasets.).

Figure 6.

Figure 6—figure supplement 1. Bridging procedure.

Figure 6—figure supplement 1.

(a) Increasing levels of registration complexity give increasingly good registration results. (b) Four composite transformations of a 12-degree-of-freedom affine transformation. (c) Regularly spaced grid of control points (red dots) in the reference brain. (d) Control points (cyan dots) in the sample image. (e) Sample brain control points affinely transformed into image space of reference brain, along with original control points. (f) Deformation field interpolated using B-splines. (g) Reformatted sample image.
Figure 6—figure supplement 2. Bridging examples.

Figure 6—figure supplement 2.

(a) A sub-section of the bridging registrations available through nat.flybrains, via which a neuroanatomical entity in any brainspace can eventually be placed into any other brainspace, by chaining bridging registrations. Arrows indicate the existence and direction of a bridging registration. Registrationfrye@ucla.edus are numerically invertible and can be concatenated, meaning that the graph can be traversed in all directions. Both brains (Arganda-Carreras et al., 2018; Bogovic et al., 2018; Cachero et al., 2010; Costa et al., 2016; Ito et al., 2014; Jefferis et al., 2007; Jenett et al., 2012; Rein et al., 2002; Tirian and Dickson, 2017; Zheng et al., 2018) and ventral nervous systems (Cachero et al., 2010; Yu et al., 2010) shown. (b) Fru+ neuroblast clones (orange) transformed from IS2 space into JFRC2 space of elav neuroblast clones (green). (c) Sexually dimorphic Fru+ neuroblast clone (male on left, female on right) along with traced neurons from FlyCircuit.
Figure 6—figure supplement 3. Warping registration without point-point correspondence.

Figure 6—figure supplement 3.

(a) A left-right registration for the lop-sided nascent L1 larval connectome (Ohyama et al., 2015). Neuroanatomical models created from CATMAID data (upper) and 644 manually left-right paired neurons can be used as a basis to generate a single deformation of L1 space using Deformetrica (via deformetricar), that describes a left-right mapping in a single deformation of ambient space. (b) Symmetrisation (purple) of the lop-sided cortex and neuropil in the L1 EM dataset. (c) Registration of segmented light-level volumes onto the EM dataset. Immunohistochemical staining of dissected L1 central nervous system was needed to make 3D models for coarse neuroanatomical volumes, which could be matched to those in the L1 EM; namely, the pattern of longitudinal ventral nervous system axon tracts and the mushroom bodies, visible in a dFasciculin-II-GFP protein trap line, the laddered ventral nervous system pattern and olfactory tracts visible in a GH146-lexA line, and a Discs large-1/dN-Cadherin stain to visualize the neuropil and cortex of the brain.

Our bridging registrations can be deployed on any 3D natverse-compatible data using the function xform_brain. A successful and accurate bridging registration will result in the neuropil stains of two template spaces being well co-localised (Figure 6). After visually inspecting co-localised template spaces to check for any obvious defects, we find it helpful to map a standard neuropil segmentation (Ito et al., 2014) into the space of the new brain to check for more subtle defects (Figure 6—figure supplement 2b). If the registration passes these checks it can then be used to combine data from multiple datasets.

The creation of a bridge between a GAL4 expression library, such as the GMR collection (Jenett et al., 2012), and images of single neurons, such as those of FlyCircuit (Chiang et al., 2011), facilitates the decomposition of an expression pattern into its constituent neurons, allowing the correct assessment of innervation density on, for example, ipsilateral and contralateral sides (Figure 6—figure supplement 2c). Similarly, correspondences between neuroblast clones can be identified with co-visualisation. We bridge Fru+ clones (Cachero et al., 2010) from IS2 space into the JFRC2 space of elav clones (Ito et al., 2013) and hence determine subset relations (Figure 6—figure supplement 2b). Furthermore, we can bridge the single neuron FlyCircuit data (Chiang et al., 2011) from the FCWB space into the IS2 space of the Fru+ clones and use the known sexual dimorphisms of Fru clones to predict which neurons may be sexually dimorphic (Figure 6—figure supplement 2c).

The ability to bridge segmentations from one space to another is useful for checking innervation across datasets. While FlyCircuit single neurons (Chiang et al., 2011) were provided along with information on innervation density based on their own neuropil segmentation, this segmentation is not the same as the canonical one (Ito et al., 2014). We have bridged the latter segmentation into FCWB space and recalculated innervation for all the FlyCircuit neurons, providing a more standardised measure (Figure 6—figure supplement 2g). Further, we can compare neurons from FlyCircuit with those for which we have electrophysiological data (Frechter et al., 2019; Kohl et al., 2013), enabling us to suggest a functional role for unrecorded neurons based on their morphological similarity to recorded neurons (Figure 6—figure supplement 2h).

Both the FlyLight (Jenett et al., 2012) and Vienna Tiles libraries (Tirian and Dickson, 2017) contain a wealth of GAL4 lines amenable to intersectional strategies (Luan et al., 2006). However, as the two libraries are registered to different template spaces, it is difficult to predict which combinations of a FlyLight GMR line with a Vienna Tiles line would produce a good intersection (split-GAL4, targeting one cell type present in both parent lines) from the raw images provided by both. Bridging one library into the space of another (Figure 6—figure supplement 2i) enables direct co-visualisation (see also Otsuna et al. (2018) for an independent bridging output). This could be used manually or computationally to identify combinations that could potentially yield useful intersectional expression patterns (Venken et al., 2011).

It is also possible to warp 3D neuropils and neuron skeletons onto some target, without using landmark pairs. For this, Deformetrica (Bône et al., 2018; Durrleman et al., 2014) can be used to compute many pairwise registrations at once for different kinds of 3D objects to produce a single deformation of ambient 3D space describing a registration (Figure 6—figure supplement 3). This is a generic method that does not require landmark correspondences to be manually assigned. We give a simple example in Figure 6—figure supplement 3a, symmetrising a distorted brain and making a LM-EM bridge for first-instar larva, for which there is a nascent connectome (Berck et al., 2016; Eichler et al., 2017; Ohyama et al., 2015; Schneider-Mizell et al., 2016). With such a method it should be possible to bridge EM or LM data between developmental stages for a nervous system to make comparisons or identify neurons.

EM to LM and back again

Finding neurons of the same cell type between a high-resolution EM dataset and light-level images of neurons (Figure 7a) is an essential step in identifying neurons and their genetic resources. So doing links connectivity and detailed morphology information acquired at the nanometer resolution to other forms of data. This can most easily be done by finding corresponding landmarks in EM data and a LM template space to build a registration (Figure 6—figure supplement 1).

Figure 7. Finding specific neurons in EM and LM data.

Figure 7.

(a) Pipeline for acquiring EM neuron data. Serial section transmission EM at high speed with a TEM camera array (Bock et al., 2011) produced several micrographs per section at 4 × 4 nm resolution, ~40 nm thick. These were, per section, stitched into mosaics which were, across sections, registered to create the female adult fly brain v.14 template space (FAFB14, grey) (Zheng et al., 2018). Corresponding landmarks between FAFB14 and JFRC2 can be found and used to build a bridge. (b) The R package elmr can be used to select an anatomical locus, here the PD2 primary neurite tract (Frechter et al., 2019), from 3D plotted light-level neurons, taken from FlyCircuit, and generate a URL that specifies its correct coordinates in a FAFB14 CATMAID instance. Candidates (185) may then be coarsely traced out until they deviate from the expected light-level morphologies (178 pink dotted profiles, often a few minutes to an hour of manual reconstruction time to rule out neurons of dissimilar cell types sharing a given tract, similar cell types are more subtly different and might need to be near completely reconstructed). Those that remain largely consistent were fully reconstructed (green profiles, ~7–12 person-hours per neuron) (Li et al., 2019). (c) Close matches reveal likely morphology of non-reconstructed branches (orange arrow) but also contain off-target expression (yellow arrow). Identification of multiple candidate lines enables split-GAL4 line generation aimed at retaining common neurons in two GAL4 patterns. MultiColor FlpOut (MCFO) (Nern et al., 2015) of resultant splits can be compared with the EM morphology. Here, a candidate GAL4 line is found for AL-lALT-PN3 (Frechter et al., 2019; Tanaka et al., 2012) using NBLAST and a MIP search (Otsuna et al., 2018). (d) A recent dense, but volume-restricted reconstruction of the mushroom body α-lobe discovered a ‘new’ mushroom body output neuron type (MBON-α2sp) (Takemura et al., 2017). By bridging from the correct mushroom-body compartment using a mushroom body mesh (Ito et al., 2014) visualised in R Studio, to the FAFB14 EM data’s equivalent space in CATMAID using the R package elmr, an experienced tracer can easily identify dendrites and find MBON-α2sp. By doing so, we found its previously unreported axon-morphology. We then imported the skeleton into R studio, bridged MBON-α2sp into the JFRC2 template space where it could be NBLAST-ed against GMR GAL4 lines to identify candidate lines containing the MBON.

In Figure 7 and Figure 8, we give the general pipeline we used in recent publications (Dolan et al., 2019; Dolan et al., 2018a; Frechter et al., 2019; Li et al., 2019) to connect neurons sparsely labeled in a split-GAL4 line (registered to the template space JFRC2) to sparsely reconstructed neurons from an EM dataset (FAFB14). Neurons can be manually reconstructed (Schneider-Mizell et al., 2016) or, more recently, partially reconstructed by machine learning methods (Januszewski et al., 2018) as segments that can be manually concatenated (Li et al., 2019). A thin plate spline bridging registration between JFRC2 and FAFB14 was built by finding ~100 corresponding landmarks between the two brainspaces, for example the location of turns in significant tracts, the boundaries of neuropils, the location of easily identifiable single neurons (Zheng et al., 2018). This registration can be deployed using xform_brain and our elmr package.

Figure 8. Bridging EM and LM data.

(a) Sparse EM reconstruction providing a database of non-comprehensive, partial morphologies that can be searched using NBLAST. Candidate neurons from the EM brainspace can be NBLAST-ed against MCFO (Nern et al., 2015) data and other light-level datasets in order to connect them to cell-type-specific information, such as odour responses and functional connectivity (Chiang et al., 2011; Dolan et al., 2019; Frechter et al., 2019; Jeanne et al., 2018), by bridging these datasets into the same brainspace. (b) An all-by-all NBLAST of all neurons in the PD2 primary neurite cluster (Frechter et al., 2019) in multiple datasets can be shown as a tSNE plot. EM cell type matches can easily be found, as well as other correspondences between the light level datasets.

Figure 8.

Figure 8—figure supplement 1. Using partial automatic segmentation of EM data.

Figure 8—figure supplement 1.

(a) A NeuroGlancer window open on a web browser, showing an example of an automatic reconstruction. (b) Automatically reconstructed segments can be mapped onto extant manual tracing in FAFB14 using our R package fafbseg, enabling easy volumetric reconstruction of neurons. (c) Upper, automatically traced segments are mapped onto 55 manually reconstructed neurons from the lateral horn of D. melanogaster in the FAFB14 dataset (Bates et al., 2020), broken down by Strahler order, to simulate different levels of ‘completeness’. A further 10 neurons have had microtubular cable annotated. Lower, proportion of cable at different levels of pruning, that are covered by the 10 largest auto-segmented fragments for each neuron. The traced to identification category comprises neurons reconstructed by expert annotators sufficiently for them to be identifiable in light level data, using the pipeline shown in Figure 8. (d) Correlation between cable length and volume for axons and dendrites (Schneider-Mizell et al., 2016) for a selection of central brain neurons.

By bridging multiple other light-level datasets into JFRC2 (Figure 6), candidate neurons from the EM brainspace can be co-visualised (Figure 8c) and NBLAST-ed against light-level datasets in order to confirm their cell type identity and consider results from different studies (Chiang et al., 2011; Dolan et al., 2019; Frechter et al., 2019; Jeanne et al., 2018; Figure 8d). However, FAFB14 contains unannotated image data for ~150,000 neurons (Bates et al., 2019), each requiring hours of manual reconstruction time, and person-power is limited. To find specific neurons in this volume, we can use the R package elmr to select a distinctive anatomical locus, for example the cell body fiber tract (Frechter et al., 2019) from 3D plotted neurons, and jump to its approximate coordinates in FAFB14 in a supported CATMAID instance using the generated URL (Figure 7b). Reconstruction efforts can then be focused at this location, being aware that the jump is not always completely accurate despite a good bridging registration as some light-level datasets can be ill-registered (Figure 7b). In the absence of an extant light-level reconstruction, candidate neurons can be found by identifying distinctive anatomical loci in the EM volume that correspond to the anatomy of the cell type in question (Figure 7d).

A user may also want to work the opposite way and connect an interesting EM reconstruction to light-level data, for example to identify a genetic resource that targets that neuron. In this situation, a similar pipeline can be used. For D. melanogaster, a reconstruction can be bridged into JFRC2 and NBLAST-ed against GAL4 lines (Jenett et al., 2012; Tirian and Dickson, 2017) read from image data and represented as vector clouds (Costa et al., 2016). Alternatively, image matching tools can be used, such as the recent colour depth MIP mask search (Otsuna et al., 2018), which operates as an ImageJ plug-in (Figure 7c).

Further, because close light-level matches for in-progress EM reconstructions reveal the likely morphology of non-reconstructed branches (Figure 7c) this process can help human annotators reconstruct neurons accurately and in a targeted manner, which may be desirable given how time intensive the task is. In order to further reduce this burden, we combined the natverse with a recent automatic segmentation of neurites in FAFB14 using a flood filling approach (Li et al., 2019), which produces volumetric fragments of neurites, where segments may be fairly large, ~100 μm in cable length.

Our fafbseg package includes functions to implement improved up-/downstream sampling of neurons based on these segments, which we have recently discussed elsewhere (Li et al., 2019). We can also generate volumetric reconstructions of manually traced neurons by mapping them onto volumetric data (Figure 8—figure supplement 1b), hosted by a brainmaps server and visible through a Neuroglancer instance (Figure 8—figure supplement 1a). Currently, ~500 such segments will map onto one accurately manually traced neuron but only ~20 segments may constitute the highest Strahler order branches meaning that manual concatenation of these fragments speeds up discovery of coarse morphologies by ~10 x (Li et al., 2019). These fragments can be used to identify the neuron in question by NBLAST-ing against light-level data. Twigs and small-calibre, lower Strahler order branches are more difficult to automatically segment (Figure 8—figure supplement 1d). Nevertheless, matching tracings to segmentations allows us to estimate the volume of neurons that we have previously manually reconstructed (Dolan et al., 2019; Dolan et al., 2018a) by only tracing the neurites’ midline (i.e. skeletonisation). We can therefore observe that superior brain neurons’ axons are slightly thicker than their dendrites and their total cable length correlates strongly with neurite volumes (Figure 8—figure supplement 1e).

A densely reconstructed connectome, with ~35% of synapses connected up for just under half of the central fly brain has recently been made available by the FlyEM team at Janelia Research Campus (Scheffer and Meinertzhagen, 2019; Shan Xu et al., 2020). Neurons from this ‘hemibrain’ volume can be transformed to the JRC2018F light level template brain via a bridging registration constructed using the strategy described by Bogovic et al. (2018). We have already wrapped this bridging registration within the natverse framework, thereby connecting it to the full network of fly template brains, datasets and analysis tools already described in this paper. We will release these tools when the hemibrain project makes its transforms publicly available.

Discussion

The shape of a neuron is of major functional significance. Morphology is driven by and constrains connectivity. It is also the primary means by which neuroscientists have historically identified neuron classes. There have been three main drivers behind the recent emphasis on quantitative neuroanatomy: a) the ever increasing scale of new approaches for acquiring image data and reconstructing neurons, b) a drive to formalise descriptions of the spatial properties of neurons and networks at various scales, and c) a desire to intuit the organisational principles behind different nervous tissues and correlate these findings with dynamic data on neuron activity.

With the natverse, a suite of R packages for neuroanatomy with well-documented code and detailed installation instructions and tutorials available online, we aim to expedite analysis of these data in a flexible programming environment. The natverse allows a user to read data from local or remote sources into R, and leverage both natverse functions and the >10,000 R packages on CRAN (and more on Bioconductor, Neuroconductor, GitHub, etc.) to aid their data analysis. Users may also call natverse R functions from other languages such as Python, Julia, MATLAB. We have provided detailed examples to analyse skeleton and volume data from various sources and have made both R and Python code available at https://github.com/natverse/nat.examples. These examples demonstrate how to obtain skeleton and volume data, calculate basic metrics for neurons, examine synapses and other tagged biological features like microtubules, analyse morphology as a graph or through Strahler order and NBLAST searches, prune neurons, semi-manually cell type neurons, spatially transform neurons and create subvolumes using neurons. We have also given an example of building a more complex analysis, based on natverse tools but making use of other available R packages.

We hope that the natverse becomes a collaborative platform for which users can contribute to existing R packages or link to their own. We note that the natverse is an actively developing project and also anticipate a) an increasing interest in dealing with neurons as volumes as automatic segmentation of datasets becomes commonplace, b) expanding our bridging tools to support a wider range of species, and to map between similar species and developmental stages, c) writing libraries to facilitate the use of the natverse in other programming languages and toolboxes besides Python, and d) expanding the range of neurogeometric analysis algorithms readily available in the natverse.

In addition to general purpose natverse tools, we have generated some specific R packages to support ongoing projects in the D. melanogaster brain. We have constructed high-quality registrations for the bridging of data from one template space to another, along with registrations for mirroring data across brain hemispheres. In two of the largest cases, only raw unregistered data were available, so we began by registration to an appropriate template space. This has allowed us to deposit ~20,000 co-registered images from different sources in the virtualflybrain.org project. Averaged intersex template spaces can form high-quality registration templates for both sexes and we recommend the use of averaged brains to reduce the effects of sample-to-sample variation. We propose using a small number of template spaces, particularly those that are already associated with the most data (JFRC2) or of highest quality (Bogovic et al., 2018), as a hub. High-quality bridging registrations would be created between new template spaces and brains in the hub, ensuring that any template could be bridged to any other via appropriate concatenations and inversions of these registrations.

Using these resources, it is now possible to co-visualise and analyse more than 23,000 single neuron images (Chiang et al., 2011), expression patterns of >9500 GAL4 lines (Jenett et al., 2012; Kvon et al., 2014; Tirian and Dickson, 2017) and a near complete set of ~100 adult neuroblast clone lineage data (Ito et al., 2013; Yu et al., 2013) and easily combine these data with the standard insect brain name nomenclature system (Ito et al., 2014). For example we have calculated the neuropil overlap between single neurons in the FlyCircuit data, which we have deposited with virtualflybrain.org so they can be queried online. It will soon be possible to identify split-GAL4 lines, a synaptic EM reconstruction and the developmental clone of origin for any given neuron or neuronal cell type for D. melanogaster. We anticipate such mappings to become publicly available and easy to use via resources such as https://v2.virtualflybrain.org/. Significantly, if an experimenter is able to register their functional imaging data to a template brain space (Mann et al., 2017; Pacheco et al., 2019), or alternatively identify neuroanatomical features in that data that can be used to build a landmark-based affine or thin-plate spline registration (e.g. using Morpho Schlager, 2017), they may be able to directly link it to cell types discovered in other datasets, including EM datasets.

The near future will see generation of EM data for multiple whole adult Dipteran brains and larval zebrafish, possibly from different sexes and species, as well as quality automatic segmentations for such data’s neurites (Funke et al., 2019; Januszewski et al., 2018) and synapses (Heinrich et al., 2018), even from anisotropic serial section transmission EM data (Li et al., 2019). Interpreting high-resolution EM connectomic data will be accelerated and enriched by making links to light level data (Schlegel et al., 2017). Furthermore, it is possible that connectomes and transcriptomes may be linked on a cell type basis, using neuron morphology as a bridge (Bates et al., 2019). The natverse provides extensible functionality for easily combining and analysing all these data.

Materials and methods

R packages for neuroanatomy

The R programming language (R Development Core Team, 2011) is perhaps the premier environment for statistical data analysis, is well supported by the integrated development environment RStudio and is a strong choice for data visualisation (Wickham, 2016). It already hosts a wealth of packages for general morphometric and graph theoretic analysis (Csardi and Nepusz, 2006; Duong, 2007; Lafarge et al., 2014; Schlager, 2017). An R package is a bundle of functions, documentation, data, tests and example code (Wickham, 2015). R packages are discrete, standardised and highly shareable units of code. They are primarily installed either from the Comprehensive R Archive Network (CRAN, >14,000 packages, curated), Bioconductor (>1700 packages, curated) or GitHub (larger, uncurated), using just one or two function calls and an Internet connection. Confirmed stable versions of nat, nat.templatebrains, nat.nblast, nat.utils and nabor can be downloaded from the centralised R package repository, CRAN. The natmanager package provides a streamlined installation procedure and will advise the user if a GitHub account is required for the full natverse install (see http://natverse.org/install).

install.packages(‘natmanager’)
# install core packages to try out the core natverse
natmanager::install(‘core’)
# Full ‘batteries included’ installation with all packages
# You need a GitHub account and personal access token (PAT) for this
natmanager::install(‘natverse’)

The R packages behind the natverse can be divided into four groups (Figure 1A):

Working with synaptic resolution data in nat

Group a) obtains synaptic-level data required for connectomes and includes catmaid, neuprintr, drvid and fafbseg. The package catmaid provides application programming interface (API) access to the CATMAID web image annotation tool (Saalfeld et al., 2009; Schneider-Mizell et al., 2016). CATMAID is a common choice for communities using terabyte-scale EM data to manually reconstruct neuron morphologies and annotate synaptic locations (Berck et al., 2016; Dolan et al., 2018a; Eichler et al., 2017; Frechter et al., 2019; Ohyama et al., 2015; Zheng et al., 2018). Users can use catmaid to read CATMAID neurons into R including the locations and associations of their synapses, and other tags that might identify biological entities such as somata, microtubules or gap junctions. Users can also leverage CATMAID’s infrastructure of flexible hierarchical semantic annotations to make queries for neurons for example in a brain region of interest. Further catmaid can edit CATMAID databases directly, for example by adding annotations, uploading neurons, synapses and meshes. Some CATMAID instances are kept private by a community before data publication. In this case, catmaid can enable a user to send authenticated requests to a CATMAID server, that is data can be kept private but still be read into R over an Internet connection. The packages neuprintr and drvid are very similar, except that they interact with API endpoints for different distributed annotation tools, the NeuPrint connectome analysis service (Clements et al., 2020; https://github.com/connectome-neuprint/neuPrint) and DVID (Katz and Plaza, 2019) and can retrieve neurons as volumes as well as skeletons. The package fafbseg aims to make use of the results of automatic segmentation attempts for large, dense brain volumes. It includes support for working with Google's BrainMaps and NeuroGlancer (https://github.com/google/neuroglancer). Automatic segmentation of EM data is a rapidly-developing field and this package is currently in active development; at present it only supports auto-segmentation (Li et al., 2019) of a single female adult fly brain (FAFB) dataset (Zheng et al., 2018).

Working with light-resolution data projects in nat

Group b) is targeted at light microscopy and cellular resolution atlases, or mesoscale projectomes. Its packages, neuromorphr, flycircuit, vfbr, mouselight, insectbrainr and fishatlas can read from large repositories of neuron morphology data, many of which are co-registered in a standard brain space. neuromorphr provides an R client for the NeuroMorpho.org API (Ascoli et al., 2007; Halavi et al., 2008; Nanda et al., 2015), a curated inventory of reconstructed neurons (n = 107395, 60 different species) that is updated as new reconstructions are collected and published. Since its neurons derive from many different systems and species, there is no 'standard' orientation, and so they are oriented by placing the soma at the origin and aligning neurons by their principal components in Euclidean space. insectbrainr can retrieve neurons and brain region surface models from InsectBrainDB.org (n = 139 neurons, 14 species). Similarly flycircuit interacts with the flycircuit.tw project (Chiang et al., 2011; Shih et al., 2015), which contains >23,000 registered and skeletonised D. melanogaster neurons. The vfbr package can pull image data from VirtualFlyBrain.org, which hosts registered stacks of central nervous system image data for D. melanogaster, including image stacks for the major GAL4 genetic driver line collections (Jenett et al., 2012), neuroblast clones (Ito et al., 2013; Yu et al., 2013) and FlyCircuit’s stochastically labelled neurons (Chiang et al., 2011). This non-skeleton data can be read into R as point clouds. The fishatlas package interacts with FishAtlas.neuro.mpg.de, which contains 1709 registered neurons from the larval Danio rerio (Kunst et al., 2019), while mouselightr does the same for the MouseLight project at Janelia Research Campus (Economo et al., 2016), which has generated >1000 morphologies. In both cases, investigators have acquired sub-micron single neuron reconstructions from datasets of whole brains using confocal (Kunst et al., 2019) or two-photon microscopy (Economo et al., 2016), modified tissue clearing techniques (Treweek et al., 2015), and generated a template brain with defined subvolumes.

Working with registrations in nat

Group c) helps users make use of registration and bridging tools. The package nat.ants wraps the R package ANTsRCore (Kandel et al., 2019) with a small number of functions to enable nat functions to use Advanced Normalisation Tools (ANTs) registrations (Avants et al., 2009). The R package deformetricar does the same for the non-image (e.g. mesh or line data) based registration software Deformetrica (Bône et al., 2018; Durrleman et al., 2014) without the need for landmark correspondences. The nat package already contains functions to support CMTK registrations (Rohlfing and Maurer, 2003). The nat.templatebrains package extends nat to explicitly include the notion of each neuron belonging to a certain template space, as well as functions to deploy bridging and mirroring registrations. Additionally, nat.flybrains contains mesh data describing commonly used template spaces for D. melanogaster as well as CMTK bridging and mirror deformations discussed in the latter half of the results section.

Analysing data in nat

Group d) contains functions that help users to easily analyse neuron data as both skeletons and volumes. Its biggest contributor is nat. nat.nblast allows users to deploy the NBLAST neuron similarity algorithm (Costa et al., 2016), by pairwise comparison of vector clouds describing these neurons in R. Our nabor package is a wrapper for libnabo (Elseberg et al., 2012), a k-nearest neighbour library which is optimised for low dimensional (e.g. 3D) spaces. The package elmr is another fly focused package that has been born out of a specific use case. Currently, ~22 laboratories and ~100 active users worldwide are engaged with reconstructing D. melanogaster neurons from EM data (Zheng et al., 2018) using CATMAID (Saalfeld et al., 2009; Schneider-Mizell et al., 2016) in order to build a draft, sparse connectome. The package elmr allows users to read neurons from this environment, transform them into a template space where they can be compared with light-level neurons for which the community may have some other information (e.g. gene expression, functional characterisation, presence in genetic drive lines, etc.), then visualised and/or NBLAST-ed; all with only a few lines of code. This process enables CATMAID users to perform interim analyses as they reconstruct neurons, helping them to choose interesting targets for reconstruction and identify manually traced or automatically reconstructed neuron fragments (Dolan et al., 2019) or anatomical landmarks such as fiber tracts (Frechter et al., 2019), and so improve the efficiency of their targeted circuit reconstructions (Dolan et al., 2018a; Felsenberg et al., 2018; Huoviala et al., 2018).

Building mirroring registrations

A simple 180° flip about the medio-lateral axis is insufficient to generate a left-right mirror for most neuroanatomical volumes; after flipping, the brain will not be perfectly centered in the image. It is first necessary to apply an affine registration to roughly match the flipped brain to the same location as the original. This results in a flipped brain with the correct gross structure (i.e. large structures such as neuropils align) but with mismatched fine details (e.g. bilaterally symmetric neurons may appear to innervate slightly different regions on either side (Figure 5a). For example, for the JFRC2 template space we found that points are, on average, displaced by 4.8 μm from their correct position, equivalent to 7–8 voxels of the original confocal image. The largest displacements, of the order of 10–15 μm, are found around the esophageal region (Figure 5—figure supplement 1b) and are likely due to specimen handling when the gut is removed during dissection. An ideal mirroring registration would result in zero total displacement after two applications of the mirroring procedure, that is a point would be mapped back to exactly the same location in the original brain hemisphere. Our constructed mirroring registrations have, on average, a round-trip displacement of less than a quarter of a micron — that is about the diffraction limit resolution of an optical microscope and less than half of the sample spacing of the original confocal image (Figure 5—figure supplement 1c).

Building bridging registrations

Given a bridging registration A  B, an attempt to produce the registration B A can be made via numerical inversion of the original registration. This is a computationally intensive process but we find it to be useful for neuroanatomical work as the inaccuracies are set by numerical error, which is much smaller than registration error. As the registration A B may be injective (i.e. points within brain A may map to a subset of the points within brain B), there may be some points in B, particularly near the boundaries of the brain, for which this inversion will not map them into A. To counter this we have, for some brains, constructed a new registration B A by explicitly registering B onto A, rather than relying on numerical inversion. Full details of the building of bridging registrations and their directions are shown in Figure 6—figure supplement 1. Here, the arrows indicate the direction of the forward transformation but, due to the ability to numerically invert the transformations, it is possible to travel ‘backwards’ along an arrow to transform in the opposite direction. While the inversion takes an appreciable time to calculate, the resulting errors are extremely small, far below the resolution of the original images, and only exist due to the finite precision with which the floating-point numbers are manipulated. By inverting and concatenating bridging registrations as appropriate, it is possible to transform data registered to any of the template spaces to any of the other template spaces.

Creating accurate registrations

Full, non-rigid warping registrations were computed using the Computational Morphometry Toolkit (CMTK), as described previously (Jefferis et al., 2007). An initial rigid affine registration with twelve degrees of freedom (translation, rotation and scaling of each axis) was followed by a non-rigid registration that allows different brain regions to move somewhat independently, subject to a smoothness penalty (Rueckert et al., 1999). In the non-rigid step, deformations between the independently moving control points are interpolated using B-splines, with image similarity being computed through the use of a normalised mutual information metric (Studholme et al., 1999). The task of finding an accurate registration is treated as an optimisation problem of the mutual information metric that, due to its complex nature, has many local optima in which the algorithm can become stuck. To help avoid this, a constraint is imposed to ensure the deformation field is spatially smooth across the brain, as is biological reasonable. Full details of the parameters passed to the CMTK tools are provided in the 'settings' file that accompanies each registration. To create mirroring registrations, images were first flipped horizontally in Fiji before being registered to the original template spaces using CMTK. For convenience, we also encoded the horizontal flip as a CMTK-compatible affine transformation, meaning that the entire process of mirroring a sample image can be carried in single step with CMTK.

Construction of new template spaces

The template space provided by the FlyLight project (JFRC) is not spatially calibrated and so we added spatial calibration to a copy named JFRC2. Similarly, FlyCircuit images are registered to male and female template spaces and so we created an intersex template space from 17 female and 9 male brains to bring all FlyCircuit neurons into a common space, irrespective of sex. The IS2, Cell07 and T1 template spaces were left unaltered.

As the neuropil and tract masks provided by the Insect Brain Name working group (Ito et al., 2014) only cover half a brain (IBN), we extended the IBN template space into a new whole brain template (named IBNWB) to improve the quality of the bridging registration between the IBN files and the other whole brain templates. The green channel (n-syb-GFP) of the tricolour confocal data provided was taken, duplicated and flipped about the medio-lateral axis using Fiji (Schindelin et al., 2012). The Fiji plugin `Pairwise stitching’ (Preibisch et al., 2009) was used to stitch the two stacks together with an offset of 392 pixels. This offset was chosen by eye as the one from the range of offsets 385–400 pixels that produced the most anatomically correct result. The overlapping region's intensity was set using the `linear blend' method. We attempted improving on this alignment using the Fourier phase correlation method that the plugin also implements, but this gave poor results – the algorithm favoured overlapping the optic lobes, with a half central brain being present on each of the left and right sides.

As the template space is synthesised from an affine transformation of the original IBN template, we only considered an affine bridging registration between IBN and IBNWB. The n-syb-GFP labelling used in the IBN template strongly labels a large collection of cell bodies close to the cortex, posterior of the superior lateral protocerebrum and lateral horn, that are not labelled by nc82 or Dlg and hence the warping registrations from IBNWB to the other whole brain templates are less accurate in this region.

Construction of averaged template spaces

CMTK's avg_adm tool was used to iteratively produce new averaged seed brains given a set of template spaces and an initial seed brain drawn from the set. In each round, template spaces are registered to the seed brain and averaged to produce a new seed brain. After all rounds are complete, a final affine registration between the latest seed brain and a flipped version is calculated and then halved, resulting in a final brain that is centered in the middle of the image. The FCWB template was produced in this manner using 17 female and 9 male brains. We have developed documented tools to help users make average templates, here: https://github.com/jefferislab/MakeAverageBrain.

Application of registrations to images, traced neurons and surface data

CMTK provides two commands, reformatx and streamxform that will use a registration to reformat images and transform points, respectively. The R package nat wraps these commands and can use them to transform neuroanatomical data, stored as objects in the R session, between template spaces. A 3D surface model of the standard neuropil segmentation (Ito et al., 2014) was generated from the labelled image stack, using Amira, read into R using nat, transformed into the different template template spaces, via JFRC2, and saved as new 3D surfaces. These can then be used to segment neurons in their original space, providing interesting volumetric data for a neuron such as the relative density of neuropil innervation.

Flies

Wild-type (Canton S, Bloomington Stock Center, Indiana University) and transgenic strains were kept on standard yeast/agar medium at 25°C. Transgenics were a GH146-lexA line and the dFasciculin-II-GFP protein trap line (courtesy of M. Landgraf). Lines were balanced with CyO, Dfd-GMR-YFP or TM6b, Sb, Dfd-GMR-YFP balancer chromosomes (Bloomington Stock Center, Indiana University).

Larval dissection, immunohistochemistry and imaging

Flies were mated a day before dissection and laid eggs on apple-juice based media with a spot of yeast paste overnight at 250C. Adults and large hatched larvae were subsequently removed, and small embryos (approx. the length of an egg) were dissected in Sorensen’s saline (pH 7.2, 0.075 M). A hypodermic needle (30 ½ G; Microlance) was used to sever the mouth hooks of each larva, at which point the CNS extruded along with viscera, and was gently separated and stuck to a cover glass that has been coated with poly-L-lysine (Sigma-Aldrich) in a bubble of solution. The CNS’ were then fixed in 4% formaldehyde (Fisher Scientific) in Sorensen’s saline for 15 min at room temperature, and subsequently permeabilised in PBT (phosphate buffer with 0.3% Triton-X-100, SigmaAldrich). Incubated overnight in primary antibodies at 4°C and, after washes in PBT, in secondary antibodies for 2 hr at room temperature. Washes took place in either a bubble of fluid or shallow dish filled with solution to prevent collapse of brain lobes into the VNC. For this reason also, confocal stacks were acquired with a 40x dipping lens on a Zeiss LSM 710, voxel resolution 0.2 × 0.2×0.5 microns. Primary antibodies used were Chicken anti-GFP (Invitrogen), 1: 10,000, mouse IgG1 anti-FasciclinII (DSHB), 1:10, rat N-Cadherin (DSHB) and mouse IgG1 Discs large-1, 1:50. Secondaries used were goat anti-mouse CF568, 1:600, goat anti-Chicken Alexa488, goat anti-mouse CF647, 1:600. Some antibodies and dissection training were kindly supplied by M. Landgraf.

Visualisation

The majority of images shown in this manuscript were generated in R Studio. 3D images were plotted with natverse functions that depend on the R package rgl (Murdoch, 2001), 2D plots were generated using ggplot2 (Wickham, 2016). 3D images of confocal data were visualised using Amira 6.0, and Paraview. Figures were generated using Adobe Illustrator.

Data availability

The bridging and mirroring registrations are deposited in two version controlled repositories at http://github.com with revisions uniquely identified by the SHA-1 hash function. As some template spaces may have multiple versions, we identify each version by its SHA-1 hash as this is uniquely dependent on the data contained in each file. Since we use the distributed version control system, git, any user can clone a complete, versioned history of these repositories. We have also taken a repository snapshot at the time of the release of this paper on the publicly funded http://zenodo.org site, which associates the data with a permanent digital object identifiers (DOIs).To simplify data access for colleagues, we have provided spatially calibrated template spaces for the main template spaces in use by the Drosophila community in a single standard format, NRRD. These brain images have permanent DOIs listed in Table 2. We have also generated registrations for the entire FlyCircuit single neuron and FlyLight datasets. The registered images have been deposited at http://virtualflybrain.org. The R packages nat.flybrains and elmr in the natverse also contain easy-to-use functions for deploying these registrations. The complete software toolchain for the construction and application of registrations consists exclusively of open source code released under the GNU Public License and released on http://github.com and http://sourceforge.net. A full listing of these resources is available at http://jefferislab.org/si/bridging. All these steps will ensure that these resources will be available for many years to come (as has been recommended Ito, 2010).

Acknowledgements

We are very grateful to the original data providers including Ann-Shyn Chiang, Gerry Rubin, Moritz Helmstaedter, Herwig Baier, Stanley Heinze, Arnim Jenett, Tzumin Lee, Kazunori Shinomiya and Kei Ito for generously sharing their image data with the research community. We specifically thank Arnim Jenett, Kazunori Shinomiya and Kei Ito for sharing the nc82-based D. melanogaster neuropil segmentation. We thank M-J Dolan for providing confocal microscopy exemplar images. Images from FlyCircuit were obtained from the NCHC (National Center for High-performance Computing) and NTHU (National Tsing Hua University), Hsinchu, Taiwan. We thank the Virtual Fly Brain team including MC, David Osumi-Sutherland, Robert Court, Cahir O'Kane and Douglas Armstrong for making some of our processed data available online through https://virtualflybrain.org. We note that data integration work with the virtualflybrain.org website was supported in part by an award from the Isaac Newton Trust to MC and Dr Cahir O'Kane. We thank Tom Kazimiers for help navigating the CATMAID API. We thank Alex Vourvoukelis, Alex von Klemperer, and Colin J Akerman for sharing unpublished reconstruction data.

We thank members of the Jefferis laboratory and the Drosophila Connectomics group for comments on this manuscript along with Jan Clemens, Jamie Jeanne and Stanley Heinze. We thank Jake Grimmett and Toby Darling for assistance with the LMB's computer cluster. This work made use of the Computational Morphometry Toolkit, supported by the National Institute of Biomedical Imaging and Bioengineering (NIBIB). We thank early users of the natverse for their help finding bugs and suggesting features, including but not limited to: István Taisz, Shanice Bailey, William Morris, Kathi Eichler, Dana Gallii, Sebastian Cachero, Erika Dona, Shahar Frechter, Konrad Heinz, Fiona Love, Paavo Huoviala, Amelia Edmondson-Stait and Lisa Marin.

This work was supported by the MRC (MC-U105188491), Starting and Consolidator grants (649111) from the European Research Council, and the Wellcome Trust (203261/Z/16/Z) to GSXEJ, the Boehringer Ingelheim Fonds and Herchel Smith Studentship (ASB) and a Fitzwilliam College Research Fellowship (JDM).

Funding Statement

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Contributor Information

Gregory SXE Jefferis, Email: jefferis@mrc-lmb.cam.ac.uk.

K VijayRaghavan, National Centre for Biological Sciences, Tata Institute of Fundamental Research, India.

K VijayRaghavan, National Centre for Biological Sciences, Tata Institute of Fundamental Research, India.

Funding Information

This paper was supported by the following grants:

  • Medical Research Council MC-U105188491 to Alexander S Bates, James D Manton, Gregory SXE Jefferis.

  • H2020 European Research Council 649111 to Alexander S Bates, James D Manton, Marta Costa, Gregory SXE Jefferis.

  • Wellcome 203261/Z/16/Z to Sridhar R Jagannathan, Marta Costa, Philipp Schlegel, Gregory SXE Jefferis.

  • Boehringer Ingelheim Fonds to Alexander S Bates.

  • Herchel Smith Fund to Alexander S Bates.

  • Fitzwilliam College, Univeristy of Cambridge to James D Manton.

Additional information

Competing interests

No competing interests declared.

Author contributions

Data curation, Software, Formal analysis, Validation, Investigation, Visualization, Methodology, Writing - original draft, Writing - review and editing.

Data curation, Software, Formal analysis, Validation, Investigation, Visualization, Methodology, Writing - original draft, Writing - review and editing.

Software, Investigation, Visualization, Methodology, Writing - review and editing.

Data curation, Software, Investigation, Methodology, Writing - review and editing.

Software, Investigation, Methodology, Writing - review and editing.

Software.

Conceptualization, Data curation, Software, Formal analysis, Supervision, Funding acquisition, Validation, Investigation, Methodology, Project administration, Writing - review and editing.

Additional files

Transparent reporting form

Data availability

All code is described at http://natverse.org/ which links to individual git repositories at https://github.com/natverse.

References

  1. Anderson K, Bones B, Robinson B, Hass C, Lee H, Ford K, Roberts TA, Jacobs B. The morphology of supragranular pyramidal neurons in the human insular cortex: a quantitative golgi study. Cerebral Cortex. 2009;19:2131–2144. doi: 10.1093/cercor/bhn234. [DOI] [PubMed] [Google Scholar]
  2. Anderson K, Yamamoto E, Kaplan J, Hannan M, Jacobs B. Neurolucida lucivid versus Neurolucida Camera: a quantitative and qualitative comparison of three-dimensional neuronal reconstructions. Journal of Neuroscience Methods. 2010;186:209–214. doi: 10.1016/j.jneumeth.2009.11.024. [DOI] [PubMed] [Google Scholar]
  3. Arganda-Carreras I, Manoliu T, Mazuras N, Schulze F, Iglesias JE, Bühler K, Jenett A, Rouyer F, Andrey P. A statistically representative Atlas for mapping neuronal circuits in the Drosophila Adult Brain. Frontiers in Neuroinformatics. 2018;12:13. doi: 10.3389/fninf.2018.00013. [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Ascoli GA, Donohue DE, Halavi M. NeuroMorpho.Org: a central resource for neuronal morphologies. Journal of Neuroscience. 2007;27:9247–9251. doi: 10.1523/JNEUROSCI.2055-07.2007. [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Aso Y, Hattori D, Yu Y, Johnston RM, Iyer NA, Ngo TT, Dionne H, Abbott LF, Axel R, Tanimoto H, Rubin GM. The neuronal architecture of the mushroom body provides a logic for associative learning. eLife. 2014;3:e04577. doi: 10.7554/eLife.04577. [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Aso Y, Rubin GM. Dopaminergic neurons write and update memories with cell-type-specific rules. eLife. 2016;5:e16135. doi: 10.7554/eLife.16135. [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. Avants BB, Tustison N, Song G. Advanced normalization tools (ANTS) The Insight Journal. 2009;2:1–35. [Google Scholar]
  8. Bates AS, Janssens J, Jefferis GSXE, Aerts S. Neuronal cell types in the fly: single-cell anatomy meets single-cell genomics. Current Opinion in Neurobiology. 2019;56:125–134. doi: 10.1016/j.conb.2018.12.012. [DOI] [PubMed] [Google Scholar]
  9. Bates AS, Schlegel P, Roberts RJW, Drummond N, Tamimi IFM, Turnbull R, Zhao X, Marin EC, Popovici PD, Dhawan S, Jamasb A, Javier A, Li F, Rubin GM, Waddell S, Bock DD, Costa M, Jefferis GSXE. Complete connectomic reconstruction of olfactory projection neurons in the fly brain. bioRxiv. 2020 doi: 10.1101/2020.01.19.911453. [DOI] [PMC free article] [PubMed]
  10. Berck ME, Khandelwal A, Claus L, Hernandez-Nunez L, Si G, Tabone CJ, Li F, Truman JW, Fetter RD, Louis M, Samuel AD, Cardona A. The wiring diagram of a glomerular olfactory system. eLife. 2016;5:e14859. doi: 10.7554/eLife.14859. [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Billeci L, Magliaro C, Pioggia G, Ahluwalia A. NEuronMOrphological analysis tool: open-source software for quantitative morphometrics. Frontiers in Neuroinformatics. 2013;7:2. doi: 10.3389/fninf.2013.00002. [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Bock DD, Lee WC, Kerlin AM, Andermann ML, Hood G, Wetzel AW, Yurgenson S, Soucy ER, Kim HS, Reid RC. Network anatomy and in vivo physiology of visual cortical neurons. Nature. 2011;471:177–182. doi: 10.1038/nature09802. [DOI] [PMC free article] [PubMed] [Google Scholar]
  13. Bogovic JA, Otsuna H, Heinrich L, Ito M, Jeter J, Meissner GW, Nern A, Colonell J, Malkesman O, Ito K, Saalfeld S. An unbiased template of theDrosophilabrain and ventral nerve cord. bioRxiv. 2018 doi: 10.1101/376384. [DOI] [PMC free article] [PubMed]
  14. Bône A, Louis M, Martin B, Durrleman S. Shape in Medical Imaging. Springer International Publishing; 2018. Deformetrica 4: an Open-Source software for statistical shape Analysis; pp. 3–13. [DOI] [Google Scholar]
  15. Brand AH, Perrimon N. Targeted gene expression as a means of altering cell fates and generating dominant phenotypes. Development. 1993;118:401–415. doi: 10.1242/dev.118.2.401. [DOI] [PubMed] [Google Scholar]
  16. Brandt R, Rohlfing T, Rybak J, Krofczik S, Maye A, Westerhoff M, Hege HC, Menzel R. Three-dimensional average-shape atlas of the honeybee brain and its applications. The Journal of Comparative Neurology. 2005;492:1–19. doi: 10.1002/cne.20644. [DOI] [PubMed] [Google Scholar]
  17. Brown KM, Donohue DE, D'Alessandro G, Ascoli GA. A cross-platform freeware tool for digital reconstruction of neuronal arborizations from image stacks. Neuroinformatics. 2005;3:343–360. doi: 10.1385/NI:3:4:343. [DOI] [PubMed] [Google Scholar]
  18. Busch S, Selcho M, Ito K, Tanimoto H. A map of octopaminergic neurons in the Drosophila brain. The Journal of Comparative Neurology. 2009;513:643–667. doi: 10.1002/cne.21966. [DOI] [PubMed] [Google Scholar]
  19. Cachero S, Ostrovsky AD, Yu JY, Dickson BJ, Jefferis GSXE. Sexual dimorphism in the fly brain. Current Biology. 2010;20:1589–1601. doi: 10.1016/j.cub.2010.07.045. [DOI] [PMC free article] [PubMed] [Google Scholar]
  20. Cannon RC, Turner DA, Pyapali GK, Wheal HV. An on-line archive of reconstructed hippocampal neurons. Journal of Neuroscience Methods. 1998;84:49–54. doi: 10.1016/S0165-0270(98)00091-0. [DOI] [PubMed] [Google Scholar]
  21. Chiang AS, Lin CY, Chuang CC, Chang HM, Hsieh CH, Yeh CW, Shih CT, Wu JJ, Wang GT, Chen YC, Wu CC, Chen GY, Ching YT, Lee PC, Lin CY, Lin HH, Wu CC, Hsu HW, Huang YA, Chen JY, Chiang HJ, Lu CF, Ni RF, Yeh CY, Hwang JK. Three-dimensional reconstruction of brain-wide wiring networks in Drosophila at single-cell resolution. Current Biology. 2011;21:1–11. doi: 10.1016/j.cub.2010.11.056. [DOI] [PubMed] [Google Scholar]
  22. Clemens J, Girardin CC, Coen P, Guan XJ, Dickson BJ, Murthy M. Connecting neural codes with behavior in the auditory system of Drosophila. Neuron. 2015;87:1332–1343. doi: 10.1016/j.neuron.2015.08.014. [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Clemens J, Coen P, Roemschied FA, Pereira TD, Mazumder D, Aldarondo DE, Pacheco DA, Murthy M. Discovery of a new song mode in Drosophila reveals hidden structure in the sensory and neural drivers of behavior. Current Biology. 2018;28:2400–2412. doi: 10.1016/j.cub.2018.06.011. [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Clements J, Dolafi T, Umayam L, Neubarth NL, Berg S, Scheffer LK, Plaza SM. neuPrint: analysis tools for EM connectomics. bioRxiv. 2020 doi: 10.1101/2020.01.16.909465. [DOI] [PMC free article] [PubMed]
  25. Cook SJ, Jarrell TA, Brittin CA, Wang Y, Bloniarz AE, Yakovlev MA, Nguyen KCQ, Tang LT, Bayer EA, Duerr JS, Bülow HE, Hobert O, Hall DH, Emmons SW. Whole-animal connectomes of both Caenorhabditis elegans sexes. Nature. 2019;571:63–71. doi: 10.1038/s41586-019-1352-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. Costa M, Manton JD, Ostrovsky AD, Prohaska S, Jefferis GSXE. NBLAST: rapid, sensitive comparison of neuronal structure and construction of neuron family databases. Neuron. 2016;91:293–311. doi: 10.1016/j.neuron.2016.06.012. [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. Csardi G, Nepusz T. The igraph software package for complex network research. InterJournal Complex Systems. 2006;1695:1–9. [Google Scholar]
  28. Cuntz H, Forstner F, Borst A, Häusser M. One rule to grow them all: a general theory of neuronal branching and its practical application. PLOS Computational Biology. 2010;6:e1000877. doi: 10.1371/journal.pcbi.1000877. [DOI] [PMC free article] [PubMed] [Google Scholar]
  29. Dacks AM, Christensen TA, Hildebrand JG. Phylogeny of a serotonin-immunoreactive neuron in the primary olfactory center of the insect brain. The Journal of Comparative Neurology. 2006;498:727–746. doi: 10.1002/cne.21076. [DOI] [PubMed] [Google Scholar]
  30. de Vries L, Pfeiffer K, Trebels B, Adden AK, Green K, Warrant E, Heinze S. Comparison of Navigation-Related brain regions in migratory versus Non-Migratory noctuid moths. Frontiers in Behavioral Neuroscience. 2017;11:158. doi: 10.3389/fnbeh.2017.00158. [DOI] [PMC free article] [PubMed] [Google Scholar]
  31. Dolan MJ, Belliart-Guérin G, Bates AS, Frechter S, Lampin-Saint-Amaux A, Aso Y, Roberts RJV, Schlegel P, Wong A, Hammad A, Bock D, Rubin GM, Preat T, Plaçais PY, Jefferis GSXE. Communication from learned to innate olfactory processing centers is required for memory retrieval in Drosophila. Neuron. 2018a;100:651–668. doi: 10.1016/j.neuron.2018.08.037. [DOI] [PMC free article] [PubMed] [Google Scholar]
  32. Dolan M-J, Frechter S, Bates AS, Dan C, Huoviala P, Roberts RJV, Schlegel P, Dhawan S, Tabano R, Dionne H, Christoforou C, Close K, Sutcliffe B, Giuliani B, Li F, Costa M, Ihrke G, Meissner G, Bock D, Aso Y, Rubin G, Jefferis GSXE. Neurogenetic dissection of the Drosophila innate olfactory processing center. bioRxiv. 2018b doi: 10.1101/404277. [DOI]
  33. Dolan MJ, Frechter S, Bates AS, Dan C, Huoviala P, Roberts RJ, Schlegel P, Dhawan S, Tabano R, Dionne H, Christoforou C, Close K, Sutcliffe B, Giuliani B, Li F, Costa M, Ihrke G, Meissner GW, Bock DD, Aso Y, Rubin GM, Jefferis GS. Neurogenetic dissection of the Drosophila lateral horn reveals major outputs, diverse behavioural functions, and interactions with the mushroom body. eLife. 2019;8:e43079. doi: 10.7554/eLife.43079. [DOI] [PMC free article] [PubMed] [Google Scholar]
  34. Duong T. Ks : Kernel Density Estimation and Kernel Discriminant Analysis for Multivariate Data in R. Journal of Statistical Software. 2007;21:1–16. doi: 10.18637/jss.v021.i07. [DOI] [Google Scholar]
  35. Durrleman S, Prastawa M, Charon N, Korenberg JR, Joshi S, Gerig G, Trouvé A. Morphometry of anatomical shape complexes with dense deformations and sparse parameters. NeuroImage. 2014;101:35–49. doi: 10.1016/j.neuroimage.2014.06.043. [DOI] [PMC free article] [PubMed] [Google Scholar]
  36. Economo MN, Clack NG, Lavis LD, Gerfen CR, Svoboda K, Myers EW, Chandrashekar J. A platform for brain-wide imaging and reconstruction of individual neurons. eLife. 2016;5:e10566. doi: 10.7554/eLife.10566. [DOI] [PMC free article] [PubMed] [Google Scholar]
  37. Eichler K, Li F, Litwin-Kumar A, Park Y, Andrade I, Schneider-Mizell CM, Saumweber T, Huser A, Eschbach C, Gerber B, Fetter RD, Truman JW, Priebe CE, Abbott LF, Thum AS, Zlatic M, Cardona A. The complete connectome of a learning and memory centre in an insect brain. Nature. 2017;548:175–182. doi: 10.1038/nature23455. [DOI] [PMC free article] [PubMed] [Google Scholar]
  38. El Jundi B, Warrant EJ, Pfeiffer K, Dacke M. Neuroarchitecture of the dung beetle central complex. Journal of Comparative Neurology. 2018;526:2612–2630. doi: 10.1002/cne.24520. [DOI] [PubMed] [Google Scholar]
  39. Elseberg J, Magnenat S, Siegwart R. Comparison of nearest-neighbor-search strategies and implementations for efficient shape registration. Journal of Software Engineering for Robotics. 2012;3:2–12. [Google Scholar]
  40. Farhoodi R, Lansdell BJ, Kording KP. Quantifying how staining methods Bias measurements of neuron morphologies. Frontiers in Neuroinformatics. 2019;13:36. doi: 10.3389/fninf.2019.00036. [DOI] [PMC free article] [PubMed] [Google Scholar]
  41. Felsenberg J, Jacob PF, Walker T, Barnstedt O, Edmondson-Stait AJ, Pleijzier MW, Otto N, Schlegel P, Sharifi N, Perisse E, Smith CS, Lauritzen JS, Costa M, Jefferis GSXE, Bock DD, Waddell S. Integration of parallel opposing memories underlies memory extinction. Cell. 2018;175:709–722. doi: 10.1016/j.cell.2018.08.021. [DOI] [PMC free article] [PubMed] [Google Scholar]
  42. Feng L, Zhao T, Kim J. neuTube 1.0: a new design for efficient neuron reconstruction software based on the SWC format. Eneuro. 2015;2:ENEURO.0049-14.2014. doi: 10.1523/ENEURO.0049-14.2014. [DOI] [PMC free article] [PubMed] [Google Scholar]
  43. Frechter S, Bates AS, Tootoonian S, Dolan MJ, Manton J, Jamasb AR, Kohl J, Bock D, Jefferis GSXE. Functional and anatomical specificity in a higher olfactory centre. eLife. 2019;8:e44590. doi: 10.7554/eLife.44590. [DOI] [PMC free article] [PubMed] [Google Scholar]
  44. Funke J, Tschopp F, Grisaitis W, Sheridan A, Singh C, Saalfeld S, Turaga SC. Large scale image segmentation with structured loss based deep learning for connectome reconstruction. IEEE Transactions on Pattern Analysis and Machine Intelligence. 2019;41:1669–1680. doi: 10.1109/TPAMI.2018.2835450. [DOI] [PubMed] [Google Scholar]
  45. Gensel JC, Schonberg DL, Alexander JK, McTigue DM, Popovich PG. Semi-automated sholl analysis for quantifying changes in growth and differentiation of neurons and Glia. Journal of Neuroscience Methods. 2010;190:71–79. doi: 10.1016/j.jneumeth.2010.04.026. [DOI] [PMC free article] [PubMed] [Google Scholar]
  46. Glaser JR, Glaser EM. Neuron imaging with neurolucida--a PC-based system for image combining microscopy. Computerized Medical Imaging and Graphics. 1990;14:307–317. doi: 10.1016/0895-6111(90)90105-K. [DOI] [PubMed] [Google Scholar]
  47. Grosjean Y, Rytz R, Farine JP, Abuin L, Cortot J, Jefferis GSXE, Benton R. An olfactory receptor for food-derived odours promotes male courtship in Drosophila. Nature. 2011;478:236–240. doi: 10.1038/nature10428. [DOI] [PubMed] [Google Scholar]
  48. Halavi M, Polavaram S, Donohue DE, Hamilton G, Hoyt J, Smith KP, Ascoli GA. NeuroMorpho.Org implementation of digital neuroscience: dense coverage and integration with the NIF. Neuroinformatics. 2008;6:241–252. doi: 10.1007/s12021-008-9030-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  49. Heinrich L, Funke J, Pape C, Nunez-Iglesias J, Saalfeld S. Synaptic cleft segmentation in Non-Isotropic volume electron microscopy of the complete Drosophila Brain. arXiv. 2018 https://arxiv.org/abs/1805.02718
  50. Heinze S, Reppert SM. Anatomical basis of sun compass navigation I: the general layout of the monarch butterfly brain. The Journal of Comparative Neurology. 2012;520:1599–1628. doi: 10.1002/cne.23054. [DOI] [PubMed] [Google Scholar]
  51. Helmstaedter M, Briggman KL, Turaga SC, Jain V, Seung HS, Denk W. Connectomic reconstruction of the inner plexiform layer in the mouse retina. Nature. 2013;500:168–174. doi: 10.1038/nature12346. [DOI] [PubMed] [Google Scholar]
  52. Ho SY, Chao CY, Huang HL, Chiu TW, Charoenkwan P, Hwang E. NeurphologyJ: an automatic neuronal morphology quantification method and its application in pharmacological discovery. BMC Bioinformatics. 2011;12:230. doi: 10.1186/1471-2105-12-230. [DOI] [PMC free article] [PubMed] [Google Scholar]
  53. Horne JA, Langille C, McLin S, Wiederman M, Lu Z, Xu CS, Plaza SM, Scheffer LK, Hess HF, Meinertzhagen IA. A resource for the Drosophila antennal lobe provided by the connectome of glomerulus VA1v. eLife. 2018;7:e37550. doi: 10.7554/eLife.37550. [DOI] [PMC free article] [PubMed] [Google Scholar]
  54. Huber W, Carey VJ, Gentleman R, Anders S, Carlson M, Carvalho BS, Bravo HC, Davis S, Gatto L, Girke T, Gottardo R, Hahne F, Hansen KD, Irizarry RA, Lawrence M, Love MI, MacDonald J, Obenchain V, Oleś AK, Pagès H, Reyes A, Shannon P, Smyth GK, Tenenbaum D, Waldron L, Morgan M. Orchestrating high-throughput genomic analysis with bioconductor. Nature Methods. 2015;12:115–121. doi: 10.1038/nmeth.3252. [DOI] [PMC free article] [PubMed] [Google Scholar]
  55. Huoviala P, Dolan M-J, Love F, Frechter S, Roberts RJV, Mitrevica Z, Schlegel P, Bates ASS, Aso Y, Rodrigues T, Cornwall H, Stensmyr M, Bock D, Rubin GM, Costa M, Jefferis GSXE. Neural circuit basis of aversive odour processing in Drosophila from sensory input to descending output. bioRxiv. 2018 doi: 10.1101/394403. [DOI]
  56. Ito K. Technical and organizational considerations for the long-term maintenance and development of the digital brain atlases and web-based databases. Frontiers in System Neuroscience. 2010;4:26. doi: 10.3389/fnsys.2010.00026. [DOI] [PMC free article] [PubMed] [Google Scholar]
  57. Ito M, Masuda N, Shinomiya K, Endo K, Ito K. Systematic analysis of neural projections reveals clonal composition of the Drosophila brain. Current Biology. 2013;23:644–655. doi: 10.1016/j.cub.2013.03.015. [DOI] [PubMed] [Google Scholar]
  58. Ito K, Shinomiya K, Ito M, Armstrong JD, Boyan G, Hartenstein V, Harzsch S, Heisenberg M, Homberg U, Jenett A, Keshishian H, Restifo LL, Rössler W, Simpson JH, Strausfeld NJ, Strauss R, Vosshall LB, Insect Brain Name Working Group A systematic nomenclature for the insect brain. Neuron. 2014;81:755–765. doi: 10.1016/j.neuron.2013.12.017. [DOI] [PubMed] [Google Scholar]
  59. Jacobs B, Driscoll L, Schall M. Life-span dendritic and spine changes in Areas 10 and 18 of human cortex: a quantitative golgi study. The Journal of Comparative Neurology. 1997;386:661–680. doi: 10.1002/(SICI)1096-9861(19971006)386:4<661::AID-CNE11>3.0.CO;2-N. [DOI] [PubMed] [Google Scholar]
  60. Jacobs B, Schall M, Prather M, Kapler E, Driscoll L, Baca S, Jacobs J, Ford K, Wainwright M, Treml M. Regional dendritic and spine variation in human cerebral cortex: a quantitative golgi study. Cerebral Cortex. 2001;11:558–571. doi: 10.1093/cercor/11.6.558. [DOI] [PubMed] [Google Scholar]
  61. Jacobs B, Lubs J, Hannan M, Anderson K, Butti C, Sherwood CC, Hof PR, Manger PR. Neuronal morphology in the african elephant (Loxodonta africana) neocortex. Brain Structure and Function. 2011;215:273–298. doi: 10.1007/s00429-010-0288-3. [DOI] [PubMed] [Google Scholar]
  62. Jacobs B, Harland T, Kennedy D, Schall M, Wicinski B, Butti C, Hof PR, Sherwood CC, Manger PR. The neocortex of cetartiodactyls. II. neuronal morphology of the visual and motor cortices in the giraffe (Giraffa camelopardalis) Brain Structure and Function. 2015;220:2851–2872. doi: 10.1007/s00429-014-0830-9. [DOI] [PubMed] [Google Scholar]
  63. Jacobs B, Lee L, Schall M, Raghanti MA, Lewandowski AH, Kottwitz JJ, Roberts JF, Hof PR, Sherwood CC. Neocortical neuronal morphology in the newborn giraffe (Giraffa camelopardalis tippelskirchi) and african elephant (Loxodonta africana) The Journal of Comparative Neurology. 2016;524:257–287. doi: 10.1002/cne.23841. [DOI] [PubMed] [Google Scholar]
  64. Jacobs B, Garcia ME, Shea-Shumsky NB, Tennison ME, Schall M, Saviano MS, Tummino TA, Bull AJ, Driscoll LL, Raghanti MA, Lewandowski AH, Wicinski B, Ki Chui H, Bertelsen MF, Walsh T, Bhagwandin A, Spocter MA, Hof PR, Sherwood CC, Manger PR. Comparative morphology of gigantopyramidal neurons in primary motor cortex across mammals. Journal of Comparative Neurology. 2018;526:496–536. doi: 10.1002/cne.24349. [DOI] [PubMed] [Google Scholar]
  65. Januszewski M, Kornfeld J, Li PH, Pope A, Blakely T, Lindsey L, Maitin-Shepard J, Tyka M, Denk W, Jain V. High-precision automated reconstruction of neurons with flood-filling networks. Nature Methods. 2018;15:605–610. doi: 10.1038/s41592-018-0049-4. [DOI] [PubMed] [Google Scholar]
  66. Jeanne JM, Fişek M, Wilson RI. The organization of projections from olfactory glomeruli onto Higher-Order neurons. Neuron. 2018;98:1198–1213. doi: 10.1016/j.neuron.2018.05.011. [DOI] [PMC free article] [PubMed] [Google Scholar]
  67. Jeanne JM, Wilson RI. Convergence, divergence, and reconvergence in a feedforward network improves neural speed and accuracy. Neuron. 2015;88:1014–1026. doi: 10.1016/j.neuron.2015.10.018. [DOI] [PMC free article] [PubMed] [Google Scholar]
  68. Jefferis GSXE, Potter CJ, Chan AM, Marin EC, Rohlfing T, Maurer CR, Luo L. Comprehensive maps of Drosophila higher olfactory centers: spatially segregated fruit and pheromone representation. Cell. 2007;128:1187–1203. doi: 10.1016/j.cell.2007.01.040. [DOI] [PMC free article] [PubMed] [Google Scholar]
  69. Jenett A, Rubin GM, Ngo TT, Shepherd D, Murphy C, Dionne H, Pfeiffer BD, Cavallaro A, Hall D, Jeter J, Iyer N, Fetter D, Hausenfluck JH, Peng H, Trautman ET, Svirskas RR, Myers EW, Iwinski ZR, Aso Y, DePasquale GM, Enos A, Hulamm P, Lam SC, Li HH, Laverty TR, Long F, Qu L, Murphy SD, Rokicki K, Safford T, Shaw K, Simpson JH, Sowell A, Tae S, Yu Y, Zugates CT. A GAL4-driver line resource for Drosophila neurobiology. Cell Reports. 2012;2:991–1001. doi: 10.1016/j.celrep.2012.09.011. [DOI] [PMC free article] [PubMed] [Google Scholar]
  70. Kandel BM, Cook PA, Tustison NJ, Muschelli J. ANTsRCore: Core Software Infrastructure for ANTsR. GitHub. 2019 https://antsx.github.io/ANTsRCore/
  71. Katz WT, Plaza SM. DVID: distributed versioned Image-Oriented dataservice. Frontiers in Neural Circuits. 2019;13:5. doi: 10.3389/fncir.2019.00005. [DOI] [PMC free article] [PubMed] [Google Scholar]
  72. Kim KM, Son K, Palmore GT. Neuron image analyzer: automated and accurate extraction of neuronal data from low quality images. Scientific Reports. 2015;5:17062. doi: 10.1038/srep17062. [DOI] [PMC free article] [PubMed] [Google Scholar]
  73. Klapoetke NC, Nern A, Peek MY, Rogers EM, Breads P, Rubin GM, Reiser MB, Card GM. Ultra-selective looming detection from radial motion opponency. Nature. 2017;551:237–241. doi: 10.1038/nature24626. [DOI] [PMC free article] [PubMed] [Google Scholar]
  74. Kohl J, Ostrovsky AD, Frechter S, Jefferis GSXE. A bidirectional circuit switch reroutes pheromone signals in male and female brains. Cell. 2013;155:1610–1623. doi: 10.1016/j.cell.2013.11.025. [DOI] [PMC free article] [PubMed] [Google Scholar]
  75. Kunst M, Laurell E, Mokayes N, Kramer A, Kubo F, Fernandes AM, Förster D, Dal Maschio M, Baier H. A Cellular-Resolution atlas of the larval zebrafish brain. Neuron. 2019;103:21–38. doi: 10.1016/j.neuron.2019.04.034. [DOI] [PubMed] [Google Scholar]
  76. Kurylas AE, Rohlfing T, Krofczik S, Jenett A, Homberg U. Standardized atlas of the brain of the desert Locust, Schistocerca gregaria. Cell and Tissue Research. 2008;333:125–145. doi: 10.1007/s00441-008-0620-x. [DOI] [PubMed] [Google Scholar]
  77. Kvon EZ, Kazmar T, Stampfel G, Yáñez-Cuna JO, Pagani M, Schernhuber K, Dickson BJ, Stark A. Genome-scale functional characterization of Drosophila developmental enhancers in vivo. Nature. 2014;512:91–95. doi: 10.1038/nature13395. [DOI] [PubMed] [Google Scholar]
  78. Lafarge T, Pateiro-López B, Possolo A, Dunkers J. R implementation of a polyhedral approximation to a 3D set of points using the α-Shape. Journal of Statistical Software. 2014;56:1–19. doi: 10.18637/jss.v056.i04. [DOI] [Google Scholar]
  79. Lai SL, Lee T. Genetic mosaic with dual binary transcriptional systems in Drosophila. Nature Neuroscience. 2006;9:703–709. doi: 10.1038/nn1681. [DOI] [PubMed] [Google Scholar]
  80. Lee T, Luo L. Mosaic analysis with a repressible cell marker (MARCM) for Drosophila neural development. Trends in Neurosciences. 2001;24:251–254. doi: 10.1016/S0166-2236(00)01791-4. [DOI] [PubMed] [Google Scholar]
  81. Li Y, Wang D, Ascoli GA, Mitra P, Wang Y. Metrics for comparing neuronal tree shapes based on persistent homology. PLOS ONE. 2017;12:e0182184. doi: 10.1371/journal.pone.0182184. [DOI] [PMC free article] [PubMed] [Google Scholar]
  82. Li PH, Lindsey LF, Januszewski M, Zheng Z, Bates AS, Taisz I, Tyka M, Nichols M, Li F, Perlman E, Maitin-Shepard J, Blakely T, Leavitt L, Gregory SXE, Bock D, Jain V. Automated reconstruction of a Serial-Section EM DrosophilaBrain with Flood-Filling Networks and Local Realignment. bioRxiv. 2019 doi: 10.1101/605634. [DOI]
  83. Løfaldli BB, Kvello P, Mustaparta H. Integration of the antennal lobe glomeruli and three projection neurons in the standard brain atlas of the moth Heliothis virescens. Frontiers in Systems Neuroscience. 2010;4:5. doi: 10.3389/neuro.06.005.2010. [DOI] [PMC free article] [PubMed] [Google Scholar]
  84. Longair MH, Baker DA, Armstrong JD. Simple neurite tracer: open source software for reconstruction, visualization and analysis of neuronal processes. Bioinformatics. 2011;27:2453–2454. doi: 10.1093/bioinformatics/btr390. [DOI] [PubMed] [Google Scholar]
  85. Luan H, Peabody NC, Vinson CR, White BH. Refined spatial manipulation of neuronal function by combinatorial restriction of transgene expression. Neuron. 2006;52:425–436. doi: 10.1016/j.neuron.2006.08.028. [DOI] [PMC free article] [PubMed] [Google Scholar]
  86. Mann K, Gallen CL, Clandinin TR. Whole-Brain calcium imaging reveals an intrinsic functional network in Drosophila. Current Biology. 2017;27:2389–2396. doi: 10.1016/j.cub.2017.06.076. [DOI] [PMC free article] [PubMed] [Google Scholar]
  87. Masse NY, Cachero S, Ostrovsky AD, Jefferis GSXE. A mutual information approach to Automate identification of neuronal clusters in Drosophila brain images. Frontiers in Neuroinformatics. 2012;6:21. doi: 10.3389/fninf.2012.00021. [DOI] [PMC free article] [PubMed] [Google Scholar]
  88. Meijering E, Jacob M, Sarria J-CF, Steiner P, Hirling H, Unser M. Design and validation of a tool for neurite tracing and analysis in fluorescence microscopy images. Cytometry. 2004;58A:167–176. doi: 10.1002/cyto.a.20022. [DOI] [PubMed] [Google Scholar]
  89. Murdoch D. R-project.org; 2001. [Google Scholar]
  90. Myatt DR, Hadlington T, Ascoli GA, Nasuto SJ. Neuromantic - from semi-manual to semi-automatic reconstruction of neuron morphology. Frontiers in Neuroinformatics. 2012;6:4. doi: 10.3389/fninf.2012.00004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  91. Namiki S, Dickinson MH, Wong AM, Korff W, Card GM. The functional organization of descending sensory-motor pathways in Drosophila. eLife. 2018;7:e34272. doi: 10.7554/eLife.34272. [DOI] [PMC free article] [PubMed] [Google Scholar]
  92. Nanda S, Allaham MM, Bergamino M, Polavaram S, Armañanzas R, Ascoli GA, Parekh R. Doubling up on the fly: neuromorpho.org meets big data. Neuroinformatics. 2015;13:127–129. doi: 10.1007/s12021-014-9257-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
  93. Narro ML, Yang F, Kraft R, Wenk C, Efrat A, Restifo LL. NeuronMetrics: software for semi-automated processing of cultured neuron images. Brain Research. 2007;1138:57–75. doi: 10.1016/j.brainres.2006.10.094. [DOI] [PMC free article] [PubMed] [Google Scholar]
  94. Nern A, Pfeiffer BD, Rubin GM. Optimized tools for multicolor stochastic labeling reveal diverse stereotyped cell arrangements in the fly visual system. PNAS. 2015;112:E2967–E2976. doi: 10.1073/pnas.1506763112. [DOI] [PMC free article] [PubMed] [Google Scholar]
  95. Oh SW, Harris JA, Ng L, Winslow B, Cain N, Mihalas S, Wang Q, Lau C, Kuan L, Henry AM, Mortrud MT, Ouellette B, Nguyen TN, Sorensen SA, Slaughterbeck CR, Wakeman W, Li Y, Feng D, Ho A, Nicholas E, Hirokawa KE, Bohn P, Joines KM, Peng H, Hawrylycz MJ, Phillips JW, Hohmann JG, Wohnoutka P, Gerfen CR, Koch C, Bernard A, Dang C, Jones AR, Zeng H. A mesoscale connectome of the mouse brain. Nature. 2014;508:207–214. doi: 10.1038/nature13186. [DOI] [PMC free article] [PubMed] [Google Scholar]
  96. Ohyama T, Schneider-Mizell CM, Fetter RD, Aleman JV, Franconville R, Rivera-Alba M, Mensh BD, Branson KM, Simpson JH, Truman JW, Cardona A, Zlatic M. A multilevel multimodal circuit enhances action selection in Drosophila. Nature. 2015;520:633–639. doi: 10.1038/nature14297. [DOI] [PubMed] [Google Scholar]
  97. Otsuna H, Ito M, Kawase T. Color depth MIP mask search: a new tool to expedite Split-GAL4 creation. bioRxiv. 2018 doi: 10.1101/318006. [DOI]
  98. Pacheco DA, Thiberge SY, Pnevmatikakis E, Murthy M. Auditory activity is diverse and widespread throughout the central brain of Drosophila. bioRxiv. 2019 doi: 10.1101/709519. [DOI] [PMC free article] [PubMed]
  99. Pascual A, Huang KL, Neveu J, Préat T. Neuroanatomy: brain asymmetry and long-term memory. Nature. 2004;427:605–606. doi: 10.1038/427605a. [DOI] [PubMed] [Google Scholar]
  100. Peng H, Bria A, Zhou Z, Iannello G, Long F. Extensible visualization and analysis for multidimensional images using Vaa3D. Nature Protocols. 2014;9:193–208. doi: 10.1038/nprot.2014.011. [DOI] [PubMed] [Google Scholar]
  101. Pfeiffer BD, Jenett A, Hammonds AS, Ngo TT, Misra S, Murphy C, Scully A, Carlson JW, Wan KH, Laverty TR, Mungall C, Svirskas R, Kadonaga JT, Doe CQ, Eisen MB, Celniker SE, Rubin GM. Tools for neuroanatomy and neurogenetics in Drosophila. PNAS. 2008;105:9715–9720. doi: 10.1073/pnas.0803697105. [DOI] [PMC free article] [PubMed] [Google Scholar]
  102. Pool M, Thiemann J, Bar-Or A, Fournier AE. NeuriteTracer: a novel ImageJ plugin for automated quantification of neurite outgrowth. Journal of Neuroscience Methods. 2008;168:134–139. doi: 10.1016/j.jneumeth.2007.08.029. [DOI] [PubMed] [Google Scholar]
  103. Preibisch S, Saalfeld S, Tomancak P. Globally optimal stitching of tiled 3D microscopic image acquisitions. Bioinformatics. 2009;25:1463–1465. doi: 10.1093/bioinformatics/btp184. [DOI] [PMC free article] [PubMed] [Google Scholar]
  104. R Development Core Team . Vienna, Austria: R Foundation for Statistical Computing; 2011. http://www.r-project.org [Google Scholar]
  105. Rees CL, Moradi K, Ascoli GA. Weighing the evidence in Peters' Rule: does neuronal morphology predict connectivity? Trends in Neurosciences. 2017;40:63–71. doi: 10.1016/j.tins.2016.11.007. [DOI] [PMC free article] [PubMed] [Google Scholar]
  106. Rein K, Zöckler M, Mader MT, Grübel C, Heisenberg M. The Drosophila standard brain. Current Biology. 2002;12:227–231. doi: 10.1016/S0960-9822(02)00656-5. [DOI] [PubMed] [Google Scholar]
  107. Reyes LD, Harland T, Reep RL, Sherwood CC, Jacobs B. Golgi analysis of neuron morphology in the presumptive somatosensory cortex and visual cortex of the Florida manatee (Trichechus manatus latirostris) Brain, Behavior and Evolution. 2016;87:105–116. doi: 10.1159/000445495. [DOI] [PubMed] [Google Scholar]
  108. Robie AA, Hirokawa J, Edwards AW, Umayam LA, Lee A, Phillips ML, Card GM, Korff W, Rubin GM, Simpson JH, Reiser MB, Branson K. Mapping the neural substrates of behavior. Cell. 2017;170:393–406. doi: 10.1016/j.cell.2017.06.032. [DOI] [PubMed] [Google Scholar]
  109. Rohlfing T, Maurer CR. Nonrigid image registration in shared-memory multiprocessor environments with application to brains, breasts, and bees. IEEE Transactions on Information Technology in Biomedicine. 2003;7:16–25. doi: 10.1109/TITB.2003.808506. [DOI] [PubMed] [Google Scholar]
  110. RStudio Team . Boston, MA: RStudio, Inc; 2015. http://www.rstudio.com [Google Scholar]
  111. Rueckert D, Sonoda LI, Hayes C, Hill DL, Leach MO, Hawkes DJ. Nonrigid registration using free-form deformations: application to breast MR images. IEEE Transactions on Medical Imaging. 1999;18:712–721. doi: 10.1109/42.796284. [DOI] [PubMed] [Google Scholar]
  112. Ryan K, Lu Z, Meinertzhagen IA. The CNS connectome of a tadpole larva of Ciona intestinalis (L.) highlights sidedness in the brain of a chordate sibling. eLife. 2016;5:e16962. doi: 10.7554/eLife.16962. [DOI] [PMC free article] [PubMed] [Google Scholar]
  113. Saalfeld S, Cardona A, Hartenstein V, Tomancak P. CATMAID: collaborative annotation toolkit for massive amounts of image data. Bioinformatics. 2009;25:1984–1986. doi: 10.1093/bioinformatics/btp266. [DOI] [PMC free article] [PubMed] [Google Scholar]
  114. Saumweber T, Rohwedder A, Schleyer M, Eichler K, Chen YC, Aso Y, Cardona A, Eschbach C, Kobler O, Voigt A, Durairaja A, Mancini N, Zlatic M, Truman JW, Thum AS, Gerber B. Functional architecture of reward learning in mushroom body extrinsic neurons of larval Drosophila. Nature Communications. 2018;9:1104. doi: 10.1038/s41467-018-03130-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  115. Sayin S, De Backer JF, Siju KP, Wosniack ME, Lewis LP, Frisch LM, Gansen B, Schlegel P, Edmondson-Stait A, Sharifi N, Fisher CB, Calle-Schuler SA, Lauritzen JS, Bock DD, Costa M, Jefferis GSXE, Gjorgjieva J, Grunwald Kadow IC. A neural circuit arbitrates between persistence and withdrawal in hungry Drosophila. Neuron. 2019;104:544–558. doi: 10.1016/j.neuron.2019.07.028. [DOI] [PMC free article] [PubMed] [Google Scholar]
  116. Scheffer LK, Meinertzhagen IA. The fly brain atlas. Annual Review of Cell and Developmental Biology. 2019;35:637–653. doi: 10.1146/annurev-cellbio-100818-125444. [DOI] [PubMed] [Google Scholar]
  117. Schindelin J, Arganda-Carreras I, Frise E, Kaynig V, Longair M, Pietzsch T, Preibisch S, Rueden C, Saalfeld S, Schmid B, Tinevez JY, White DJ, Hartenstein V, Eliceiri K, Tomancak P, Cardona A. Fiji: an open-source platform for biological-image analysis. Nature Methods. 2012;9:676–682. doi: 10.1038/nmeth.2019. [DOI] [PMC free article] [PubMed] [Google Scholar]
  118. Schlager S. Chapter 9 - Morpho and Rvcg – Shape Analysis in R: R-Packages for Geometric Morphometrics, Shape Analysis and Surface Manipulations. In: Zheng G, Li S, Székely G, editors. Statistical Shape and Deformation Analysis. Academic Press; 2017. pp. 217–256. [DOI] [Google Scholar]
  119. Schlegel P, Costa M, Jefferis GSXE. Learning from connectomics on the fly. Current Opinion in Insect Science. 2017;24:96–105. doi: 10.1016/j.cois.2017.09.011. [DOI] [PubMed] [Google Scholar]
  120. Schmitz SK, Hjorth JJ, Joemai RM, Wijntjes R, Eijgenraam S, de Bruijn P, Georgiou C, de Jong AP, van Ooyen A, Verhage M, Cornelisse LN, Toonen RF, Veldkamp WJ, Veldkamp W. Automated analysis of neuronal morphology, synapse number and synaptic recruitment. Journal of Neuroscience Methods. 2011;195:185–193. doi: 10.1016/j.jneumeth.2010.12.011. [DOI] [PubMed] [Google Scholar]
  121. Schneider-Mizell CM, Gerhard S, Longair M, Kazimiers T, Li F, Zwart MF, Champion A, Midgley FM, Fetter RD, Saalfeld S, Cardona A. Quantitative neuroanatomy for connectomics in Drosophila. eLife. 2016;5:e12059. doi: 10.7554/eLife.12059. [DOI] [PMC free article] [PubMed] [Google Scholar]
  122. Shan Xu C, Januszewski M, Lu Z, Takemura S-Y, Hayworth K, Huang G, Shinomiya K, Maitin-Shepard J, Ackerman D, Berg S, Blakely T, Bogovic J, Clements J, Dolafi T, Hubbard P, Kainmueller D, Katz W, Kawase T, Khairy K, Leavitt L, Ph L, Lindsey L, Neubarth N, Olbris DJ, Otsuna H, Troutman ET, Umayam L, Zhao T, Ito M, Goldammer J, Wolff T, Svirskas R, Schlegel P, Neace ER, Knecht CJ, Alvarado CX, Bailey D, Ballinger S, Borycz JA, Canino B, Cheatham N, Cook M, Dreyer M, Duclos O, Eubanks B, Fairbanks K, Finley S, Forknall N, Francis A, Hopkins GP, Joyce EM, Kim S, Kirk NA, Kovalyak J, Lauchie SA, Lohff A, Maldonado C, Manley EA, McLin S, Mooney C, Ndama M, Ogundeyi O, Okeoma N, Ordish C, Padilla N, Patrick C, Paterson T, Phillips EE, Phillips EM, Rampally N, Ribeiro C, Robertson MK, Rymer JT, Ryan SM, Sammons M, Scott AK, Scott AL, Shinomiya A, Smith C, Smith K, Smith NL, Sobeski MA, Suleiman A, Swift J, Takemura S, Talebi I, Tarnogorska D, Tenshaw E, Tokhi T, Walsh JJ, Yang T, Horne JA, Li F, Parekh R, Rivlin PK, Jayaraman V, Ito K, Saalfeld S, George R, Meinertzhagen I, Rubin GM, Hess HF, Scheffer LK, Jain V, Plaza SM. A connectome of the adult Drosophila Central Brain. bioRxiv. 2020 doi: 10.1101/2020.01.21.911859. [DOI] [PMC free article] [PubMed]
  123. Shih CT, Sporns O, Yuan SL, Su TS, Lin YJ, Chuang CC, Wang TY, Lo CC, Greenspan RJ, Chiang AS. Connectomics-based analysis of information flow in the Drosophila brain. Current Biology. 2015;25:1249–1258. doi: 10.1016/j.cub.2015.03.021. [DOI] [PubMed] [Google Scholar]
  124. Sholl DA. Dendritic organization in the neurons of the visual and motor cortices of the cat. Journal of Anatomy. 1953;87:387–406. [PMC free article] [PubMed] [Google Scholar]
  125. Stepanyants A, Chklovskii DB. Neurogeometry and potential synaptic connectivity. Trends in Neurosciences. 2005;28:387–394. doi: 10.1016/j.tins.2005.05.006. [DOI] [PubMed] [Google Scholar]
  126. Stone T, Webb B, Adden A, Weddig NB, Honkanen A, Templin R, Wcislo W, Scimeca L, Warrant E, Heinze S. An anatomically constrained model for path integration in the bee brain. Current Biology. 2017;27:3069–3085. doi: 10.1016/j.cub.2017.08.052. [DOI] [PMC free article] [PubMed] [Google Scholar]
  127. Strutz A, Soelter J, Baschwitz A, Farhan A, Grabe V, Rybak J, Knaden M, Schmuker M, Hansson BS, Sachse S. Decoding odor quality and intensity in the Drosophila brain. eLife. 2014;3:e04147. doi: 10.7554/eLife.04147. [DOI] [PMC free article] [PubMed] [Google Scholar]
  128. Studholme C, Hill DLG, Hawkes DJ. An overlap invariant entropy measure of 3D medical image alignment. Pattern Recognition. 1999;32:71–86. doi: 10.1016/S0031-3203(98)00091-0. [DOI] [Google Scholar]
  129. Sümbül U, Song S, McCulloch K, Becker M, Lin B, Sanes JR, Masland RH, Seung HS. A genetic and computational approach to structurally classify neuronal types. Nature Communications. 2014;5:3512. doi: 10.1038/ncomms4512. [DOI] [PMC free article] [PubMed] [Google Scholar]
  130. Takemura SY, Bharioke A, Lu Z, Nern A, Vitaladevuni S, Rivlin PK, Katz WT, Olbris DJ, Plaza SM, Winston P, Zhao T, Horne JA, Fetter RD, Takemura S, Blazek K, Chang LA, Ogundeyi O, Saunders MA, Shapiro V, Sigmund C, Rubin GM, Scheffer LK, Meinertzhagen IA, Chklovskii DB. A visual motion detection circuit suggested by Drosophila connectomics. Nature. 2013;500:175–181. doi: 10.1038/nature12450. [DOI] [PMC free article] [PubMed] [Google Scholar]
  131. Takemura SY, Xu CS, Lu Z, Rivlin PK, Parag T, Olbris DJ, Plaza S, Zhao T, Katz WT, Umayam L, Weaver C, Hess HF, Horne JA, Nunez-Iglesias J, Aniceto R, Chang LA, Lauchie S, Nasca A, Ogundeyi O, Sigmund C, Takemura S, Tran J, Langille C, Le Lacheur K, McLin S, Shinomiya A, Chklovskii DB, Meinertzhagen IA, Scheffer LK. Synaptic circuits and their variations within different columns in the visual system of Drosophila. PNAS. 2015;112:13711–13716. doi: 10.1073/pnas.1509820112. [DOI] [PMC free article] [PubMed] [Google Scholar]
  132. Takemura SY, Aso Y, Hige T, Wong A, Lu Z, Xu CS, Rivlin PK, Hess H, Zhao T, Parag T, Berg S, Huang G, Katz W, Olbris DJ, Plaza S, Umayam L, Aniceto R, Chang LA, Lauchie S, Ogundeyi O, Ordish C, Shinomiya A, Sigmund C, Takemura S, Tran J, Turner GC, Rubin GM, Scheffer LK. A connectome of a learning and memory center in the adult Drosophila brain. eLife. 2017;6:e26975. doi: 10.7554/eLife.26975. [DOI] [PMC free article] [PubMed] [Google Scholar]
  133. Tanaka NK, Endo K, Ito K. Organization of antennal lobe-associated neurons in adult Drosophila melanogaster brain. The Journal of Comparative Neurology. 2012;520:4067–4130. doi: 10.1002/cne.23142. [DOI] [PubMed] [Google Scholar]
  134. Tirian L, Dickson B. The VT GAL4, LexA and split-GAL4 driver line collections for targeted expression in theDrosophilanervous system. bioRxiv. 2017 doi: 10.1101/198648. [DOI]
  135. Tobin WF, Wilson RI, Lee WA. Wiring variations that enable and constrain neural computation in a sensory microcircuit. eLife. 2017;6:e24838. doi: 10.7554/eLife.24838. [DOI] [PMC free article] [PubMed] [Google Scholar]
  136. Travis K, Ford K, Jacobs B. Regional dendritic variation in neonatal human cortex: a quantitative golgi study. Developmental Neuroscience. 2005;27:277–287. doi: 10.1159/000086707. [DOI] [PubMed] [Google Scholar]
  137. Treweek JB, Chan KY, Flytzanis NC, Yang B, Deverman BE, Greenbaum A, Lignell A, Xiao C, Cai L, Ladinsky MS, Bjorkman PJ, Fowlkes CC, Gradinaru V. Whole-body tissue stabilization and selective extractions via tissue-hydrogel hybrids for high-resolution intact circuit mapping and phenotyping. Nature Protocols. 2015;10:1860–1896. doi: 10.1038/nprot.2015.122. [DOI] [PMC free article] [PubMed] [Google Scholar]
  138. Venken KJ, Simpson JH, Bellen HJ. Genetic manipulation of genes and cells in the nervous system of the fruit fly. Neuron. 2011;72:202–230. doi: 10.1016/j.neuron.2011.09.021. [DOI] [PMC free article] [PubMed] [Google Scholar]
  139. Wan Y, Long F, Qu L, Xiao H, Hawrylycz M, Myers EW, Peng H. BlastNeuron for automated comparison, retrieval and clustering of 3D neuron morphologies. Neuroinformatics. 2015;13:487–499. doi: 10.1007/s12021-015-9272-7. [DOI] [PubMed] [Google Scholar]
  140. Wearne SL, Rodriguez A, Ehlenberger DB, Rocher AB, Henderson SC, Hof PR. New techniques for imaging, digitization and analysis of three-dimensional neural morphology on multiple scales. Neuroscience. 2005;136:661–680. doi: 10.1016/j.neuroscience.2005.05.053. [DOI] [PubMed] [Google Scholar]
  141. Wickham H. R Packages: Organize, Test, Document, and Share Your Code. O’Reilly Media, Inc 2015
  142. Wickham H. Ggplot2: Elegant Graphics for Data Analysis. New York: Springer-Verlag; 2016. [DOI] [Google Scholar]
  143. Winnubst J, Bas E, Ferreira TA, Wu Z, Economo MN, Edson P, Arthur BJ, Bruns C, Rokicki K, Schauder D, Olbris DJ, Murphy SD, Ackerman DG, Arshadi C, Baldwin P, Blake R, Elsayed A, Hasan M, Ramirez D, Dos Santos B, Weldon M, Zafar A, Dudman JT, Gerfen CR, Hantman AW, Korff W, Sternson SM, Spruston N, Svoboda K, Chandrashekar J. Reconstruction of 1,000 projection neurons reveals new cell types and organization of Long-Range connectivity in the mouse brain. Cell. 2019;179:268–281. doi: 10.1016/j.cell.2019.07.042. [DOI] [PMC free article] [PubMed] [Google Scholar]
  144. Wolff T, Rubin GM. Neuroarchitecture of the Drosophila central complex: a catalog of Nodulus and asymmetrical body neurons and a revision of the protocerebral bridge catalog. Journal of Comparative Neurology. 2018;526:2585–2611. doi: 10.1002/cne.24512. [DOI] [PMC free article] [PubMed] [Google Scholar]
  145. Wong DC, Lovick JK, Ngo KT, Borisuthirattana W, Omoto JJ, Hartenstein V. Postembryonic lineages of the Drosophila brain: ii. identification of lineage projection patterns based on MARCM clones. Developmental Biology. 2013;384:258–289. doi: 10.1016/j.ydbio.2013.07.009. [DOI] [PMC free article] [PubMed] [Google Scholar]
  146. Wu M, Nern A, Williamson WR, Morimoto MM, Reiser MB, Card GM, Rubin GM. Visual projection neurons in the Drosophila lobula link feature detection to distinct behavioral programs. eLife. 2016;5:e21022. doi: 10.7554/eLife.21022. [DOI] [PMC free article] [PubMed] [Google Scholar]
  147. Yu JY, Kanai MI, Demir E, Jefferis GSXE, Dickson BJ. Cellular organization of the neural circuit that drivesDrosophilacourtship behavior. Current Biology : CB. 2010;20:1602–1614. doi: 10.1016/j.cub.2010.08.025. [DOI] [PubMed] [Google Scholar]
  148. Yu HH, Awasaki T, Schroeder MD, Long F, Yang JS, He Y, Ding P, Kao JC, Wu GY, Peng H, Myers G, Lee T. Clonal development and organization of the adult Drosophila Central Brain. Current Biology : CB. 2013;23:633–643. doi: 10.1016/j.cub.2013.02.057. [DOI] [PMC free article] [PubMed] [Google Scholar]
  149. Zhao XC, Kvello P, Løfaldli BB, Lillevoll SC, Mustaparta H, Berg BG. Representation of pheromones, interspecific signals, and plant odors in higher olfactory centers; mapping physiologically identified antennal-lobe projection neurons in the male heliothine moth. Frontiers in Systems Neuroscience. 2014;8:186. doi: 10.3389/fnsys.2014.00186. [DOI] [PMC free article] [PubMed] [Google Scholar]
  150. Zhao T, Plaza SM. Automatic neuron type identification by neurite localization in the Drosophila Medulla. arXiv. 2014 https://arxiv.org/abs/1409.1892
  151. Zheng Z, Lauritzen JS, Perlman E, Robinson CG, Nichols M, Milkie D, Torrens O, Price J, Fisher CB, Sharifi N, Calle-Schuler SA, Kmecova L, Ali IJ, Karsh B, Trautman ET, Bogovic JA, Hanslovsky P, Jefferis GSXE, Kazhdan M, Khairy K, Saalfeld S, Fetter RD, Bock DD. A complete electron microscopy volume of the brain of adult Drosophila melanogaster. Cell. 2018;174:730–743. doi: 10.1016/j.cell.2018.06.019. [DOI] [PMC free article] [PubMed] [Google Scholar]

Decision letter

Editor: K VijayRaghavan1

In the interests of transparency, eLife publishes the most substantive revision requests and the accompanying author responses.

Thank you for submitting your article "The natverse: a versatile toolbox for combining and analysing neuroanatomical data" for consideration by eLife. Your article has been reviewed by three peer reviewers, and the evaluation has been overseen K VijayRaghavan as the Senior and Reviewing Editor. The reviewers have opted to remain anonymous.

The reviewers have discussed the reviews with one another and the Reviewing Editor has drafted this decision to help you prepare a revised submission.

Summary:

This manuscript documents a large number of computational neuroanatomy software packages, and associated workflow pipelines. Many of these tools are potentially useful for neuroanatomy work in a variety of species. That said, the need for these tools has been largely driven by the availability of large public neuroanatomy datasets from the Drosophila brain at single-cell (and often single-synapse) spatial resolution; therefore, it is appropriate that the authors focus their descriptions here on the workflow of the Drosophila neurobiologist, with additional frequent specific instances drawn from other species. Many of the tools described here have already been released publicly by the authors, and they have proven to be useful to many other researchers, as evidenced by the citations they have already accrued. Now the authors have augmented these existing tools with new ones that extend the functionality of the existing tools. This manuscript describes and rationalizes the entire workflow that these tools enable. Broadly speaking, this workflow allows researchers to assemble brain-wide network diagrams with relative ease – beginning with a cell (or cell-type) of interest, researchers can use these tools to identify the upstream and downstream connections of that cell, to locate similar cell types, to identify and classify morphological variants of those cells, to search for genetic driver lines to target cells of interest and their upstream and downstream partners, and to analyze the subcellular distribution of presynapses and postsynapses within cells of interest. Among the key elements here are the bridging registrations that bring different datasets into same 3D coordinate space, allowing researchers to leverage the enormous (and rapidly growing) amount of detailed single-cell and single-synapse-scale neuroanatomical data in Drosophila. This type of computational neuroanatomy workflow is currently revolutionizing Drosophila neurobiology, due in large part to the work of these authors. Without it, each individual lab would write their own custom methods to accomplish a fraction of what is possible in the natverse – the authors should be commended for developing these tools and making them accessible to the community quickly. Overall, this suite of methods represents an incredibly important tool for many disciplines within neuroscience. They are already having a transformative impact on the field of Drosophila systems/circuits neuroscience, and the deployment of these tools for mouse and fish datasets is highly valuable.

Once the manuscript is appropriately revised we would strongly support publication.

Essential revisions:

1) As this is a tools and resources paper, we would like to make a suggestion that might help – to borrow from their own language – the casual user. In the Results section, the authors lay out the various kinds of analyses possible with the natverse suite of tools. Reading the paper, what can be done is increasingly impressive. It might be nice to have this laid out in the front end of the paper in some form. Take, for example, subsection “Bridging template spaces in D. melanogaster” that deals with bridging template spaces – performing in silico intersections of Gal4 lines will help a researcher narrow their search for clean lines. Or in subsection “EM to LM and back again” – researchers are often trying to find 'their neuron' described by light microscopy in an EM volume and vice versa. It will be very nice to state these and other possibilities in the beginning giving the casual user a flavour of what can be done before diving into the details.

2) Skeletonization of neurons: the authors have been leading the way in development of these neuroanatomy tools and have had to make choices about how to connect between different data sets. The choice of skeletonizing a neuron was a useful one, but it is clear that one loses valuable information by skeletonization. Connectome projects are generating beautiful flood filled reconstructions of neurons (e.g. see Li et al., 2019) – the authors develop fafbseg to connect flood-filled segments together based on a manually-traced skeleton. But, it would be useful for the authors to discuss how their EM to LM to EM tools could be adapted to starting with neuron volumes (that they get from fafbseg or from other connectome projects) versus skeletons. They mention this issue very briefly in the Discussion, but don't address how well these tools would work on volume data, or whether new tools would need to be developed, and what the constraints would be.

3) Many users of these resources are generating in vivo neural imaging data and it will be increasingly important to register in vivo brain activity and templates to fixed brain templates (for example, see Mann et al., 2017 or Pacheco et al., 2019). It would be useful for the authors to discuss this issue and to provide such registrations.

4) It would be useful for the authors to generate a small number of example pipeline codes, with some explanation in the code for each step. A small repository with data to use with these example pieces of code will allow a first-time user to run the code locally and get results in a short time, and then to use these pieces of code as a starting point for further analyses. The examples shown in the paper cover all of the basic analysis examples, and adding code that one can simply run to generate the figures shown in the paper, will help new users (if this is included somewhere, apologies)

5) In some cases, a short video with example use will make it much easier to understand and use the tools appropriately. For example for the functions 'nlscan' and 'plot3D'. Generally, short example videos are useful. Such videos already exist for NBLAST and VFB.

6) Having one-line installation is definitely useful and saves time, but a few steps are still needed before running it – this is OK, but please describe them to save troubleshooting time. This reviewer tried installation on two machines, and got the same error on both – API rate limit exceeded for xx. Perhaps a description like this would help:

Install R, Rstudio

Create a Github account if you don't have one install.packages("usethis") usethis::browse_github_pat() usethis::edit_r_environ() then update the environment variable with the personal access token, following the instructions in Rstudio - copy your personal access token as instructed

Paste to.…\.Renviron

Should have, for example, these two lines:

GITHUB_PAT=xx

eLife. 2020 Apr 14;9:e53350. doi: 10.7554/eLife.53350.sa2

Author response


Essential revisions:

1) As this is a tools and resources paper, we would like to make a suggestion that might help – to borrow from their own language – the casual user. In the Results section, the authors lay out the various kinds of analyses possible with the natverse suite of tools. Reading the paper, what can be done is increasingly impressive. It might be nice to have this laid out in the front end of the paper in some form. Take, for example, subsection “Bridging template spaces in D. melanogaster” that deals with bridging template spaces – performing in silico intersections of Gal4 lines will help a researcher narrow their search for clean lines. Or in subsection “EM to LM and back again” – researchers are often trying to find 'their neuron' described by light microscopy in an EM volume and vice versa. It will be very nice to state these and other possibilities in the beginning giving the casual user a flavour of what can be done before diving into the details.

This is a good suggestion. We have added a new paragraph at the end of the Introduction so that the reader has an overview of the key applications discussed in the Results.

2) Skeletonization of neurons: the authors have been leading the way in development of these neuroanatomy tools and have had to make choices about how to connect between different data sets. The choice of skeletonizing a neuron was a useful one, but it is clear that one loses valuable information by skeletonization. Connectome projects are generating beautiful flood filled reconstructions of neurons (e.g. see Li et al., 2019) – the authors develop fafbseg to connect flood-filled segments together based on a manually-traced skeleton. But, it would be useful for the authors to discuss how their EM to LM to EM tools could be adapted to starting with neuron volumes (that they get from fafbseg or from other connectome projects) versus skeletons. They mention this issue very briefly in the Discussion, but don't address how well these tools would work on volume data, or whether new tools would need to be developed, and what the constraints would be.

We have amended the subsection “Neuron skeleton data” to address neuron volume data briefly, ​ and state in the Discussion that updating the natverse​ to work with, and implement specific algorithms for, volume representations for neurons is a future goal (Discussion, second paragraph)​.​ We agree with the reviewer that it will become increasingly important as volume data for EM neurons becomes more prevalent.

3) Many users of these resources are generating in vivo neural imaging data and it will be increasingly important to register in vivo brain activity and templates to fixed brain templates (for example, see Mann et al., 2017 or Pacheco et al., 2019). It would be useful for the authors to discuss this issue and to provide such registrations.

We feel that it is out of the scope of the current work (which already covers a wide range of applications) to provide an example of registering functional imaging data. However, the reviewer has raised an important point, and we have now adapted the penultimate paragraph of the Discussion to note how one might link functional imaging data to, for example, EM data.​

4) It would be useful for the authors to generate a small number of example pipeline codes, with some explanation in the code for each step. A small repository with data to use with these example pieces of code will allow a first-time user to run the code locally and get results in a short time, and then to use these pieces of code as a starting point for further analyses. The examples shown in the paper cover all of the basic analysis examples, and adding code that one can simply run to generate the figures shown in the paper, will help new users (if this is included somewhere, apologies).

This is included in our GitHub repo: https://github.com/natverse/nat.examples. We have reworded part of the fourth paragraph of the Introduction, to make the nature of this code a little clearer.​

5) In some cases, a short video with example use will make it much easier to understand and use the tools appropriately. For example for the functions 'nlscan' and 'plot3D'. Generally, short example videos are useful. Such videos already exist for NBLAST and VFB.

The reviewer makes a good point. We have now included Videos 1-5 (20 minutes in total) that demonstrate some of our functions in RStudio, giving examples for plotting neurons, bridging neurons between template brains, using NBLAST and comparing EM and LM data. We will also add this to the natverse website. The video can also be found as a single 20 min file here:

https://drive.google.com/open?id=1wBpAG4V9bBujdLcryWk8P-K7YTpgfZTO

6) Having one-line installation is definitely useful and saves time, but a few steps are still needed before running it – this is OK, but please describe them to save troubleshooting time. This reviewer tried installation on two machines, and got the same error on both – API rate limit exceeded for xx. Perhaps a description like this would help:

Install R, Rstudio

Create a Github account if you don't have one install.packages("usethis") usethis::browse_github_pat() usethis::edit_r_environ() then update the environment variable with the personal access token, following the instructions in Rstudio - copy your personal access token as instructed

Paste to.…\.Renviron

Should have, for example, these two lines:

GITHUB_PAT=xx

This is a very good point that we had largely overlooked during testing, since even our users without a natverse installation had already authenticated with GitHub. We have now provided what we hope is a significantly improved installation procedure which should remove the need for GitHub authentication in most situations, while helping the user through the setup process listed above if necessary. We also provide more detailed installation instructions covering this point. Finally we also suggest a simplified install of “core” packages that bypasses this requirement altogether. We have updated our Materials and methods (subsection “R packages for Neuroanatomy”​)​ as well as the instructions at http://natverse.org/install. Essentially​

install.packages("natmanager")

# Then basic install.… natmanager::install("core")

# OR full install

natmanager::install("natverse")

should handle most install cases without requiring the end user to authenticate with GitHub.

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    Transparent reporting form

    Data Availability Statement

    The bridging and mirroring registrations are deposited in two version controlled repositories at http://github.com with revisions uniquely identified by the SHA-1 hash function. As some template spaces may have multiple versions, we identify each version by its SHA-1 hash as this is uniquely dependent on the data contained in each file. Since we use the distributed version control system, git, any user can clone a complete, versioned history of these repositories. We have also taken a repository snapshot at the time of the release of this paper on the publicly funded http://zenodo.org site, which associates the data with a permanent digital object identifiers (DOIs).To simplify data access for colleagues, we have provided spatially calibrated template spaces for the main template spaces in use by the Drosophila community in a single standard format, NRRD. These brain images have permanent DOIs listed in Table 2. We have also generated registrations for the entire FlyCircuit single neuron and FlyLight datasets. The registered images have been deposited at http://virtualflybrain.org. The R packages nat.flybrains and elmr in the natverse also contain easy-to-use functions for deploying these registrations. The complete software toolchain for the construction and application of registrations consists exclusively of open source code released under the GNU Public License and released on http://github.com and http://sourceforge.net. A full listing of these resources is available at http://jefferislab.org/si/bridging. All these steps will ensure that these resources will be available for many years to come (as has been recommended Ito, 2010).

    All code is described at http://natverse.org/ which links to individual git repositories at https://github.com/natverse.


    Articles from eLife are provided here courtesy of eLife Sciences Publications, Ltd

    RESOURCES