Skip to main content
Genomics, Proteomics & Bioinformatics logoLink to Genomics, Proteomics & Bioinformatics
. 2019 Dec 2;17(4):381–392. doi: 10.1016/j.gpb.2019.09.003

How Big Data and High-performance Computing Drive Brain Science

Shanyu Chen 1,2,a, Zhipeng He 1,2,b, Xinyin Han 1,2,c, Xiaoyu He 1,2,d, Ruilin Li 1,2,e, Haidong Zhu 1,2,f, Dan Zhao 1,2,g, Chuangchuang Dai 1,2,h, Yu Zhang 1,2,i, Zhonghua Lu 1,j, Xuebin Chi 1,2,4,k, Beifang Niu 1,2,3,⁎,l
PMCID: PMC6943776  PMID: 31805369

Abstract

Brain science accelerates the study of intelligence and behavior, contributes fundamental insights into human cognition, and offers prospective treatments for brain disease. Faced with the challenges posed by imaging technologies and deep learning computational models, big data and high-performance computing (HPC) play essential roles in studying brain function, brain diseases, and large-scale brain models or connectomes. We review the driving forces behind big data and HPC methods applied to brain science, including deep learning, powerful data analysis capabilities, and computational performance solutions, each of which can be used to improve diagnostic accuracy and research output. This work reinforces predictions that big data and HPC will continue to improve brain science by making ultrahigh-performance analysis possible, by improving data standardization and sharing, and by providing new neuromorphic insights.

Keywords: Brain science, Big data, High-performance computing, Brain connectomes, Deep learning

Introduction

A human brain has about 100 billion neurons [1]. Each neuron must pass through approximately 1015 neurons to communicate with another neuron [2]. The brain is responsible for human intelligence. Compromised functioning of the brain resulting from brain disease causes more than 6.3% of the global disease burden in terms of disability-adjusted life years. Moreover, the World Health Organization (WHO) noted that the proportion of brain disease in the global disease burden is projected to increase by 12% from 2005 to 2030 [3]. In 2010, around $2.5 trillion spent on research on brain disease, an amount that is estimated to increase to $6 trillion by 2030 [4]. According to statistics from the WHO, brain diseases account for 12% of the world's total deaths [5]. Also, in low- and middle-income countries and high-income countries, encephalopathy accounts for 16.8% and 13.2% of the total death toll, respectively [5].

Driven by the rising incidence of brain diseases, brain research is important for understanding brain function mechanisms, promoting the diagnosis and treatment of brain diseases, and improving the development of brain-like intelligence. In 2013 [6], the European Union announced the Human Brain Project (HBP) to strengthen neuroscience research and gain a comprehensive understanding of brain function through worldwide research [6]. Moreover, the USA, China, and many other countries and organizations had also focused on and invested in brain-research projects (Table 1). The governments of these countries had attempted to build platforms for studying the human brain using neuroinformatics, brain simulation, and brain-tissue science to create a validation model that runs on supercomputers [7].

Table 1.

National brain projects

graphic file with name fx1.gif

Note: Brain/MINDS, Brain Mapping by Integrated Neuro technologies for Disease Studies; MIRI, Multi-Investigator Research Initiative; PSG, platform support grants.

The development of big data technology and HPC has contributed to insights gained from these brain-research projects. Big data methods have improved the details of brain scans in several ways, thereby laying the foundation for the generation of new knowledge that can drive understanding of the human brain even further. HPC methods have advanced data storage, computational accuracy, and computational speed, and thereby assist in the processing of vast and very complex data sets. The development and popularization of brain science, big data, and HPC methods are shown in Figure 1. Data, methods, and computing power are being continuously added to brain science research. Particularly after 2006, brain science combined with big data and deep learning has become a research hotspot. Subsequently, the support of neural networks and HPC for brain science research has also been enhanced considerably. An overall increase in interdependence has been observed from 2000 to 2018.

Figure 1.

Figure 1

Research overview of big data and HPC methods in brain science

A. The heatmap shows the changes in the number of articles published annually from 2000 to 2018 in four research directions of brain science: brain science with HPC; brain science with deep learning; brain science with big data; and brain science with neural networks. Articles in brain science with deep learning, brain science with big data, and brain science with neural networks reached their highest numbers in 2013, whereas articles in brain science with HPC reached its highest number in 2018. All articles were retrieved by searching using keywords “brain science, HPC” (BS-HPC), “brain science, deep learning” (BS-DL), “brain science, big data” (BS-BD), or “brain science, neural network” (BS-NN) in Google Scholar in September 2019. B. Combinations between brain science and big data or HPC methods. Big data provide a wealth of knowledge and data, from which neural networks and deep learning methods can extract features that represent brain functions, mechanisms, or diseases. Big data can also be used to build computational models. HPC provides storage space and formidable computing power for the study of brain science. HPC, high-performance computing.

Combination and revolution

Brain science is based primarily on biological insights and data-driven, bottom-up experimental research. Figure 2 shows the current research fields of brain science: brain function/mechanisms, diagnosis of brain disease, brain-like intelligence, and refinement of sub-areas. These seemingly different research areas are interrelated. In biology, “classical reductionism” suggests that each entity comprises smaller parts. That is, the aging process, decision-making principles, pathogenesis, or brain cognition are the “macroscopic” consequences of “microscopic” behavior and reaction in the brain. In this sense, “microscopic” denotes less vision and more knowledge and, most importantly, big data. In addition, these connections between the microscopic parts are usually linear [8], and there are approximately 1015 such linear connections in the brain [9]. Thus, exploring microscopic characteristics in such a large number of structures and complex connections results in extensive use of computational resources. Moreover, as brain science advances, the resolution becomes infinite.

Figure 2.

Figure 2

General classification of research activities in brain science

This figure shows the main research directions in contemporary brain science.

The cycle of acquisition of functional magnetic resonance imaging (fMRI) data has accelerated from 4 s to 1 s in speed, and the resolution has increased from 5 mm3 to 1 mm3 [10]. Faster speeds and more detailed visions bring challenges while bringing amazing scientific discoveries. For example, genome-wide association studies combined with functional and structural brain-imaging phenotypes to verify genes for iron transport and storage are associated with the magnetic susceptibility of subcortical brain tissue [11]. However, a single genome contains around 180 GB of uncompressed data that is equivalent to 30 copies of 3 billion bases, and 100 GB of compressed information needs to be retained over time [12]. Therefore, the ability to process data analysis and workflows, meet data storage/simulation requirements, and even break the limits of computational speed are critical factors in brain science. Here, big data and HPC bear the brunt of these challenges. Advances in data storage and mining technologies allow users to retain increasing amounts of data directly or indirectly and to analyze such data for valuable discoveries [13]. HPC provides a high-speed computing environment that meets high-throughput and multitasking computing features, including the use of clusters of multiple processors or computers as part of a single machine [14].

In recent decades, research funding agencies and scientists have placed great importance on the use of big data and HPC techniques in brain science. With regard to big data, the HBP’s fifth sub-project is to build a Neuroinformatics Computing Platform (NCP) for integrating multidimensional brain data (e.g., molecules, genes, cells, networks) and provide powerful analysis of brain data to simplify models of actual neuronal circuits [15]. The Allen Human Brain Atlas is a unique multi-modal atlas that maps gene expression across adult brains [16], including magnetic resonance imaging (MRI), diffusion tensor imaging, histology, and gene-expression data [17]. The BrainSpan Atlas of the developing human brain is a unique resource for exploring the development of the human brain. It comprises in situ hybridization data, RNA sequencing, microarray data, as well as broad and detailed anatomical analysis of gene expression during human brain development [17]. The seventh sub-project of the HBP has been to establish a platform for high-performance analytics and computing designed to provide the HBP and the neuroscience community with HPC systems to meet their particular [18]. Currently, the HBP has four tier-0 supercomputer centers: Barcelona Supercomputing Centre, Cineca, Centro Svizzero di Calcolo Scientifico and Jülich Supercomputing Centre (JSC) [6]. The supercomputer JUROPA at the JSC has developed an ultrahigh-resolution three-dimensional (3D) human brain model called BigBrain [19]. An account of research in brain science combined with other fields is shown in Figure 3. The amount of literature on brain science that encompasses the other four fields detailed in Figure 3 increased from 2000 to 2018. The correlation coefficients in Figure 3 demonstrate that these combinations have stimulated productivity in brain science.

Figure 3.

Figure 3

Research status of brain science in combination with other fields

This figure shows the number of articles listed in Google™ Scholar each year from 2000 to 2018 by the following terms: BS, BS-NN, BSBD, BS-DL, and BS-HPC. This figure comprises three parts. The histogram shows trends in the number of articles on BS combined with each of the other four fields. Each thumbnail in the lower triangular area consists of a correlation ellipse, a scattergram of the corresponding rows and columns, and its LOWESS smoothing curve. The correlation ellipse indicates the correlation between corresponding rows and columns. A flatter oval indicates a stronger correlation. The LOWESS smoothing curve shows the trend between the two sets of data over time. Each thumbnail in the upper triangle contains a value that represents the correlation coefficient of the corresponding row and column. For example, the value 0.44 in the first row and the second column refers to the correlation coefficient between BS and BS-NN. BS, brain science; BS-NN, brain science with neural network; BS-BD, brain science with big data; BS-DL brain science with deep learning; BS-HPC, brain science with high-performance computing.

Research advances enabled by big data and HPC

Function and mechanisms within the brain

Brain activity consists of dynamic sets of input sensations and spontaneous responses [20]. However, the brain is very active even in the absence of explicit inputs or outputs [21]. Researchers often create brain-simulation models (e.g., simulations of brain molecules and brain cells) or use fMRI data of living human brains to gain insight into brain structure. For example, one MRI time-course of 512 echo-planar images in a resting human brain obtained every 250 milliseconds showed fluctuations of physiologic origin in signal intensity in each pixel. Such data can reveal functional connections in the motor cortex of the resting human brain [22]. Furthermore, recent functional imaging studies have revealed co-activation in a distributed network of cortical regions that characterize the quiescent state or “default mode” of the human brain [23]. Subtle imaging provides a large data set that requires a large storage space and capability to undertake high-resolution analysis [24]. Applications that mimic brain structures could simulate about 100 billion neurons at the molecular level, and each requires 0.1 megabyte (MB) to 10 terabytes (TB) of memory [25]. The fMRI divides the brain into tens of thousands of voxels and then images the entire brain continuously with high time resolution. One scan is used as a processing unit, so the entire brain generates a very large amount of high-dimensional data [26]. Although extant computational models often carry out feature combinations (e.g., Suk et al. [27]) in studies on brain function and brain mechanisms and even multimodal fusions of features (Liu et al. [28]) to reduce “dimensional disasters”, their capacity to leverage computational power to analyze highly multidimensional data is limited.

Deep learning computational models use eigenvector sets to represent biological information. This strategy enables computational models that consist of multiple processing layers to learn data representations with multiple levels of abstraction [29]. Li et al. [30] used image-analysis technology combined with a Statistical Parametric Mapping (SPM) and Voxel-Based Morphometry (VBM) toolbox to visualize cerebrospinal fluid and other brain components. Besides, Vidaurre et al. [31] proposed a framework that used a hidden Markov model to infer a consistent and interpretable dynamic brain network in different data sets. Spitzer et al. [32] proposed a model based on a convolution neural network (CNN) to predict 13 regions of the human visual system. Even though deep learning provides important insights, the deep neural network is a complex hierarchical structure similar to a biological neural network. Each layer consists of several “artificial neurons”; the more layers in the neural network, the better the insights. However, mapping relationships from input layers to output layers often reveals non-linear relationships, and some deep learning models may even extend to brain mechanisms at the 3D level (e.g., Payan et al. [33], Hosseini-Asl [34]).

Whether using data or computational models, it is clear that the exploration of brain function and brain mechanisms demands a great deal of storage capacity and powerful computing capabilities. Big data and HPC approaches are, therefore, necessary in all brain science studies that generate such data. This is especially true if supercomputers are used to model brain function. For instance, Apache Spark with autoencoder for fMRI [35], and JUROPA in JSC have both implemented ultrahigh-resolution 3D models of the human brain [36], thereby greatly improving computational performance and the speed of data processing. In addition, researchers in China have used the Tianhe-1A supercomputer for NEST (NEural Simulation Tool), which showed that a single compute node implements a neural-network simulation of 7.3 × 104 neurons and implementation of 5.6 × 106 neurons on 128 compute nodes [12]. Use of an NVIDIA graphic processing unit (GPU) accelerates the Groningen MAchine for Chemical Simulation (GROMACS) [37] by a factor of 3–4, thereby reducing the time spent on simulations of molecular dynamics in the brain from days to hours. HPC helps virtual epilepsy (VEP) brain model to explore system parameter space, fit and validate brain model, thus promoting large-scale brain models to encourage the development of personalized treatment and intervention strategies [38]. HPC methods not only allow new studies on brain science to overcome traditional hardware and software constraints, they also foster a new research dynamic whereby studies of intrinsically biological problems can be studied by relying on HPC methods only. The advantage of this development is that HPC methods enable in-depth quantification that was impossible previously [19]. For instance, the Salk Institute has established a 3D molecular computational model to understand the human brain through neural communication processes in the ciliary ganglion of the chicken, which model adjusts data parameters with the aid of a central HPC system in the San Diego Supercomputer Center.

Research on brain disease

The diagnosis and treatment of brain diseases—especially Alzheimer’s disease (AD) and Parkinson’s disease (PD)—is an important focus of clinical research into brain disease in many countries. AD is a progressive neurologic disease that presents as memory and cognitive dysfunctions [39]. Major clinical manifestations include memory impairment, aphasia, agnosia, altered personality, and behavior changes [40]. The failure rate of treatment in the human brain in clinical trials for AD reached 99.6% from 2002 to 2013 [41]. In 2010, the total number of people with dementia globally was approximately 35.6 million. This figure is expected to double every 20 years to reach 65.7 million in 2030 and 115.4 million in 2050 [42]. PD is a common dyskinetic disease in the elderly that affects 1.5–26 per 100,000 of the general adult population [43]. Projections suggest that the number of patients with PD aged over 50 years may reach 8.7–9.3 million worldwide by 2030 [44]. Research on brain disease has become a high priority in several countries. The Japan Brain Project [45] seeks to improve the understanding of human brain diseases such as AD and schizophrenia via experimentation on marmoset brains. The China Brain Project seeks to study pathogenic mechanisms and to develop efficacious diagnostic and therapeutic approaches for developmental disorders of the brain (e.g., autism, intellectual disabilities), neuropsychiatric disorders (e.g., depression, addiction), and neurodegenerative disorders (e.g., AD, PD) [46]. The urgency to reduce the escalating societal burden associated with these disorders and the ineffectiveness of current therapies has resulted in calls for early diagnoses at pre-symptomatic and prodromal stages so that early intervention may halt or delay disease progression [46].

With a more profound understanding of brain function and mechanics driven by big data and HPC approaches, researchers are better able to diagnose and treat disease. More information about brain diseases is key to improving the diagnosis and treatment of brain diseases, so it is necessary to use big data and HPC methods to build detailed and effective disease computational models. For example, big data collected from traditional medical imaging or advanced wearable sensing devices (see Shamir et al. [47]) has produced huge volumes of information. Also, Gupta and colleagues [48] used an autoencoder to understand the features of 2D patches through MRI. Yang and co-workers [49] used MRI data from the Open Access Series of Imaging Studies and Alzheimer’s Disease Neuroimaging Initiative (ADNI) to develop a classification of AD. Machine learning and deep learning methods for disease diagnosis (e.g., support vector machine (SVM) [50], Gaussian kernel SVM [51], enhanced logistic regression model [52], deep belief networks [53]) have also been found to be indispensable in improving efficiency. Big data combined with deep learning models not only provide increased information generation and analytical efficiency but also improve accuracy. Experimental accuracy with respect to AD, mild cognitive impairment (MCI), and MCI-converter diagnoses have reached 95.90% in the ADNI data set [25]. Moreover, Sarraf et al. [54] used fMRI data with CNN and LeNet-5 to diagnose AD, and achieved an accuracy of 96.86% using test data. Payan [28] used 3D-CNN to achieve accuracy ≤95.39% when classifying patients as AD or healthy controls.

While big data and deep learning methods have yielded notable benefits, they have also introduced new computational challenges that can be resolved only by HPC methods. The machine from Hewlett–Packard Development Company contains 160 TB of single memory (equivalent to 160 million books of active memory). This helps the German Center for Neurodegenerative Diseases speed up the genomics pipeline by nine times, meaning that the process would have taken only approximately 36 seconds instead of the original 25 min [22]. Moreover, a research team at the Friedrich-Alexander-Universität (Germany) used a supercomputer from the Regionales Rechenzentrum Erlangen to carry out all-atom molecular dynamics simulations in an explicit solvent of 0.7 μs in total on five Aβ9-42 oligomers (monomers through to pentamers) to reveal Aβ peptides, an important factor in AD [55]. All in all, these examples show the indispensable nature of HPC approaches.

Brain models and connectomes

The human brain is a complex multi-scale structure in space and time, and produces fine molecular, cellular and neuronal phenomena [56], [57]. Neuroimaging can provide brain images with high temporal and spatial resolution, but dynamic information for the brain is lacking. Therefore, in brain science research, simulation tools, brain models, and connectomes have been developed gradually and built to provide simulation information of neurons, brain structures, and networks. Simulation tools focus on individual neurons and the corresponding models of ion channels [58]. For example, the GEneral NEtwork SImulation System (GENESIS) was designed to simulate neural networks using standard and flexible methods to obtain detailed and realistic models [59]. A model of the human brain is a “reference brain” that provides important biological details. It outlines the spatial framework and brain composition from a macroscopic perspective and helps researchers extract and analyze microscopic data from molecular processes to various behaviors for modeling and simulation. The brain connectome is the “Google™ Maps” for brain models. It provides precise human brain coordinates and helps researchers transform detailed neural connections with human brain cognition and behavior. Connectomes map elements to human brain networks dynamically [60], where circuit abnormalities presumably reflect a complex interplay between genes and the environment [61]. Large-scale models of the human brain and connectomes not only provide basic insights of brain structure [62] but also serve as biomarkers of brain diseases to help researchers explain diseases such as AD and PD [63], [64], [65], [66], [67] and even help researchers understand the sex-based differences in human behavior [68].

Vast human brain structures and high-resolution imaging technology determine the essence of brain models and connectomes to be a big data set. The brain model of the HBP consists of 100 neocortical columns [69]. Defense Advanced Research Projects Agency (DARPA)’s Synapse project 500 billion neurons [70], [71], [72]. Also, Izhikevich et al. published a detailed large-scale thalamocortical model that simulates one million multicompartmental spiking neurons [73]. On a microscopic scale, the number of neurons and synapses contained in brain connectomes is approximately 1010–1011 and 1014–1015 [74]. At the macroscopic scale, the cortical hypothalamic junction contains hundreds of brain regions and thousands of comprehensive pathways data sets [75]. Currently, the Human Connectome Project (HCP) already has 7-T diffusion magnetic resonance imaging (dMRI) and 3-T resting-state fMRI (R-fMRI) data [76], [77], [78]. Not only structure but also high-resolution imaging. When building a brain model, an optical microscope sufficient to track a single neuron has a resolution of 0.25–0.5 microns, and an electron microscope capable of displaying synaptic or chemical signals has a resolution of nanometers [79]. Diffusion tensor imaging [80] and four main MRI modes (structural MRI, task fMRI, dMRI, R-fMRI) can be used to measure connectivity in the brain [61] with resolution of 1–3 mm or even smaller [81]. Brain models and connectomes as big data sets provide abundant information and knowledge to drive the development of brain-research programs. The Izhikevich neuron model has influenced more than 3000 academic studies by 2019. Based on the 500-subject release of the HCP, the Budapest Reference Connectome Server v2.0 and v3.0 has generated the common sides of connectomes in 96 and 477 different cortical layers [82], [83]. Also, disease research by the HCP applies HCP-style data to people at risk or suffering from brain disease (e.g., anxiety, depression, epilepsy) [84].

The human brain is not only a simple big data set, but also a complex mathematical object. Hence, building models of the human brain and connectomes requires powerful platforms for data storage and processing. To meet the HPC requirements for brain models, the Izhikevich neuron model [85] used the Beowolf Cluster [86] with 60 processors of 3-GHz each, the HBP and Synapse project used the IBM Blue Gene supercomputer, and Spaun used eight Core Xeon processors (2.53 GHz) [71]. The HCP infrastructure uses IBM HPCS from the WU Center for High Performance Computing to execute pipelines and user-submitted jobs to meet the high-throughput data-processing requirements of approximately 200,000 inputs and outputs per second [71], [87]. In addition, the HCP has established a set of informatics tools, including ConnectomeDB [88] and Connectome Workbench [89], to collect, process, share and visualize high-throughput data [90]. Not only brain models and connectomes, workflows and simulation tools are moving toward high-performance computing and distribution. Simulation tools such as GENESIS [59], NEURON [91], NEST [84], Network and Cache Simulator (NCS) [92], Neosim [93], and SpikeNET [94] have also been extended to support parallel processing systems to improve performance. After parallelization, NEURON achieved almost linear acceleration. It requires an integration time of 9.8 seconds and communication time of 1.3 seconds if running 40,000 realistic cells on 2000 processors on the IBM Blue Gene supercomputer [95]. Also, the BigBrain-based 3D-PLI workflow uses the JUDGE and JUROPA supercomputers from the JSC to meet the requirements for data reading, analysis, and calculation [96]. The JUROPA supercomputer owns 26,304 cores, and its Rpeak reaches 308.3 TP/s. With HICANN from Heidelberg University, brain activity can be simulated at 10,000 times the normal speed to compress one day into ten seconds [72]. The HBP estimates that supercomputers with exaflop computing speed and exabyte computing memory can simulate a complete human brain (1000 times that of the rodent brain) [69]. With the development of ultrahigh-performance computers and computing environments, a model of the whole brain and dynamic brain connectomes will be completed eventually.

Prospects for brain science

Using big data and HPC methods to address problems in high-dimensional brain science research is important. Hence, research programs on brain science (and especially large government-led programs) are becoming more reliant on supercomputers and large databases. Table 1 introduces the basic concepts of some national brain programs, including commonly used platforms and supercomputers. Table 2 lists several large databases on brain science research. At this stage, governments and funding agencies have resolved economic problems, and several mature HPC environments (and some customized for brain science research) are available. However, three main limitations exist. First, computing resources are scattered. As shown in Table 3, Sequoia, K, JURECA, and Tianhe-1A are among the top-500 supercomputers in the world, and are ranked 13th, 20th, 52th, and 87th, respectively (June 2019). These supercomputers are in the USA, Japan, Germany, and China, respectively. In the near future, we hope that a cooperative high-performance platform, shared high-performance resources, and universal standards for analysis of brain science data can be created. The second limitation is based on storage capacity and computational performance. As shown in Figure 1, Figure 3, the correlation coefficient between brain science and the HPC is smaller than that of the others, but the research output in these fields has been increasing from 2000 to 2018. In contrast, although the correlation coefficient is larger, the number of published articles in brain science and other fields decreased slightly after 2013. These results could be because brain science using HPC is constantly evolving, but the support provided by HPC is insufficient and limits the further development of brain science with other fields. As one of the most important elements of the brain model, the amount of imaging data of a neuron can reach the petabyte (PB) level readily at a resolution of microns or even nanometers, which is far beyond that in the human genome. The Sequoia supercomputer can simulate 530 billion neurons and 137 trillion synapses, but this is less than 1% of the information-processing capacity of humans [10]. The K supercomputer analyzed approximately 1% of neuron connections in the simulated human brain; this implies that simulation of the entire brain neuron requires 100 times the power of the best supercomputer performance. The JSC has developed an algorithm that can simulate 100% of neuron connections but, unfortunately, no supercomputer can run this algorithm [97]. Alternatively, a neuromorphic computer that breaks the conventional Von Neumann architecture could be developed [98] or even a combination of traditional supercomputers with neuromorphic computing could be initiated. The third limitation is the lack of randomness and dynamics. To fully grasp the relationship between the human brain and consciousness, emotions, and even thinking, a dynamic and complete brain model is, ultimately, what is needed. The “perfect” brain should be able to simulate all dynamic neural activity (not just static images), support the ability to switch visual imaging at any resolution, and even provide brain coordinates at any scale. This perfection requires not only sufficient storage space and flexible and variable computing environments, but also meticulous process design and impeccable visualization tools. To further meet random and dynamic requirements, these tools should be developed to be distributed and parallel, and they should even be highly portable.

Table 2.

Databases related to brain science

graphic file with name fx2.gif

Note: AD, Alzheimer's disease; BIRN, Biomedical Informatics Research Network; BP, bipolar disorder; MRI, magnetic resonance imaging; fMRI, functional MRI; tfMRI, task-based fMRI; HCP, Human Connectome Project; BCP, Baby Connectome Project; ABCD, Adolescent Brain Cognitive Development; OASIS, Open Access Series of Imaging Studies.

Table 3.

Supercomputers related to brain science

graphic file with name fx3.gif

In general, the development of brain science is destined to be inseparable from the support of big data and HPC. Big data, HPC and brain science promote each other and will break all the technical bottlenecks. A detailed and accurate atlas of the brain, and a fully simulated dynamic computer model will be produced through the use of big data and HPC methods, which could be used as an adjunct to the mouse model for developing new drugs associated with brain disease.

Conclusion

Big data and HPC have become indispensable for: (i) exploration of brain function; (ii) determining the mechanisms of brain disease; (iii) building a whole-brain model and dynamic connectomes. Big data provides large databases on knowledge of the brain, such as the Allen Human Atlas, and efficient frameworks for big data analysis, such as Apache Spark. HPC methods use platforms such as the JSC and supercomputers such as Sequoia, and can solve the challenges of computational performance caused by large data sets and complex models. HPC methods require more storage space but provide increasingly powerful simulation capabilities to reduce runtimes for complex simulations from days to hours. Big data combined with deep learning models can increase the diagnostic accuracy of AD to more than 90%. HPC has also transformed biology into a science of deep quantitative analysis, and has made breakthroughs in characterizing neural communications.

Brain science will continue to develop in a more comprehensive, precise, and detailed direction with the support of governments and the scientific community. Over time, big data analysis for brain science will be standardized, HPC will be shared and coordinated, and new, ultrahigh-performance forms of computer will become a universal reality. However, as big data methods generate ever-larger pools of data, ever-faster and more powerful computational methods will be required to analyze them. Therefore, the “one-way ratchet” of this relationship suggests that brain science will become more reliant on big data and HPC methods in the future.

Competing interests

The authors have declared no competing interests.

Acknowledgments

This work was supported by the National Natural Science Foundation of China (Grant No. 31771466), the National Key R&D Program of China (Grant Nos. 2018YFB0203903, 2016YFC0503607, and 2016YFB0200300), the Transformation Project in Scientific and Technological Achievements of Qinghai, China (Grant No. 2016-SF-127), the Special Project of Informatization of Chinese Academy of Sciences, China (Grant No. XXH13504-08), the Strategic Pilot Science and Technology Project of Chinese Academy of Sciences, China (Grant No. XDA12010000), and the 100-Talents Program of Chinese Academy of Sciences, China (awarded to BN).

Handled by Xiangdong Fang

Footnotes

Peer review under responsibility of Beijing Institute of Genomics, Chinese Academy of Sciences and Genetics Society of China.

References

  • 1.Bartheld V.S.C. Myths and truths about the cellular composition of the human brain: a review of influential concepts. J Chem Neuroanat. 2017;93:2–15. doi: 10.1016/j.jchemneu.2017.08.004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Murre J.M.J., Sturdy D.P.F. The connectivity of the brain: multi-level quantitative analysis. Biol Cybern. 1995;73:529. doi: 10.1007/BF00199545. [DOI] [PubMed] [Google Scholar]
  • 3.Organization W.H. Annexes and index. In: Campanini B., editor. Neurological Disorders: Public Health Challenges. WHO Press; Switzerland: 2006. pp. 183–218. [Google Scholar]
  • 4.Marquez P.V., Saxena S. Making mental health a global priority. Cerebrum. 2016;2016 cer-10–6. [PMC free article] [PubMed] [Google Scholar]
  • 5.Dua T, Cumbrera M, Mathers C, Saxena S. Global burden of neurological disorders: estimates and projections. In: B C. (ed) Neurological Disorders: Public Health Challenges. Switzerland: WHO Press; 2006, p. 27–39.
  • 6.Shepherd G.M., Mirsky J.S., Healy M.D., Singer M.S., Skoufos E., Hines M.S. The Human Brain Project: neuroinformatics tools for integrating, searching and modeling multidisciplinary neuroscience data. Trends Neurosci. 1998;21:460–468. doi: 10.1016/s0166-2236(98)01300-9. [DOI] [PubMed] [Google Scholar]
  • 7.Markram H., Meier K., Lippert T., Grillner S., Frackowiak R., Dehaene S. Introducing the Human Brain Project. Procedia Comput Sci. 2011;7:39–42. [Google Scholar]
  • 8.Geerts H., Dacks P.A., Devanarayan V., Haas M., Khatchaturian Z., Gordon M.F. From big data to smart data in Alzheimer's disease: the brain health modeling initiative to foster actionable knowledge. Alzheimers Dement. 2016;12:1014–1021. doi: 10.1016/j.jalz.2016.04.008. [DOI] [PubMed] [Google Scholar]
  • 9.Poo M.M. Whereto the mega brain projects? Natl Sci Rev. 2014;1:12–14. [Google Scholar]
  • 10.Liu Y.D., Hu D.W. Computing in the viewpoint of brain research. Chin J Comput. 2017;40:2148–2166. (in Chinese with an English abstract) [Google Scholar]
  • 11.Elliott L.T., Sharp K., Alfaro-Almagro F., Shi S., Miller K.L., Douaud G. Genome-wide association studies of brain imaging phenotypes in UK Biobank. Nature. 2018;562:210. doi: 10.1038/s41586-018-0571-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Becker M., Schultze H., Schultze J.L. Personalized medicine: the need for exascale data handling. Schriften des Forschungszentrums Jülich IAS Series. 2018;40:18–19. [Google Scholar]
  • 13.Michael K., Miller K.W. Big data: new opportunities and new challenges [guest editors' introduction] Computer. 2013;46:22–24. [Google Scholar]
  • 14.Buyya R. High performance cluster computing: architectures and systems (volume 1). 1st ed. Upper Saddle River: Prentice Hall; 1999.
  • 15.Calimera A., Macii E., Poncino M. The Human Brain Project and neuromorphic computing. Funct Neurol. 2013;28:191–196. [PMC free article] [PubMed] [Google Scholar]
  • 16.Hawrylycz M.J., Lein E.S., Guillozet-Bongaarts A.L., Shen E.H., Ng L., Miller J.A. An anatomically comprehensive atlas of the adult human brain transcriptome. Nature. 2012;489:391–399. doi: 10.1038/nature11405. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Sunkin S.M., Ng L., Lau C., Dolbeare T., Gilbert T.L., Thompson C.L. Allen Brain Atlas: an integrated spatio-temporal portal for exploring the central nervous system. Nucleic Acids Res. 2013;41:D996–1008. doi: 10.1093/nar/gks1042. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Amunts K. The EU's Human Brain Project (HBP) Flagship–accelerating brain science discovery and collaboration. CEUR-WS. 2017;2022:187–188. [Google Scholar]
  • 19.Casanova H., Berman F., Bartol T., Gokcay E., Sejnowski T., Birnbaum A. The virtual instrument: support for grid-enabled mcell simulations. Int J HighPerform C. 2004;18:3–17. doi: 10.1177/1094342004041290. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Wu X., Xu L., Yao L. Big data analysis of the human brain’s functional interactions based on fMRI. Chin Sci Bull. 2014;59:5059–5065. [Google Scholar]
  • 21.Fox M.D., Raichle M.E. Spontaneous fluctuations in brain activity observed with functional magnetic resonance imaging. Nat Rev Neurosci. 2007;8:700–711. doi: 10.1038/nrn2201. [DOI] [PubMed] [Google Scholar]
  • 22.Biswal B., Zerrin Yetkin F., Haughton V.M., Hyde J.S. Functional connectivity in the motor cortex of resting human brain using echo-planar MRI. Magn Reson Med. 1995;34:537–541. doi: 10.1002/mrm.1910340409. [DOI] [PubMed] [Google Scholar]
  • 23.Greicius M.D., Srivastava G., Reiss A.L., Menon V., Raichle M.E. Default-mode network activity distinguishes Alzheimer's disease from healthy aging: evidence from functional MRI. Proc Natl Acad Sci U S A. 2004;101:4637–4642. doi: 10.1073/pnas.0308627101. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Luo S.Q. Study on digitized atlas of the human brain. Zhongguo Yi Liao Qi Xie Za Zhi. 2001;25:91. (in Chinese with an English abstract) [PubMed] [Google Scholar]
  • 25.Lippert T., Orth B. Supercomputing infrastructure for simulations of the human brain. BrainComp. 2013;8603:198–212. [Google Scholar]
  • 26.Zhao X.W., Junzhong J.I., Liang P.P. The human brain functional parcellation based on fMRI data. Chin Sci Bull. 2016;61:2035–2052. [Google Scholar]
  • 27.Suk H.-I., Shen D. Deep learning-based feature representation for AD/MCI classification. Miccai. 2013;8150:583–590. doi: 10.1007/978-3-642-40763-5_72. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Liu S., Liu S., Cai W., Che H., Pujol S., Kikinis R. Multimodal neuroimaging feature learning for multiclass diagnosis of Alzheimer's disease. IEEE T Bio-Med Eng. 2015;62:1132–1140. doi: 10.1109/TBME.2014.2372011. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.LeCun Y., Bengio Y., Hinton G. Deep learning. Nature. 2015;521:436. doi: 10.1038/nature14539. [DOI] [PubMed] [Google Scholar]
  • 30.Li K.C., Yang X.P. Imaging diagnosis of Parkinson's disease. Diagn Theory Pract. 2005;4:273–274. (in Chinese with an English abstract) [Google Scholar]
  • 31.Vidaurre D., Abeysuriya R., Becker R., Quinn A.J., Alfaro-Almagro F., Smith S.M. Discovering dynamic brain networks from big data in rest and task. Neuroimage. 2017;180:646–656. doi: 10.1016/j.neuroimage.2017.06.077. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Spitzer H., Amunts K., Harmeling S., Dickscheid T. Parcellation of visual cortex on high-resolution histological brain sections using convolutional neural networks. Proc IEEE Int Symp Biomed Imaging. 2017:920–923. [Google Scholar]
  • 33.Payan A., Montana G. Predicting Alzheimer's disease: a neuroimaging study with 3D convolutional neural networks. Comput Sci arXiv. 2015;1502 1502.02506. [Google Scholar]
  • 34.Hosseini-Asl E., Ghazal M., Mahmoud A., Aslantas A., Shalaby A.M., Casanova M.F. Alzheimer's disease diagnostics by a 3D deeply supervised adaptable convolutional network. Front Biosci. 2016;23:584–596. doi: 10.2741/4606. [DOI] [PubMed] [Google Scholar]
  • 35.Makkie M., Huang H., Zhao Y., Vasilakos A.V., Liu T. Fast and scalable distributed deep convolutional autoencoder for fMRI big data analytics. Neurocomputing. 2017;325:20–30. doi: 10.1016/j.neucom.2018.09.066. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Amunts K., Lepage C., Borgeat L., Mohlberg H., Dickscheid T., Rousseau M.É. BigBrain: an ultrahigh-resolution 3D human brain model. Science. 2013;340:1472–1475. doi: 10.1126/science.1235381. [DOI] [PubMed] [Google Scholar]
  • 37.Abraham M.J., Murtola T., Schulz R., Páll S., Smith J.C., Hess B. GROMACS: High performance molecular simulations through multi-level parallelism from laptops to supercomputers. Softwarex. 2015;1–2:19–25. [Google Scholar]
  • 38.Jirsa V.K., Proix T., Perdikis D., Woodman M.M., Wang H., Gonzalez-Martinez J. The virtual epileptic patient: individualized whole-brain models of epilepsy spread. Neuroimage. 2016;145:377–388. doi: 10.1016/j.neuroimage.2016.04.049. [DOI] [PubMed] [Google Scholar]
  • 39.Forman M.S., Trojanowski J.Q., Lee V.M. Neurodegenerative diseases: a decade of discoveries paves the way for therapeutic breakthroughs. Nat Med. 2004;10:1055–1063. doi: 10.1038/nm1113. [DOI] [PubMed] [Google Scholar]
  • 40.Budson A.E., Solomon P.R. Chapter 4 – Alzheimer's disease dementia and mild cognitive impairment due to Alzheimer's disease. Mem Loss Alzheimers Dis Dement. 2016;2016:47–69. [Google Scholar]
  • 41.Cummings J.L., Morstorf T., Zhong K. Alzheimer’s disease drug-development pipeline: few candidates, frequent failures. Alzheimers Res Ther. 2014;6:37. doi: 10.1186/alzrt269. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42.Murray CJ, Lopez AD, World Health Organization. World Bank, Harvard School of Public Health. The global burden of disease: a comprehensive assessment of mortality and disability from diseases, injuries, and risk factors in 1990 and projected to 2020: summary. In: Murray CJL, Lopez AD, World Health Organization. editors. Cambridge: Harvard University Press; 1996, p. 41.
  • 43.Xiong J.Z., Xia S.R., Li J.S. The Study on the auto-classification of Parkinson’s disease based on MR imaging. Chin Digit Med. 2016;11:8–10. [Google Scholar]
  • 44.Xu J., Gong D., Man C., Fan Y. Parkinson's disease and risk of mortality: meta-analysis and systematic review. Acta Neurol Scand. 2014;129:71–79. doi: 10.1111/ane.12201. [DOI] [PubMed] [Google Scholar]
  • 45.Okano H., Sasaki E., Yamamori T., Iriki A., Shimogori T., Yamaguchi Y. Brain/MINDS: a Japanese national brain project for marmoset neuroscience. Neuron. 2016;92:582. doi: 10.1016/j.neuron.2016.10.018. [DOI] [PubMed] [Google Scholar]
  • 46.Poo M.M., Du J.L., Ip N.Y., Xiong Z.Q., Xu B., Tan T. China brain project: basic neuroscience, brain diseases, and brain-inspired computing. Neuron. 2016;92:591–596. doi: 10.1016/j.neuron.2016.10.050. [DOI] [PubMed] [Google Scholar]
  • 47.Shamir R.R., Dolber T., Noecker A.M., Walter B.L., Mcintyre C.C. Machine learning approach to optimizing combined stimulation and medication therapies for Parkinson's disease. Brain Stimul. 2015;8:1025–1032. doi: 10.1016/j.brs.2015.06.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 48.Gupta A., Maida A.S., Ayhan M. Natural image bases to represent neuroimaging data. Proc Int Conf Mach Learn. 2014:987–994. [Google Scholar]
  • 49.Yang W., Lui R.L.M., Gao J.H., Chan T.F., Yau S.T., Sperling R.A. Independent component analysis-based classification of Alzheimer's MRI data. J Alzheimers Dis. 2011;24:775–783. doi: 10.3233/JAD-2011-101371. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 50.Klöppel S., Stonnington C.M., Barnes J., Chen F., Chu C., Good C.D. Accuracy of dementia diagnosis—a direct comparison between radiologists and a computerized method. Brain. 2008;131:2969. doi: 10.1093/brain/awn239. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 51.Janoušová E., Vounou M., Wolz R., Gray K.R., Rueckert D., Montana G. Biomarker discovery for sparse classification of brain imagesin Alzheimer's disease. Ann Bmva. 2012;2012:1–11. [Google Scholar]
  • 52.Batmanghelich N., Taskar B., Davatzikos C. A general and unifying framework for feature construction, in image-based pattern classification. Inf Process Med Imaging. 2009;21:423–434. doi: 10.1007/978-3-642-02498-6_35. [DOI] [PubMed] [Google Scholar]
  • 53.Al-Fatlawi A.H., Jabardi M.H., Ling S.H. Efficient diagnosis system for Parkinson's disease using deep belief network. IEEE CEC. 2016:1324–1330. [Google Scholar]
  • 54.Sarraf S., Tofighi G. Deep learning-based pipeline to recognize Alzheimer's disease using fMRI data. Ftc. 2016;2016:816–820. [Google Scholar]
  • 55.Horn A.H.C., Sticht H. Amyloid-β42 oligomer structures from fibrils: a systematic molecular dynamics study. J Phys Chem B. 2010;114:2219–2226. doi: 10.1021/jp100023q. [DOI] [PubMed] [Google Scholar]
  • 56.Bassett D.S., Gazzaniga M.S. Understanding complexity in the human brain. Trends Cogn Sci. 2011;15:200–209. doi: 10.1016/j.tics.2011.03.006. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 57.Ed B., Anna B., Bassett D.S., Alex F., Manfred K., David M. Generic aspects of complexity in brain imaging data and other biological systems. Neuroimage. 2009;47:1125–1134. doi: 10.1016/j.neuroimage.2009.05.032. [DOI] [PubMed] [Google Scholar]
  • 58.Schutter E.D. A consumer guide to neuronal modeling software. Trends Neurosci. 1992;15:462–464. [Google Scholar]
  • 59.Wilson M.A., Bhalla U.S., Uhley J.D., Bower J.M. GENESIS: a system for simulating neural networks. Adv Neural Inf Process Syst. 1989;1989:485–492. [Google Scholar]
  • 60.Sporns O. The human connectome: a complex network. Ann N Y Acad Sci. 2012;136:109–125. doi: 10.1111/j.1749-6632.2010.05888.x. [DOI] [PubMed] [Google Scholar]
  • 61.Van Essen D.C., Barch D.M. The human connectome in health and psychopathology. World Psychiatry. 2015;14:154–157. doi: 10.1002/wps.20228. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 62.Hagmann P., Cammoun L., Gigandet X., Meuli R., Honey C.J., Wedeen V.J. Mapping the structural core of human cerebral cortex. PLoS Biol. 2008;6 doi: 10.1371/journal.pbio.0060159. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 63.Kaiser M. The potential of the human connectome as a biomarker of brain disease. Front Hum Neurosci. 2013;7:484. doi: 10.3389/fnhum.2013.00484. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 64.Dai Z., He Y. Disrupted structural and functional brain connectomes in mild cognitive impairment and Alzheimer's disease. Neurosci Bull. 2014;30:217–232. doi: 10.1007/s12264-013-1421-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 65.Dickerson B.C., Brickhouse M., Mcginnis S., Wolk D.A. Alzheimer's disease: the influence of age on clinical heterogeneity through the human brain connectome. Alzheimers Dement (Amst) 2017;6:122–135. doi: 10.1016/j.dadm.2016.12.007. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 66.Wang J., Zuo X., Dai Z., Xia M., Zhao Z., Zhao X. Disrupted functional brain connectome in individuals at risk for Alzheimer's disease. Biol Psychiatry. 2013;73:472–481. doi: 10.1016/j.biopsych.2012.03.026. [DOI] [PubMed] [Google Scholar]
  • 67.Surmeier D.J., Obeso J.A., Halliday G.M. Selective neuronal vulnerability in Parkinson disease. Nat Rev Neurosci. 2017;18:101. doi: 10.1038/nrn.2016.178. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 68.Ingalhalikar M., Smith A., Parker D., Satterthwaite T.D., Elliott M.A., Ruparel K. Sex differences in the structural connectome of the human brain. Proc Natl Acad Sci U S A. 2014;111:823–828. doi: 10.1073/pnas.1316909110. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 69.Markram H. The human brain project. Sci Am. 2012;306:50–55. doi: 10.1038/scientificamerican0612-50. [DOI] [PubMed] [Google Scholar]
  • 70.Eliasmith C., Stewart T.C., Choo X., Bekolay T., Dewolf T., Tang C. A large-scale model of the functioning brain. Science. 2012;338:1202. doi: 10.1126/science.1225266. [DOI] [PubMed] [Google Scholar]
  • 71.Eliasmith C., Trujillo O. The use and abuse of large-scale brain models. Curr Opin Neurobiol. 2014;25:1–6. doi: 10.1016/j.conb.2013.09.009. [DOI] [PubMed] [Google Scholar]
  • 72.Hsu J. IBM's new brain [news] IEEE Spectr. 2014;51:17–19. [Google Scholar]
  • 73.Izhikevich E.M., Edelman G.M. Large-scale model of mammalian thalamocortical systems. Proc Natl Acad Sci USA. 2008;105:3593–3598. doi: 10.1073/pnas.0712231105. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 74.Sporns O. The human connectome: origins and challenges. Neuroimage. 2013;80:53–61. doi: 10.1016/j.neuroimage.2013.03.023. [DOI] [PubMed] [Google Scholar]
  • 75.Sporns O., Tononi G., Kötter R. The human connectome: a structural description of the human brain. PLoS Comput Biol. 2005;1 doi: 10.1371/journal.pcbi.0010042. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 76.Vu A.T., Auerbach E., Lenglet C., Moeller S., Sotiropoulos S.N., Jbabdi S. High resolution whole brain diffusion imaging at 7 T for the human connectome project. Neuroimage. 2015;122:318–331. doi: 10.1016/j.neuroimage.2015.08.004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 77.Essen D.C.V., Ugurbil K., Auerbach E., Barch D., Behrens T.E.J., Bucholz R. The human connectome project: a data acquisition perspective. Neuroimage. 2012;62:2222–2231. doi: 10.1016/j.neuroimage.2012.02.018. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 78.Smith S.M., Beckmann C.F., Andersson J.L.R., Auerbach E.J., Bijsterbosch J.D., Douaud G. Resting-state fMRI in the human connectome project. Neuroimage. 2013;80:144–168. doi: 10.1016/j.neuroimage.2013.05.039. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 79.Landhuis E. Neuroscience: big brain, big data. Nature. 2017;541:559. doi: 10.1038/541559a. [DOI] [PubMed] [Google Scholar]
  • 80.Horn A., Ostwald D., Reisert M., Blankenburg F. The structural-functional connectome and the default mode network of the human brain. Neuroimage. 2014;102:142–151. doi: 10.1016/j.neuroimage.2013.09.069. [DOI] [PubMed] [Google Scholar]
  • 81.Uğurbil K., Xu J., Auerbach E.J., Moeller S., An T.V., Duarte-Carvajalino J.M. Pushing spatial and temporal resolution for functional and diffusion MRI in the Human Connectome Project. Neuroimage. 2013;80:80–104. doi: 10.1016/j.neuroimage.2013.05.012. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 82.Szalkai B., Kerepesi C., Varga B., Grolmusz V. The budapest reference connectome server v2.0. Neurosci Lett. 2015;595:60–62. doi: 10.1016/j.neulet.2015.03.071. [DOI] [PubMed] [Google Scholar]
  • 83.Szalkai B., Kerepesi C., Varga B. Grolmusz V. Parameterizable consensus connectomes from the human connectome project: the budapest reference connectome server v3.0. Cogn Neurodyn. 2016;11:1–4. doi: 10.1007/s11571-016-9407-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 84.Gewaltig M.O., Diesmann M. NEST (neural simulation tool) Scholarpedia J. 2007;2:1430. [Google Scholar]
  • 85.Izhikevich E.M. Simple model of spiking neurons. IEEE Trans Neural Netw. 2003;14:1569–1572. doi: 10.1109/TNN.2003.820440. [DOI] [PubMed] [Google Scholar]
  • 86.Sterling T., Lusk E., Gropp W. MIT Press; Cambridge: 2002. Beowulf cluster computing with linux. [Google Scholar]
  • 87.Marcus D.S., John H., Timothy O., Michael H., Glasser M.F., Fred P. Informatics and data mining tools and strategies for the human connectome project. Front Neuroinform. 2011;5:4. doi: 10.3389/fninf.2011.00004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 88.Hodge M.R., Horton W., Brown T., Herrick R., Olsen T., Hileman M.E. ConnectomeDB—sharing human brain connectivity data. Neuroimage. 2016;124:1102–1107. doi: 10.1016/j.neuroimage.2015.04.046. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 89.Glasser M.F., Smith S.M., Marcus D.S., Andersson J.L.R., Auerbach E.J., Behrens T.E.J. The human connectome project's neuroimaging approach. Nat Neurosci. 2016;19:1175. doi: 10.1038/nn.4361. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 90.Elam J.S., Van Essen D. Human Connectome Project. In: Jaeger D., Jung R., editors. Encyclopedia of computational neuroscience. Springer New York; New York: 2013. pp. 1–4. [Google Scholar]
  • 91.Hines M.L., Carnevale N.T. The NEURON simulation environment. Neural Comput. 2014;9:1179–1209. doi: 10.1162/neco.1997.9.6.1179. [DOI] [PubMed] [Google Scholar]
  • 92.Davison B.D. NCS: network and cache simulator – an introduction. In: Technical report DCS-TR-444 [Internet] New Brunswick: The State University of New Jersey. 2011 https://rucore.libraries.rutgers.edu/rutgers-lib/59031/ [Google Scholar]
  • 93.Goddard N., Hood G., Howell F., Hines M., De Schutter E. NEOSIM: portable large-scale plug and play modelling. Neurocomputing. 2001;38:1657–1661. [Google Scholar]
  • 94.Delorme A., Gautrais J., Van Rullen R., Thorpe S. SpikeNET: A simulator for modeling large networks of integrate and fire neurons. Neurocomputing. 1999;26:989–996. [Google Scholar]
  • 95.Migliore M., Cannia C., Lytton W.W., Markram H., Hines M.L. Parallel network simulations with NEURON. J Comput Neurosci. 2006;21:119. doi: 10.1007/s10827-006-7949-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 96.Amunts K., Bücker O., Axer M. Towards a multiscale, high-resolution model of the human brain. Braincomp. 2013;8603:3–14. [Google Scholar]
  • 97.Markram H., Muller E., Ramaswamy S., Reimann M.W., Abdellah M., Sanchez C.A. Reconstruction and simulation of neocortical microcircuitry. Cell. 2015;163:456. doi: 10.1016/j.cell.2015.09.029. [DOI] [PubMed] [Google Scholar]
  • 98.Jordan J., Ippen T., Helias M., Kitayama I., Sato M., Igarashi J. Extremely scalable spiking neuronal network simulation code: from laptops to exascale computers. Front Neuroinform. 2018;12:2. doi: 10.3389/fninf.2018.00002. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Genomics, Proteomics & Bioinformatics are provided here courtesy of Oxford University Press

RESOURCES